Testing my minimal 3D reaction-diffusion setup in Jupyter.
Visualization is slow af.. musing on the best way to port this in #blender. Suggestions welcomed.pic.twitter.com/1rOGSQg7Rj
-
Show this thread
-
First obvious approach: displacement + alpha, via exported video texture (not interactive, can't handle 3D data)pic.twitter.com/kOeCTsLPvH
1 reply 1 retweet 4 likesShow this thread -
Other approach: particle system with density controlled by texture (still not interactive nor 3D)pic.twitter.com/GwZkgbRQOH
1 reply 0 retweets 5 likesShow this thread -
Other approach: moving plane with particle system emitter controlled by video texture. Not interactive, but can render 2D systems evolution in 3D, or render a still of a 3D volume via exported 2D slicing.pic.twitter.com/0EvLkDVyAU
2 replies 0 retweets 7 likesShow this thread -
Simple script to edit in-memory image used as texture, with the simulation running as a frame-change handle. Semi-interactive, but still stuck to 2D, as I have no idea how to edit 3D texture data (volumes) programmatically. Code https://github.com/5agado/data-science-learning/tree/master/graphics/reaction_diffusion …pic.twitter.com/soGDYZpdXu
2 replies 0 retweets 9 likesShow this thread -
Testing 3D reaction-diffusion in Blender using stacked-planes + alpha from video-texture. Simulation is done via my custom Python codepic.twitter.com/s5UyVyYNhw
3 replies 1 retweet 20 likesShow this thread -
'COEFF_A': 0.12, 'COEFF_B': 0.06, 'FEED_RATE': 0.02, 'KILL_RATE': 0.05
#b3dpic.twitter.com/HL3EtN502q2 replies 3 retweets 25 likesShow this thread -
-
Replying to @Marcel_Hampel
Ahaha, I feel that the simulation is exactly too fast, giving a chaotic feeling.. If you look instead at a slice in pure b/w, it is much more smooth and relaxingpic.twitter.com/WPCNcPaOpa
1 reply 0 retweets 3 likes
mesmerizing! :D looks awesome!
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.