Testing my minimal 3D reaction-diffusion setup in Jupyter.
Visualization is slow af.. musing on the best way to port this in #blender. Suggestions welcomed.pic.twitter.com/1rOGSQg7Rj
-
Show this thread
-
First obvious approach: displacement + alpha, via exported video texture (not interactive, can't handle 3D data)pic.twitter.com/kOeCTsLPvH
1 reply 1 retweet 4 likesShow this thread -
Other approach: particle system with density controlled by texture (still not interactive nor 3D)pic.twitter.com/GwZkgbRQOH
1 reply 0 retweets 5 likesShow this thread -
Other approach: moving plane with particle system emitter controlled by video texture. Not interactive, but can render 2D systems evolution in 3D, or render a still of a 3D volume via exported 2D slicing.pic.twitter.com/0EvLkDVyAU
2 replies 0 retweets 7 likesShow this thread -
Simple script to edit in-memory image used as texture, with the simulation running as a frame-change handle. Semi-interactive, but still stuck to 2D, as I have no idea how to edit 3D texture data (volumes) programmatically. Code https://github.com/5agado/data-science-learning/tree/master/graphics/reaction_diffusion …pic.twitter.com/soGDYZpdXu
2 replies 0 retweets 9 likesShow this thread -
Testing 3D reaction-diffusion in Blender using stacked-planes + alpha from video-texture. Simulation is done via my custom Python codepic.twitter.com/s5UyVyYNhw
3 replies 1 retweet 20 likesShow this thread
I can imagine, it´s so much fun to do these things. :D
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.