I don't usually retweet my own blogposts, but I feel this one deserves another go now that I have more followers. It's a nice looking bokeh depth of field in a single pass that you can throw into almost any rendering pipeline with minimal effort.http://blog.tuxedolabs.com/2018/05/04/bokeh-depth-of-field-in-single-pass.html …
Thanks for retweeting it, because it gave me an excuse to implement DoF in my engine. I have a question though. I'm finding that this assumes a reverse depth buffer, right?
-
-
Also, instead of the depth sample * uFar, shouldn't it be remapping the depth from 0-1 to near-far? Near is normally small enough that it doesn't matter, but I'm just trying to make sure I understand everything.
-
I am also trying to visualize the getBlurSize() function, and this is what I'm seeing: https://www.desmos.com/calculator/dad9l25lum … For DoF in general, this doesn't seem right -- at least not what I would intuit. I'm far from an expert, though.
- Näytä vastaukset
Uusi keskustelu -
Lataaminen näyttää kestävän hetken.
Twitter saattaa olla ruuhkautunut tai ongelma on muuten hetkellinen. Yritä uudelleen tai käy Twitterin tilasivulla saadaksesi lisätietoja.