The amount of times I wanted a 1-bit texture on GPUs has reached double digits now. It's either waste a whole 8-bits, or find a really clever but annoying to encode & decode way to pack 1-bit into an existing texture. Setting aside filtering, why can't GPUs support this?
-
-
Considerably less memory bandwidth for doing stencil-like things. Avoids the massive problem of having to work some new state into an already convoluted stencil setup if you do use it. If you have combined depth-stencil you can't sample both in a shader at the same time, etc.
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
Lataaminen näyttää kestävän hetken.
Twitter saattaa olla ruuhkautunut tai ongelma on muuten hetkellinen. Yritä uudelleen tai käy Twitterin tilasivulla saadaksesi lisätietoja.