Conversation

Replying to
Yeah, it's specs are pretty similar stats to a 1080TI, which are the most common. 3090 is a absolutely great price point at MSRP for a 24GB card, but they are hard to use in multi-GPU setups. The dataset size doesn't affect the VRAM requirements, model size does.
1
1
Replying to and
I've mostly used 3090s as the VRAM on those cards make it the most attractive by far of the current generation. If you can find the discontinued blower fan models, even better as you could fit 3 or even 4 in a single 4 slot motherboard without using PCI risers.
1
1
Replying to and
To clarify, the 3060 specs are pretty similar to 1080TI, (like 30% faster). Any 12GB+ VRAM card will work reasonably. 24GB+ VRAM is even better. I keep hearing rumors of Nvidia dropping a 3000 series 16GB card, since the memory bus supports it, but I have yet to see one listed.
3
1