if your ai model contains 8 randomly accessible *terabyte* different kinds of models are possible, different kinds of knowledge can at least theoretically be remembered and encoded / compressed into. can gpt-3 fit? it can, right?
-
-
Show this thread
-
New conversation -
-
-
frequently they have a limited number of writes to a particular sector before failure
-
using SSDs as swap space for example (depending on the ssd technology) is a good way to break them very quickly
- Show replies
New conversation -
-
-
Intel Optane, 6TB in RAM sockets and/or SSD versions. https://www.intel.com/content/dam/www/public/us/en/documents/product-briefs/optane-persistent-memory-200-series-brief.pdf …
-
amazing tech. i’m thinking about uses
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

"i read somewhere that the new ssd's are in some ways as fast as ram was, 10 years ago.
if that's true, it seems like an entirely new scale of machine learning architecture could be performant and convenient if 8 TB of SSD are available on a professional level *laptop*..."