Conversation

Replying to
Problem is biggest models that use the biggest piles of hardware keep winning. The small ones are qualitatively always a generation behind. I suspect it will be public training of large models plus hybrid inference on augmented versions of large models. Already so in vision.
2
2
Replying to and
Search was orders of magnitude cheaper from get go. Too cheap to meter even before Google. ML is not. Training is very expensive. Even inference at the bleeding edge still seems to be in the dollars/inference range, not sub-penny. Unit economics closer to video streaming ~1997
1
3
Replying to
I guess it becomes a question of timing. I think you are suggesting an initial big public aggregate push (biggest player wins most?), which “commoditises the competition”? followed by niche / specific enterprise players?