OpenAI has secretly built GFM-2, a transformer model trained on 4GB of deep learning criticism by Gary F Marcus. It reaches super human performance using only 50000 parameters, but OpenAI won’t release it to the public because they think it’s way too dangerous.
-
Show this thread
-
Replying to @Plinz
Wonder why they limited it to 4GB? Surely Gary’s tome of DL criticisms has reached the terabyte level by now.
1 reply 0 retweets 0 likes
Replying to @DeanSHorak
Whenever they used enough material for achieving full model coherence, they were encountering a serious mode collapse. There did not seem to be a way to debug this, while maintaining the generativity.
3:18 AM - 30 Nov 2019
0 replies
0 retweets
2 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.