1/ #DeepLearning is only good at interpolation. But applications need extrapolation that can reason about more complex scenarios than it is trained on. With current methods, accuracy degrades rapidly when complexity of test instances grows. Our new work aims to overcome this
-
Show this thread
-
2/We show that augmenting recursive networks with external memory like stack significantly improves extrapolation to harder examples.
@ForoughArabsha1@sameer_ Zhichu Lu https://arxiv.org/abs/1911.01545 pic.twitter.com/48P14SIkbU
4 replies 21 retweets 122 likesShow this thread -
Replying to @AnimaAnandkumar @ForoughArabsha1 and
Interesting, would be nice to try that in connection to this physics task where we use tree recursive networks. https://arxiv.org/abs/1702.00748
1 reply 0 retweets 6 likes
Indeed! I had discussed with @cosmo_shirley about using recursive networks for physics problems. That is the right direction. Would be great to see how much external memory helps and how do we share stack parameters among different functions.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.