Impressive! One caveat though: Mathematica either outputs right answers or no answers; While this seq2seq NN approach always outputs an answer, which may be right or wrong.
-
-
-
This hasn't even been verified yet: 1. How different is test data from training data? 2. Did anybody try to reproduce this? 3. Related to 1: is there data leakage in this set, how will it work on out-of-domain data? All seem like reasonable questions to ask.
- Još 3 druga odgovora
Novi razgovor -
-
-
Take that,
@GaryMarcus ;) Next, lets train a transformer to write books that explain what neural networks cannot do :) - Još 1 odgovor
Novi razgovor -
-
-
Does it really integrate though?
-
What does “really” mean in this context?
- Još 9 drugih odgovora
Novi razgovor -
-
-
The part where you use a computer algebra system to make the training set seems circular. Deep learning is smarter than CAS, as long as you can train it with a CAS.pic.twitter.com/wDZvw1ybCR
-
This pattern is now quite common. e.g. training a denoising ray trace pipeline on fully ray-traced images, or training a physics sim network on the output of a real physics engine.
Kraj razgovora
Novi razgovor -
-
-
"The ability of the model to recover equivalent expressions, without having been trained to do so, is very intriguing. This suggest that some deeper understanding of mathematics has been achieved by the model." Will the next "Einstein" be a DNN?
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.