The rate of paradigm refinement and fleshing out in ML is just exhausting. Amazed at the people who keep up with the play-by-play. It’s like the early decades after James Watt’s steam engine patents expired and the SOTA galloped forward briskly.
Quote Tweet
Transformers are Sample Efficient World Models
“With the equiv. of 2 hours of gameplay…our approach sets a new SOTA for methods without lookahead search, and even surpasses MuZero.”
Love the simplicity of this approach!
pdf arxiv.org/abs/2209.00588
code github.com/eloialonso/iris
Show this thread
GIF
2
1
25

