i've seen worse :D
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
kings indian? white opening
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
AlphaZero had 1700 years of GPU training. Leela is like 5% of the way there and that's being generous. Less than 500 nodes helping. LCZero has even less. Better use of GPU cycles than hashing some random functions at least.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thanks for the good info.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Great resource. Would love to get my hands on that
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Check out https://github.com/glinscott/leela-chess … …. We are currently in the process of setting up the distributed training. Self-play games already working. Volunteers with idle CPU/GPU time welcome!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thank you for sharing this! I really enjoy your tweets
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.