you should look at this guy. he gives some good advice.https://twitter.com/karpathy/status/1013244313327681536 …
-
-
-
I know! It just makes it worse. I even wrote a whole blog post pointing this out again. It's a "memory-free" bug: the probability of creating/noticing it is independent of the number of times it has been encountered in one's past.
- Još 1 odgovor
Novi razgovor -
-
-
Next time
#phoneafriend :) We'd be glad to take a look@karpathy !! -
>>> loss.backward()
"""Hey there! It looks like you're accumulating gradients. 99% of the time this means you've forgotten to use .zero_grad(). Just fyi. Disable this warning with ...."""
? :) - Još 1 odgovor
Novi razgovor -
-
-
This is the semicolon of deep learning
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
I’ve spent 4 hours debugging once, to find a missing ;
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
To think you are also in the same boat is heartening.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
thank you Karpathy godspeed
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
I had bug recently in my peer-to-peer logic that I spent 3 long evenings to debug. The fix was 2 lines of code, needed multiple servers and connections to debug and reproduce bug. Almost went mad...
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
So you're using Pytorch? :)
-
Of course! He's got a lot of ideas to iterate through quickly and wants a familiar line by line debug experience.
Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.