Andrej KarpathyOvjeren akaunt

@karpathy

Director of AI at Tesla. Previously a Research Scientist at OpenAI, and CS PhD student at Stanford. I like to train Deep Neural Nets on large datasets.

Stanford
Vrijeme pridruživanja: travanj 2009.

Tweetovi

Blokirali ste korisnika/cu @karpathy

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @karpathy

  1. proslijedio/la je Tweet

    Tesla will hold a super fun AI party/hackathon at my house with the Tesla AI/autopilot team in about four weeks. Invitations going out soon.

    Poništi
  2. 29. sij

    Big congrats on the launch ! 🎉 It's awesome to see more examples of cutting edge deep learning lifted from proxy tasks in a research lab into real-world applications, where the problems/challenges "backprop" to you from economics. Good luck!

    Poništi
  3. 18. sij

    Open Syllabus Project Open Syllabus is a non-profit organization that maps the curriculum of higher education. Database of / stats from 7M class syllabi 👏

    Poništi
  4. 15. sij

    Why the Future of Farming is in Cities - The Big Money in Vertical Farming

    Poništi
  5. 10. sij

    Stumbled by a thread on Reddit: "Is it theoretically possible to do object recognition with classification algorithms other than NN’s?". Just ~8 years ago you'd be more likely to find "Is it theoretically possible to do object recognition with NN’s?". That was a fun few years.

    Poništi
  6. proslijedio/la je Tweet
    4. sij
    Poništi
  7. proslijedio/la je Tweet
    30. pro 2019.

    I agree with this article that AI Dungeon 2 (the GPT-2 powered text-based game of infinite possibilities) is "one of the coolest video game experiments of 2019". You should definitely check it out: . Great work !

    Poništi
  8. 4. sij
    Poništi
  9. 4. sij

    Incredible video series (and YouTube channel), thank you for the pointer!

    Poništi
  10. 4. sij

    Metabolic Engineering and Synthetic Biology of Yeast - Jens Nielsen🤯 (the whole channel is quite good). Bio will grow into a major tech stack. We're writing assembly today, but when the AWS is up things will get interesting.

    Poništi
  11. 2. sij
    Poništi
  12. 1. sij

    I made some bets in 2001 on what 2020 (a crazy future at the time, two whole decades away) would be like. And now it’s here. As a very common theme I way over predicted a lot of physical and way under predicted a lot of digital. Maybe I can try better now for 2040 :)

    Poništi
  13. Still slowly making my way through this year's NeurIPS talks. Esp like to stumble by good talks from slightly different areas, e.g. tonight liked "ML Meets Single-Cell Biology" Incredible that we're mapping out cell state markov chains for tissues.

    Poništi
  14. Biology is able to pass a lot information from one individual to another as lots of animals are born “ready to go” in both perception/control. And a large fraction of children getting better rapidly as they age can be brain maturing, not magic learning.

    Prikaži ovu nit
    Poništi
  15. A 4 year old child actually has a few hundred million years of experience, not 4. Their rapid learning/generalization is much less shocking/magical considering this fact.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    13. pro 2019.

    We're releasing "Dota 2 with Large Scale Deep Reinforcement Learning", a scientific paper analyzing our findings from our 3-year Dota project: One highlight — we trained a new agent, Rerun, which has a 98% win rate vs the version that beat .

    Poništi
  17. proslijedio/la je Tweet
    9. pro 2019.

    Classifiers are secretly energy-based models! Every softmax giving p(c|x) has an unused degree of freedom, which we use to compute the input density p(x). This makes classifiers into generative models without changing the architecture.

    , , i još njih 2
    Prikaži ovu nit
    Poništi
  18. This week's excitement and adventure in Machine Learning: ! 🎉 Talks & slides are live and being posted online

    Poništi
  19. I implemented some normalizing flows yesterday (NICE, RealNVP, MAF, IAF), tried to make core of it somewhat clean in case helpful I like how flow layers can be structured similar to backprop, each needs an invert() and emits a log det J "regularization"

    Poništi
  20. ~2 hours debugging an issue I thought was due to something I misunderstood in the deep mathematics involved but I just forgot to call `_grad()`. This bug really builds character

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·