Tim Kietzmann

@TimKietzmann

Machine Learning meets Neuroscience, Assistant Professor at the Donders Institute, tweets on neuroscience, deep learning, A.I., & all things science, proud dad

Vrijeme pridruživanja: listopad 2016.

Medijski sadržaj

  1. 31. sij
  2. 30. sij

    Who of you thought it was a good idea to tell the army to collect data via CAPTCHAS? 😄

  3. 29. sij
  4. 24. sij
    Odgovor korisniku/ci

    Modeling is the rug that ties the room together.

  5. 24. sij
    Odgovor korisnicima

    For anyone else like me who had never heard of meeting owls:

  6. 23. sij
    Odgovor korisniku/ci

    We think alike. This is the slide that follows the video.

  7. 23. sij

    Totally including this in the introduction lecture when we speak about the relation between the number of neurons and cognitive abilities. I mean, they need to get a feeling for 86 billion after all...

  8. 21. sij
  9. 15. sij
  10. 14. sij
    Odgovor korisniku/ci
  11. 14. sij

    I literally just hit ‘tab’ on my keyboard to auto-complete a thought. How is everyone else's day going?

  12. 10. sij
    Odgovor korisniku/ci

    Absolutely. "Networks as participants" is the project slogan.

  13. 10. sij
    Odgovor korisnicima

    Interesting, would love to learn more. We do look at training trajectories in the paper, too, to see when the representations are affected most and when they start to stagnate.

  14. 10. sij

    Dropout can help, but considerable differences remain. This calls into question the practice of using single network instances to derive neuroscientific insight. Going forward, multiple DNNs may need to be analysed (similar to experimental participants). /fin

    Prikaži ovu nit
  15. 10. sij

    What are the origins of this? We argue that the categorization objective does not sufficiently constrain the arrangement of category clusters and exemplars. In addition, the interplay of ReLus and properties of certain distance measures contribute to differences. 6/7

    Prikaži ovu nit
  16. 10. sij

    Simply changing the random seed leads to considerable individual differences (shared variance in distance estimates can be as low as 44% across networks). The size of the effect is comparable to training networks with completely different image sets. 5/7

    Prikaži ovu nit
  17. 10. sij

    Here we test this by training multiple identical network instances while varying only the random seed during weight initialisation. We compare the learned representations using a technique from systems neuroscience: representational similarity analysis (RSA). 4/7

    Prikaži ovu nit
  18. 10. sij

    Deep neural networks have seen a surge in popularity in neuroscience and psychology, where they are used as a modelling framework to understand (visual) information processing in the brain. 2/7

    Prikaži ovu nit
  19. 9. sij
    Odgovor korisniku/ci
  20. 9. sij
    Odgovor korisniku/ci

    Why did I not see this? Of course you are right, AI stands for "Artificial Ingredients". It all makes sense now.

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·