Stanford NeuroAI Lab

@NeuroAILab

neuroscience, artificial intelligence, and psychology research at stanford (PI: Dan Yamins)

Vrijeme pridruživanja: studeni 2017.

Tweetovi

Blokirali ste korisnika/cu @NeuroAILab

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @NeuroAILab

  1. proslijedio/la je Tweet
    4. pro 2019.

    We introduce Dreamer, an RL agent that solves long-horizon tasks from images purely by latent imagination inside a world model. Dreamer improves over existing methods across 20 tasks. paper code Thread 👇

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    4. pro 2019.

    . & I are co-organizing the next Conference Apr 1, 2020—no joke!—on the topic Triangulating Intelligence: Melding Neuroscience, Psychology, and AI. Botvinick——Tenenbaum——save the date!

    Poništi
  3. proslijedio/la je Tweet
    28. lis 2019.

    Best Paper Award SinGan: Learning a Generative Model from a Single Natural Image Congratulations to (), (), Tomer Micaeli ()! First 2 authors are women!

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet

    Thank you for inviting me to give the talk, a return to where I began my science career after 18 years! Tomorrow’s technology should be inspired and developed in collaboration with brain and cognitive sciences.

    Poništi
  5. proslijedio/la je Tweet
    16. lis 2019.

    Looking forward to speaking at the Minisymposium "Artificial Intelligence and Neuroscience"! I'll be presenting our work on task-driven recurrent convolutional models of higher visual cortex dynamics. Monday Oct. 21, 9:55-10:15 am, Session 261, Room S406A.

    Poništi
  6. proslijedio/la je Tweet
    15. lis 2019.

    We've trained an AI system to solve the Rubik's Cube with a human-like robot hand. This is an unprecedented level of dexterity for a robot, and is hard even for humans to do. The system trains in an imperfect simulation and quickly adapts to reality:

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    8. lis 2019.

    Heterogeneous graph support is finally here! Many new models: GCMC, RGCN(for hetero), HAN, Metapath2vec. New DGL-KE package supports efficient training of TransE, ComplEx, DistMult. Look forward to new research ideas using the right tool! V0.4 release:

    Poništi
  8. proslijedio/la je Tweet

    A new work on structuring diverse semantics in 3D space that yielded the 3D Scene Graph! It’s showcased on the Gibson database by annotating the models with diverse semantics using a semi-automated method.

    Poništi
  9. proslijedio/la je Tweet
    9. lis 2019.

    The list of accepted papers at the Graph Representation Learning Workshop 2019 is online! (Camera-ready versions will follow later this month). Submission statistics / acceptance rates below 👇

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    2. lis 2019.

    RPN is inspired by classical symbolic planning and modern model-based planning and aims to combine the best of both worlds. We show that RPN can achieve strong zero-shot generalization to multi-step tasks (~50 subgoals) with pixel inputs. (2/3)

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    26. ruj 2019.

    Really enjoyed the (non OpenAI) ICLR submission that trained a transformer on symbolic math. The surprise: it beat Mathematica on symbolic integration and diff eq solving by a _very_ big margin!

    Poništi
  12. proslijedio/la je Tweet
    25. ruj 2019.

    Thank you Machine Learning community for getting off to such a great start 🎉. Congratulations to everyone who submitted and for pushing our science to new heights 📈. Read the papers: 2594 now online Next phase: Bidding, instructions on Friday.

    Poništi
  13. proslijedio/la je Tweet

    We see more significant improvements from training data distribution search (data splits + oversampling factor ratios) than neural architecture search. The latter is so overrated :)

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    10. ruj 2019.

    What does deep learning bring to neuroscience? What is the role of theory in the age of deep learning? Answers (and more questions) in our workshop "Brain Against the Machine", Berlin, Sep 17&18 at . Full schedule:

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    10. ruj 2019.

    It's hard to scale meta-learning to long inner optimizations. We introduce iMAML, which meta-learns *without* differentiating through the inner optimization path using implicit differentiation. to appear w/

    Poništi
  16. proslijedio/la je Tweet

    One of the coolest aspects of SPIRAL is to use computer programs (as opposed to games) as environments for RL agents. Another is to use a discriminator that provides rewards to the agent.

    Poništi
  17. proslijedio/la je Tweet
    15. kol 2019.
    Poništi
  18. proslijedio/la je Tweet
    15. kol 2019.

    What sizes do these come in? I don’t want to overfit

    Poništi
  19. proslijedio/la je Tweet

    Facebook AI researchers have released PHYRE, a new open benchmark for assessing an system’s capacity for reasoning about the physical laws that govern real-world environments.

    Poništi
  20. proslijedio/la je Tweet
    13. kol 2019.

    Geoff Hinton: "One big challenge the community faces is that if you want to get a paper published in ML now it's got to have a table in it, with all these different data sets across the top, and all these diff methods along the side, and your method has to look like the best one"

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·