ChinHuiChen

@jcchinhui

Look for computational neuro PHD opportunity. Senior data engineer passionate about computational neuroscience and AGI.

Taipei
Vrijeme pridruživanja: kolovoz 2010.

Tweetovi

Blokirali ste korisnika/cu @jcchinhui

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @jcchinhui

  1. proslijedio/la je Tweet
    2. velj

    “AMD is aiming to disrupt 4K gaming, and build a Radeon GPU line-up to take on Nvidia in a similar way to how the firm has ramped up Ryzen processors to beat out Intel” What about OpenCL and deep learning?

    Poništi
  2. proslijedio/la je Tweet
    2. velj

    "Giving BERT a calculator" Big step in adding computational abilities to language models. Still no understanding but can certainly start to fool you. What happens if you "give BERT" thousand other similar abilities? cc

    Poništi
  3. proslijedio/la je Tweet
    2. velj

    This blog on RCN model is just excellent with amazing visuals! I highly recommend it, especially if you are interested in generative visual models based on the visual cortex.

    Poništi
  4. proslijedio/la je Tweet
    3. velj
    Odgovor korisniku/ci

    I remember buying that book and trying to train networks using the SW that came with ut in 1989 No fun when you had to create your own data and do the training on a crappy olivetti 8086! Thankfully things have moved on since then!

    Poništi
  5. proslijedio/la je Tweet
    2. velj

    The 1986 classic 'Parallel Distributed Processing' uses the term 'threshold function' instead of 'rectified linear unit'. I prefer the 1986 version :)

    Poništi
  6. proslijedio/la je Tweet

    If you want to find out more , I recommend checking out the great Game Changer book by &

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet

    Enjoyed this review on how has influenced the phenomenal World Chess Champion , by his coach, the brilliant It has loads of illustrative games from his incredible unbeaten run in 2019!

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    3. velj

    Meet our guest lecturer Tom Schaul Senior Research Scientist at ! Tom is a seasoned researcher and part of the team, who joins with a lecture and workshop on Deep RL in Games Registration is now open! Don't miss out!

    Poništi
  9. proslijedio/la je Tweet
    3. velj

    "One of the things I really like about this article is how it integrates work from the fields of artificial intelligence, psychology, neuroscience, and evolutionary theory." editor , picks Reinforcement Learning, Fast and Slow as her review of 2019

    Poništi
  10. proslijedio/la je Tweet
    prije 21 sat

    builds tools that transforms game development through enabling better game testing, content generation, and player understanding. We are at the forefront of bringing the AI revolution to game development. If your game studio needs an AI injection, call us

    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    prije 21 sat

    I might be biased, but I honestly believe we have the best team of game AI researchers of any private company. The leadership includes and myself. As individual researchers, we have helped shape the field of AI and Games.

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    prije 21 sat

    Our startup, , is focused on delivering AI tools to game developers. We take state-of-the-art AI research and build products that revolutionize how games can be built. We're very pleased to announce that investors believe in us.

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    30. pro 2019.

    The AI field is in its infancy. A great definition of deep learning would allow the field to be more welcoming, to extend its reach to other fields and to illuminate the state of discourse.

    Poništi
  14. proslijedio/la je Tweet
    2. sij
    Odgovor korisnicima i sljedećem broju korisnika:

    That's the position Bengio and are arguing for - as few priors as possible and mainly for meta-learning rather than specific knowledge. argues for more prior knowledge pre-wired in from the genome

    Poništi
  15. proslijedio/la je Tweet
    2. sij

    classical symbolic AI, despite its own weaknesses, has strengths that complement those of deep learning: its representations are abstract, compositional, interpretable, and lend themselves well to high-level reasoning

    Poništi
  16. proslijedio/la je Tweet
    2. sij

    This is just huge. And perhaps just the tip of the iceberg; individual neurons may do lots we haven't credited them with doing. & even a single human-specific innovation like this could be deeply illuminating. & it may really reshape our thoughts on "neurally-inpired" networks.

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    2. sij

    [Recently] there has been a great deal of discussion within the community about the relative merits of and symbolic paradigms, and the need to adopt the best of both approches is widely recognised (and fostered by people like )...Worth reading!

    Poništi
  18. proslijedio/la je Tweet
    30. pro 2019.

    Four Decades, Four AI Papers. A personal perspective on 30+ years of history at the end of the decade. Apologies for the self-promotion -

    Poništi
  19. proslijedio/la je Tweet
    2. sij
    Odgovor korisnicima i sljedećem broju korisnika:

    One major difference in this case is that the XOR is solved by a single dendritic transfer function. Another is that targeted inhibition can cause excitation. Stay tuned for more about the power of this transfer function!

    Poništi
  20. proslijedio/la je Tweet
    4. sij

    "Being amazed that you can make a human with only 20k [protein-coding] genes is like being amazed that Shakespeare could write all his plays with only 26 letters." Ace blog on how much knowledge a genome encodes, inspired by Bengio/ debate:

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·