Gabriele Sarti

@gsarti_

Research Intern , MSc Candidate in Data Science and . Interpretability and for social good 🗣👥

Trieste, Italy
Vrijeme pridruživanja: studeni 2017.

Tweetovi

Blokirali ste korisnika/cu @gsarti_

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @gsarti_

  1. Prikvačeni tweet
    20. stu 2019.

    This Friday at hosted by I'll be presenting a brief overview of recent advances in language modelling, with a focus on language understanding! people, did I forgot to mention something? Slides available here:

    Prikaži ovu nit
    Poništi
  2. 3. velj

    Totally agree with this. Learning based solely on distributional properties of language is deceptive and will hold us back in the long term. Research in grounded communication among agents in naturalistic settings seems the most promising way forward!

    Poništi
  3. proslijedio/la je Tweet
    2. velj

    This is insane. Has anyone computed the carbon footprint of this? It's time for mandatory checks by an ethical committee, and for redirecting the field towards methods that allow replication.

    Poništi
  4. proslijedio/la je Tweet
    28. sij

    New paper: Towards a Human-like Open-Domain Chatbot. Key takeaways: 1. "Perplexity is all a chatbot needs" ;) 2. We're getting closer to a high-quality chatbot that can chat about anything Paper: Blog:

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet
    27. sij

    Join us for the 5th Workshop on Representation Learning for NLP at in Seattle! The first call for papers is out now. Deadline is April 6. More info here:

    Poništi
  6. 23. sij

    Definitely support the idea of running a shared task for COI detection and reviewer-paper matching. Let the community take advantage of its own skills!

    Poništi
  7. proslijedio/la je Tweet
    21. sij

    The call for tasks is open! The deadline for submitting your proposal is February 7th 2020!

    Poništi
  8. proslijedio/la je Tweet
    18. sij

    Talking about opens a giant can of worms. One worm: what is it that we compose and where does it come from? What is it that does composition, and where does it come from? I've tried to put some thoughts together on . /1

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    15. sij

    🎉 2019 🎉 was quite the year for Deep Reinforcement Learning. In todays blog post I list my top 10 papers 🦄💻🧠 What was your favourite paper? Let me know!

    Poništi
  10. proslijedio/la je Tweet
    15. sij

    We are likely to be overestimating the true capabilities of machine commonsense across all these benchmarks. From: To appear in

    Poništi
  11. proslijedio/la je Tweet
    14. sij

    I often meet research scientists interested in open-sourcing their code/research and asking for advice. Here is a thread for you. First: why should you open-source models along with your paper? Because science is a virtuous circle of knowledge sharing not a zero-sum competition

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    11. sij

    Here's a thread surveying some 'classic' work on . Lots of people seem to be discussing this right now, but with partial references to the whole story. My aim is to highlight some of the philosophical and psychological issues in the history of the concept. 1/

    Prikaži ovu nit
    Poništi
  13. 11. sij

    Hey , can you add a "Most cited" criterion for sorting paper citations? I often use your platform to delve into new topics, and this can help in finding related content that is highly influential!

    Poništi
  14. proslijedio/la je Tweet
    10. sij

    Very happy to share our latest work accepted at : we prove that a Self-Attention layer can express any CNN layer. 1/5 📄Paper: 🍿Interactive website : 🖥Code: 📝Blog:

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    7. sij

    New year, new account! Welcome to the Twitter account of the Italian Association of Computational Linguistics! Follow to stay updated on research, projects, tools, events!

    Poništi
  16. proslijedio/la je Tweet
    5. sij

    Questo articolo, caro ⁦⁩, e’ uno scandalo. Denigra il merito in modo così farsesco che verrebbe da dubitare della lucidità mentale dell’autore. Da vincitore di mi vergogno per voi.

    Poništi
  17. proslijedio/la je Tweet
    5. sij

    Sul Corriere di oggi c'è un articolo interessante scritto da un accademico del secolo scorso contro il merito e i finanziamenti europei alla ricerca. Tuttavia, poiché è scritto in una lingua un po' arcaica, per leggerlo c'è bisogno di una traduzione. [thread di xx tweet 👇]

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    3. sij

    L’intelligenza artificiale è la frontiera della tecnologia che promette più evoluzioni e prospettive occupazionali. A i professionisti del settore si formano già all’università. Intervista a Luca Bortolussi, coordinatore corso di laurea in Data Science dell’

    Poništi
  19. 3. sij

    Correction - Stating that the XOR cannot be computed by a 1-layer perceptron is false; it is easy to achieve this by using Gaussian nonlinearities. However, these have proven ineffective for large ANN. The statement holds only for current activations (ReLU, softmax, etc.)

    Prikaži ovu nit
    Poništi
  20. 3. sij

    Big news: single human neurons can compute XOR. The same result can only be achieved by a 2-layer ANN and is unique among all species. A gentle reminder that, despite current outstanding results in deep learning, much work has still to be done!

    Prikaži ovu nit
    Poništi
  21. proslijedio/la je Tweet
    2. sij

    The 2010s were an eventful decade for NLP! Here are ten shocking developments since 2010, and 13 papers* illustrating them, that have changed the field almost beyond recognition. (* in the spirit of and , exclusively from other groups :)).

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·