Tweetovi

Blokirali ste korisnika/cu @bassed1984

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @bassed1984

  1. proslijedio/la je Tweet
    30. sij

    Organizations across industries need accurate short-term, tactical forecasts, such as the amount of goods to be ordered and number of employees needed, to keep pace with their growth.

    Poništi
  2. proslijedio/la je Tweet
    2. velj

    The Plato Research Dialogue System enables experts and non-experts alike to quickly build, train, and deploy conversational AI agents.

    Poništi
  3. proslijedio/la je Tweet
    24. sij

    We're releasing mBART, a new seq2seq multilingual pretraining system for machine translation across 25 languages. It gives significant improvements for document-level translation and low-resource languages. Read our paper to learn more:

    Poništi
  4. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  5. proslijedio/la je Tweet
    3. pro 2019.

    Ever wanted to combine the NLU superpowers of BERT with the generation superpowers of GPT-2? It's now possible in transformers thanks to !

    Poništi
  6. proslijedio/la je Tweet

    To make online search results more useful for training , scientists at Facebook AI are condensing the raw text of those results into knowledge graphs for more efficient processing.

    Poništi
  7. proslijedio/la je Tweet
    22. stu 2019.

    How can we learn a sequence of tasks without forgetting, without class labels and with unknown or ambiguous task boundaries? Continual Unsupervised Representation Learning: Paper: Code:

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    21. stu 2019.
    Poništi
  9. proslijedio/la je Tweet
    16. stu 2019.

    The very impressive new ConvoKit from and his Cornell NLP crew provides easy access to lots of conversational datasets and tools:

    Poništi
  10. 7. stu 2019.

    [1911.02150] Fast Transformer Decoding: One Write-Head is All You Need

    Poništi
  11. 5. stu 2019.

    [1910.08282] Unsupervised Context Rewriting for Open Domain Conversation

    Poništi
  12. proslijedio/la je Tweet
    27. lis 2019.

    How can computers answer questions with multi-step information needs? How can it be done efficiently and interpretably? and colleagues explain at . Paper: Blog post:

    Poništi
  13. 13. lis 2019.

    [1904.09675] BERTScore: Evaluating Text Generation with BERT

    Poništi
  14. proslijedio/la je Tweet
    30. ruj 2019.

    Who said that training GPT-2 or BERT was expensive? "We use 512 Nvidia V100 GPUs [...] Upon the submission of this paper, training has lasted for three months [...] and perplexity on the development set is still dropping."

    Poništi
  15. 17. ruj 2019.

    A View on the Evolution of Representations in the Transformer from the Information Bottleneck Perspective: a post on the EMNLP 2019 paper

    Poništi
  16. 12. ruj 2019.

    Introducing a Conditional Transformer Language Model for Controllable Generation

    Poništi
  17. proslijedio/la je Tweet
    27. kol 2019.

    The excellent interactive book "Dive into Deep Learning" has been ported to PyTorch by students at IIT Roorkee The book was authored by , et. al.

    Poništi
  18. proslijedio/la je Tweet
    22. kol 2019.
    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet

    Facebook AI researchers are sharing an all-attention layer to simplify the Transformer model and an adaptive attention span method to make it more efficient. Even with a much simpler architecture, these methods match or improve state-of-the-art results.

    Poništi
  20. proslijedio/la je Tweet
    19. kol 2019.

    Introducing Abductive-NLI! A new benchmark dataset to test an 's abductive reasoning & common sense in forming explanations for a set of observations. Paper: Leaderboard and Data:

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·