Tweetovi

Blokirali ste korisnika/cu @Wietsedv

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @Wietsedv

  1. proslijedio/la je Tweet
    4. velj

    "How to do machine learning efficiently". There's so much to love about this wonderful article.

    Poništi
  2. proslijedio/la je Tweet
    3. velj
    Odgovor korisnicima

    Meena has exactly the same core issue as ELIZA: it doesn't build a model of what it or the interlocutor has said, and it often contradicts what happened a few turns earlier. Topic without understanding in 1965, topic without understanding in 2020. Here's a sample:

    Poništi
  3. proslijedio/la je Tweet
    3. velj

    The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors. 🔥

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    3. velj

    Today we celebrate the first day of Malvina Nissim as a full professor! Well deserved!

    Poništi
  5. proslijedio/la je Tweet
    2. velj
    Poništi
  6. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  7. 31. sij

    Shamelessly advertised BERTje at yesterday. 😅

    Poništi
  8. 31. sij

    Proud to say that our BERTje from is now available as the default Dutch BERT model in Transformers by ! Some comparisons of BERTje with mBERT, BERT-NL and RobBERT are available on (more coming soon).

    Poništi
  9. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    27. sij
    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    27. sij

    Transfer learning is increasingly going multilingual with language-specific BERT models: - 🇩🇪 German BERT - 🇫🇷 CamemBERT , FlauBERT - 🇮🇹 AlBERTo - 🇳🇱 RobBERT

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet

    So awesome to see the community effort on multi-lingual NLP! In the past month, models in Dutch, Finnish, French, German, Italian, Japanese, Mandarin and Spanish have been added to transformers. Check them all here:

    Poništi
  13. proslijedio/la je Tweet
    23. pro 2019.

    Hot off the press: Bertje. We collected a large and diverse corpus of Dutch and trained a monolingual BERT model. The model is available for download. Paper: joint work by me Gertjan van Noord & Malvina Nissim

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·