AllenNLP

@ai2_allennlp

An open-source NLP research library, built on PyTorch

Allen Institute for Artificial Intelligence
Vrijeme pridruživanja: kolovoz 2018.

Tweetovi

Blokirali ste korisnika/cu @ai2_allennlp

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ai2_allennlp

  1. Prikvačeni tweet
    25. ruj 2019.

    AllenNLP v0.9.0 is released today. AllenNLP Interpret is the headline feature. We also now have full compatibility with pytorch-transformers from , including RoBERTa (GPT2 and BERT are backing the LM and masked LM in the demos linked below).

    Poništi
  2. proslijedio/la je Tweet
    22. sij

    Having fun with 's demo page playing with gender bias in BERT

    Poništi
  3. proslijedio/la je Tweet
    13. sij

    🎉 has some new docs! 🎉 These are based on 's unbelievable FastAPI docs, with an additional auto-generated API section directly from allennlp's source. Built with 's great mkdocs-material template. Would recommend!

    Poništi
  4. proslijedio/la je Tweet

    A Guide To Machine Reading Comprehension In Production With AllenNLP by Caleb Kaiser

    Poništi
  5. proslijedio/la je Tweet
    19. stu 2019.

    Happy to announce that Matt () and I are working on a free, Web-based AllenNLP () Course that provides onboarding of AllenNLP and in-depth tutorials on how to use the framework and its abstractions for various NLP tasks. Stay tuned!

    Poništi
  6. proslijedio/la je Tweet
    Odgovor korisnicima

    No kidding. I've been using recently, and reading their code is a total breath of fresh air on this front.

    Poništi
  7. 15. stu 2019.

    This is today! If you want to come work with us, let us know today.

    Poništi
  8. proslijedio/la je Tweet
    24. lis 2019.

    The team will consider research internship applications on November 15! Please apply by then.

    Poništi
  9. proslijedio/la je Tweet
    24. lis 2019.

    The team will consider predoctoral young investigator applications on November 15 and February 15! Apply here:

    Poništi
  10. 9. lis 2019.

    AllenNLP is looking to hire a research engineer, apply now!

    Poništi
  11. 7. lis 2019.

    Have you built a model on top of AllenNLP? We're looking for examples to showcase. Send us yours:

    Poništi
  12. proslijedio/la je Tweet
    25. ruj 2019.

    Try our new Composed Seq2Seq abstractions in AllenNL v0.9.0. You can now easily experiment using BERT as encoder or add POS embeddings or do crazy models like transformer encoder+lstm decoder, with few lines of json(net), thanks to the flexibility of allennlp.

    Poništi
  13. proslijedio/la je Tweet
    25. ruj 2019.

    Introducing AllenNLP Interpret: a demo/toolkit for interpreting NLP models. Adversarial attacks, saliency maps, etc. for 𝘢𝘯𝘺 AllenNLP model/task: BERT, SQuAD, NER, ... Demo: Site: w/ [1/7]

    Prikaži ovu nit
    Poništi
  14. 22. tra 2019.

    AllenNLP Summit 2019: 14 Aug. We're looking for a small group of AllenNLP community members (mostly grad students and postdocs, we expect) to be our guests in Seattle for a day of discussion, brainstorming & networking. Apply by 10 May at

    Poništi
  15. proslijedio/la je Tweet
    3. ožu 2019.

    Announcing DROP, a new reading comprehension benchmark that requires discrete reasoning over paragraphs of text. New paper by , , , , , and me.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet

    Slides for my "notebooks and reproducibility" talk from the workshop on "Reproducible AI":

    Prikaži ovu nit
    Poništi
  17. 8. sij 2019.

    AllenNLP v0.8.1 just released. It follows up our v0.8.0 release--which featured PyTorch 1.0 support--with a number of small improvements.

    Poništi
  18. 13. pro 2018.

    Just merged in a PR to support PyTorch 1.0, expect a release including this (and various other fixes) next week!

    Poništi
  19. 3. pro 2018.

    AllenNLP v0.7.2 is out! We now include a BERT embedder and a tool to help you build model configurations. Full release notes at .

    Poništi
  20. proslijedio/la je Tweet
    16. stu 2018.

    Here's how we beat the state-of-the-art in NLP with HMTL 💪 Happy to finally share our latest paper on multi-task learning: !! And we are also releasing the code!! The training code relies on the AllenNLP library .

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·