Tweetovi

Blokirali ste korisnika/cu @malte_pietsch

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @malte_pietsch

  1. proslijedio/la je Tweet
    3. velj

    is more than just public code. It's a mindset of sharing, being transparent and collaborating across organizations. It's about building on the shoulders of other projects and advancing together the state of technology (1/N) , , ,

    Prikaži ovu nit
    Poništi
  2. proslijedio/la je Tweet
    28. sij

    It's challenging to keep track of all the latest out there. What was again the difference between and ? What's the core idea behind ? Here's a little (not comprehensive) that we use for workshops

    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    20. sij

    GitHub Repo Spotlight №3: Transfer Learning library for NLP called FARM:: With FARM you easily use BERT, XLNet, and others easily for any downstream NLP tasks. FARM is great for fast prototyping too.

    Prikaži ovu nit
    Poništi
  4. 22. sij

    Just migrated our basic inference API in from to . Really love how simple it is! Looking forward to a deeper integration. Great work ! FastAPI: Haystack:

    Poništi
  5. proslijedio/la je Tweet
    17. sij

    Are you doing in a non-english language? Try the multilingual XLM-R model! It gave us amazing results in German (for the SOTA chasers: yes, it's also outperforming previous results with BERT & Co). Blog:

    Poništi
  6. proslijedio/la je Tweet
    15. sij

    v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: Release Notes: Last release for Python 2 (bye bye!)

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    16. sij

    Introducing Reformer, an efficiency optimized architecture, based on the Transformer model for language understanding, that can handle context windows of up to 1 million words, all on a single accelerator with only 16GB of memory. Read all about it ↓

    Poništi
  8. proslijedio/la je Tweet
    13. pro 2019.

    How big is the CO₂ bucket for 1.5°C? Well, the bucket is about to overflow in a few short years, unless we: 1. Turn off the tap (urgently) 2. Put a hole in the bottom to remove CO₂ (negative emissions)

    Poništi
  9. proslijedio/la je Tweet
    15. pro 2019.

    As promised: here are the slides from Malte's talks in Warsaw! - Keynote at : - Talk at HumanTech: Reach out to us if you have some large polish text data set (> 10GB) and want to train a polish BERT or ALBERT.

    Poništi
  10. 4. pro 2019.

    Honoured to give a keynote at next week about ! Looking forward to the conference, the people and the city. Let me know, if you are also there and want to grab a coffee.

    Poništi
  11. proslijedio/la je Tweet
    28. stu 2019.

    Excited to announce two releases with a common theme: Bringing to the industry! 1) : We have a new framework joining the family! Focus: all you need for at scale: indexing docs, retrievers, labeling ...!

    Prikaži ovu nit
    Poništi
  12. 18. stu 2019.

    It was really a pleasure to speak (again) at in Berlin and see so many new faces joining the community. As many of you asked me about slides and the framework. Here they are: Slides: FARM framework:

    Poništi
  13. proslijedio/la je Tweet
    5. stu 2019.

    We're releasing the 1.5billion parameter GPT-2 model as part of our staged release publication strategy. - GPT-2 output detection model: - Research from partners on potential malicious uses: - More details:

    Poništi
  14. proslijedio/la je Tweet
    28. lis 2019.

    We just released 0.3.0 🎉 with two new language models: and are joining the .

    Poništi
  15. proslijedio/la je Tweet
    18. lis 2019.

    Great to see our opensource resources being used at the competition: - Winner Task 2.1 & 2.2: Used GermanBERT (UPB, ) - Winner Task 2.3: Used FARM & GermanBERT ( , )

    Poništi
  16. proslijedio/la je Tweet
    10. lis 2019.

    We just released 0.2.1 🎉. Another major step towards simpler for the industry! - Better parallelization of preprocessing (less RAM) - Multilabel classification - Simpler multitask learning - Upgrading to transformers 2.0

    Poništi
  17. proslijedio/la je Tweet
    8. lis 2019.

    This is a nice diagram by Zhengyan Zhang and that shows how many recent pretrained language models are connected. The GitHub repo contains a full list of relevant papers:

    Poništi
  18. proslijedio/la je Tweet
    30. kol 2019.

    Hear Malte Pietsch talk about "The Imagenet Moment for NLP: Tips & Tricks for Effective Transfer Learning" at Berlin 2019!

    Poništi
  19. proslijedio/la je Tweet
    13. ruj 2019.

    It's great to see the growing landscape of NLP transfer learning libraries: - pytorch-transformers by : - spacy-pytorch-transformers by : - FARM by

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet
    19. kol 2019.

    We just released FARM 0.2.0 - making faster & easier: - 🚀 Significant speed up of preprocessing & training - 🙂 More user friendly processors for custom datasets - 🐞 Several bug fixes Check it out:

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·