Tweetovi

Blokirali ste korisnika/cu @deepset_ai

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @deepset_ai

  1. prije 3 sata

    The new release also includes many other exciting features: Checkpointing & caching, AMP, SageMaker integration, flexible LR schedules, early stopping, cross-validation, windows support and many more! Thanks to all contributors! Details: (4/N)

    Prikaži ovu nit
    Poništi
  2. prije 3 sata

    is built on top of the great by . With today's release of v 0.4.1, we go a huge step towards framework compatibility by allowing users to convert models seamlessly between FARM <-> transformers and load models from 's model hub. (3/N)

    Prikaži ovu nit
    Poništi
  3. prije 3 sata

    That's why frameworks should be compatible to each other instead of building borders. While there are good reasons to have different tooling for different user groups/use cases, we should build an ecosystem rather than silos. (2/N)

    Prikaži ovu nit
    Poništi
  4. prije 3 sata

    is more than just public code. It's a mindset of sharing, being transparent and collaborating across organizations. It's about building on the shoulders of other projects and advancing together the state of technology (1/N) , , ,

    Prikaži ovu nit
    Poništi
  5. 29. sij

    Today's is heavily fueled by the power of . Glad to announce that we are now a member of 's Inception program! Looking forward to even more GPU power and acceleration of our models via & co

    Poništi
  6. 28. sij

    It's based on the nice work by Zhengyan Zhang & Xiaozhi Wang 🧡

    Prikaži ovu nit
    Poništi
  7. 28. sij

    As we believe in , you can find the public google slides here: Feel free to use it in your own slides & comment missing LMs!

    Prikaži ovu nit
    Poništi
  8. 28. sij

    It's challenging to keep track of all the latest out there. What was again the difference between and ? What's the core idea behind ? Here's a little (not comprehensive) that we use for workshops

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    27. sij

    Transfer learning is increasingly going multilingual with language-specific BERT models: - 🇩🇪 German BERT - 🇫🇷 CamemBERT , FlauBERT - 🇮🇹 AlBERTo - 🇳🇱 RobBERT

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    27. sij

    New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via )

    Poništi
  11. proslijedio/la je Tweet
    20. sij

    GitHub Repo Spotlight №3: Transfer Learning library for NLP called FARM:: With FARM you easily use BERT, XLNet, and others easily for any downstream NLP tasks. FARM is great for fast prototyping too.

    Prikaži ovu nit
    Poništi
  12. 17. sij

    Are you doing in a non-english language? Try the multilingual XLM-R model! It gave us amazing results in German (for the SOTA chasers: yes, it's also outperforming previous results with BERT & Co). Blog:

    Poništi
  13. proslijedio/la je Tweet
    15. sij

    v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: Release Notes: Last release for Python 2 (bye bye!)

    Prikaži ovu nit
    Poništi
  14. proslijedio/la je Tweet
    16. sij

    Introducing Reformer, an efficiency optimized architecture, based on the Transformer model for language understanding, that can handle context windows of up to 1 million words, all on a single accelerator with only 16GB of memory. Read all about it ↓

    Poništi
  15. proslijedio/la je Tweet
    1. sij

    We present our new year special: “oLMpics - On what Language Model pre-training captures״, , Exploring what symbolic reasoning skills are learned from an LM objective. We introduce 8 oLMpic games and controls for disentangling pre-training from fine-tuning.

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    18. pro 2019.
    Poništi
  17. proslijedio/la je Tweet
    16. pro 2019.

    What does it mean to understand language? We argue that human-like understanding requires complementary memory systems and rich representations of situations. A roadmap for extending ML models toward human-level language understanding:

    Prikaži ovu nit
    Poništi
  18. 15. pro 2019.

    As promised: here are the slides from Malte's talks in Warsaw! - Keynote at : - Talk at HumanTech: Reach out to us if you have some large polish text data set (> 10GB) and want to train a polish BERT or ALBERT.

    Poništi
  19. 28. stu 2019.

    2) v0.3.2: Completely redesigned Pipeline, making it simpler & faster than ever to train & use , etc. for QA. We got preprocessing of the dataset down to 42s(!) 🚀. See our blog post for details:

    Prikaži ovu nit
    Poništi
  20. 28. stu 2019.

    Excited to announce two releases with a common theme: Bringing to the industry! 1) : We have a new framework joining the family! Focus: all you need for at scale: indexing docs, retrievers, labeling ...!

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·