Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @deepset_ai
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @deepset_ai
-
The new release also includes many other exciting features: Checkpointing & caching, AMP, SageMaker integration, flexible LR schedules, early stopping, cross-validation, windows support and many more! Thanks to all contributors! Details: https://github.com/deepset-ai/FARM/releases/tag/0.4.1 … (4/N)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#FARM is built on top of the great#transformers by@huggingface. With today's release of v 0.4.1, we go a huge step towards framework compatibility by allowing users to convert models seamlessly between FARM <-> transformers and load models from@huggingface's model hub. (3/N)Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
That's why
#opensource#NLP frameworks should be compatible to each other instead of building borders. While there are good reasons to have different tooling for different user groups/use cases, we should build an ecosystem rather than silos. (2/N)Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#opensource is more than just public code. It's a mindset of sharing, being transparent and collaborating across organizations. It's about building on the shoulders of other projects and advancing together the state of technology (1/N)@huggingface,@spacy_io,@fastdotai,#NLPpic.twitter.com/YkGjDU6sHu
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Today's
#NLP is heavily fueled by the power of#GPUs. Glad to announce that we are now a member of@NVIDIA's Inception program! Looking forward to even more GPU power and acceleration of our models via#apex & co@NvidiaAI#NLP#cuda#amp#deeplearningpic.twitter.com/IHG90MzMbr
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
It's based on the nice work by Zhengyan Zhang & Xiaozhi Wang
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
As we believe in
#opensource, you can find the public google slides here: https://docs.google.com/presentation/d/1INCmHpmFC9s5atS2w5J01wDoEweyzau3dVTktT02gOc/edit?usp=sharing … Feel free to use it in your own slides & comment missing LMs!#openslidesPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
It's challenging to keep track of all the latest
#languagemodels out there. What was again the difference between#RoBERTa and#BERT? What's the core idea behind#T5? Here's a little (not comprehensive)#cheatsheet that we use for workshops#nlp#NLProc#deeplearningpic.twitter.com/KxBSWv8OTd
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
Transfer learning is increasingly going multilingual with language-specific BERT models: -
German BERT https://deepset.ai/german-bert
-
CamemBERT https://arxiv.org/abs/1911.03894 , FlauBERT https://arxiv.org/abs/1912.05372
-
AlBERTo http://ceur-ws.org/Vol-2481/paper57.pdf …
-
RobBERT https://arxiv.org/abs/2001.06286 Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via
@revue)http://newsletter.ruder.io/archive/217744Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
GitHub Repo Spotlight №3: Transfer Learning library for NLP called FARM:: https://github.com/deepset-ai/FARM With FARM you easily use BERT, XLNet, and others easily for any downstream NLP tasks. FARM is great for fast prototyping too.
#NLP#DataScience#AIPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Are you doing
#NLP in a non-english language? Try the multilingual XLM-R model! It gave us amazing results in German (for the SOTA chasers: yes, it's also outperforming previous results with BERT & Co). Blog:https://towardsdatascience.com/xlm-roberta-the-multilingual-alternative-for-non-english-nlp-cf0b889ccbbf …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: https://pytorch.org/blog/pytorch-1-dot-4-released-and-domain-libraries-updated/ … Release Notes: https://github.com/pytorch/pytorch/releases/tag/v1.4.0 … Last release for Python 2 (bye bye!)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
Introducing Reformer, an efficiency optimized
#ML architecture, based on the Transformer model for language understanding, that can handle context windows of up to 1 million words, all on a single accelerator with only 16GB of memory. Read all about it ↓https://goo.gle/2treP7rHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
We present our new year special: “oLMpics - On what Language Model pre-training captures״, http://arxiv.org/abs/1912.13283 , Exploring what symbolic reasoning skills are learned from an LM objective. We introduce 8 oLMpic games and controls for disentangling pre-training from fine-tuning.pic.twitter.com/ECQ7ZpcKlg
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
Great to see VCs being excited about NLP. Recent examples: -
@Lux_Capital's investment in@huggingface: https://medium.com/@brandon.reeves/our-investment-in-hugging-face-c8497a3519eb … -@Accel's investment in@Rasa_HQ: https://www.accel.com/interests/OurInvestmentInRasa … - "Entering the Golden Age of NLP" by@thresholdvc:https://medium.com/@thresholdvc/neurips-2019-entering-the-golden-age-of-nlp-c8f8e4116f9d …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
deepset proslijedio/la je Tweet
What does it mean to understand language? We argue that human-like understanding requires complementary memory systems and rich representations of situations. A roadmap for extending ML models toward human-level language understanding: https://arxiv.org/abs/1912.05877 pic.twitter.com/eKBOekfmgj
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
As promised: here are the slides from Malte's talks in Warsaw! - Keynote at
@pydatawarsaw: https://drive.google.com/file/d/1V81Vn5n5L0z8naiu3tnZfTwt5CUj1Ofp/view?usp=sharing … - Talk at HumanTech: https://drive.google.com/file/d/1uQM3nEGkJh_HWpWJ4TV_FwGDTyCY3MBf/view?usp=sharing … Reach out to us if you have some large polish text data set (> 10GB) and want to train a polish BERT or ALBERT.pic.twitter.com/noyQpfqzeh
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
2)
#FARM v0.3.2: Completely redesigned#QA Pipeline, making it simpler & faster than ever to train & use#BERT,#Roberta etc. for QA. We got preprocessing of the#Squad dataset down to 42s(!)
. See our blog post for details:https://medium.com/deepset-ai/modern-question-answering-systems-explained-4d0913744097 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Excited to announce two releases with a common theme: Bringing
#QuestionAnswering to the industry! 1)#Haystack: We have a new#NLP framework joining the#FARM family! Focus: all you need for#QA at scale: indexing docs, retrievers, labeling ...!https://github.com/deepset-ai/haystack …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.