Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @seb_ruder
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @seb_ruder
-
Prikvačeni tweet
10 ML & NLP Research Highlights of 2019 New blog post on ten ML and NLP research directions that I found exciting and impactful in 2019. https://ruder.io/research-highlights-2019/ …pic.twitter.com/mPoKbkOcOW
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
NLP Newsletter #3: Flax, Thinc, Language-specific BERT models, Meena, Flyte, LaserTagger,…
featuring: @AnimaAnandkumar,@techno246,@hen_str,@jeremyakahn,@lexfridman,@iamtrask,@seb_ruder,@huggingface,. GitHub: https://github.com/dair-ai/nlp_newsletter … Medium:https://medium.com/dair-ai/nlp-newsletter-flax-thinc-language-specific-bert-models-meena-flyte-lasertagger-4f7da04a9060 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Curriculum for Reinforcement Learning "Learning is probably the best superpower we humans have."
@lilianweng explores four types of curricula that have been used to help RL models learn to solve complicated tasks. https://lilianweng.github.io/lil-log/2020/01/29/curriculum-for-reinforcement-learning.html …pic.twitter.com/UTWHt9l3ng
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
Given the paucity of annotated data, how can we perform sample-efficient generalization on unseen task-language combinations? Possible solution: a generative model of the neural parameter space, factorized into variables for several languages and tasks. 1/2
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
Humans learn from curriculum since birth. We can learn complicated math problems because we have accumulated enough prior knowledge. This could be true for training a ML/RL model as well. Let see how curriculum can help an RL agent learn:https://lilianweng.github.io/lil-log/2020/01/29/curriculum-for-reinforcement-learning.html …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
We are happy to welcome
@seb_ruder back to Heidelberg! He will give a talk on 6th Feb about "Cross-lingual transfer learning" https://www.cl.uni-heidelberg.de/english/colloquium/index.mhtml …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
This year I am keen on featuring ML and NLP tools and projects in the NLP Newsletter. They help to inspire other developers and also promote some of the interesting ideas coming from NLP and ML. Reach out if you are working on something interesting and would like to feature it.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
When you apply a prototype Transmogrifier to language modelling, you get the Mogrifier LSTM https://openreview.net/forum?id=SJe5P6EYvS … and a couple of state-of-the-art results. Joint work with
@TomasKocisky and Phil Blunsom. Code at https://github.com/deepmind/lamb .pic.twitter.com/40qTjXsJbp
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
Machine Learning Summer School 2020 is in Tuebingen, Germany! Please apply. Deadline: 11 Feb 2020. http://mlss.tuebingen.mpg.de/2020
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
At
@CompScienceCU we run a "Data and Knowledge Engineering" seminar every Monday. Check out the amazing list of speakers for the Spring semester, with a great mix of internal and external speakers from both industry and academia!pic.twitter.com/ExxQmWXmlC
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
We have both a Dutch ULMFiT model and a Dutch BERT-model (BERT-NL), both available on http://www.textdata.nl The paper on Dutch ULMFit: https://arxiv.org/abs/1910.00896 (focusing on small training set sizes)
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
A few more ;) -
https://huggingface.co/KB/albert-base-swedish-alpha … https://huggingface.co/af-ai-center/bert-base-swedish-uncased …
-
https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1 …
-
https://huggingface.co/bert-base-japanese-char … https://huggingface.co/bert-base-japanese … https://huggingface.co/yosuke/bert-base-japanese-char …
The full list compatible with transformers is below 
https://huggingface.co/models Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Looking at the replies, there are a lot of BERT models that I missed: -
RuBERT https://arxiv.org/abs/1912.09582
-
BETO https://github.com/dccuchile/beto
-
BERTje https://arxiv.org/abs/1912.09582
-
Portuguese BERT https://github.com/neuralmind-ai/portuguese-bert …
-
https://github.com/dbmdz/berts/blob/master/README.md …
Thanks everyone!
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sebastian Ruder proslijedio/la je Tweet
We are excited to host
@seb_ruder on Februray 6 at http://heidelberg.ai ! He will talk about "Cross-lingual Transfer Learning", more info at: https://heidelberg.ai/2020/02/06/ruder.html …@peterjensen_@maierhein@saakohlHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Transfer learning is increasingly going multilingual with language-specific BERT models: -
German BERT https://deepset.ai/german-bert
-
CamemBERT https://arxiv.org/abs/1911.03894 , FlauBERT https://arxiv.org/abs/1912.05372
-
AlBERTo http://ceur-ws.org/Vol-2481/paper57.pdf …
-
RobBERT https://arxiv.org/abs/2001.06286 Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via
@revue)http://newsletter.ruder.io/archive/217744Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
If you want to learn about privacy-preserving machine learning, then there is no better resource than this step-by-step notebook tutorial by
@iamtrask. From the basics of private deep learning to building secure ML classifiers using PyTorch & PySyft.https://github.com/OpenMined/PySyft/tree/master/examples/tutorials …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Emil’s Story as a Self-Taught AI Researcher An interview with
@EmilWallner with useful tips on structuring a curriculum, creating a portfolio, getting involved in research, and finding a job.
https://blog.floydhub.com/emils-story-as-a-self-taught-ai-researcher/ …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Is MT really lexically less diverse than human translation? TL;DR:
@marian_nmt analyses WMT19 system outputs and finds no difference in lexical diversity (LD) between MT and human translations and no correlation between LD and MT quality. https://marian-nmt.github.io/2020/01/22/lexical-diversity.html …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I also really like the focus on learning, organised around a collection of top resources for each topic:https://practicalai.me/learn/lessons/
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
For instance, you can follow what I'm currently reading here:https://practicalai.me/@sebastianruder/decks/iy6em/What-I'm-reading/?sortby=latest …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.