Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @w4nderlus7
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @w4nderlus7
-
Prikvačeni tweet
Ludwig v0.2 is out! BERT, Audio / Speech, Geospatial and Temporal features, new Visualization API, integrated REST server. Enjoy!https://twitter.com/UberOpenSource/status/1154067074253127681 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The correct link, because as you should know by now, I can’t type... https://arxiv.org/abs/2001.09977
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
2) the model has 2.5B parameters, 79% score. Adding a sibgle rule to avoid repetition improved about 7% the performance. What do you think twitterspere, can we say that a rule is worth 200M parameters? :) (3/3)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
1) The human metric correlates with perplexity, which is pretty unexpected and important (2/3)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The paper about Meena (the latest an greatest and biggest chitchat model from Google https://arxiv.org/abs/2001.09977was …) was really interesting to me for two reasons: (1/3)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
@hardmaru this is the paper I told you about at EMNLP other than PPLM ;)Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Surprisingly
@VentureBeat already covered it!https://www.google.it/amp/s/venturebeat.com/2020/01/27/ubers-ai-plays-text-based-video-games-with-human-life-self-sufficiency/amp/ …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
We just released on arXiv our work on playing text adventure games wit Go-Explore in which we achieve surprising generalization results: https://arxiv.org/abs/2001.08868
@AndreaMadotto@Namazifar@Joost_Huizinga@AdrienLE@tur_gokhan@HuaixiuZheng@Alexand90803163@chandra_pkhatriPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
Glad to finally release our paper (http://arxiv.org/abs/2001.06463 ) on Plato
#ConversationalAI platform (https://github.com/uber-research/plato-research-dialogue-system …) with@Alexand90803163@Namazifar@yichia@w4nderlus7@tur_gokhan#ArtificialIntelligence#NLProc#NLP#Uberhttps://twitter.com/arXiv_Daily/status/1219382366566617088 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Piero Molino proslijedio/la je Tweet
Is a whole presentation made with xkcd font too much?? Look forward to my first pplm talk at
@primer_ai today. With live demo
pic.twitter.com/ilIfXbs3jB
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
We failed with declarative for ML long ago ... recently gotten one \eps used (Overton/Apple, https://arxiv.org/pdf/1909.05372.pdf …) similar to
@w4nderlus7's awesome Ludwig https://uber.github.io/ludwig/ . IMO declarative helpful when many types of users and model coding not main challenge, c.f. SQLHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
Looks like PPLM is on the front page of Hacker News today
congrats @savvyRL@w4nderlus7@jasonyo et al.! Thread: https://news.ycombinator.com/item?id=21836473 …pic.twitter.com/X7tz9GVnAF
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
Plug and play language models accepted to
#ICLR2020. Super excited about this work! Thanks to the amazing@jasonyo and@savvyRL for mentoring and hosting me, and@AndreaMadotto for NLP wisdom! Thanks to@lanjanice@jhung0@IHaveSweaters@w4nderlus7 for invaluable contributions!Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
Introducing Generative Teaching Networks, which generate entirely synthetic data that is up to 9x faster to train on than real data!, enabling state-of-the-art Neural Architecture Search https://eng.uber.com/generative-teaching-networks/ … Led by
@felipesuch w@kenneth0stanley,@joelbot3000, &@AdityaRawaI 1/Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
If you are interested in any elements of the power set of {graph learning, link prediction, meta learning, few-shot tasks} and you are at
#NeurIPS2019 check out our poster at the Graph Representation Learning workshop 11:30-12:30. With@bose_joey@asj_ankit@williamleifHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
This is a really cool work from UberAI on a tough question: Is it possible to control the generations of an unconditionally trained language model? We loved it so much that we added it to our repo and made an online demo to play with it! Give it a try
https://transformer.huggingface.co/model/pplm https://twitter.com/savvyRL/status/1202654390630273024 …pic.twitter.com/dkcurlIUXJ
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#PPLM prompted with Dungeons and Dragons stuff and monster topic.pic.twitter.com/EWejNAt5yr
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
PPLM, from the Uber AI team, builds on top of other large transformer-based generative models (like GPT-2), where it enables finer-grained control of attributes of the generated language (e.g. gradually switching topic
or sentiment
).
Try it out
https://transformer.huggingface.co/model/pplm Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
You can find the code already added to the
@huggingface Transformers repo:https://github.com/huggingface/transformers/blob/master/examples/pplm/README.md …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Piero Molino proslijedio/la je Tweet
PPLM, or how to steer the GPT-2 mammoth with a mouse. Was a super fun project to work on (and the initial foray into NLP for most of us!), led by
@sdathath with tireless collaborators@AndreaMadotto@lanjanice@jhung0@IHaveSweaters@w4nderlus7@savvyRL.
Blog / arXiv / demo!https://twitter.com/savvyRL/status/1202654390630273024 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.