Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @LxYuan
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @LxYuan
-
Prikvačeni tweet
Accept responsibility for your own actions.No excuses, no regrets, no alibis, don't point the finger, don't blame anybody else.-Tony Doherty
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
I chatted with
#MeenaBot about the#coronavirus and her advice is to see a doctor sooner rather than later. I guess it's not a bad one & hope everyone is well! On the other hand, Meena is also excited about technology, especially VR!pic.twitter.com/pKRxfFxp38
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
That's cool. Pandas has added a `to_markdown()` method for formatting dataframes.pic.twitter.com/nYf0QgrGUO
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
New blogpost! Transformers from scratch. Modern transformers are super simple, so we can explain them in a really straightforward manner. Includes pytorch code. http://www.peterbloem.nl/blog/transformers …pic.twitter.com/rTLwur0Gwq
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
I've started to upload the videos for the Neural Nets for NLP class here: https://www.youtube.com/playlist?list=PL8PYTP1V4I8CJ7nMxMC8aXv8WqKYwj-aJ … We'll be uploading the videos regularly throughout the rest of the semester, so please follow the playlist if you're interested.https://twitter.com/gneubig/status/1216792330273001472 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
Happy to release NN4NLP-concepts! https://github.com/neulab/nn4nlp-concepts … It's a typology of important concepts that you should know to implement SOTA NLP models using neural nets: https://github.com/neulab/nn4nlp-concepts/blob/master/concepts.md … 1/3 We'll reference this in CMU CS11-747 this year, trying to maximize coverage. 1/3 https://twitter.com/gneubig/status/1216792330273001472 …pic.twitter.com/ILFeobZmPM
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
Now that neural nets have fast implementations, a bottleneck in pipelines is tokenization: strings
model inputs.
Welcome
Tokenizers: ultra-fast & versatile tokenization led by @moi_anthony: -encode 1GB in 20sec -BPE/byte-level-BPE/WordPiece/SentencePiece... -python/js/rust...pic.twitter.com/1TfJ1Hm1xx
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
Workera recently published a report on AI career pathways. It doesn't mention hardware. I also don't see the difference b/w SWE-ML & ML Engineer. But it highlights some important distinctions. I also like
@josh_tobin_ talk on the structure of AI teams https://www.youtube.com/watch?v=Qb3RhwNb4EM …pic.twitter.com/KysDChOjgR
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
如何拯救无法「深度学习」的制造业,2019工业智能灵魂10问https://www.jiqizhixin.com/articles/2020-01-09 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
Looking back, my last decade was like a neural network. Some parts were linear. Some were nonlinear. I never seemed to get enough data, and always got stuck in local minima. There was a lot of learning. I can't explain how any of it worked, but the results came out alright.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
It's incredibly sad for me to say that my time at NVIDIA has ended. I'm grateful for the chance to work w/ so many wonderful people on challenging projects. As I'm going on a new adventure, I put down a quick note on the lessons I learned over the year.https://huyenchip.com/2019/12/23/leaving-nvidia-lessons.html …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hugging Face raises $15 million to build the definitive natural language processing library – TechCrunchhttps://techcrunch.com/2019/12/17/hugging-face-raises-15-million-to-build-the-definitive-natural-language-processing-library/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
New preprint "How Can We Know What Language Models Know?" http://phontron.com/paper/jiang19lpaqa.pdf … Recent work queries LMs for knowledge ("profession") w/ textual questions ("X's profession is Y"). We show you need the *right* Qs: with BERT, just changing how you ask raises accuracy 31% to 38%!pic.twitter.com/VcOwSNB2Ee
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
ACL 2019开源论文 | 基于Attention的知识图谱关系预测https://www.jiqizhixin.com/articles/2019-11-15-3 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
XLM-RoBERTa: Amazing results on XLU and GLUE benchmarks from Facebook AI: large transformer network trained on 2.5TB of text from 100 languages. - ArXiv paper:...https://ai.facebook.com/blog/-xlm-r-state-of-the-art-cross-lingual-understanding-through-self-supervision/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
From the perspective of researchers who want to use T5, its size is a huge obstacle: The full model is more than thirty times the size of established general-purpose NLP models like BERT. Google T5 Explores the Limits of Transfer Learning by
@Synced_Globalhttps://link.medium.com/VHTlpJH6r1Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
“Lessons from How to Lie with Statistics” by
@koehrsen_willhttps://link.medium.com/jFV0Acrsp1Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Understanding searches better than ever before
@googlehttps://blog.google/products/search/search-language-understanding-bert/ …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
"But adults hiding behind children to avoid the difficult conversations that must take place about how to achieve solutions is nothing other than moral cowardice..."https://townhall.com/columnists/benshapiro/2019/09/25/catastrophic-thinking-without-solutions-n2553632 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
I watched the lectures (they are free online), loved the course, recommended it to everyone, and now I'm super excited to be part of it.https://twitter.com/pabbeel/status/1176916292122509313 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Lx proslijedio/la je Tweet
Oh hey, there's a new paper about the YAKE unsupervised keyword extraction algorithm we've been talking about during the live-coding.

Looks like it'll be free to read until November 6:https://authors.elsevier.com/a/1Zl6U4ZQDzlyi Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.