Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @ralphbrooks
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ralphbrooks
-
Prikvačeni tweet
I can help your company spend “less time reading” English text data and more time “acting on insights.” DM me for details.
#DeepLearning#AI#NLProc Resume: https://docs.google.com/document/d/1ik_tiZMR-cmkZySm8f5XJG_y6jQD2ejVy7M5VGZ3HS0/edit?usp=sharing … Recent work: http://famousquotes.ralphabrooks.com Technical Blog:https://towardsdatascience.com/best-practices-for-nlp-classification-in-tensorflow-2-0-a5a3d43b7b73 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
I just published my first post on
@huggingface Medium "Is the future of Neural Networks Sparse?" https://link.medium.com/In4bINyeO3 . Enjoy!Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
New Study Suggests Self-Attention Layers Could Replace Convolutional Layers on Vision Tasks
#machinelearning#NLPhttps://medium.com/syncedreview/new-study-suggests-self-attention-layers-could-replace-convolutional-layers-on-vision-tasks-251f518b76a6 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
Hidden Markov Models have gotten a bit less love in the age of deep learning, but they are really nifty models that can learn even from tiny datasets. I’ve written a notebook introducing HMMs and showing how to implement them in PyTorch—check it out here:https://colab.research.google.com/drive/1IUe9lfoIiQsL49atSOgxnCmMR_zJazKI …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
Google's Reformer paper is a much bigger deal than it is being made out to be. BERT and GPT-2 were not practical given the insane training times/costs. Reformer has the potential to cause a mini NLP revolution over the next few months! -https://ai.googleblog.com/2020/01/reformer-efficient-transformer.html …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
This textbook on NLP is just beautiful https://web.stanford.edu/~jurafsky/slp3/
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
If you want to go beyond the stuff you will learn in
#MachineLearning MOOCs follow this class from@AnimaAnandkumar (not an easy class if you take it seriously). Video lectures: https://www.youtube.com/playlist?list=PLVNifWxslHCDlbyitaLLYBOAEPbmF1AHg …https://twitter.com/AnimaAnandkumar/status/1222114841386577921 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
How to “flex” when you’re a distributed employee: 1. Buy a fancy microphone for Zoom calls - you’ll sound like an talk show host. 2. Add unique background behind video calls - standard wallpaper is so 2010’s. 3. Lighting is everything - pick good light to look like a TV star.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
Open source alert
today we are sharing the code that accelerated BERT inference 17x and allowed us to use the model for @Bing web search at scale
code is available for both @PyTorch and@TensorFlow. Thanks@Azure ML for the great collaboration!https://cloudblogs.microsoft.com/opensource/2020/01/21/microsoft-onnx-open-source-optimizations-transformer-inference-gpu-cpu/ …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
NLP community: Interpreting text models with Captum – an open source, extensible library for model interpretability built on PyTorch. Sentiment Analysis and interpreting BERT Models in the tutorials. https://captum.ai/tutorials/IMDB_TorchText_Interpret …pic.twitter.com/o0xbPPEsFb
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
NLP Newsletter (Issue #2): Reformer, DeepMath, ELECTRA, TinyBERT for Search, VizSeq, Open-Sourcing ML,…
featuring: @refikanadol,@__MLT__,@Thom_Wolf,@HanGuo97,@WWRob,@iamtrask,... GitHub: https://github.com/dair-ai/nlp_newsletter … Medium:https://medium.com/dair-ai/nlp-newsletter-reformer-deepmath-electra-tinybert-for-search-vizseq-open-sourcing-ml-68d5b6eed057 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
Google Reformer: Transformer that can process text sequences of lengths up to 1 million words on a single accelerator using only 16GB of memory http://ai.googleblog.com/2020/01/reformer-efficient-transformer.html … via
@googleaiHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
Happy to release NN4NLP-concepts! https://github.com/neulab/nn4nlp-concepts … It's a typology of important concepts that you should know to implement SOTA NLP models using neural nets: https://github.com/neulab/nn4nlp-concepts/blob/master/concepts.md … 1/3 We'll reference this in CMU CS11-747 this year, trying to maximize coverage. 1/3 https://twitter.com/gneubig/status/1216792330273001472 …pic.twitter.com/ILFeobZmPM
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
We have open-sourced wav2letter@anywhere, an inference framework for online speech recognition that delivers state-of-the-art performance. https://ai.facebook.com/blog/online-speech-recognition-with-wav2letteranywhere/ …pic.twitter.com/9peZPbUNu4
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
I published a new article on the
@PyTorch blog: Active Transfer Learning with PyTorch. Read about adapting Machine learning models with the knowledge that some data points will later get correct human labels, even if the model doesn't yet know the labels:https://medium.com/pytorch/active-transfer-learning-with-pytorch-71ed889f08c1 …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
A good listing of tricks for ML research. Probably combining any two of these could get you a paper at a conference. Please don't do that blindly. Slide by
@FrnkNlsn.pic.twitter.com/wxq2ZOwPyR
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
I prepared a new notebook for my Deep Learning class: Joint Intent Classification and Slot Filling with BERT: https://nbviewer.jupyter.org/github/m2dsupsdlclass/lectures-labs/blob/master/labs/06_deep_nlp/Transformers_Joint_Intent_Classification_Slot_Filling_rendered.ipynb … This a step by step tutorial to build a simple Natural Language Understanding system using the
@snips voice assistant dataset (English only).Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
I running a 60-minute webinar blitz on causal modeling in machine learning code examples THIS Thursday. Ideal for applied
#DataScience and#MachineLearning practitioners interested in connections between dense causal inference lit and ML tools practice. https://altdeep.lpages.co/causal-modeling-in-ml-webinar/ …pic.twitter.com/VTMljYJcUY
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ralph Brooks proslijedio/la je Tweet
I came up with some tricks to accelerate Transformer-XL. I hope you will find this post useful.https://www.reddit.com/r/MachineLearning/comments/eg5rzi/d_some_novel_techniques_i_found_that_accelerates/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
#NLProc I just put together a notebook which gives BERT unknown text and which uses the@huggingface Transformers library to generate sentiment classification from a SavedModel.https://github.com/ralphbrooks/tensorflow-tutorials/blob/master/3-How-to-use-Saved-Model.ipynb …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.