Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @letrungson1
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @letrungson1
-
Shawn Le proslijedio/la je Tweet
An Opinionated Guide to ML Research: “To make breakthroughs with idea-driven research, you need to develop an exceptionally deep understanding of your subject, and a perspective that diverges from the rest of the community—some can do it, but it’s hard.” http://joschu.net/blog/opinionated-guide-ml-research.html …pic.twitter.com/fyO6cyr9im
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
in today’s Stats 385 lecture, Jeffrey Pennington of Google studies Theory of Neural Nets using Random Matrix Theorypic.twitter.com/QSS4f7KPQc
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Many aspiring AI engineers ask me how to take the next step and join an AI team. This report from
@Workera_, a@deeplearningai_ affiliate, walks you through how AI teams work and which skills you need for different AI career tracks. Download it here: http://bit.ly/2s2HBdUHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
So, "deep learning" is the idea of doing representation learning via a chain of learned feature extractors. It's all about describing some input data via *deep hierarchies of features*, where features are *learned*. A further question is then: is the brain "deep learning"?
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Courses 1 & 2 of
@deeplearningai_‘s newest Specialization is now available on@Coursera! Training a model is only one step in building a working AI system. These courses teach you how to navigate some key deployment scenarios. Enroll here: http://bit.ly/2Yxg0xi pic.twitter.com/ifzwCpFbUK
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
How Boston Dynamics Is Redefining Robot Agility - IEEE Spectrumhttps://spectrum.ieee.org/robotics/humanoids/how-boston-dynamics-is-redefining-robot-agility#.Xd_TyqUKrzY.twitter …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
More on domain randomization here by
@OpenAIhttps://openai.com/blog/solving-rubiks-cube/ …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Want to improve accuracy and robustness of your model? Use unlabeled data! Our new work uses self-training on unlabeled data to achieve 87.4% top-1 on ImageNet, 1% better than SOTA. Huge gains are seen on harder benchmarks (ImageNet-A, C and P). Link: https://arxiv.org/abs/1911.04252 pic.twitter.com/0umSnX7wui
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Self-supervised learning opens up a huge opportunity for better utilizing unlabelled data while learning in a supervised learning manner. My latest post covers many interesting ideas of self-supervised learning tasks on images, videos & control problems:https://lilianweng.github.io/lil-log/2019/11/10/self-supervised-learning.html …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Pushy robots learn the fundamentals of object manipulation https://ift.tt/2oUR5Xm
#roboticsHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
I Now call it "self-supervised learning", because "unsupervised" is both a loaded and confusing term. In self-supervised learning, the system learns to predict part of its input from other parts of it input. In... https://www.facebook.com/722677142/posts/10155934004262143/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Yann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning on … - Yann https://into.ai/blog/news-stories/yann-lecun-deep-learning-convolutional-neural-networks-and-self-supervised-learning-on/ …
#ai#intoAInewsHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Self-Supervised Learning of Depth and Motion Under Photometric Inconsistency https://deepai.org/publication/self-supervised-learning-of-depth-and-motion-under-photometric-inconsistency … by Tianwei Shen et al.
#Statistics#SupervisedLearningHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
RoBERTa demonstrates the potential for self-supervised training techniques to match or exceed the performance of more traditional, supervised approaches. Read more: https://ai.facebook.com/blog/roberta-an-optimized-method-for-pretraining-self-supervised-nlp-systems/ …pic.twitter.com/gLh5wEjo01
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Neural structured learning in TensorFlowhttps://medium.com/tensorflow/introducing-neural-structured-learning-in-tensorflow-5a802efd7afd …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
In our new blog post, we review how brains replay experiences to strengthen memories, and how researchers use the same principle to train better AI systems:https://deepmind.com/blog/article/replay-in-biological-and-artificial-neural-networks …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
We see more significant improvements from training data distribution search (data splits + oversampling factor ratios) than neural architecture search. The latter is so overrated :)
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Can neural network architectures alone, without learning any weight parameters, encode solutions for a given task? We search for “weight agnostic neural network” architectures that can perform various tasks even when using random weight values. Learn more→https://goo.gle/2Lf77Cx pic.twitter.com/J58sAAly2K
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
The wonderful
@FryRsquared has been working on this podcast with the team for a while and I’m really excited to see the series launch! It was great fun talking to Hannah for episode 8, and I think she's captured@DeepMindAI's culture and research brilliantly.https://twitter.com/DeepMind/status/1162316239898849280 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Shawn Le proslijedio/la je Tweet
Nice blog post about a series of optimizations to reduce training time of a CIFAR10 image model. Many of these options are likely applicable to lots of different kinds of models. Nice work,
@dcpage3!https://twitter.com/dcpage3/status/1163563850442182657 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.