Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @ekindogus
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @ekindogus
-
Ekin Dogus Cubuk proslijedio/la je Tweet
Graphene physicist being asked a question about graphite: "Sorry, that's well outside my expertise, so it would be unreasonable to speculate." Any physicist being asked a question about philosophy: "Allow me to launch into a 30 minute lecture about why this question is trivial."
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
FixMatch: focusing on simplicity for semi-supervised learning and improving state of the art (CIFAR 94.9% with 250 labels, 88.6% with 40). https://arxiv.org/abs/2001.07685 Collaboration with Kihyuk Sohn,
@chunliang_tw@ZizhaoZhang Nicholas Carlini@ekindogus@Han_Zhang_@colinraffelpic.twitter.com/BmeYvpEHzX
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
From mantou to manti!https://twitter.com/ykomska/status/1213908289899126785 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
I'll start: "Do ImageNet Classifiers Generalize to ImageNet?" Presents evidence that leaderboard chasing has been fruitful. Models degrade on new test sets but: 1) Order among best models preserved. 2) Deterioration due to dist-shift, not overfitting. https://arxiv.org/abs/1902.10811
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
RandAugment: Practical automated data augmentation with a reduced search space Decreasing the search space in a clever way avoids the need to perform highly expensive computation search. ie. NAS→EfficientNets, AutoAugment→RandAugment Might be useful for domain randomization.https://twitter.com/barret_zoph/status/1196621040064974849 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
JAX now supports Google Cloud TPUs! https://github.com/google/jax/tree/master/cloud_tpu_colabs … I contributed this example, solving a 2D wave equation with a spatially partitioned grid. The code is remarkably simple and all in pure Python!pic.twitter.com/h5NhXkgTm3
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
Very compelling talk by
@sschoenholz on implementing molecular dynamics with Jax. I think the general strategy of upgrading our simulation to include autodiff (and probprog) will be a major theme of the next 5 years. Those points apply equally well to HEP#NeurIPS2019#ML4PS2019pic.twitter.com/bG6WVhMfbJ
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
Tomorrow I'll be talking about JAX MD: a hardware accelerated, end-to-end differentiable, molecular dynamics library at the ML4PS at 9:20am (along with tons of amazing speakers). Paper: https://arxiv.org/abs/1912.04232 Code: https://github.com/google/jax-md Colab: https://colab.sandbox.google.com/github/google/jax-md/blob/master/notebooks/jax_md_cookbook.ipynb …https://twitter.com/DynamicWebPaige/status/1200607460131688448 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
JAX things!!! excited to talk about JAX MD on Saturday. Also, check out our neural tangents poster at the Bayesian deep learning workshop (also JAX)!https://twitter.com/DynamicWebPaige/status/1204840425686626304 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
Entertaining exposition aside, I think the best quote from this paper is "there are usually far more efficient ways to achieve something once we know it’s possible."https://twitter.com/Smerity/status/1199529360954257408 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
This is a great description of RandAugment. Thanks
@CShorten30 for taking the time to make the video.https://twitter.com/CShorten30/status/1197300422802857987 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The code is available online, consider trying it on your image classification or object detection task! With collaborators
@barret_zoph, Jon Shlens, and@quocleix. Code:http://git.io/JeoplPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
However, we also find that the optimal distortion magnitude increases with training set size, which deserves more investigation.pic.twitter.com/TYMNMtGLrQ
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Because of its interpretable hyperparameter (single distortion magnitude), we can study the interaction of data augmentation with different aspects of deep learning. For example, the optimal distortion magnitude goes up with model size, which is to be expected.pic.twitter.com/VtVW0iyxqh
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
RandAugment has a significantly smaller search space, which allows it to be optimized on the model and dataset of interest (instead of having to use a smaller proxy task). It works on CIFAR-10/100, SVHN, ImageNet, and COCO. https://twitter.com/barret_zoph/status/1196621040064974849 …pic.twitter.com/EcQJH3yotr
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
*New paper* RandAugment: a new data augmentation. Better & simpler than AutoAugment. Main idea is to select transformations at random, and tune their magnitude. It achieves 85.0% top-1 on ImageNet. Paper: https://arxiv.org/abs/1909.13719 Code: https://git.io/Jeopl pic.twitter.com/equmk59K2i
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
RandAugment was one of the secret sources behind Noisy Student that I tweeted last week. Code for RandAugment is now opensourced.https://twitter.com/barret_zoph/status/1196621040064974849 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
This morning, at 9:30am, researcher Daniel Park is discussing
#SpecAugment, a simple data augmentation method for automatic speech recognition. Stop by the#Interspeech2019 Google booth to learn all about it and read more at ↓https://goo.gl/KAPS5dHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
At a glance, this seems like a very nice review of a bunch of related ideas.https://twitter.com/hardmaru/status/1168330954147917824 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Ekin Dogus Cubuk proslijedio/la je Tweet
Very saddened to report that Mitchell Feigenbaum passed away over the weekend. Famous for his work in chaos. Highly creative & original, with contributions to cartography, vision and finance not so well known. I was lucky to be one of his friends, and will miss this unique man.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.


If you will be around