Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @gkossakowski
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @gkossakowski
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
You should try http://fast.ai 's training loop even if your model happens to fall into one of the broad categories they support like vision, nlp or tabular data. End of my today's praise fo http://fast.ai . 4/4
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Spot checking the examples returned by `most_confused` and `top_losses` from the Interpretation object revealed mislabelled training examples. These are little touches that are really handy in practice. 3/4
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The big surprise comes from the fact that I was using exactly the same optimizer and learning rate as in the home-grown training loop. I also disabled weight decay and still http://fast.ai 's training loop was producing a better model. Not sure why. 2/4
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I've experimented swapping a home-grown PyTorch training loop with http://fast.ai one and kept the embeddings classification model intact. To my surprise, classifier's performance jumped the most since this model was put together originally 1/4
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Grzegorz Kossakowski proslijedio/la je Tweet
“In 1990 a generation of baby-boomers, with a median age of 35, owned a third of America’s real estate by value. In 2019 a similarly sized cohort of millennials, aged 31, owned just 4%”https://twitter.com/TheEconomist/status/1217851633155149825 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The modern Stochastic Gradient Descent is marvelous. I've seen enough examples of its effectiveness and still can't wrap my head around how well it works.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I heard a while back from
@jeremyphoward that deep learning models can be trained on noisy data. It's biased data that's problematic. I trained recently a graph embedding model with 50% of training examples being bogus and the model still achieved a remarkable performancePrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Grzegorz Kossakowski proslijedio/la je Tweet
The guy who invented the locomotive had to do an experiment to prove to himself that a vehicle could move itself simply by turning its own wheels
pic.twitter.com/GdsEeo8Jym
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Grzegorz Kossakowski proslijedio/la je Tweet
The feedback will continue until morale improves.
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
That essay calls ML an "enabling layer" and that's the best term I have seen. Interestingly, Jeff Bezos calls the Internet an enabling layer in his 2007 TED talk: https://www.youtube.com/watch?v=vMKNUylmanQ … It's an entertaining and enlightening talk from the time TED represented qualityhttps://twitter.com/gkossakowski/status/1217356646441046017 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
That essay was one of sources for my go-to rule that any startup with .ai in its domain name represents either a purposeful acquisition target by triggering the right keyword search or a lousy product thinkinghttps://twitter.com/gkossakowski/status/1217356646441046017 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
The essay was published in 2018 but I stumbled upon it in 2019.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Looking back at 2019 reads, I find Benedict Evans' "Ways to think about machine learning" to be one of the most interesting frameworks of thinking of ML https://www.ben-evans.com/benedictevans/2018/06/22/ways-to-think-about-machine-learning-8nefy … I reread it today and realized that it ages really way
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Grzegorz Kossakowski proslijedio/la je Tweet
the programmer equivalent of a lawyer to represent my interests within various software systems
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Grzegorz Kossakowski proslijedio/la je Tweet
this is absolutely nuts. The AI GPT-2 has learned to play chess moderately well (able to give bad human amateurs a game) – despite only being a text AI, learning from a corpus of chess notation text, and not having any concept of what a chessboard ishttps://slatestarcodex.com/2020/01/06/a-very-unlikely-chess-game/ …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
That paper claims that MC Dropout is indeed used against adversarial examples. Phew, i wasn’t off with my intuition. And as it usually goes with good ideas for pressing problems: someone is already on it.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Keeping Dropout on is called Monte Carlo Dropput and can be thought as having an ensemble of networks instead of a single network. I wondered if MC Dropout could be used to counter adversarial examples that are tuned for particular weights and here it is: https://arxiv.org/abs/1803.08533
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
WAW
SFO