Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @dilipkay
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @dilipkay
-
Dilip Krishnan proslijedio/la je Tweet
Come to our poster on "Fantastic Generalization Measures and Where To Find Them" at
#NeurIPS2019 workshops "ML with Guarantees" and "Science Meets Engineering of DL".@yidingjiang will also give a spotlight talk at 5:40pm in "Science Meets Engineering of DL" workshop.pic.twitter.com/GF48nwytFB
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
One of the most comprehensive studies of generalization to date; ≈40 complexity measures over ≈10K deep models. Surprising observations worthy of further investigations. Fantastic Generalization Measures: https://bit.ly/34TqKZs w
@yidingjiang@bneyshabur@dilipkay S. Bengiopic.twitter.com/POG4DoNaAU
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
Google AI Residency 2020 applications are open, with positions in many different locations including Europe and Africa. A fantastic program designed to jumpstart a career in Machine Learning. Apply at http://g.co/airesidency/apply … before Dec. 19, 2019.
#AIResidency@GoogleAIHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
Nice work & repo on knowledge distillation https://github.com/HobbitLong/RepDistiller … dark knowledge remains one of few amusingly brain-tickling / head-scratching results in neural nets
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
New paper with
@YonglongT and@phillip_isola : we apply contrastive learning to representation distillation, achieving SOTA results. Seems to be the first method to consistently outperform knowledge distillation for a range of tasks. Details and code: https://hobbitlong.github.io/CRD/ pic.twitter.com/IO7AeuPhWP
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
So it seems the margin-based approach to generalization bounds can be made to work -- provided you measure margin in a way that reflects the stability of all layers jointly. In hindsight, that seems rather natural. More impressive work from
@tengyuma with Colin Wei.https://twitter.com/TheGradient/status/1186408003517370368 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
For quite sometime (NeurIPS18, ICLR19), we have empirically observed margin at intermediate layers carries significant information about generalization of a deep model. Delighted to see
@tengyu has now proved this phenomenon, and provided a cleaner definition of all-layer margin.https://twitter.com/tengyuma/status/1186403966881615874 …Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
Applications now open for 2019 Google Faculty Research Award and are due September 30 at 1:00PM PST. The award provides unrestricted gifts as support for world-class technical research in Computer Science, Engineering, and related fields.https://ai.google/research/outreach/faculty-research-awards/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
#ISRO Today (August 20, 2019) after the Lunar Orbit Insertion (LOI),#Chandrayaan2 is now in Lunar orbit. Lander Vikram will soft land on Moon on September 7, 2019pic.twitter.com/6mS84pP6RD
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
New paper at ICCV 2019! We tackle the problem of image *extrapolation* using generative adversarial networks. This is a much less constrained problem than image interpolation. Check out the paper, and more results here: https://sites.google.com/corp/view/boundless-iccv/home …pic.twitter.com/SbiJaa7eVt
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Successful launch, congratulations
#isro! 23 days before landing sequence is initiated (aiming at the South Pole of the moon).https://twitter.com/isro/status/1153232244506411008 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Good luck to ISRO!! Hope the technical issues are fixed soon.https://twitter.com/isro/status/1150520298761936896 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
Our work on DEMOGEN dataset (first dataset of trained networks with realistic sizes/architectures) and its use in studying the connection between margin distribution and generalization gap.https://twitter.com/GoogleAI/status/1148649459468738560 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
My attempt at writing blog post.https://twitter.com/GoogleAI/status/1148649459468738560 …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
with
@yidingjiang@TheGradient and Samy BengioPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
New
@GoogleAI by blog post on our new dataset of over 700 models for the study of generalization; and our results on (very accurately) predicting the generalization gap in deep networks!https://ai.googleblog.com/2019/07/predicting-generalization-gap-in-deep.html …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
A novel new regularizer for deep networks that achieves SOTA for Imagenet adversarial robustness! Joint work with Chongli Qin, Pushmeet Kohli and other awesome people at DeepMind: https://arxiv.org/abs/1907.02610
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
with
@yidingjiang@dilipkay@bneyshabur and S. Bengio.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
3/3 We thought storing a few (e.g. 5) evenly spaced weights between initial and final w's. I also thought storing LPC coeffs (say order n=10) [LPC predicts next w as linear combination of previous w's: (a_1,...,a_n)=argmin sum_t (w(t)-sum_i^n a_i w(t-i))^2]. Other thoughts?
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Dilip Krishnan proslijedio/la je Tweet
2/3 We cannot store all intermediate weights (final weights alone are 15.6GB in current version of DEMOGEN). Are there some compact statistics from the trajectory that you wished to have access to besides the final weights?
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.