Tweetovi
- Tweetovi, trenutna stranica.
- Tweetovi i odgovori
- Medijski sadržaj
Blokirali ste korisnika/cu @sschoenholz
Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @sschoenholz
-
Prikvačeni tweet
After a ton of work by a bunch of people, we're releasing an entirely new Neural Tangents. Paper: https://arxiv.org/abs/1912.02803 Github: https://github.com/google/neural-tangents … Colab Notebook:https://colab.sandbox.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
cc
@cosmo_shirley@DavidSpergel - see this^ and also@prfsanjeevarora's paper: https://arxiv.org/pdf/1911.00809.pdf …. They get AlexNet performance with a GP!! I think this technique is widely applicable for non-parametric Bayesian inference on raw astronomical images.Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
I was particularly pleased with how easily everything came together in this colab since it more-or-less reproduces the results of (https://papers.nips.cc/paper/6322-exponential-expressivity-in-deep-neural-networks-through-transient-chaos …) and (https://arxiv.org/abs/1611.01232 ). Of course here it's easy to play around with the architecture and see what changes.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
This includes a bonus-double-vmap through Newton's method that *just worked* and was super fast. Hacked on this with
@jaschasd along with a lot of helpful comments from Roman Novak,@hoonkp,@Locchiu, and@TheGregYang.Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
This is a great question that I've gotten periodically. Previously it would have taken too long to put something together, but using Neural Tangents (http://github.com/google/neural-tangents …) it's really easy and fast! Here is the reproduction in a colab: https://colab.sandbox.google.com/github/google/neural-tangents/blob/master/notebooks/phase_diagram.ipynb …https://twitter.com/DigantaMisra1/status/1220571913904242688 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Thank you to
@RepAdamSchiff for charging so brilliantly in what is perhaps one of the last stands Americans can do for truth, right, and democracy. I admire you.https://twitter.com/RepAdamSchiff/status/1220559375938609152 …
8:38Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Research on the Neural Tangent Kernel (NTK) almost exclusively uses a non-standard neural network parameterization, where activations are divided by sqrt(width), and weights are initialized to have variance 1 rather than variance 1/width.
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Have you ever wondered what will be the ML frameworks of the '20s? In this essay, I examine the directions AI research might take and the requirements they impose, concluding with an overview of what I believe to be two strong candidates: JAX and S4TF.http://inoryy.com/post/next-gen-ml-tools/ …
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
We used JAX for competitive gradient descent (CGD) with Hessian-vector products. Mixed mode differentiation in JAX makes this efficient (just twice cost of backprop). We used CGD for training GANs and for constrained problems in RL. This library will be very useful
@pierreluxhttps://twitter.com/hardmaru/status/1219088493583880192 …
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Did you say more tiles?
#generative#everyday#creativecoding#daily#code#nannou #1112pic.twitter.com/9Qm3XhlYaUHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
To make my research more reproducible, extensible and comparable to that of others & out of need to homogenize the language we use to express nn pruning methods, I contributed `nn.utils.prune` to
#PyTorch 1.4 (see highlights https://github.com/pytorch/pytorch/releases/tag/v1.4.0 …) Try it out & build on it!
https://twitter.com/PyTorch/status/1217596623167152129 …pic.twitter.com/NqURm90B0E
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Differentiable Digital Signal Processing (DDSP)! Fusing classic interpretable DSP with neural networks.
Blog: http://magenta.tensorflow.org/ddsp
Examples: https://g.co/magenta/ddsp-examples …
Colab: http://g.co/magenta/ddsp-demo …
Code: http://github.com/magenta/ddsp
Paper: http://g.co/magenta/ddsp-paper …
1/pic.twitter.com/SlxLUOUC6k
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
pyhf 0.3.4 now supports JAX! And after JIT it's the fastest backend yet to perform particle physics hypothesis tests (even just on CPU). Thanks
@SingularMattrix for some early PRs on einsum :)@HEPfeickert@kratsgpic.twitter.com/mKSU06HFAQ
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
The Case for Bayesian Deep Learning ”Bayesian or not, the prior will certainly be imperfect. Avoiding an important part of the modelling process because one has to make assumptions, however, will often be a worse alternative than an imperfect assumption.” https://cims.nyu.edu/~andrewgw/caseforbdl.pdf …pic.twitter.com/86rV2eqqXD
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Reality and simulation converging...https://twitter.com/OlixOliver/status/1215087838296989696 …
0:29Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
1/2 Why isn't it more common to do explicit Hamiltonian MCMC on a Bayesian Neural Network's weights, with eg the initial condition = the loss minima found via SGD? I'm playing around with one in JAX and it seems to be working reasonably even with 5 chains: https://colab.research.google.com/drive/1gMAXn123Pm58_NcRldjSuGYkbrXTUiN2 …pic.twitter.com/LbC3kPOMkY
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
What I did over my winter break! It gives me great pleasure to share this summary of some of our work in 2019, on behalf of all my colleagues at
@GoogleAI &@GoogleHealth.https://ai.googleblog.com/2020/01/google-research-looking-back-at-2019.html …Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Fenchel-Rockafellar duality is a powerful tool that more people should be aware of, especially for RL! Straightforward applications of it enable offpolicy evaluation, offpolicy policy gradient/imitation learning, among others https://arxiv.org/abs/2001.01866
@daibond_alphaHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
Our review of Machine Learning and the Physical Sciences made the cover of
@APSphysics Review of Modern Physics ! https://journals.aps.org/rmp/abstract/10.1103/RevModPhys.91.045002 … (or here https://arxiv.org/abs/1903.10563 ) I worked on that image for a while and it incorporates the famous CNN figure from@ylecunpic.twitter.com/Fkr9ZFrYHR
Prikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
e & q = tuning forks γ= phonons g= ping pong ballpic.twitter.com/1OCvP7Vm7r
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi -
Sam Schoenholz proslijedio/la je Tweet
We used mixed mode differentiation in JAX to implement competitive gradient descent that requires Hessian-vector products. Code repo: https://github.com/gehring/fax
@shoyer@SingularMattrixPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.