Diego Francisco Valenzuela Iturra

@diegovogeid

Coffee, Music and Deep Learning

Santiago, Chile
Vrijeme pridruživanja: ožujak 2018.

Tweetovi

Blokirali ste korisnika/cu @diegovogeid

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @diegovogeid

  1. proslijedio/la je Tweet
    prije 14 sati

    Why do random embedding methods for high-dimensional Bayesian optimization produce inconsistent results? Find out in our new paper w/ et al . Implementation of new method now in Ax. See for usage + replication materials.

    Poništi
  2. proslijedio/la je Tweet
    3. velj
    Odgovor korisniku/ci

    PyTorch is the most frequently used external tool set/library

    Poništi
  3. proslijedio/la je Tweet
    3. velj

    Our NN is initially in Python for rapid iteration, then converted to C++/C/raw metal driver code for speed (important!). Also, tons of C++/C engineers needed for vehicle control & entire rest of car. Educational background is irrelevant, but all must pass hardcore coding test.

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    Odgovor korisniku/ci

    This has been the most interesting infra to design. It's all about catching the neural net in its uncertain moments (lot of details here!), gathering data in those situations, incorporating it (correctly labeled) into training sets, retraining, and carefully tracking the effects.

    Poništi
  5. proslijedio/la je Tweet

    Help revolutionize the world with full self-driving by joining us at Tesla Autopilot: It is very hard to find other places where AI expertise makes as much of a difference on as big of a problem.

    Poništi
  6. proslijedio/la je Tweet
    3. velj

    Check out the live video recording of our latest ContinualAI Online Meetup on "Generative Models for Continual Learning" 👇

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    3. velj

    It was a great opportunity to discuss with more than 20 researchers about generative models and how they can be used in Continual Learning! This was a great experience and truly enriching! Thanks to the community! 😊

    Poništi
  8. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  9. proslijedio/la je Tweet
    31. sij

    Transformers 2.4.0 is out 🤗 - Training transformers from scratch is now supported - New models, including *FlauBERT*, Dutch BERT, *UmBERTo* - Revamped documentation - First multi-modal model, MMBT from , text & images Bye bye Python 2 🙃

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    31. sij

    We are releasing HiPlot, a lightweight interactive visualization tool to help AI researchers discover correlations and patterns in high-dimensional data.

    Poništi
  11. proslijedio/la je Tweet
    30. sij

    Introducing Sapling: Accelerating Suffix Array Queries with Learned Data Models with and .

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    Poništi
  13. proslijedio/la je Tweet
    30. sij

    We're standardizing OpenAI's deep learning framework on PyTorch to increase our research productivity at scale on GPUs (and have just released a PyTorch version of Spinning Up in Deep RL):

    Poništi
  14. proslijedio/la je Tweet
    27. sij

    Another great result demonstrating that VAEs (deep learning + amortized variational inference) make a lot of sense for data compression. Its loss function directly maximizes compressibility, and the resulting codec is fully parallelizable.

    Poništi
  15. proslijedio/la je Tweet
    28. sij

    Check out Meena, a new state-of-the-art open-domain conversational agent, released along with a new evaluation metric, the Sensibleness and Specificity Average, which captures basic, but important attributes for normal conversation. Learn more below!

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    20. pro 2019.

    (2) Another paper is a new approach based on autoregressive flows for molecular graph generation and optimization

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    20. pro 2019.

    Excited to share that two papers on drug discovery have been accepted to ICLR'20. (1) one is a new unsupervised and semi-supervised approach for learning graph-level representations for properties prediction.

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    26. sij

    Pour approfondir (ou mieux structurer) sa connaissance de l'apprentissage profond pour le traitement automatique des langues, un cours de à encore chaud de 2019 !

    Prikaži ovu nit
    Poništi
  19. proslijedio/la je Tweet
    26. sij

    The best part of DeepMind's Graph Nets library is the fact that they have Colab notebooks ready to demo in your browser for 3 different applications (Path Finding, Sorting, and Physics State Prediction). Have a look!

    Poništi
  20. proslijedio/la je Tweet
    26. sij

    Quaternions and Euler angles are discontinuous and difficult for neural networks to learn. They show 3D rotations have continuous representations in 5D and 6D, which are more suitable for learning. i.e. regress two vectors and apply Graham-Schmidt (GS).

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·