FloydHub

@FloydHub_

FloydHub ( W17) is a zero setup Deep Learning platform for training and deploying AI models.

Vrijeme pridruživanja: veljača 2017.

Tweetovi

Blokirali ste korisnika/cu @FloydHub_

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @FloydHub_

  1. Prikvačeni tweet
    29. kol 2018.

    Meet - the fastest way to build, train, and deploy AI models. Sign up for free at :

    Poništi
  2. proslijedio/la je Tweet
    29. sij

    This was by far the most difficult (and interesting) post I have worked on. Tokenization is a really exciting field of research in and of itself. Loved working with to get it published

    Prikaži ovu nit
    Poništi
  3. 29. sij

    How do recent advancements in actually teach models a more efficient way to process words? Take a dive into the nitty-gritty details of tokenization with our newest article authored by :

    Poništi
  4. proslijedio/la je Tweet
    20. sij

    How to get into machine learning without a degree, alternatives to traditional education, building a portfolio, applying for ML jobs, interviews, getting into research, and indie research ideas. I had an ace chat with at . Enjoy!

    Poništi
  5. proslijedio/la je Tweet
    15. stu 2019.

    I'm excited to share this blog article on the project I've been working on for the past few months: Training a tiny Sentiment Analisys model using Knowledge Distillation to remarkably improve accuracy Article: Code:

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    31. lis 2019.

    Once you've created your classifier, you need to analyze its quality. I published a guide with on classifier metrics

    Poništi
  7. proslijedio/la je Tweet
    16. lis 2019.

    Got my first Article published. Titled: Introduction to Adversarial Machine Learning. Most exciting part is it uses my Library scratchai built on top of 🤩🤩 Link: Library:

    Poništi
  8. proslijedio/la je Tweet
    20. ruj 2019.

    A dive on deep learning projects, giving you tried-and-true advice, debugging tricks, and an easily-understood overview of training neural networks in this guide: *Training Neural Nets: a Hacker’s Perspective*

    Poništi
  9. proslijedio/la je Tweet
    19. ruj 2019.
    Poništi
  10. proslijedio/la je Tweet
    18. ruj 2019.

    Multiprocessing vs. Threading in Python: What Every Data Scientist Needs to Know via

    Poništi
  11. proslijedio/la je Tweet
    16. ruj 2019.
    Poništi
  12. proslijedio/la je Tweet
    7. ruj 2019.

    [FloydHub] Multiprocessing vs. Threading in Python: What Every Data Scientist Needs to Know --> Sooner or later, every data science project faces an inevitable challenge: speed. Working with larger data sets leads to slower processing thereof, so you'l…

    Poništi
  13. proslijedio/la je Tweet
    21. kol 2019.

    Bigger vs. smaller models, powerful vs. dumb models by via My article "Becoming One with the Data" is also in the newsletter :)

    Poništi
  14. proslijedio/la je Tweet
    19. kol 2019.

    This is one of the most astounding demos of Deep Learning I've ever seen! Huge opportunity for new open-source Web Development tools based on merely 200 lines of Keras code. Really opens up the imagination for what can be done with

    Poništi
  15. proslijedio/la je Tweet

    The researchers have shared their Keras/TensorFlow code on GitHub!

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet

    Nice detailed post on the applications of deep learning in genetics

    Prikaži ovu nit
    Poništi
  17. proslijedio/la je Tweet
    8. kol 2019.

    I looked into this in a bit more detail in my recent blog post on to see which models performed better for general semantic similarity with the default pre-trained models

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    6. kol 2019.

    Just published a post on about when choosing the best model might not be the best choice. Big shout out to for their transformers repo which enabled me to use XLNet for comparison.

    Poništi
  19. proslijedio/la je Tweet
    4. kol 2019.

    N-Shot Learning: Getting things done with less data |

    Poništi
  20. proslijedio/la je Tweet
    2. kol 2019.

    Deep learning is great but it requires a large amount of data to train and predict with such efficiency. But recently, a new domain has emerged which works on training deep models using as less as 0 examples. Check out my new article exploring the domain.

    Prikaži ovu nit
    Poništi
  21. 25. srp 2019.

    So stoked to see 's progress over the last couple of years! Very excited for the breakthroughs in AI and brain-machine interfaces. Honored to have as an investor in

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·