Bradley Boehmke

@bradleyboehmke

Data science enabler . & enthusiast. Husband, father and consumer.

Ohio, USA
Vrijeme pridruživanja: travanj 2014.

Tweetovi

Blokirali ste korisnika/cu @bradleyboehmke

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @bradleyboehmke

  1. Prikvačeni tweet
    12. stu 2019.

    Great news! Truly enjoyed writing this with my buddy & learned so much in the process. Many thanks to & especially for our "minor" timeline adjustments 😬. Hope the community benefits from the work!

    Poništi
  2. Must read - paper on their process: - loss gains vs biz metric gains - relative impact of all models - RCTs for model impact comparison - dealing with latency in production 📝: More reads:

    Poništi
  3. proslijedio/la je Tweet
    1. velj

    Glad I was part of this team! For anyone who missed it, you can check the material made available by and here 👇

    Poništi
  4. proslijedio/la je Tweet
    1. velj

    floodsung / Deep-Learning-Papers-Reading-Roadmap Deep Learning papers reading roadmap for anyone who are eager to learn this amazing tech!

    Poništi
  5. 31. sij

    Great time teaching with keras & tensorflow workshop at 💯support by & Ed team, & others... 🙏🙏🙏 Material is CC BY 4.0: ...enjoy!

    Poništi
  6. proslijedio/la je Tweet
    29. sij

    Excited to announce at that is now hosted within the and LF AI! Site: Blog: Slides: Also, sparklyr 1.1 now in CRAN, adds supports for with !

    Poništi
  7. proslijedio/la je Tweet
    Poništi
  8. proslijedio/la je Tweet
    22. sij

    The latest h2o package, v3.28.0.2, is now on CRAN! Lots of new features to check out, including: ✅ Hierarchical GLMs ✅ New leaderboard metrics ✅ Parallel grid search (super speedup!) ✅ updates ✍️Read more here:

    Poništi
  9. proslijedio/la je Tweet
    20. sij

    Interesting paper showing how dozens of studies have accidentally leaked large amounts of data from train->test dataset, by duplicating data items prior to doing a random split.

    Poništi
  10. proslijedio/la je Tweet

    Geek alert 🤓 I’m super excited to be back at next week in SF 🌉 Hoping to meet lots of new people any see some familiar faces — say hi! 👋🏼 I’ll be a TA in ‘s workshop on both days. He’s created an aces training program 💯

    Poništi
  11. proslijedio/la je Tweet
    16. sij

    “Be curious. Read widely. Try new things. I think a lot of what people call intelligence boils down to curiosity.” - Aaron Swartz

    Poništi
  12. 14. sij

    Neural network hyperparameter optimization can be daunting...but it can also "be reasonably quick if one searches for clues in the test loss early in training." This paper is great for anyone trying to put some logic behind model tuning.

    Poništi
  13. 13. sij

    tip: don't just rely on a constant or decaying learning rate, often cyclical learning rates can improve performance. Read this paper for the why and how.

    Poništi
  14. 13. sij

    Getting excited. Partly because this title slide is far more inviting then the cold and dreary Ohio winter day outside my window!

    Poništi
  15. 11. sij

    Learning curve issue #7: Underrepresented validation data Val loss < train loss, no matter how many iterations performed. Often from data leakage or poor sampling procedures. Try: 1. ✓ for dup obs 2. ✓ for data leaks 3. ➕ k-fld CV / bootstrap

    Poništi
  16. proslijedio/la je Tweet
    10. sij

    Enjoyed this 2016 article by on optimization algorithms for gradient descent, covers stochastic methods to distributed training -- Definitely still relevant in 2020!

    Poništi
  17. 10. sij

    R tip: use DESCRIPTION file in any project/repo (does NOT have to be a 📦) to simplify dependency install for end users. End user simply runs devtools::install_deps() to install all 📦s listed as Imports 's 📚s (i.e. ) are 👍 examples

    Poništi
  18. 10. sij

    Learning curve issue #6: Underrepresented validation data Train loss looks to be learning but valid loss shows noisy movements and little or no improvement. Try: 1. ⬆️ obs to validation set 2. ➕ k-fld CV / bootstrap

    Poništi
  19. proslijedio/la je Tweet
    9. sij

    What I did over my winter break! It gives me great pleasure to share this summary of some of our work in 2019, on behalf of all my colleagues at & .

    Prikaži ovu nit
    Poništi
  20. 9. sij
    Poništi
  21. 9. sij

    Excited to see the podcast will be focusing on interpretable in 2020. Quite fitting that the first guest was !

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·