Search results
  1. People View all

  2. 1 hour ago
  3. Jan 29

    This video explains 's amazing new Meena chatbot! An Evolved Transformer with 2.6B parameters on 341 GB / 40B words of conversation data to achieves remarkable chatbot performance! "Horses go to Hayvard!"

  4. Day 13. Honestly I didn't code. Why? Because I need to understand the algorithms and they're very difficult.🤣😂😁 I know Python but the Machine Learning world is new to me. So I read this:

    Show this thread
  5. Day 162/300 - Added 7 object detection pipelines to Monk-GUI - A Graphical user Interface for deep learning and computer vision over Monk Libraries:

  6. 23/100 Not really a full day of "butt in seat" time, but I got some good problem solving in. Coding consisted of going back into some math and generalizing the hard numbers I used to originally get that part of the algorithm done.

    Show this thread
  7. 8 hours ago

    Round 2 Day 42 of I am still going back and forth between perception and gradient descent. Looking for that 'click' moment to move on! Submitted the first ML algorithm and got the new one right after it :D

  8. Feb 1

    Day 8/100 Created and recorded my second Machine Learning Web App which predicts whether the patient has diabetes or not. Github: YouTube playlist:

  9. Feb 1

    D44 Took a break from again and got a good day started with but eventually went back to and hammered out the rest of 😀

  10. 7 hours ago
  11. Day-17 & 18: Continuing with Python 😅 on Python for Data Science and Machine Learning Bootcamp on And started pandas and Numpy in A big thank to Jose Portilla

  12. Today we show you how you can get up and running with W&B in a GCP instance, so you can start tracking your model experiments like a pro.

  13. Day 7: 1) Studied the problem of overfitting and it's solution by regularization. 2) Exploring Neural networks and started third week's assignment. 3) Parallelly enjoying image processing with openCV.

  14. 3 hours ago

    R1D41: 3 Stuff to get done today. Spatial Analysis by Luc Anselin Lecture 1-6. Udacity Lesson 8. Creation of dataset with different azimuths and altitude.

  15. Day 39 : Working on my DeepLearning nanodegree from Udacity. Learned the black box theory of Gradient Deacent and used it to implement a neural network to predict bike sales.

  16. Feb 1
  17. Jan 31
  18. Jan 31

    I highly recommend checking out the lecture series from "Full Stack Deep Learning" on YouTube

  19. If you're doing ML & need a community of people to nerd out with – W&B Forum is it. We have folks entering & winning hackathons together, getting help on weekend projects, talking shop. It's a great time🥰

    Show this thread
  20. Jan 31

    Tensors are the basic data structures on which any Neural Network Framework is built. is named after Tensors. It is basically the regularization of matrices and Vectors.

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.