1/ Starting my road to learning about AI. I’ve done some courses here and there but I am planning a more consistent effort over the next few years.
Mostly documenting my path for others and for fun. Often people suggest things I didn’t know which can be helpful.
Conversation
2/ I am doing course.fast.ai again. I did part 1 in 2018 but binged the first 4 lessons fri-sun. Jeremy was kind enough to invite me to the 2022 version on going.
Wait a few weeks before pursuing this which will show you new tools and a few new concepts.
Replying to
3/ I made a custom Twitter list to follow certain AI people: twitter.com/i/lists/153949
Mostly to stay on top of the cutting edge and keep up to date.
4
10
208
4/ I’ve been messing around with prompts again in OpenAI playground with DALL-E and GPT-3 to experiment. Made some album art for a song I plan to release next month and prototyped a new feature for I hope we’ll ship later.
Just keeping up the momentum/motivation.
5
2
104
5/ Going to start writing real code and compete in a competition on Kaggle to apply my knowledge now before prototyping/building things for
s/o jeremyphoward for the guidance
Next step:
2
2
132
8/ I am heaven now. vim bindings and all. I can't see the files on the left but you know what? I can live with that.
docs.paperspace.com/gradient/noteb
4
81
9/ Hands down this is the best experience: all your VS Code extensions (vim, copilot, etc.) all in one integrated IDE running in a browser using cloud compute.
Don't bother with anything else.
4
4
148
10/ Trained my first model and submitted to this little example Kaggle competition: kaggle.com/competitions/p
gn
2
1
53
11/ Had a nice conversation with yesterday about a project for Mighty I want to work on. He helped me think a bit about what data to collect, what sort of approach to take, how to avoid getting too complicated, starting with small amounts of data, etc
Thank you!
1
2
51
17/ Studying/learning this makes me feel pretty stupid. I am so glad a friend sent me this: blog.gregbrockman.com/how-i-became-a
It's almost like learning programming for the first time as gdb mentions.
2
15
150
19/ After a 1.5 hour convo with on a new approach, I shall attempt my second attempt. Turns out the NLP stuff isn’t a total waste.
1
1
30
20/ IT WORKS (I think?)
A lot more testing to validate but huzzah! It was quite confident that the other address bar results were not a match as well which is neat.
A big thank you to helping me get unstuck.
1
46
23/ Quick and simple article that made some fundamentals about how to prevent overfitting click for me: cs.toronto.edu/~lczhang/360/l
3
6
42
24/ Not super happy with my NLP classification model predictions. Going to take a stab at Tabular after some people suggested it.
I’ll be watching this video tonight:
1
29
26/ Learning about LSTM: karpathy.github.io/2015/05/21/rnn
Briefly got intrigued to learn more after skimming the OpenAI Dota 2 paper.
2
1
33
27/ Worked through lesson 8 with real production data: youtu.be/htiNBPxcXgo
Built a tabular model. Getting faster.
Trying to figure out how to do a multi-modal approach. Increasingly interested in understanding how to build nets from scratch to internalize the fundamentals.
3
2
39
Show replies
