VICTOR BASU

@victor_basu_360

Learning and will always be a learner|| Computer Science Engineer || I don't know what to do in life so I just do everything.

Vrijeme pridruživanja: listopad 2018.

Tweetovi

Blokirali ste korisnika/cu @victor_basu_360

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @victor_basu_360

  1. Prikvačeni tweet
    2. srp 2019.

    Promoted as Kernels Expert at . A year passed since I started my journey, and learned a lot of new things at kaggle. Would be continuing learning, a lot of exciting kernels on ML, and DL based on modern researches are still to come, Still a long way to go...😤😤

    Poništi
  2. proslijedio/la je Tweet
    27. sij

    You can touch the mountains but you'll still miss your home. ❤️ Such is the cost of distance and sacrifice!

    Poništi
  3. proslijedio/la je Tweet
    27. sij

    When a new competition launches

    Poništi
  4. 8. sij

    My very first implementation of BERT at . Link - This kernel would be very much helpful to those who are struggling to find a starting point to implement BERT for NLP. Please go through the kernel, like it and share it if u find it helpful.

    Poništi
  5. 31. pro 2019.

    NLP with Disaster Tweets(Analysis and Prediction) Link - This is my last open-source kernel this year at .I have learned a lot this year and thanks to all of you guys for supporting my work. Learning and will always be a learner

    Poništi
  6. 27. pro 2019.

    Also me being a final year college CSE student, I would strongly support this opinion, to study ml, dl or data science u just need to practice and culure ur skill every single day, no matter how huch small it is.

    Poništi
  7. proslijedio/la je Tweet
    26. pro 2019.
    Odgovor korisnicima
    Poništi
  8. proslijedio/la je Tweet
    22. pro 2019.

    All birds find shelter during rain. But an eagle avoids rain by flying above the Clouds. - APJ Abdul Kalam

    Poništi
  9. proslijedio/la je Tweet

    ImageNet accuracy by year 2012: 0.633 (AlexNet) 2013: 0.663 (Five Base + Five HiRes) 2014: 0.745 (VGG-19) 2015: 0.788 (Inception V3) 2016: 0.823 (ResNeXt) 2017: 0.829 (PNASNet-5) 2018: 0.854 (ResNeXt) 2019: 0.874 (EfficientNet-L2)

    Poništi
  10. proslijedio/la je Tweet
    15. pro 2019.

    Desperate Question: How to all the ML people out there keep track of the barrage of new information? Formal note keeping? Rely on whatever sticks? Some other system? It seems like each year I'm losing ground; would love tried and true tips. (I'd love a RT for visibility. Thx!)

    Poništi
  11. proslijedio/la je Tweet
    11. pro 2019.

    StyleGAN 2: Mostly We Got Rid of The Blobs Also: No 'phase artifacts' (I hadn't noticed them but obvious in retrospect), easier-to-navigate latent spaces mean better encoding, and ~25% more efficient. pdf: code (coming-later):

    Prikaži ovu nit
    Poništi
  12. 11. pro 2019.

    I do have a weakness of doing spelling mistakes and confusing between spellings which sounds similar or maybe creating a new word by combing them during exams. But the worst thing that happens to me every time is when I finally realize that I have committed it again, after exam

    Poništi
  13. Poništi
  14. 9. pro 2019.

    Do you know whats going on between the input and output layer of a neural network while classification? Here is the answer👇👇

    Poništi
  15. 5. pro 2019.

    hey guys, you should also attend Google cloud OnBoard Application Development like me, It's very interesting and you get a lot to learn from it.

    Poništi
  16. proslijedio/la je Tweet
    21. stu 2019.

    EfficientDet: a new family of efficient object detectors. It is based on EfficientNet, and many times more efficient than state of art models. Link: Code: coming soon

    Prikaži ovu nit
    Poništi
  17. 13. stu 2019.

    Even I have also faced these problems while participating in job interviews, where the interviewer think that machine learning and deep learning is all about creating A.I. robots and terminators and just kept on asking me heaven and hell questions totally out of the context.😐😐

    Poništi
  18. 9. stu 2019.

    Am I the only one on earth who shies to create a kernel or code over his very own dataset. 😐😐 Link to the dataset -

    Poništi
  19. 9. stu 2019.
    Poništi
  20. proslijedio/la je Tweet
    7. stu 2019.

    I have just published my dataset at on Asteroid. I hope this dataset would be very much helpful to you in your data science projects and astronomical research with the help of Data science and ML. credits - , , Link -

    Poništi
  21. proslijedio/la je Tweet
    4. stu 2019.

    Hey guys, I have created a tutorial at . I am creating an entire series on this tutorial. Chapter-1 (The Classification) is released. link - Please go check it out.☝️☝️ Chapter-2 would be released very soon. till then stay tuned. 🤗🤗

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·