Marcin Kardas

@misterkardas

Lifelong learner passionate about math, computers and artificial intelligence.

Vrijeme pridruživanja: veljača 2018.

Tweetovi

Blokirali ste korisnika/cu @misterkardas

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @misterkardas

  1. Prikvačeni tweet
    22. kol 2018.

    ULMFiT + sentencepiece = state-of-the-art perplexity for polish language. Our (me and ) model (ppl. 118) easily won in PolEval'18 competition (second best ppl. 147). Thank you , and for your invaluable contribution to DL & NLP.

    Poništi
  2. 4. pro 2019.

    Interesting work! I believe that sooner than later someone will use neural networks to translate textbooks and papers into theorem prover language like lean or coq.

    Poništi
  3. proslijedio/la je Tweet
    26. stu 2019.

    Want to export your conda environment/requirements but don't list all dependencies but only those you directly depend on / have installed explicitly, use `--from-history`

    # conda env export --from-history
name: pyarrow-test
channels:
  - conda-forge
  - defaults
dependencies:
  - python=3.7
  - pyarrow
    Prikaži ovu nit
    Poništi
  4. 31. lis 2019.

    Great statistics of deep learning frameworks over time based on inspecting repos (by ). Interesting to see how research trends compare to search query trends.

    Poništi
  5. 30. lis 2019.

    I've always (in deep learning timescale) thought that SentencePiece unigram model has an advantage over BPE by not keeping intermediate tokens and thus better utilizing vocabulary space. However, this redundancy of BPE may be crucial when subword regularization is used.

    Poništi
  6. 28. lis 2019.

    CVXPY - Differentiable Convex Optimization Layers for pytorch and tensorflow. Read blog post by with a how-to and an interesting perspective on standard activation functions as convex optimization problems.

    Poništi
  7. proslijedio/la je Tweet
    22. lis 2019.

    Most of the world’s text is not in English. We are releasing MultiFiT to train and fine-tune language models efficiently in any language. Post: Paper: With Marcin Kadras

    Poništi
  8. proslijedio/la je Tweet
    10. lis 2019.
    Odgovor korisnicima i sljedećem broju korisnika:

    And since 20 min, you can see how much faster the DistilGPT-2 is compared to the original GPT-2 small, thanks to an new service by @paperwithcode

    Poništi
  9. 10. lis 2019.

    Speed vs BLEU score trade-off for state-of-the-art English-German translation models. See more on just launched

    Poništi
  10. proslijedio/la je Tweet
    10. lis 2019.

    🎉 Introducing sotabench : a new service with the mission of benchmarking every open source ML model. We run GitHub repos on free GPU servers to capture their results: compare to papers, other models and see speed/accuracy trade-offs. Check it out:

    Poništi
  11. proslijedio/la je Tweet
    4. lis 2019.

    Join us next Thursday at the developer conference for an exciting update on Papers With Code and where we are headed next...

    Poništi
  12. proslijedio/la je Tweet
    5. kol 2019.

    An Animated History of ImageNet : from AlexNet to FixResNeXt-101. See the full table and add more results here:

    Poništi
  13. 30. srp 2019.

    If there's one thing that's impossible for Jeremy it's... :) Really appreciate. Now, stop scrolling and start contributing. is a great place to start.

    Poništi
  14. proslijedio/la je Tweet
    6. srp 2019.

    Resources from my talk on Applied NLP Lessons at . Thanks to everyone who attended!

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    28. lip 2019.

    After the top down approach from part 1, part 2 is a comprehensive bottoms-up approach that shows every single part needed to train a SOTA model on ImageNet.

    Poništi
  16. proslijedio/la je Tweet
    28. lip 2019.

    Thank you everyone for your patience - our new deep learning MOOC is here! Includes 5 lessons diving into the foundations with , and 2 lessons (co-taught with ) building a new deep learning library in

    Poništi
  17. 15. lip 2019.
    Poništi
  18. proslijedio/la je Tweet
    10. lip 2019.

    Amazing work by the team. Easier than ever before to discover, build upon and extend deep learning models!

    Poništi
  19. proslijedio/la je Tweet
    31. svi 2019.

    My & presentation ’s ULMFiT at AI NLP day was received well. I had a lot of fun presenting. I hope we see increased use of the library and some contributions back, especially implementation of Unsupervised Data Augmentation in ULMFiT.

    Prikaži ovu nit
    Poništi
  20. 26. svi 2019.

    TIL about component framework, the top trending js project of this month, by inspecting what was used to implement Activation Atlas (). The great thing is that posts are open source ().

    Poništi
  21. 14. tra 2019.

    Together with we meet weekly with students from local CS club to teach them DL and . I quickly realized that I'm much better at coding than drawing :) so for a lesson about ResNet internals I used ipyvolume to visualize a few tensors. Hope you like it.

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·