Tweetovi

Blokirali ste korisnika/cu @godboleam

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @godboleam

  1. proslijedio/la je Tweet

    Today's Python history lesson: Python took its control and data structures from ABC, its identifiers, strings and %-string formats from C, and its regular expressions from Perl. But its # comments (and #!) and -c command line flag came from the UNIX v7 shell.

    Poništi
  2. proslijedio/la je Tweet
    28. sij

    💥 Starting the 2020 edition of 's Deep Learning class with ~200 students! 🤩 This year we *will* end up with annotated ✏️ video recordings 🎥 and publishable lecture notes 📖, as we're putting the extra effort to renew 🌟 and reorganise 🧐 the material. – mjesto: NYU Courant Institute of Mathematical Sciences

    Poništi
  3. proslijedio/la je Tweet
    27. sij

    New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via )

    Poništi
  4. proslijedio/la je Tweet
    27. sij

    The surveillance state beyond just face recognition: “We need to have a serious conversation about all the technologies of identification, correlation and discrimination” by Bruce Schneier () in ⁦

    Poništi
  5. proslijedio/la je Tweet
    27. sij
    Poništi
  6. proslijedio/la je Tweet
    21. sij

    Let me highlight this amazing work I've read recently on in NLP, in which you'll find both: - a deep discussion of what it means for a neural model to be compositional - a deep and insightful comparison of LSTM, ConvNet & Transformers! 👉

    Poništi
  7. proslijedio/la je Tweet
    21. sij

    🚨New lecture series🚨 We've teamed up with to bring you the Deep Learning Lecture Series: 12 lectures covering a range of topics in Deep Learning - all led by DeepMind researchers, all free, and all open to everyone. Info & tickets:

    Prikaži ovu nit
    Poništi
  8. proslijedio/la je Tweet
    14. sij

    A blog post by and François Charton about their deeply-trained symbolic math system that can compute integrals and solve differential equations.

    Poništi
  9. proslijedio/la je Tweet
    14. sij

    Prepared to have your 🤯 ? Check out the stats on 's Big Red 200, the fastest university-owned supercomputer in the world. 🖥️:

    , , i još njih 2
    Poništi
  10. proslijedio/la je Tweet
    8. sij

    Many aspiring AI engineers ask me how to take the next step and join an AI team. This report from , a affiliate, walks you through how AI teams work and which skills you need for different AI career tracks. Download it here:

    Poništi
  11. proslijedio/la je Tweet
    10. sij

    Great talk! Explains how to vectorize slow pandas code. Here: replacing .apply when working w conditional statements. Was guilty of using .apply myself a lot recently because I thought of it as elegant. Turns out my old & actually preferred method, numpy.where, is a lot faster!

    Prikaži ovu nit
    Poništi
  12. proslijedio/la je Tweet
    3. sij

    I heard about earlier this week Tried it for the first time this morning 🔥Holy smokes 🔥 I'm not sure I'll ever write a Jupyter notebook again. They might still have some use case? But Streamlit is shockingly nice to use Thx for telling me about it

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    30. pro 2019.

    📃 Notebooks can be hand-outs too! Our new print styles make sure your notebooks look as good on paper as they do on screen. Simply print your current browser tab, select "Hide headers & footers" and click "Print" or "Save as PDF..."

    Poništi
  14. proslijedio/la je Tweet
    30. pro 2019.

    Feature Selection: Beyond feature importance?

    Poništi
  15. proslijedio/la je Tweet
    28. pro 2019.

    Here's why the circle has 360 degrees: around 2400 B.C., the ancient Sumerians noticed the Sun's annual path across the sky was ~ 360 days. In order to track the Sun's motion, they decided to divide the circle in 360 degrees.

    Poništi
  16. proslijedio/la je Tweet
    26. pro 2019.

    Good summary of ML/DL/AI in 2019: Farewell to a landmark year; lang. models get literate; face rec. meets resistance; driverless cars stall; deepfakes go mainstream; simulation subst. for data; the rule-based (symbolist) vs neurons (connectionist) debate

    Poništi
  17. proslijedio/la je Tweet
    13. pro 2019.

    Joblib parallel is getting a spark backend This will enable distributed fitting of on clusters. It require work to be solid: contribute

    Poništi
  18. proslijedio/la je Tweet

    A year ago I called for governments, companies & citizens to come together to protect the web as a force for good. Today, we launch the Contract for the Web — the first global plan of action to build the . Join us.

    Poništi
  19. proslijedio/la je Tweet
    12. stu 2019.

    Want to improve accuracy and robustness of your model? Use unlabeled data! Our new work uses self-training on unlabeled data to achieve 87.4% top-1 on ImageNet, 1% better than SOTA. Huge gains are seen on harder benchmarks (ImageNet-A, C and P). Link:

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet
    8. stu 2019.

    does not have a standard benchmark for interpretability. I am stoked to announce ERASER: the first-ever effort on unifying and standardizing NLP tasks with the goal of interpretability.

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·