Francesco Orabona

@bremen79

Assistant professor at Boston University. Formerly at Stony Brook University, , and . I hate hyperparameters in ML algorithms.

Vrijeme pridruživanja: veljača 2010.

Tweetovi

Blokirali ste korisnika/cu @bremen79

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @bremen79

  1. Prikvačeni tweet
    6. stu 2019.

    Post-Doc position to work with me on Online Learning/Optimization/Statistical Learning Theory problems. Ideal candidate has multiple theory papers at NeurIPS/ICML/COLT and hates parameters in ML 😀 E-mail me to apply, we can also meet at NeurIPS. Please RT & share it!

    Poništi
  2. proslijedio/la je Tweet
    26. sij
    Prikaži ovu nit
    Poništi
  3. proslijedio/la je Tweet
    23. sij

    Google Dataset Search is now officially out of beta. "Dataset Search has indexed almost 25 million of these datasets, giving you a single place to search for datasets & find links to where the data is." Nice work, Natasha Noy and everyone else involved!

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    4. sij

    I finally gave in and made a twitter account. I'm planning to keep the Machine Learning (Theory) blog for longer form discussions, but I've often found myself wanting to discuss relatively short things for which Twitter seems more natural than a blog.

    Poništi
  5. 1. sij

    As promised, I compiled all my lecture notes on Online Learning in a single PDF. Feedback is welcome! "A Modern Introduction to Online Learning" PS Happy New Year!

    Poništi
  6. 17. pro 2019.

    People of Milan, I will give a talk on December 19th at the University of Milan (via Celoria 18). I will talk about parameter-free machine learning through coin-betting

    Poništi
  7. proslijedio/la je Tweet
    16. pro 2019.

    Plz RT!🙏 and I are looking for summer interns to work on privacy-preserving machine learning. If you are interested, apply here: Plz share with PhD students who you think might be interested!

    Poništi
  8. 12. pro 2019.

    In our paper, we close the gap: we design a variant of Ridge Regression that matches the lower bound in *all* the cases. Also, we show novel rates in the case that the Bayes risk is 0. The proof is also interesting: we use online learning tools to prove it. 6/6 The end.

    Prikaži ovu nit
    Poništi
  9. 12. pro 2019.

    Now, a lower bound was known and Ridge Regression was matching it almost in all the regimes. But in some "difficult" regimes, there was a gap between upper and lower bound. Till now, it was not even clear if the lower bound was tight or not. 5/6

    Prikaži ovu nit
    Poništi
  10. 12. pro 2019.

    In this case, you can still achieve the performance of the best regression function, even if technically it is outside of your space! The rate will depend on how "infinite" is its norm and how "big" is the space. Papers by are the best reference on this topic. 4/6

    Prikaži ovu nit
    Poništi
  11. 12. pro 2019.

    Now, with kernels things are fuzzier. For example, Universal Kernels can approximate any continuous function. So, for universal kernels the optimal function can be always written in terms of the eigenfunctions of the kernel, but its norm might be infinite! 3/6

    Prikaži ovu nit
    Poništi
  12. 12. pro 2019.

    First, let's see what is the right way to study things with kernels. In finite dimensional space, the optimal regression function is in the space or outside of it. If inside, sooner or later you will converge to it. If if is outside, you cannot achieve its performance. 2/6

    Prikaži ovu nit
    Poništi
  13. 12. pro 2019.

    It seems that Kernels are fashionable again, with all these Neural Tangent Kernels everywhere. However, do you know what is the best that you can do in regression with kernels? In the poster #242 (5:00-7:00 PM) , , and myself

    Prikaži ovu nit
    Poništi
  14. 12. pro 2019.

    An important note: do you know that SVRG was invented independently by two groups at the same conference? Apparently ~98% of you is missing the second reference: Please do the right thing and always cite both papers!

    Prikaži ovu nit
    Poništi
  15. 10. pro 2019.

    After the lottery for the registration, apparently we need a lottery to see the poster session at They shut down the doors and thousands of the people are out

    Poništi
  16. proslijedio/la je Tweet
    6. pro 2019.

    To help you plan which posters to check out next week at , consider using the following page, which lists all subject areas and their corresponding posters, for each poster session:

    Poništi
  17. 6. pro 2019.

    ML and Optimization people, we have 2 positions for assistant professors in the ECE department at Boston University. We have strong research, friendly environment, and we are in a city 🥳 CS+ECE is 13th on CSRankings for ML+CV:

    Poništi
  18. proslijedio/la je Tweet

    The double descent phenomenon is described in ~1000 papers & talks over past year. It's featured in at least 1 slide per talk @ last summer's Simons workshop on Foundations of DL. Why is this post getting so much attention as if it's a new discovery? Am I missing smtg?

    Poništi
  19. proslijedio/la je Tweet
    2. pro 2019.

    Maria-Florina Balcan is an Assoc. Prof. of Computer Science at and has received many awards such as the Career award for her outstanding research. Come today to her talk on recent advances in !

    Poništi
  20. 27. stu 2019.

    New (and last!) lecture note on Online Learning: From Online Learning to Statistical Learning Here, we show that the existence of a no-regret algorithm implies agnostic-PAC learnability, linking online learning to statistical learning.

    Poništi
  21. proslijedio/la je Tweet
    24. stu 2019.

    proud to announce the program of social events we've put together with Maria for 2019!! once again, many thanks to the community for coming up with all these amazing proposals!!!

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·