Russ Salakhutdinov

@rsalakhu

UPMC Professor of Computer Science at Carnegie Mellon University

Joined January 2015

Tweets

You blocked @rsalakhu

Are you sure you want to view these Tweets? Viewing Tweets won't unblock @rsalakhu

  1. Feb 1

    Got promoted to a full professor at CMU:) Thanks to my amazing students, collaborators, and my mentors. CMU is a remarkable place, but what makes it truly unique is its brilliant students! Hard to find a better place to do ML and AI.

    Undo
  2. Retweeted

    The 2020 Information Theory and Applications Workshop kicks off on Sunday here in San Diego! Researchers from around the country are gathering to apply theory (ie deep learning, ML) to diverse areas of science and engineering. Schedule and more info:

    , , and 3 others
    Undo
  3. Jan 30

    Very excited to see two of my PhD students Devendra Singh Chaplot ⁦⁦⁩ and Yao-Hung Hubert Tsai receive Facebook PhD fellowships. Congratulations!

    Undo
  4. Retweeted
    Jan 23

    Excited to announce the 2nd workshop on multimodal language @ ! We welcome submissions in all areas of human language, multimodal ML, multimedia, affective computing, and applications! w/ fantastic speakers:

    Show this thread
    Undo
  5. Retweeted
    Jan 15

    ***spread the word!!!*** ITA and ALT team up for a joint symposium day on Feb 8!! Tutorials by Russ Salakhutdinov (), Alex Slivkins & Bobby Kleinberg Talks by Peter D. Grünwald and Cosma Shalizi () Register to ALT and attend:

    Show this thread
    Undo
  6. Jan 6

    Graduate School Applications at the CMU School of Computer Science : "60% of apps for the *entire school* mention the phrase "machine learning" at least once" via

    Undo
  7. 27 Dec 2019
    Undo
  8. Retweeted
    25 Dec 2019

    Christmas morning academia quotes! You can either try to build it for the next decade or ... just keep talking about it but still claim "thought leadership" if/when things start working in three decades

    Undo
  9. Retweeted
    25 Dec 2019

    Video & slides for LIRE workshop @ are now up: Check out the Talks and Panel by Jeff Bilmes Tom Griffiths & more. Thanks to all speakers & presenters for making the workshop a success!

    , , and 3 others
    Undo
  10. 25 Dec 2019

    Someone told me once: You can do AI or you can just talk about it.

    Undo
  11. Retweeted

    From all of us at the Machine Learning Department at CMU, we wish you a happy holiday season and a prosperous New Year!🤗

    Happy holidays greeting from the Machine Learning Department at Carnegie Mellon, the background shows the Gates Hillman Center
    Undo
  12. Retweeted

    A leader in robotics research and education, is recruiting faculty in all areas of expertise. Deadline is Jan. 3

    A button proclaiming "I (heart) Robots!"
    Undo
  13. Retweeted
    14 Dec 2019
    Replying to and

    The MineRL competition session is about to start - so exciting!! It’s been amazing to work with such a brilliant team, and see the fantastic progress and achievements of the participants!

    Undo
  14. Retweeted

    Wow, I am so excited! The competition was featured in The Verge! We've come a long way towards Minecraft AI in the past 2 years, but this competition is a great reminder that there are so many more challenges to solve! Read More:

    Undo
  15. Retweeted
    13 Dec 2019

    Come join the workshop on Learning with Rich Experience. Note the location: West 208+209. Look fwd to the super exciting talks by JeffBilmes & TomGriffiths, and the contributed presentations:

    , , and 5 others
    Undo
  16. 13 Dec 2019

    BBC article about our competition: Come and check out the winners and talks from the top competitors: Sat Dec 14 9:00 AM @ West 116 + 117 w/ , , ,

    Undo
  17. Retweeted
    12 Dec 2019

    Our MixTape is 3.5-10.5x faster than Mixture of Softmaxes /w SOTA results in language modeling & translation. Key is to do gating in the logit space but with vectors instead of scalars (+sigmoid tree decomposition & gate sharing for efficiency). /w Zhilin, Russ, Quoc

    Undo
  18. 12 Dec 2019

    Mixtape: breaking the softmax bottleneck that limits expressiveness of neural language models. A network with Mixtape Output Layer is only 35% slower than softmax-based network, while outperforming softmax in perplexity & translation quality

    Undo
  19. Retweeted
    10 Dec 2019

    Check out our latest work on modeling the "what's" and "where's" of objects and parts with geometric capsule representations from 3D point clouds. (with & )... I'm at all week if you want to chat about it!

    Undo
  20. 10 Dec 2019

    Capsules may replace ConvNets one day.

    Undo

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

    You may also like

    ·