Medijski sadržaj
- Tweetovi
- Tweetovi i odgovori
- Medijski sadržaj, trenutna stranica.
-
New work on Geometric Capsules: Learning to group 3D points into parts & parts into the whole object in unsupervised way. Each capsule represents a visual entity consisting of a pose & feature representing "where" & ''what'' it is. https://arxiv.org/abs/1912.03310 w/t
@nitishsr &@Hanlinpic.twitter.com/WP5GBaxI8y
-
#CoRL2019 paper from our Apple group on Worst Cases Policy Gradients: Learning more robust policies by minimizing long-tail risks, reducing the likelihoods of bad outcomes. https://arxiv.org/abs/1911.03618 with Charlie Tang and Jian Zhang.pic.twitter.com/LQaOI0yG49
-
#NeurIPS2019 paper on Multiple Futures Prediction: Sequential generative model that learns multi-step future motions/interactions of agents directly from multi-agent trajectory data, while remaining scalable to a large number of agents https://arxiv.org/abs/1911.00997 w/t C. Tang @ Applepic.twitter.com/rYHK1q9TuA
-
Check out
#NeurIPS2019 workshop on Sets and Partitions, focusing on models with set-based inputs/outputs, models of partitions and novel clustering methodology: https://www.sets.parts/ pic.twitter.com/6pSAan8haG
-
Congratulations to Zhilin Yang for successfully defending his PhD thesis at CMU in just 4 years! Zhilin introduced XL-Net, Transformer-XL, Mixture of Softmaxes High-Rank LM, HotpotQA, GLoMo Unsupervised Learning of Relational Graphs, just to name a few: https://kimiyoung.github.io/ pic.twitter.com/UApoDIF7Op
-
Participate in
#MineRL#NeurIPS competition on sample-efficient reinforcement learning using human priors: http://minerl.io/competition/ with@wgussml, Brandon Houghtonm et. al, and with@MSFTResearch sponsoring the compute: https://www.youtube.com/watch?v=KFMuI4TfC7c&feature=youtu.be … -
It is interesting to see that Transformer-XL can already generate coherent, novel text articles with thousands of tokens, see below. Code, pretrained models, paper https://github.com/kimiyoung/transformer-xl … XLNet will likely improve over Transformer-XL and we will make those models available soon.pic.twitter.com/exJMrDS6SY
-
Congratulations to
@dchaplot, Saurabh Gupta for taking the 1st place in RGB-D track & a joint 1st place in RGB track at#CVPR2019 Habitat Challenge - Autonomous Navigation Challenge in Embodied AI https://aihabitat.org/challenge/ Code and paper coming up very soon (with Abhinav Gupta)pic.twitter.com/pTsX7BYRFI -
From your ICML2019 Program Chairs with
@kamalikac. We are done my friends! We hope you enjoyed ICML this year! And big thanks to all members of Organizing Committee and our workflow chairs for making it a successful conference!pic.twitter.com/G8dg8OfkQu
-
CMU MLD blog
@mldcmu: Using Deep Learning to Help Us Understand Language Processing in the Brain: https://blog.ml.cmu.edu/2019/05/31/using-deep-learning-to-help-us-understand-language-processing-in-the-brain/ …pic.twitter.com/9ns8R56256
-
New CMU ML blog entry: Your 2 is My 1, Your 3 is My 9: Handling Crazy Miscalibrations in Ratings from People https://blog.ml.cmu.edu/2019/05/04/your-2-is-my-1-your-3-is-my-9-handling-crazy-miscalibrations-in-ratings-from-people/ …pic.twitter.com/NWBkImVUlY
-
Slides from my talk on Integrating Domain Knowledge into Deep Learning at the New York Academy of Sciences
@NYASciences. Special shoutout to@ZhitingHu and Bhuwan Dhingra for leading this amazing work and helping me with the slides: https://www.cs.cmu.edu/~rsalakhu/NY_2019_v3.pdf …pic.twitter.com/E0SfabdM0B
-
(1/3) ICML 2019 Call for Papers (with
@kamalikac) https://icml.cc/Conferences/2019/CallForPapers … Key points: 1. Abstract submission deadline is on January 18, 2019, 3:59 p.m. Pacific, 23:59 Universal time. 2. Full papers are due on January 23, 2019, 3:59 p.m. Pacific, 23:59 Universal time.pic.twitter.com/cgUMqsLxmd
Prikaži ovu nit -
New paper: Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context: Learning long-term dependency without disrupting temporal coherence, SOTA on 5 datasets w/t Zihang Dai, Zhilin Yang et al. https://arxiv.org/abs/1901.02860 Code, pretrained models: https://github.com/kimiyoung/transformer-xl …pic.twitter.com/Y9erWCuh5O
-
-
New paper on Point Cloud GAN: Learning to generate point clouds using ideas from hierarchical Bayes and implicit generative models: https://arxiv.org/abs/1810.05795 Open review: https://openreview.net/forum?id=ByxAcjCqt7 … w/t Li, Zaheer, Zhang, Póczospic.twitter.com/lstJl2QZYZ
-
The Machine Learning Department
@mldcmu at CMU has multiple tenure track and multiple teaching track positions. Join the best place to do ML and AI. https://www.ml.cmu.edu/Faculty_Hiring.html#%20|%20Machine%20Learning%20|%20Carnegie%20Mellon%20University …pic.twitter.com/nWvnsSrNgr
-
Excited to co-chair ICML 2019 with
@kamalikac. Call for Papers https://icml.cc/Conferences/2019/CallForPapers … Major change this year: Abstract submission deadline is on Jan 18; Full papers are due on Jan 23, 2019.pic.twitter.com/hLQldt6TYL
-
#nips2018 Deep Generative Models with Learnable Knowledge Constraints - Establishing mathematical connection between posterior regularization (PR) and RL, while expanding PR to learn constraints as the extrinsic reward in RL https://arxiv.org/abs/1806.09764 w/t Z. Hu, Z. Yang, et al.pic.twitter.com/6PTdXM9Ld2
-
#nips2018 paper: GLoMo: Unsupervisedly Learned Relational Graphs as Transferable Representations: Learning generic latent relational graphs between words, pixels from unlabeled data & transferring the graphs to downstream tasks: https://arxiv.org/abs/1806.05662 w/t Z. Yang, J. Zhao et al.pic.twitter.com/1S1cY8wKhT
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.