Excited to share our #CVPR2022 paper ObjectFolder 2.0, a multisensory object dataset with visual, acoustic, and tactile data for Sim2Real transfer.
Paper: arxiv.org/pdf/2204.02389
Project page: ai.stanford.edu/~rhgao/objectf
Dataset/Code: github.com/rhgao/ObjectFo
Details with narration👇
Conversation
Replying to
Compared to our prior work ObjectFolder 1.0, this new dataset is 10 times larger in the amount of objects and orders of magnitude faster in rendering time. We also significantly improve the multisensory data rendering quality for all three modalities.
1
2
Visualization of all 1,000 object in ObjectFolder 2.0. We leverage existing high-quality scans of real-world objects from online repositories such as Google Scanned Objects and the ABO dataset.
1
3
Some examples of the acoustic data (impact sounds) obtained from ObjectFolder 2.0 compared with ObjectFolder 1.0 and the real impact sound recordings.
1
2
Some examples of the tactile data obtained from ObjectFolder 2.0 for some sample objects. We vary the contact locations as well as the contact rotation and the gel penetration depth.
1
1
Awesome collaboration with Zilin Si and Wenzhen Yuan.
2
Show additional replies, including those that may contain offensive content
Show


