ObjectFolder 2.0

Multimodal dataset of everyday objects with audio and touch modalities

Released in: ObjectFolder 2.0: A Multisensory Object Dataset for Sim2Real Transfer

Contributor:

Summary

Objects play a crucial role in our everyday activities. Though multisensory object-centric learning has shown great potential lately, the modeling of objects in prior work is rather unrealistic. ObjectFolder 1.0 is a recent dataset that introduces 100 virtualized objects with visual, acoustic, and tactile sensory data. However, the dataset is small in scale and the multisensory data is of limited quality, hampering generalization to real-world scenarios. The authors present ObjectFolder 2.0, a large-scale, multisensory dataset of common household objects in the form of implicit neural representations that significantly enhances ObjectFolder 1.0 in three aspects. First, the new dataset is 10 times larger in the amount of objects and orders of magnitude faster in rendering time. Second, the authors significantly improve the multisensory rendering quality for all three modalities. Third, they show that models learned from virtual objects in our dataset successfully transfer to their real-world counterparts in three challenging tasks: object scale estimation, contact localization, and shape reconstruction. ObjectFolder 2.0 offers a new path and testbed for multisensory learning in computer vision and robotics.

1000 objects

Images in dataset

2022

Year Released

Key Links & Stats

ObjectFolder 2.0

@inproceedings{gao2022ObjectFolderV2, title = {ObjectFolder 2.0: A Multisensory Object Dataset for Sim2Real Transfer}, author = {Gao, Ruohan and Si, Zilin and Chang, Yen-Yu and Clarke, Samuel and Bohg, Jeannette and Fei-Fei, Li and Yuan, Wenzhen and Wu, Jiajun}, booktitle = {CVPR}, year = {2022} }

scenebox

Modalities

  1. 3D Asset

Verticals

  1. General
  2. Home/Office

ML Task

  1. General
  2. Object Detection
  3. Object Recognition
  4. Object Tracking
  5. Scene Understanding

Related organizations

Stanford University

Carnegie Mellon University