26 June 2024

SonicSense:

Object Perception from In-Hand Acoustic Vibration

Preprint

Jiaxun Liu
Jiaxun Liu Duke University
Boyuan Chen
Boyuan Chen Duke University boyuanchen.com

Overview

We introduce SonicSense, a holistic design of hardware and software to enable rich robot object perception through in-hand acoustic vibration sensing. While previous studies have shown promising results with acoustic sensing for object perception, current solutions are constrained to a handful of objects with simple geometries and homogeneous materials, single-finger sensing, and mixing training and testing on the same objects. SonicSense enables container inventory status differentiation, heterogeneous material prediction, 3D shape reconstruction, and object re-identification from a diverse set of 83 real-world objects. Our system employs a simple but effective heuristic exploration policy to interact with the objects as well as end-to-end learning-based algorithms to fuse vibration signals to infer object properties. Our framework underscores the significance of in-hand acoustic vibration sensing in advancing robot tactile perception.

Video (Click to YouTube)

Video Figure

Paper

Check out our paper linked here.

Codebase

Check out our codebase at https://github.com/generalroboticslab/SonicSense

Citation

@article{liu2024sonicsense,
      title={SonicSense: Object Perception from In-Hand Acoustic Vibration}, 
      author={Jiaxun Liu and Boyuan Chen},
      year={2024},
      eprint={2406.17932},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2406.17932}, 
}         

Acknowledgment

This work is supported by ARL STRONG program under awards W911NF2320182 and W911NF2220113, by DARPA FoundSci program under award HR00112490372.

Contact

If you have any questions, please feel free to contact Jiaxun Liu.

Categories

Multimodal Perception Robot Learning