06 July 2024

ClutterGen:

A Cluttered Scene Generator for Robot Learning

Preprint

Yinsen Jia
Yinsen Jia Duke University yjia.net/
Boyuan Chen
Boyuan Chen Duke University boyuanchen.com

Overview

We introduce ClutterGen, a physically compliant simulation scene generator capable of producing highly diverse, cluttered, and stable scenes for robot learning. Generating such scenes is challenging as each object must adhere to physical laws like gravity and collision. As the number of objects increases, finding valid poses becomes more difficult, necessitating significant human engineering effort, which limits the diversity of the scenes. To overcome these challenges, we propose a reinforcement learning method that can be trained with physics-based reward signals provided by the simulator. Our experiments demonstrate that ClutterGen can generate cluttered object layouts with up to ten objects on confined table surfaces. Additionally, our policy design explicitly encourages the diversity of the generated scenes for open-ended generation. Our real-world robot results show that ClutterGen can be directly used for clutter rearrangement and stable placement policy training.

Video (Click to YouTube)

Video Figure

Paper

Check out our paper linked here.

Codebase

Check out our codebase at https://github.com/generalroboticslab/ClutterGen

Citation

@article{jia2024cluttergen,
      title={ClutterGen: A Cluttered Scene Generator for Robot Learning}, 
      author={Yinsen Jia and Boyuan Chen},
      year={2024},
      eprint={2407.05425},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2407.05425}, 
}       

Acknowledgment

This work is supported by ARL STRONG program under awards W911NF2320182 and W911NF2220113, by DARPA FoundSci program under award HR00112490372.

Contact

If you have any questions, please feel free to contact Yinsen Jia.

Categories

Robot Learning Transfer Learning