Dobb·E: On Bringing Robots Home
An open-source, general framework for learning household robotic manipulation
109 tasks
10 NYC homes
81% success rate
20 minutes to learn a new task
Videos
Dobb·E in action
In 10 homes of New York City, Dobb·E attempted 109 tasks. Here are sample rollouts from each of the tasks.
Hardware
The Stick
We believe one of the largest roadblocks to safe and scalable progress in home robotics, especially in imitation learning based approaches, is the lack of a cheap, ergonomic, and easy way to collect demonstrations for robots.
To address this, we built the Stick, a demonstration collection tool we built out of a $25 Reacher-grabber stick, some 3D printed parts, and an iPhone.
Dataset
Homes of New York (HoNY)
22 homes
216 environments
5620 trajectories
13 hours
1.5 million frames
Homes of New York (HoNY) is a dataset containing 13 hours of interactions at 22 different homes of New York City collected with the Stick. The dataset contains RGB and depth videos at 30 fps, as well as full action annotations for 6D pose of the gripper as well as the gripper's opening angle normalized between (0, 1).
Model
Home Pretrained Representations (HPR)
Home Pretrained Representation (HPR) is a model pre-trained on the HoNY dataset that we used to initialize a robot policy to perform a new task in a novel enviroment. HPR is a ResNet-34 model trained on the HoNY dataset using the MoCo-v3 self-supervised learning objective.
During deployment,we used HPR to initialize a policy, the trunk of which was simply our pretrained ResNet-34 model followed by two linear layers on top.
🤗 Get the model at Huggingface
Or if you are using 🤗 Pytorch Image Models (TIMM), you can simply start using it in a couple of lines:
import timm model = timm.create_model( "hf-hub:notmahi/dobb-e", pretrained=True )
import timm model = timm.create_model("hf-hub:notmahi/dobb-e", pretrained=True)
Paper
On Bringing Robots Home
@article{shafiullah2023bringing,
title={On bringing robots home},
author={Shafiullah, Nur Muhammad Mahi and Rai, Anant and Etukuru, Haritheja and Liu, Yiqian and Misra, Ishan and Chintala, Soumith and Pinto, Lerrel},
journal={arXiv preprint arXiv:2311.16098},
year={2023}
}
Code