MidasTouch

Monte-Carlo inference over distributions across sliding touch

Sudharshan Suresh  •  Zilin Si  •  Stuart Anderson  •  Michael Kaess  •  Mustafa Mukadam 6th Annual Conference on Robot Learning (CoRL) 2022

Website  •  Paper  •  Presentation  •  YCB-Slide

TL;DR: We track the pose distribution of a robot finger on an object’s surface using geometry captured by a tactile sensor

License: MIT   Code style: black        Meta-AI    rpl

MidasTouch performs online global localization of a vision-based touch sensor on an object surface during sliding interactions. For details and further results, refer to our website and paper.

Setup

1. Clone repository

git clone [email protected]:facebookresearch/MidasTouch.git
git submodule update --init --recursive

2. Download YCB-Slide dataset

cd YCB-Slide 
chmod +x download_dataset.sh && ./download_dataset.sh
cd ..

3. Download weights/codebooks

chmod +x download_assets.sh && ./download_assets.sh

4. Setup midastouch conda env

sudo apt install build-essential python3-dev libopenblas-dev
conda env create -f environment.yml 
conda activate midastouch
pip install -e .

5. Install PyTorch and the MinkowskiEngine

      Follow the conda instructions from the NVIDIA MinkowskiEngine webpage

Run MidasTouch

Run interactive filtering experiments with our YCB-Slide data from both the simulated and real-world tactile interactions.

TACTO simulation trajectories

python midastouch/filter/filter.py expt=ycb # default: 004_sugar_box log 0
python midastouch/filter/filter.py expt.obj_model=035_power_drill expt.log_id=3 # 035_power_drill log 3
python midastouch/filter/filter.py expt.off_screen=True   # disable visualization
python midastouch/filter/filter.py expt=mcmaster   # small parts: cotter-pin log 0

Real-world trajectories

python midastouch/filter/filter_real.py expt=ycb # default: 004_sugar_box log 0
python midastouch/filter/filter_real.py expt.obj_model=021_bleach_cleanser expt.log_id=2 # 021_bleach_cleanser log 2

Codebook live demo

With your own DIGIT, you can simple plug in the sensor and experiment with the image to 3D and tactile codes visualizer.

python midastouch/filter/filter_real.py expt.obj_model=025_mug

Folder structure

midastouch
├── bash          # bash scripts for filtering, codebook generation
├── config        # hydra config files 
├── contrib       # modified third-party code for TDN, TCN
├── eval          # select evaluation scripts 
├── filter        # filtering and live demo scripts
├── modules       # helper functions and classes
├── render        # DIGIT tactile rendering class
├── tactile_tree  # codebook scripts 
└── viz           # pyvista visualization 

Bibtex

@inproceedings{suresh2022midastouch,
    title={{M}idas{T}ouch: {M}onte-{C}arlo inference over distributions across sliding touch},
    author={Suresh, Sudharshan and Si, Zilin and Anderson, Stuart and Kaess, Michael and Mukadam, Mustafa},
    booktitle = {Proc. Conf. on Robot Learning, CoRL},
    address = {Auckland, NZ},
    month = dec,
    year = {2022}
}

License

The majority of MidasTouch is licensed under MIT license, however portions of the project are available under separate license terms: MinkLoc3D is licensed under the MIT license; FCRN-DepthPrediction is licensed under the BSD 2-clause license; pytorch3d is licensed under the BSD 3-clause license. Please see the LICENSE file for more information.

Contributing

We actively welcome your pull requests! Please see CONTRIBUTING.md and CODE_OF_CONDUCT.md for more info.

GitHub

View Github