Datasets:

ArXiv:
License:
sparsh-x-dataset / README.md
carohiguera's picture
update readme
5968983
metadata
license: cc-by-nc-4.0
tags:
  - Sparsh-x
  - Digit 360
  - SSL training

Dataset Details

This dataset contains the sequences used for training Sparsh-X, a multisensory touch encoder for the Digit 360 sensor. Sparsh-X allows to fuse multiple touch modalities into a single embedding, such as tactile image, audio from contact microphones, IMU and pressure data.

Our Sparsh-X training dataset is generated from two primary sources: an Allegro hand with Digit 360 sensors on the fingertips that performs random motions with objects such as dipping into a tray filled with various items; and a manual picker with the same sensor adapted to the gripping mechanism, used to execute atomic manipulation actions such as picking up, sliding, tapping, placing, and dropping objects against diverse surfaces that vary in roughness, hardness, softness, friction, and texture properties. An example of the data collection is shown below:

D360 data collection

Dataset Structure and Usage

Each sequence contains a data.pickle and metadata.yaml. The pickle file has the following structure, with the raw messages per Digit 360 and tactile modality:

├── data.pickle
│   ├── d360_0
│   |   ├── image_raw/compressed # list of msgs with tactile image @30Hz
│   |   ├── imu_quat_topic  # list of msgs with IMU quaternion data @400Hz
│   |   ├── imu_raw_topic # list of msgs with raw 3-axis accelerometer data @400Hz
│   |   ├── mic_0 # list of time-series messages from contact microphone @48kHz
│   |   ├── mic_1 # list of time-series messages from contact microphone @48kHz
│   |   ├── pressure_topic # list of time-series messages from pressure @200Hz

You can load the pickle file as:

import pandas as pd
sequence_path = <PATH TO SEQUENCE>
ros_msgs = pd.read_pickle(os.path.join(sequence, "data.pickle"))
print(ros_msgs.keys())

And the output should be the key name for each Digit 360 qith recorded data:

> dict_keys(['d360_0', 'd360_1', 'd360_2', 'd360_3'])

The metadata.yaml for each sequence contains useful labels for Supervised Learninf downstream tasks. It describes the object in direct contact with the sensor (e.g. tennis ball), the surface the object is making contact with (e.g. grass), and the action that was perform during the data collection (e.g. tapping). Be aware that not all sequences have the three labels, for instance may not have the surface label.

Please refer to Sparsh-X repo for further information about using the dataset for SSL training. Some useful pointers:

Citation

If you find this dataset useful for your research, please consider citing:

@inproceedings{
higuera2025tactile,
title={Tactile Beyond Pixels: Multisensory Touch Representations for Robot Manipulation},
author={Carolina Higuera and Akash Sharma and Taosha Fan and Chaithanya Krishna Bodduluri and Byron Boots and Michael Kaess and Mike Lambeta and Tingfan Wu and Zixi Liu and Francois Robert Hogan and Mustafa Mukadam},
booktitle={9th Annual Conference on Robot Learning},
year={2025},
url={https://openreview.net/forum?id=sMs4pJYhWi}
}
@article{lambeta2024digitizing,
  title={Digitizing touch with an artificial multimodal fingertip},
  author={Lambeta, Mike and Wu, Tingfan and Sengul, Ali and Most, Victoria Rose and Black, Nolan and Sawyer, Kevin and Mercado, Romeo and Qi, Haozhi and Sohn, Alexander and Taylor, Byron and others},
  journal={arXiv preprint arXiv:2411.02479},
  year={2024}
}