- 📖 Table of Contents
- 📍 Overview
- 🔑 Key Features
- 📂 Contents
- 🚀 Getting Started
- 📄 License
- 👏 Acknowledgments
- 📋 BibTeX
This repository contains the code and resources related to the paper "Augmenting Tactile Simulators with Real-like and Zero-Shot Capabilities" (under review). The paper introduces SightGAN, a cutting-edge solution for enhancing tactile perception in high-resolution tactile sensors such as allsight sensor.
The simulation data was obtained using tha allsight_sim package that utilizing TACTO a physics-engine simulator for optical-based tactile sensors.
For more information about the simulation package please see the link provided above.
A bi-directional Generative Adversarial Network which built upon CycleGAN and designed to bridge the reality gap between simulated and real tactile data, particularly for high-resolution tactile sensors.
SightGAN introduces contact-specific consistency losses:
- Spatial Contact Consistency loss
- Pixel-wise Contact Region Consistency loss
The GAN diagram and training process can be represented as follow:
This repository contains several directories with the details as follow:
Root
| File | Summary |
|---|---|
| requirements.txt | Dependencies file |
| train.py | Train the GAN model |
| test.py | Test the GAN model |
| train_regressor.py | Train the regressor model only |
| train_regressor_finetune.py | Train a pre-trained regressor model with additional data |
Options
Python files forked from CycleGAN repoModels
| File | Summary |
|---|---|
| diff_cycle_gan_model.py | The class of SightGAN model with its auxilliary losses |
All the rest of the files were forked from CycleGAN and helps for building and managing the models during training and test procedures.
Data_process
| File | Summary |
|---|---|
| merge_json_sim.py | Merging all the dataset from simulation into main json file database |
| sim2gan_json.py | Updata the json file with generated image pathes |
| filter_real_images.py | Filtering the real data and creating a json file database |
| transfer_images.py | Transfering iamges from sim and real databases to the train/test folders |
| add_diff_frame.py | update the json file with the substracted images |
Train_allsight_regressor
A Python package for training the spatial contact estimator and inherit the models.
Util
Python files forked from CycleGAN repo
Dependencies
Please ensure you have the following dependencies installed on your system.
Project was tested on:
- Ubuntu 18/20
- python >= 3.8
- Clone the allsight_sim2real repository:
git clone https://github.com/RobLab-Allsight/allsight_sim2real- Change to the project directory:
cd allsight_sim2real- Install the dependencies:
pip install -r requirements.txtNOTE: Be aware for the pathes adjusts and adaptation for your custom dataset.
NOTE: Please Be aware for the arguments adjusts needed for every script.
Assuming you have a dataset folder and updated package with the relevant pathes to your dataset:
Filter the real data:
python3 data_process/filter_real_images.py Merge sim data:
python3 data_process/merge_json_sim.py Add diff frame:
python3 data_process/add_diff_frame.pyTransfer images to gan folders:
python3 data_process/transfer_images.py Train CycleGAN:
python3 train.py <arguments> This repository is licensed under the MIT License. Feel free to use, modify, and distribute the code as per the terms of this license.
@misc{azulay2023augmenting,
title={Augmenting Tactile Simulators with Real-like and Zero-Shot Capabilities},
author={Osher Azulay and Alon Mizrahi and Nimrod Curtis and Avishai Sintov},
year={2023},
eprint={2309.10409},
archivePrefix={arXiv},
primaryClass={cs.RO}
}



