Skip to content

CIVIL2025/Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CIVIL

[PDF] [Project Page] [Video] License: MIT

This repository contains the implementation for CIVIL, a framework designed to enable real robots to learn from multimodal visual-language instruction data using imitation learning.

Getting Started

Set Up DEVA

We use DEVA (Tracking Anything with DEVA) as a dependency for visual tracking of offline data. Configure DEVA by following the instructions at the official repository: Tracking-Anything-with-DEVA

⚠️ Make sure DEVA is installed and working before proceeding to install the environment.

Install the Environment

Use the provided environment.yml file to create the conda environment:

conda env create -f environment.yml
conda activate CIVIL

Prepare Simulation Dataset

To work with the simulation dataset, refer to the instructions provided by CALVIN.

For generating CALVIN data:

  • Use data_generation/generate_calvin.py to create marker data.
  • Use add_segmentation_calvin.py to generate segmentation masks.

Real-World Dataset Sample

A sample of the real-world dataset we used with Panda robots is available under:

panda_data_example/

For further documentation on training scripts, experiment setup, and user study evaluations, refer to the relevant scripts in the repository.

Check out robot rollouts with CIVIL on our website.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages