NeurIPS23 "Flow Factorized Representation Learning"
T-PAMI25 "Unsupervised Representation Learning from Sparse Transformation Analysis"
| MNIST | Shapes3D | Falcol3D | Isaac3D |
|---|---|---|---|
![]() ![]() ![]() |
![]() |
![]() |
CalMS
| Latent Flow 1 | Latent Flow 2 | Latent Flow 3 |
|---|---|---|
![]() |
![]() |
![]() |
CityScape
| Latent Flow 1 | Latent Flow 2 | Latent Flow 3 |
|---|---|---|
![]() |
![]() |
![]() |
Illustration of our flow factorized representation learning: at each point in the latent space we have a distinct set of tangent directions ∇uk which define different transformations we would like to model in the image space. For each path, the latent sample evolves to the target on the potential landscape following dynamic optimal transport.
Depiction of our model in plate notation. (Left) Supervised, (Right) Weakly-supervised. White nodes denote latent variables, shaded nodes denote observed variables, solid lines denote the generative model, and dashed lines denote the approximate posterior. We see, as in a standard VAE framework, our model approximates the initial one-step posterior p(z0|x0), but additionally approximates the conditional transition distribution p(zt|zt−1, k) through dynamic optimal transport over a potential landscape.
Overview of Sparse Transformation Analysis (STA): given an input sequence
Our model across
First, clone the repository and navigate into it:
git clone https://github.com/KingJamesSong/latent-flow.git
cd latent-flowWe recommend setting up a new conda environment for this project. You can do this using the following command:
conda create --name latent-flow-env python=3.11
conda activate latent-flow-envNext, install the necessary dependencies. This project requires PyTorch. You can find the installation instructions on the PyTorch setup page.
After installing PyTorch, install the remaining dependencies from the requirements.txt file:
pip install -r requirements.txtFor development purposes, you may also want to install the dependencies listed in requirements_dev.txt:
pip install -r requirements_dev.txtIt is recommended to set your IDEs autoformatter to use black and to enable "format on save".
Finally, install the package itself. If you plan on modifying the code, install it in editable mode using the -e option:
pip install -e .This will allow your changes to be immediately reflected in the installed package.
The code assumes that all datasets are placed in the ./data folder. This folder is going to be created automatically if necessary.
However, if you'd like to use a different folder for your datasets, you can create a symbolic link to that folder. This can be done using the following commands:
For Unix-based systems (Linux, MacOS), use the ln command:
ln -s /path/to/your/dataset/folder ./dataThis command creates a symbolic link named ./data that points to /path/to/your/dataset/folder.
For Windows systems, use the mklink command:
mklink /D .\data C:\path\to\your\dataset\folderThis command creates a symbolic link named .\data that points to C:\path\to\your\dataset\folder.
Please replace /path/to/your/dataset/folder and C:\path\to\your\dataset\folder with the actual path to your dataset folder.
Please check the scripts folder for the training and evaluation codes.
Please check latent_flow.py for the minimal training code of using the spike-and-slab prior.
We pre-define 3 latent flow fields and you just need to use your own MLP for it.
If you think the code is helpful to your research, please consider citing our paper:
@inproceedings{song2023flow,
title={Flow Factorized Representation Learning},
author={Song, Yue and Keller, Andy and Sebe, Nicu and Welling, Max},
booktitle={NeurIPS},
year={2023}
}
@article{song2024unsupervised,
title={Unsupervised Representation Learning from Sparse Transformation Analysis},
author={Song, Yue and Keller, Thomas Anderson and Yue, Yisong and Perona, Pietro and Welling, Max},
journal={arXiv preprint arXiv:2410.05564},
year={2024}
}
If you have any questions or suggestions, please feel free to contact me via yue.song@unitn.it.














