Skip to content

XiaokunSun/StrandHead

Repository files navigation

[ICCV 2025] StrandHead: Text to Hair-Disentangled 3D Head Avatars Using Human-Centric Priors

Nanjing University
*Corresponding Author

ArXiv Project Page Gallery


Teaser

🔨 Installation

Tested on Ubuntu 20.04, Python 3.8, NVIDIA A6000, CUDA 11.7, and PyTorch 2.0.0. Follow the steps below to set up the environment.

  1. Clone the repo:
git clone https://github.com/XiaokunSun/StrandHead.git
cd StrandHead
  1. Create a conda environment:
conda create -n strandhead python=3.8 -y
conda activate strandhead
  1. Install dependencies:
pip install torch==2.0.0 torchvision==0.15.1 torchaudio==2.0.1
pip install -r requirements.txt
pip install git+https://github.com/openai/CLIP.git
pip install git+https://github.com/ashawkey/envlight.git
pip install git+https://github.com/NVlabs/nvdiffrast.git --no-build-isolation
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
pip install git+https://github.com/KAIR-BAIR/nerfacc.git@v0.5.2
pip install git+https://github.com/ashawkey/cubvh --no-build-isolation
conda install https://anaconda.org/pytorch3d/pytorch3d/0.7.5/download/linux-64/pytorch3d-0.7.5-py38_cu117_pyt200.tar.bz2 # Note: Please ensure the pytorch3d version matches your CUDA and Torch versions
  1. Download models:
mkdir ./pretrained_models
bash ./scripts/download_humannorm_models.sh
  1. Download other models (eg., FLAME, Tets) from GoogleDrive. Make sure you have the following models:
StrandHead
|-- load
    |-- flame_models
        |-- flame
            |-- closed_woeyes_flame_faces.npy
            |-- face_flame_faces.npy
            |-- ...
            |-- NHC_scalp_vertex_idx.pth
            |-- woeyes_flame_index.npy
    |-- strandhead
        |-- init_NHC
            |-- A afro
            |-- A short afro
            |-- ...
            |-- strands00372
            |-- strands00384
        |-- data_dict.json
        |-- haar_head.obj
        |-- init_NHC_dict.json
        |-- strand_ckpt.pth
        |-- USC_head.obj
    |-- tets
        |-- 256_tets.npz
    |-- prompt_library.json
|-- pretrained_models
    |-- controlnet-normal-sd1.5
    |-- depth-adapted-sd1.5
    |-- normal-adapted-sd1.5
    |-- normal-aligned-sd1.5

🕺 Inference

# Generate bald head
python ./scripts/generate_bald_head.py --dict_path ./load/strandhead/data_dict.json --bald_head_exp_root_dir ./outputs/bald_head --bald_head_idx 0:1:1 --gpu_idx 0
# Generate strand
python ./scripts/generate_strand.py --dict_path ./load/strandhead/data_dict.json --bald_head_exp_root_dir ./outputs/bald_head --hair_head_exp_root_dir ./outputs/strand_head --idx 0:1:1 --gpu_idx 0

🪄 Application

# Hairstyle Editing
python ./scripts/edit_strand_color.py
python ./scripts/edit_strand_cuvr.py
python ./scripts/edit_strand_length.py
python ./scripts/edit_strand_num.py
# Hairstyle Transfer
python ./scripts/transfer_hair.py

If you wish to customize the initial hairstyle, please refer to our initialization approaches for hairstyles from the USC-HairSalon dataset and those generated by HAAR, as implemented in ./scripts/init_NHC_USC.py and ./scripts/init_NHC_HAAR.py, respectively. Be sure to update the configuration files ./load/strandhead/data_dict.json and ./load/strandhead/init_NHC_dict.json accordingly.

⭐ Acknowledgements

This repository is based on many amazing research works and open-source projects: ThreeStudio, HumanNorm, NeuralHaircut, HAAR, etc. Thanks all the authors for their selfless contributions to the community!

📚 Citation

If you find this repository helpful for your work, please consider citing it as follows:

@inproceedings{sun2025strandhead,
  title={StrandHead: Text to Hair-Disentangled 3D Head Avatars Using Human-Centric Priors},
  author={Sun, Xiaokun and Cai, Zeyu and Tai, Ying and Yang, Jian and Zhang, Zhenyu},
  booktitle=ICCV,
  year={2025}
}

About

[ICCV 2025] Official repo of "StrandHead: Text to Strand-Disentangled 3D Head Avatars Using Hair Geometric Priors“

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages