This repository contains the code for the NeurIPS 2021 paper: Controlled Text Generation as Continuous Optimization with Multiple Constraints
- pytorch >= 1.6
- transformers >= 4.5.1
- (optional for some constraints) sentence-transformers
- (optional for some constraints) POT
The main file to run this decoding algorithm is decode.py. All models used in this code are based on huggingface transformers.
see examples
see examples
This code currently supports the following losses:
- Sentence Classification (Cross Entropy)
- Semantic Similarity (Cosine Similarity, WMD between representations)
- Conditional generation losses (MarianMT, GPT2)
To add more losses/constraints, follow examples from 'mucoco/losses/'
The source code is licensed under the MIT license, which you can find in the LICENSE.md file