Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

On the Effectiveness of Sentence Encoding for Intent Detection Meta-Learning

This repository contains the open-sourced official implementation of the paper

On the Effectiveness of Sentence Encoding for Intent Detection Meta-Learning (NAACL 2022).
Tingting Ma, Qianhui Wu, Zhiwei Yu, Tie-jun Zhao and Chin-Yew Lin

If you find this repo helpful, please cite the following paper

@inproceedings{ma-etal-2022-intentemb,
    title = {On the Effectiveness of Sentence Encoding for Intent Detection Meta-Learning},
    author = {Tingting Ma and Qianhui Wu and Zhiwei Yu and Tiejun Zhao and Chin-Yew Lin},
    booktitle = {2022 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022)},
    year = {2022},
    publisher = {Association for Computational Linguistics},
    url = {https://openreview.net/pdf?id=SzGx4ZQfHZq}
}

For any questions/comments, please feel free to open GitHub issues or email the fist author directly.

Requirements

pip install -r requirements.txt 

The original experiments were conducted with Python 3.7, numpy 1.21.0, and sentencepiece 0.1.94; which now have security vulneralibilities. Please try reverting to those versions in case of any problem.

Download datasets

For convenience, you can download the intent dataset splits from this repo and put the downloaded data into data folder. Then run

python gen_labelname.py

Download models

The SP-para. model need be downloaded manually by running

wget http://www.cs.cmu.edu/~jwieting/paraphrase-at-scale-english.zip
unzip paraphrase-at-scale-english.zip

Experiments

1. Evaluate the sentence embedding

bash scripts/eval_emb.sh

2. Apply the label name trick

bash scripts/eval_emb_label.sh

3. Evaluate ProtoNet, ProtAugment

Please following the code released by the author of ProtAugment, and save your trained model as encoder.pkl, then evaluate the model with our scripts by replacing the output_path argument.

Acknowledgement

This repository leverages open-sourced work from multiple sources. We thank the original authors of the repositories below for sharing their code, and models.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit httpscla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.