- This repository contains code for our paper INSIDE: LLMs' Internal States Retain the Power of Hallucination Detection Download paper here
- If you have any question about our paper or code, please don't hesitate to contact with me chench@zju.edu.cn/ercong.cc@alibaba-inc.com, we will update our repository accordingly
The dataset can be downloaded here ...
@article{chen2024inside,
title={INSIDE: LLMs' Internal States Retain the Power of Hallucination Detection},
author={Chen, Chao and Liu, Kai and Chen, Ze and Gu, Yi and Wu, Yue and Tao, Mingyuan and Fu, Zhihang and Ye, Jieping},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024}
