Inspiration
The inspiration for HumanizerGPT came from the desire to make AI-generated text more human-like and undetectable. We wanted to create a system that could take the output from a GPT-4 model and transform it into something that reads as if it was written by a human, with all the nuances and subtleties that entails.
What it does
HumanizerGPT is an interactive system where a user enters a prompt. The system then uses the GPT-4 model to generate an initial response. This response is then passed through our Parrot transformer model, trained on the PAWS dataset, which paraphrases and humanizes the text. The result is a more natural, human-like text that has 0% AI detection.
How we built it
We built HumanizerGPT using a combination of Vue.js for the frontend, Go for the backend, and Python for the machine learning components. The GPT-4 API is used to generate the initial text based on the user's prompt, which is then passed to the Parrot transformer model for paraphrasing. The fine-tuned T5 model architecture called Parrot, is an augmentation framework built to speed-up training NLU models. The author of the fine-tuned model did a small library to perform paraphrasing. Parrot is trained on the PAWS dataset. The entire system is designed to be autonomous, providing the user with a seamless experience.
Challenges we ran into
One of the main challenges we faced was ensuring the quality of the paraphrased and humanized text. We had to fine-tune the Parrot model to ensure that the output text maintained the original meaning while also sounding natural and human-like. We ended up not being able to actually implement the Parrot T5 Transformer model into our app and ended up using the paraphrase genius API which is not good at all.
Accomplishments that we're proud of
We're proud of creating a system that can take AI-generated text and make it sound human-like to the point of being undetectable as AI-generated. This opens up a lot of possibilities for applications where the 'AI-ness' of the text can be a drawback.
What we learned
We learned a lot about transformer models and how to fine-tune them for specific tasks. We also learned about integrating different technologies (Vue.js, Go, Python) into a cohesive system.
What's next for HumanizerGPT
he next step for HumanizerGPT is to improve the quality of the humanization process even further. We also plan to explore other applications for our system, such as content generation, chatbots, and more. And of course, implement the transformer paraphrasing model to make the humanizer 0% AI detectable 100 percent of the time


Log in or sign up for Devpost to join the conversation.