Inspiration

Many people struggle with research. We wanted to fix that. So we built something to provide a starting point for research on any topic, with the goal of being concise and informative.

What it does

It uses uipath to find the definition of almost any topic on Wikipedia to give the user a bit of a general overview of the topic. Afterward, it scrapes the titles of the most relevant news articles to give the user a head-start in their research.

How we built it

  1. As mentioned before we have a uipath component that gets the definition of the topic from Wikipedia as a general overview. For that, we have built an algorithm that goes step by step into this process just like a human would. This gets tied into the python code using the start_wikifetcher.py file which uses paperclip and other functions from the sentence_tools.py.
  2. sentence_tools.py has numerous algorithms that are responsible for the computation to scrape the most relevant information. Such as the extract_first_sentence() and get_key_points_google(), which extract the first sentence and get the headlines from the first page of google news on the topic respectively
  3. For the front end, there is an index.html file that contains the HTML content of the page. It is fairly basic and easy to understand for the reader.
  4. The graphics were made using a PowerPoint presentation.

Note: Please refer to the README file from the git repository for a more thorough explanation of the installation process.

Challenges we ran into

  1. Google's limit for requests before it labels your IP as spam.
  2. Working remotely
  3. Managing teammates
  4. Having unrealistic goals of making an AI that writes essays like humans.

Accomplishments that we're proud of

For the accomplishments, I will address how we overcame the challenges in their respective order as that is an accomplishment in itself.

  1. We found a way to just get the first page from google search and we settled on only doing the search once other than doing a search for all selective combinations of the topic
  2. We used discord calls to connect better
  3. We optimized the team by selectively picking teammates.
  4. We have made that a goal for the future now.

What we learned

This project taught us both technical and nontechnical skills, we had to familiarize ourselves with numerous APIs and new technologies such as UiPath. On the non-technical side of things, the reason this project was a success was because of the equitable amount of work everyone was given. Everyone worked on their part because they truly wanted to.

What's next for Essay Generator

Next step is to use all the data found by web scraping to form an essay just like a human would and that requires AI technologies like machine learning and neural networks. This was too hard to figure out in the 36 hrs. period but in the future years, we all will try to come up with a solution for this idea.

Built With

Share this project:

Updates