Inspiration
We were inspired by the miscommunication and communication issues that often happen with the elderly. We noticed a gap in the information the elderly were telling loved ones, unwilling to share information that may give the impression that they are weak and reliant on others. With the RUAssist, these people have easy access to a friend without a stake in the situation, someone to talk to, and vent to, without judgment or preconceived notions. We hope that this will encourage the user to effectively communicate their problems and fears within a safe space, improving their mental health.
What it does
Upon deployment of the app, the user opens to a screen with two options. The first is to stay on the screen, where there are buttons for easy access to emergency contacts (i.e. 911, a family member or guardian, and the user's doctor's office). A simple click will prompt the user to immediately call the number. The second function is the virtual friend. Clicking on the button for the virtual friend, the user is brought to another screen where a chat message opens up. The user enters a message below, clicks the send button, and the message will appear on the screen. Afterward, ChatGPT will answer the message, continuing the conversation.
How we built it
We built the entire app on XCode, using the UI language Swift. We started with a simple storyboard, implementing OpenAI into a Pod file and updating the workplace. Using the ViewController, we created three UIButtons which upon press were linked to a phone number using the UIApplication openURL. We also used multiple UILabels in order to design and decorate the app. Next, for the virtual assistant screen, we used the UINavigator to connect the first screen (original ViewController) to the second screen (secondViewController) without any code, providing a toggle between the two screens. On the secondViewController, we called the ContentView, which displays the code within the ContentView swift file, which displays the message screen for the conversation. The ContentView is placed with a hosting controller which then calls the ChatViewModel, which contains the swift code to call OpenAI using the OpenAI key and gather a response based on the user message. The model then returns the bot message to the messages list which outputs the message onto the screen.
Challenges we ran into
We ran into multiple challenges regarding the IBAction reference outlets to the UI buttons present on the main screen. As they were originally placed on the AppDelegate, no output was showing, which was confusing as it was a logical error. We later realized (upon transferring the functions to the viewController) that this function was not able to work within the simulator and had to be deployed to an actual iPhone in order for the call function to work. Another huge challenge we faced was trying to get the chatbot to respond to the chat. The first issue we faced was with the size of the controller opening on the secondViewController, which did not match up and thus caused errors. However, this was fixed simply by inserting a delay function to delay the new screen to make sure the original screen was present for long enough for the other screen to adjust in size. But the main problem, after heavy debugging, was with OpenAI itself. As my OpenAI API key was enabled through my OpenAI account, it was dependent on my billing plan, which was not active as it had expired and I had not known. With the right OpenAI API key (and an active account to utilize the OpenAI APIs), the chatbot was able to appear.
Accomplishments that we're proud of
We are very proud of both aspects of the app and in particular the chatbot. This was a very finicky process, with the multiple classes and structs providing so much room for error that it was difficult to debug and figure out where exactly the error was coming from. When it deployed properly for the first time, we were very relieved.
What we learned
We learned a lot about not only app development but also integrating OpenAI into new languages. Previously, I had done this within Python, but within Swift, it was very different. Additionally, we learned a lot about the different packages and libraries available to add in that could make the functionality of the project a lot more easy to implement, such as using the OpenAISwift package rather than OpenAI itself as much of the JSON format necessary to call the API was already implemented.
What's next for RUAssist
In the future, we may work to train the AI model to adopt a more caring and friend-like tone when answering the messages in order to mimic a conversation between a friend rather than the robot-like tone that sometimes comes off of ChatGPT. Additionally, there are more features we could add in, like a connection to messages in order for the "friend" to remind the user to take their medicine, etc.
Log in or sign up for Devpost to join the conversation.