✍️⚔️ What is Draw Your Sword?

  1. You draw a stick.
  2. You try and hit someone on the head with it.

💭 Inspiration

When you were a kid way back when and you picked up a stick, did you ever imagine that it was a sword? Have you ever painted something in VR and wanted to grab it? If you've said yes to either of those questions that's what our inspirations were behind Draw Your Sword. We just thought that it would be very cool by making a stick and having it BE a sword!

👷 How we built it

We built it with: 2 Oculus Quest 2s, 2 Asus Workstation laptops, 2 iPads, a couple mice, and a couple macs and a lower end PC. We decided to develop with Unity.

😥 Challenges we ran into

There were four main problems of this project:

Firstly, setting up the project (done by Unity Veteran Mitchell) proved to be problematic. A lot of the problems stemmed from the inconsistency in Oculus documentation and their insensitivity to our hardware. We've brought with us 5 Quests and only 2 of them were able to connect with our computers. The Oculus PC software would often crash and disconnect from our headsets. When developing on Unity the headset would randomly disconnect and we would lose several minutes trying to get it hooked back up.

With every new Oculus Unity plugin update, they would change the names of some function calls and not update the documentation. We've lost so much time trying to figure out how to get passthrough to work.

Real time mesh generation (done by Joyce and Matthew) and multiplayer networking (done by Shin).

Real time mesh generation is an aspect that we really wanted to explore for this project and we got working as well.

With real time meshes, there were three components we had to worry about: mesh generation, smooth mesh interpolation, mesh colliders. We managed to get through mesh generation and smooth mesh interpolation done relatively quickly and when we tested the mesh generation on the computer, it all worked fine. However, when we got to testing it in VR things started breaking...

It turns out that we forgot to remove one of the variables in the vectors we were using to calculate the vertices from the triangles. and then generation started working smoothly!

The next big issue was when we wanted to pick up the meshes we wanted to draw. The default mesh collision script that came with unity didn't work, so we resorted to writing our own collider code with a bunch of box colliders, but we couldn't get reliably pick up the mesh.

While work was being done on the meshing systems, there was great progress to be made in multiplayer. Our multiplayer uses Photon's peer-to-peer networking and it runs on either standalone or tethered VR.

The main issue that we had with multiplayer was to try and replicate the custom meshes players would draw (a.k.a. player A needs to see what player B makes). It's very difficult with Photon to update gameObjects/prefabs after they've been instantiated because while you can instantiate a game object for all the players, you can only update its location and rotation. Passing in additional modifications into the prefab was a very big hurdle for us.

Ultimately, it was decided to drop the custom meshing in terms of the multiplayer since we already had significant multiplayer gameplay functionality already implemented.

In terms of design, our designer (mainly Nancy) found that there was a huge lack of references out there for other passthrough games. Passthrough is in black and white, so she had to think outside the box to make the game visually appealing and make items viewable within the game space. Some questions we though about together was which elements would be better to have in 3D or in 2D, such as connecting screens or Win/Lose state menus. Did we even want to deal with menus?

Something else we all thought about was the Win/Lose conditions itself, what counts as a win? Well, it quickly became apparent that we only had tracking to the head so if a player taps some sort of object on the other players head, it should lead to interesting gameplay mechanics were people would make convoluted weapons to hit someone else's (virtual) head.

🎖 Accomplishments that we're proud of

We learned how to generate 3D meshes in Unity, and in extension Unreal Engine. The two engines have very similar methods to creating meshes. This was something we were really wanting to look into for a long time and now we know how to do it!

We also met some people that we look up to and aspire to be like in the future while working at this hackathon. So many cool people.

🧠 What we learned

Essentially, you need to generate your polygon mesh by creating an array of vertices and an array of triangles and calculate the normal direction of these triangles and then pass it into a function.

Need to know the basics of 3D modeling when making custom assets for games. Objects cannot be too high poly and instead of exporting the WHOLE scene we should only export the asset we want to export.

We've learned that we shouldn't develop for VR/AR on low-end hardware and on Macs. We swapped laptops with each other on a need to use basis. If someone was doing VR interactions, they'd swap for a more performant laptops from someone else in the group.

It's also nice to have your own VR headsets for these hackathons and to test that they work prior. Only 2 out of our 5 VR headsets worked for this hackathon.

Media labs is also very cool and some of us are thinking going there, but we've also heard that they don't have as much funding as they used to.

➡️ What's Next for Draw your Sword?

We think that the concept for Draw Your Sword is one that should definitely be fleshed out more in the future and we plan on figuring out the multiplayer replication for the custom meshes out at some point and adding more features.

Built With

Share this project:

Updates