-
-
The Full Tech Setup!
-
A person in both the real and virtual world!
-
Our little router, in charge of getting info from one computer to another
-
Left - Game Server (and client). Right - Vision Server
-
For our sponsors we added short speeches so that even those who could not attend the event could still gain the valuable insight on offer.
-
For our sponsors we added short speeches so that even those who could not attend the event could still gain the valuable insight on offer.
-
Our sponsor room for bede and Netcraft
-
Our sponsor room for Marshall Wace, BidFX and QRT.
-
Our sponsors on display to allow them to further promote their company.
-
Our sponsors on display to allow them to further promote their company.
-
Our sponsors on display to allow them to further promote their company.
Inspiration
We were inspired by how dull online events can sometimes be, especially when run over Microsoft teams or similar platforms. We wanted to find a way to give those who cannot attend an event physically the same experience online! This proof-of-concept shows how a crowd of physical people can be brought into the virtual space.
What it does
Our software takes a real time video feed from a camera, and represents physical Durhack attendees in a virtual re-creation of the space! As people walk around in the lobby (wearing their purple durhack 2023 T-shirts), they appear as players in a Minecraft server, which any unmodified Minecraft client can connect to!
How we built it
We used a mirrorless camera and HDMI capture card to get a real-time video feed into python/opencv. This applies a variety of image processing filters to get a “top-down view “ video and pick out areas of purple, before identifying them and sending them to another computer via websockets.
This computer, running the Minecraft server and custom plugin written in java, creates, moves and deletes NPC entities according to the data received.
Challenges we ran into
We ran into challenges almost every step of the way - from getting video into the vision computer to running Minecraft servers on Eduroam.
Significant issues included;
- The maths involved in translating the coordinates on the camera plane onto the floor (which was later abandoned in favour of a computationally faster photogrammetric approach). There’s a lot of unhelpful information about this online!
- Keeping websockets between the Vision and Game server open
- Detecting people reliably with built-in opencv methods (leading us to create our own t-shirt based algorithm)
Accomplishments that we're proud of
We’re proud of the whole thing! The final payoff of seeing the project work, and the awe on the faces of passers by made it worth the (literal) all nighter coding.
We’re also particularly proud of our hackathon ‘side-project’ - duck.technology! (domain courtesy of godaddy registrar)
What we learned
An awful lot about openCV and image recognition, trigonometry, vectors, polar coordinates, and how much Redbull is too much Redbull.
What's next for Durcraft Virtual Attendees?
We worked on getting the rotation of the in-game characters to follow real-world movement, however any success was hindered by poor performance and minimal time to optimise. Maybe with some more thought we could implement this feature properly.
Next year, we have our eye on some hardware projects. Did you know you can buy a de-commissioned traffic light on eBay for £80….?
Demonstration of project can be found here
Log in or sign up for Devpost to join the conversation.