Submission for the 2026 IEEE Robotech Hackathon at Georgia Tech. This repo contains the code for the humanoid (Hercules from System Technology Works) robot and its integration with a rover workflow for sample collection and analysis. The concept splits exploration and analysis between a rover and a humanoid, with the humanoid handling inspection and sorting when samples return.
The frontier of scientific exploration in space inspired us to find ways for robots to conduct research on their own. Hercules + Cerberus are a humanoid + rover combo that can explore their environment, collect samples, and evaluate them based on value and relevance to discovering life on foreign planets, demonstrating environmental awareness and autonomous decision making.
Many of the features we implemented were similar to assembly line robotics, such as the sample analysis feature provided by the humanoid robot. We were also inspired by startups pushing for centralized intelligence.
We split the tasks between two robots. Typically, lunar rovers will roam the surface alone and take limited samples. These samples likely cannot be analyzed until later on. We streamlined this exploration and analysis process by allowing the rover to go out, collect samples, and return to the humanoid, which can examine the samples. The rover uses OpenCV AprilTag/Aruco detection to lock onto fiducials, steer with proportional control, and modulate ESC speed based on tag area (distance). The humanoid uses PhotonVision + NetworkTables fiducial IDs, then applies inverse kinematics to pick up and sort samples. A web mission-control dashboard aggregates telemetry, ranks regions, and dispatches the rover to the most promising site.
We initially wanted to provide added functionality to the robot, specifically to get it walking. Unfortunately, the robot model was physically constrained (lacking a knee, top-heavy, etc.), disabling it from walking reliably. Instead, we decided to continue to add features to the upper body (inverse arm kinematics) and delegate the rest of the work to a rover. In many ways, our process of building was also our way of overcoming challenges that cropped up during the competition.
- Humanoid control (this repo): Python servo control via Pololu Maestro, 3-DOF inverse kinematics, AprilTag-to-arm mapping, and PhotonVision/NetworkTables vision hooks for fiducial localization and pickup sequencing.
- Rover control (Raspberry Pi):
pigpioPWM control for steering/ESC, OpenCV AprilTag/Aruco (DICT_APRILTAG_36h11) detection, and a Flask/navigateendpoint that triggers autonomous drive-to-tag behavior with obstacle avoidance routines. Repo: https://github.com/pamin1/RoboHacks_RaspPi - Mission control web interface: React + Vite + Tailwind dashboard with Framer Motion animations, Mapbox/MapLibre Mars basemap (globe projection when
VITE_MAPBOX_TOKENis set), and a Lottie rover overlay. The Flask API hosts/api/analyze,/api/dashboard, and a dispatch endpoint that posts commands to the rover; samples are scored in-memory via rule-based inference (keyword mapping). A NetworkTables bridge listens to PhotonVisionrawBytes/fiducialIdand forwards ArUco IDs to the analysis service. Our preliminary site explored LLM-based sample analysis, but time constraints led us to simplify the model using example data. Repo: https://github.com/JustinKhem27/dpap_web_interface_robotech26
Our team is very proud of how we dealt with the unexpected challenges of working with humanoid robots. After we had spent 5 hours in vain to get the humanoid walking, it was not an easy decision to take a risk and abandon our plan, instead ideating to build a brand-new robot, yet by staying true to our strengths, we were able to get everything up and running in just over twelve hours. We also believe that approaching inverse kinematics and combining it with CV was unique and non-trivial. We are proud of the breadth of skills and features we added across the software stack, from web development to control theory.
We learned much more about the intersection of typical software engineering and the applications of robotics. More abstractly, we learned to step back, reframe, and continue when we faced unexpected challenges with the platforms we were given. Finally, integrating multiple mobile systems and a web server was a unique integrations challenge.
Looking ahead, we would love to explore swarm robotics and bring in more rovers. By building small, nimble rovers that specialize in collecting and transporting samples, we can empower faster and more advanced research with specialized robotic scientists like Hercules.
We also want to bring more computing power to the rovers, allowing us to introduce localization and mapping for better decision making.
This repo is mostly Python scripts and helpers for servo control, inverse kinematics, PhotonVision/NetworkTables vision hooks, and audio feedback.
maestro.Controller: Pololu Maestro serial controller wrapper used across arm/head control.vision_manager.py::_Packet: PhotonVision rawBytes decoder helper.vision_manager.py::VisionManager: NetworkTables client for PhotonVision target info (hasTarget + fiducial ID).
arm.py: Full-body servo definitions and control helpers for legs, arms, and head; includes IK mapping, AprilTag-to-arm mapping, and motion routines.pickup.py: Sample pickup state machine that usesVisionManager+ audio cues to decide grip/throw/putdown sequences.inverse_kinematics.py: 3-DOF IK solver and servo conversion utilities.ik_config.py: Shared arm calibration defaults and angle conventions.audio.py: Offline TTS helper (Piper/pyttsx3).play_audio.py: Cross-platform MP3 playback utility.photon_test.py: CLI test for PhotonVision targets via NetworkTables.maestro.py: Pololu Maestro serial controller implementation.
tracking/head_tracking_nt.py: Head pan/tilt tracking from PhotonVision yaw/pitch via NetworkTables.tracking/head_tracking_haar.py,tracking/face_tracking_haar.py,tracking/face_detect_haar.py: Haar-based face tracking/detection demos.
inverseKinematics/ik_calibration.py,ik_manual_test.py,ik_motion_calibration.py,ik_joint_control.py: Calibration and manual control scripts for IK testing.
vestigal/: Earlier assistant/vision demos.
- Most scripts require physical hardware (Pololu Maestro, servos, cameras) and will fail without it.
- Some scripts hardcode Linux serial paths like
/dev/ttyACM*. pickup.pyandphoton_test.pyimportvision_managerfrom the repo root.
python photon_test.py
python pickup.py
python tracking\head_tracking_nt.py