TBP 🤖 Robot Hackathon May, 2025
We decided to build and test real robots as a critical reality check for Monty.
✅ Does our simulated system translate to the physical world? Yes.
🚫 Is Monty intuitive and plug-and-play for robotics? Not yet.
Due to, well, understandable reasons, some team members weren’t comfortable flying to the US, and others weren’t comfortable leaving it, we split the teams.
Half gathered in Redwood City, the other half in Crete 🇬🇷.
While we missed the full in-person energy and spontaneous team moments of past retreats, the experience was still meaningful, trust-building with a lot of lessons learned.
Meet the Teams!
The three teams were
Everything is Awesome
The Lego and RaspberryPi platform team comprised of:
- Tristan Slominski,
- Ramy Mounir
Ultrasound Perception
The ultrasound team, comprised of:
- Viviane Clay,
- Niels Leadholm,
- Will Warren
Icarus
The drone team comprised of:
- Scott Knudstrup,
- Hojae Lee,
- Jeremy Shoemaker
Lego Platform
The Everything is Awesome team built a robotic platform out of Lego. The platform had a tower that housed a sensor that could move up and down and take RGB images of the object sitting on the platform combined with information from a depth sensor which was collated by a RaspberryPi.
Pose and Location
Monty needs to know the location of the patch being sensed in order to recognize objects. The lego team had a platform that rotated the object and a sensor that moved up and down, so patches sent to the Monty system needed the coordinates of the sensor and the depth of the patch, along with the surface normal of the object being sensed.
Ultrasound
Thanks to the generous support of Butterfly Network, we had the opportunity to work with the iQ3, a handheld, iPad-connected ultrasound device that’s compact, ergonomic, and powerful.
Ultrasound imaging, while invaluable, typically demands extensive and expensive, training to interpret what you’re seeing and know where to scan next.
Our goal was to build a proof of concept where Monty becomes the intelligent system behind both object recognition and the next-best-view strategy, guiding where to move the probe to rapidly disambiguate what’s being scanned.
Butterfly iQ3 Ultrasound
The cross section of a mustard bottle
Pose and Location
To guide Monty’s learning we need precise location and orientation data from the sensor, in this case, the ultrasound probe.
Our solution was to attach a Vive tracker directly to the probe, delivering accurate, real-time 6-DoF pose and location data. This gives Monty exactly what it needs to understand where the sensor is and its orientation in space.
Patch Processing
Ultrasounds are pretty hard to read, fortunately, we had our very own medical doctor, Niels Leadholm who understood how to read an ultrasound image, which parts were important, what parts should be ignored.Â
Captured images from the probe were sent to Monty for processing.
Here, you can read a more in-depth document of the patch processing.
Drone
The drone team acquired a DJI Tello drone, that could be flown via an API and had a front mounted camera. Â
DJI Tello Drone
Tello Drone (vent for scale)
Pose and Location
The drone system does not have any location information nor does the camera supply any depth information. To figure out the drone’s location ArUco markers were used, and the system used an image to depth-map transformer to get depth information for the sensed object.
The Winner!
With an astounding success rate, the team Everything is Awesome team, took 1st place in our first ever Robot Hackathon.
🎉 This watershed moment is the first time Monty has moved its sensors in the real world. 🎉
Congratulations to Ramy and Tristan!
Their prize was a robot potted meat can!
You can watch all the hackathon robot presentations in this youtube video:
Getting Involved
The robots project was a lot of fun. We learned a lot about robots, Monty, and ourselves in this robot journey. Â
All of the projects are available on Github for you to use as a foundation to build your own robotics. We’d recommend starting with the Everything is Awesome project as they got the furthest in terms of object recognition, and the learning curve is a little bit steeper for drones and ultrasounds.
Take a look at the Project Showcase section of our documentation.
We also have robotics tutorial available in the documentation here: https://thousandbrainsproject.readme.io/docs/using-monty-for-robotics and you can chat with us about your ideas for Monty and robotics over on our forum – https://thousandbrains.discourse.group/