Ask HN: How do robotics teams manage data and debugging today?
Hi HN,
I’m working on a project in the robotics space and would love to get the community’s perspective.
The problem I’ve seen: robotics teams generate a massive amount of data (ROS2, MCAP, OpenLABEL, etc.), but debugging and analysis often means hours of digging through logs, building custom scripts, or fighting fragmented formats. For small and medium robotics companies, this can really slow down iteration.
I’m trying to understand:
• How do you/your team currently manage and analyze robot data?
• What are the biggest pain points you face (e.g. debugging failures, comparing test runs, searching across logs)?
• Have you tried tools like Foxglove/Rerun/etc.? What works, what doesn’t?
• If there was a solution that actually made this easier, what would it have to do for you to adopt it?
I also put together a short (5 min) survey here: https://forms.gle/x57UReg8Yj9Gx7qZ8
If you’re willing to share your experiences in more detail, it would really help shape what we’re building.
We’ll anonymize responses and share the aggregated insights back with the community once we’ve collected enough.
Thanks in advance — I know this is a niche problem, but I figured HN has some of the sharpest robotics engineers, founders, and tinkerers out there. Really curious to hear how you’re solving this today.
Am willing to help with this as well. The math can be iffy.