Moe at the robotics workstation

I am building an Edge AI inspection robot, and the first milestone is complete.

Perception is a core pillar of modern robotics, and today it is inseparable from machine learning. The real challenge is not just training a model. It is getting a custom, lightweight model trained quickly and running reliably on real hardware.

I have now integrated Edge Impulse object detection models directly into the ROS 2 ecosystem.

What is happening in the demo:

  • An Edge Impulse .eim object detection model runs inside a ROS 2 node
  • The node publishes vision_msgs/Detection2DArray messages with bounding boxes
  • A second ROS node consumes those detections
  • The robot pan-tilt actively tracks and points toward the detected object

This is a strong step toward ROS-native Edge AI: fast iteration, embedded-friendly models, and clean integration into real robotic systems.

More updates coming soon. I will share deeper technical details and source code shortly.

Where do you think Edge AI matters most in robots?