Raspanion by 8Phase:
The Raspanion is a versatile and affordable computer vision carrier board designed for the Raspberry Pi Compute Module platform. Engineered to integrate effortlessly with cutting-edge peripherals such as Google Coral or Hailo-8 neural accelerators, it supports up to two cameras, two LIDARs, and four servos. Equipped with an onboard power supply, joystick, OLED display, magnetometer, and IMU, the Raspanion empowers drone, robotics, and automation projects with advanced AI-driven capabilities. Whether you're building a drone, a robotic system, or an automation solution, the Raspanion delivers seamless integration and exceptional performance for AI and vision applications.
As an open-source initiative, Raspanion is crafted to complement the ArduPilot ecosystem, fostering a collaborative environment for ongoing innovation and development. Currently licensed under the GNU General Public License v3.0 (same as Ardupilot), we plan to allow major contributors to commercialize their derivative works without restrictions in the future provided they have made significant contributions to the project, encouraging both a thriving community and sustainable commercial opportunities.
Key Features:
Low-Cost Vision and AI: Unlock advanced computer vision capabilities at an affordable price point.
Seamless Peripheral Integration: Effortlessly connect Google Coral or Hailo NPUs, cameras, LIDARs, and other sensors for a complete drone solution.
Modular Scalability: Built for Raspberry Pi Compute Modules, ensuring flexibility and hardware longevity.
ArduPilot Complement: Designed to integrate seamlessly with ArduPilot, expanding its feature set and capabilities.
Open-Source with Unique Licensing: Benefit from a collaborative, open-source community, while maintaining the opportunity for commercial productization after contributing to the codebase.
With Raspanion, 8Phase aims to make low-cost, lightweight, and accessible computer vision technology for autonomous drones available to everyone.
Here’s what’s happening in this demo (above):
Green Arrows show the optical flow from the drone’s camera, basically how the scene shifts from frame to frame.
Blue Arrows represent the gyro-predicted rotational flow. If the drone rotates, these match the green arrows, proving we’re canceling out rotation correctly.
Velocity Labels (Vx, Vy, Vz) display the drone’s linear velocities (in cm/s) in its own coordinate frame, after subtracting the rotational component from the optical flow.
Depth Map (the colorful background) is generated in real time from a single camera and single LiDAR. Need more coverage? You can add a second camera and LiDAR, too.
It's running a compressed neural network on a Hailo-8L module alongside a Raspberry Pi CM4, giving us efficient, real-time performance with just one camera and LiDAR.