The Inertial Sense LUNA Platform Overview – Autonomous Navigation and Localization
The LUNA Platform can elevate your autonomous unit to the next level without question. The real genius underneath this hardware is the software and the firmware that drives everything. Watch the video below about how the LUNA Platform can benefit your robotic device’s autonomous navigation and localization.
The Inertial Sense foundation is built from wanting to know where something is going, and how it’s going to get there. This information comes from a combination of cameras and sensors within the hardware, while simultaneously working with the software and firmware to optimize autonomous navigation.
The LUNA Platform helps provide a full-stack solution to your autonomous robot’s navigation and localization needs. This includes sensor fusion, path generation, and the drive control.
We help provide communication between the user and the autonomous robotic device through an app. We provide network communications of entire solutions to the cloud to do reporting and analytics. No matter the size of your needs, we have a back-end interface for it.
What separates us from others is our dedication to our customers and giving cutting-edge technology that elevates your device to the next level. Whether it’s sensors, a platform, fusion, inputs, processing, network communications, analytics reporting, or configuration, at Inertial Sense, we do it all and so much more!
This is actually what our LUNA platform looks like today. And it’s big because it is a prototype, but this would be actually much smaller in a full production version. But there’s a lot going on here.
But let’s talk about the hardware itself. The hardware that you’re looking at is mostly off the shelf. Hardware, nothing that’s proprietary except for our sensors, which are inside. The real genius is in the software and the firmware that’s driving everything. And that’s always been sort of our…I always at the heart of our DNA of what we’re doing.
First of all, it’s just the sensors inside, the localization, navigation. If you don’t know where you are, you don’t know where you can go. We’ve had many customers tell us that without that core sensor DNA, we don’t understand how you can be an autonomous company. So that’s something that is a great foundation for us to work from.
But then it comes into, ok, what are the different perceptions, inputs that are coming from the rest of the robot, whether it’s a visual slam camera, a lidar camera; it could be ticks on a wheel encoder, GPS, any number of different inputs coming in off the device itself. It’s how that gets fused within this platform, prioritized, and then turned into the proper path for the vehicle to take. That’s kind of our secret sauce. It’s the hardest part to do, but that’s, of course, what we’re best at.
So that sort of theater processing, the sensor fusion, the path generation, and the drive control, is what comes out of this box to the robot itself.
Now there’s a whole host of other things that the customer needs that we also do as well. There’s the communication with the user to the robot to configure through an app, we do that. There’s the network communications of this entire solution to the cloud to do reporting and analytics. Whether you’re looking at it from an OEM standpoint managing millions of devices in the field, or just a commercial fleet manager managing tens, maybe a hundred devices. We’ve already built the back-end interface for that.
So again, it’s going back to this notion of, to really have a full-stack solution for robots, you’ve got to have something that plugs into the customer’s business and delivers value across the full spectrum of implementation. I think that’s what really separates us. So if you sort of start from the bottom upwards, if it’s the sensors, the platform, the fusion, the inputs, the processing, the network communications, the analytics reporting, and the configuration, we do all of that, all as one solution.