Understanding How Robot Vacuums Navigate: And How It Significantly Impacts Your Cleaning Experience

2022-06-06 06:00:00 · Product

Robot vacuums have made significant advances over the past decade. Suction power and features aside, the key technology separating good robovacs with the “meh” ones is its navigation system.

LiDAR-navigation robot vacuums are gaining relevance over the previous random bump-and-run robot vacuums, as well as camera-based (vSLAM) robot vacuums. Just how important is navigational prowess in robot vacuums?

Simply put, a key factor determining how satisfactory your robot vacuuming experience is lies in how well it navigates. A good robot vacuum equipped with sophisticated navigational abilities will expertly clean around common home obstacles and automatically make its way back to the charging dock upon completing its run, without fuss. The point is, you wouldn’t want to be constantly “rescuing” your robot vacuum whenever it gets stuck underneath a bed, or even in an open area, albeit filled with toys strewn around courtesy of your toddler.

The whole point of having an automated floor cleaner is, well, automation!

LiDAR Navigation - Cream of the Crop?
The application of LiDAR on robot vacuums was popularized by Roborock who, back in 2016, rolled out its first robot vacuum which was branded and sold by Xiaomi. Roborock has since introduced multiple award-winning Roborock-branded robot vacuums, including the Roborock S7, which was recognized in TIME Magazine’s Best Inventions Awards 2022.

The logic behind the mass-application of LiDAR technology is straightforward - it simply is more efficient, especially when compared to the earlier inertial navigation system, where a robot vacuum randomly bumps around its surroundings and identifies obstacles upon contact. LiDAR-based robot vacuums use the reflection of laser rays to accurately identify obstacles within 6m. It plans the quickest cleaning route, while ensuring little or no misses nor repetition. A cleaning map is generated courtesy of this very technology, and customers can then customize their cleaning preferences such as setting selective zone/ room cleanings.

The application of LiDAR also means that robot vacuums can work just as well in low-light environments, so you do not need to worry about your robot vacuum giving up on itself in dark rooms!

Beyond LiDAR - Introducing Reactive AI 2.0, Roborock’s upgraded navigation technology

Roborock, with its always-eager & ambitious group of engineers, has decided that even the almighty LiDAR has its flaws and has introduced its own proprietary Reactive AI 2.0 obstacle recognition and avoidance system in Jan 2022.

Reactive AI 2.0 is applied to Roborock's new S7 MaxV series, launched in January 2022, complementing the LiDAR navigation system. It combines three elements, including a 3D structured light system, a RGB camera, and a NPU chip to help the S7 MaxV detect small objects in its cleaning path. This is something the LiDAR or even other VSLAM-powered robot vacuums are unable to achieve. As an example, the top mounted LiDAR sensor on the S7 MaxV prevents it from sensing the smaller objects on ground level as effectively as a front-facing obstacle recognition system. VSLAM-based robot vacuums, on the other hand, normally have upwards facing cameras, which makes obstacle recognition impossible.


The Roborock S7 MaxV in test


Reactive AI 2.0 - How Effective Is It?
The 3D structured light acts as a ruler, accurately measuring the distance between an object in the S7 MaxV’s path and the S7 MaxV itself. It is also able to make sense of the dimensions of the obstacles.

The RGB camera plays the role of the human eye, capturing detailed images of the obstacles, including texture and color.

Both the 3D structured light and the RGB camera feed information onto the onboard processor, powered by a NPU chip (the brain), which then allows the pre-set machine learning algorithms to kick into play. The S7 MaxV will then clean around different objects/ obstacles at an optimal distance  - meaning the less risky the object is, the closer the S7 MaxV will clean around the objects. This also means less chances of collision while at the same time, less chances of missed cleaning spots.


So while LiDAR takes care of the overall mapping efforts, where it accurately pinpoints distance and routing information for the efficient navigation of the robot vacuum, Reactive AI 2.0 complements it by recognizing and avoiding common home obstacles such as shoes, socks, and cables in the path of the robot vacuum. As the object recognition is done in real time, movement is also taken into account, such as a pet roaming around the floors.

As with all machine learning applications, practice makes perfect. There isn’t yet a perfect system, even with the level of sophistication of this LiDAR & Reactive AI 2.0 combination. Time will tell though, if this combination will eventually become the standard, as Roborock and other manufacturers race against time to refine their products.

For more information about the Reactive AI 2.0 and on Roborock’s line of products, please visit us.roborock.com

Submit a review request and get your hands on our range of products
Copyright Roborock. All Rights Reserved.