AI Drives Autonomous Mobile Robots

| Manufacturing

AMRs benefit from Visual SLAM to register both their fixed as well as variable surroundings

Localisation, mapping technology and 3D vision provide the mechanism for the next stage of autonomous mobile robots

ABB Robotics has transformed its Autonomous Mobile Robots (AMRs) with the addition of Visual Simultaneous Localisation and Mapping (Visual SLAM) technology, enabling its AMRs to make intelligent navigation decisions based on their surroundings.

Using AI-enabled 3D vision to perform location and mapping functions, ABB’s Visual SLAM AMRs make production faster, more flexible, efficient and resilient while taking on dull, dirty and dangerous tasks so people can focus on more rewarding work.

AI combines with 3D vision

Visual SLAM combines AI and 3D vision technologies to guarantee a superior performance in comparison to other guidance techniques for AMRs. Offering significant advantages over other forms of navigation such as magnetic tape, QR codes, and traditional 2D SLAM that require additional infrastructure to function, Visual SLAM AMRs are being embraced by companies to handle an expanding range of production and distribution tasks.

According to Marc Segura, President of ABB Robotics Division, the introduction of Visual SLAM AMRs radically enhances companies’ operations, making them faster, more efficient and more flexible, while freeing up employees to take on more rewarding work.

“Offering more autonomy and intelligence, our new AMRs operate safely in dynamic, human-populated environments. Visual SLAM technology provides a new level of intelligence for AMRs that transforms robotic applications, from production and distribution through to healthcare,” he says.

Visual SLAM technology

Visual SLAM uses cameras mounted on the AMR to create a real time 3D map of all objects in the surrounding area. The system can differentiate between fixed navigation references such as floors, ceilings and walls that need to be added to the map, and objects such as people or vehicles that move or change position. The cameras detect and track natural features in the environment enabling the AMR to dynamically adapt to its surroundings and determine the safest and most efficient route to its destination. Unlike 2D SLAM, Visual SLAM requires no additional references such as reflectors or markers, saving cost and space and offers accurate positioning to within three millimeters.

By eliminating the need to change the environment, stop production or add infrastructure, Visual SLAM technology helps to reduce commissioning time by up to 20 percent compared to 2D SLAM, significantly reducing the time needed to introduce a new AMR into the existing fleet. The technology can be used at scale with fleets updated remotely. The technology is also secure, as it analyses raw data only, with no visual images saved on either the AMR or on a server.

SLAM robots go into production

ABB was able to develop its Visual SLAM AMRs through working in collaboration with partner Sevensense Robotics, a specialist technology company dealing with Artificial Intelligence and 3D visual technology.

The technology developed by Sevensense Robotics is being incorporated into ABB’s latest generation of autonomous mobile robots progressively over the next two years. It will start with the AMR T702V, which will be available from the third quarter this year. The AMR P604V will be available later in the year and these will be followed by other AMR products incorporating Visual SLAM which will be rolled out during both 2024 and 2025. The Visual SLAM technology is already deployed in industrial projects for customers in automotive and retail, with the potential to replace conventional production lines with intelligent, modular production cells served by AMRs.

Jonathan Newell
Latest posts by Jonathan Newell (see all)

Related news

Read More News From ABB: