Robots clean up after Covid-19

Robots clean up after Covid-19
Feature articles |
Phil Duffy, vice president of Innovation at Brain Corp, talks to Nick Flaherty about the development of robots that can be used in public spaces
By Nick Flaherty

分享:




机器人提供变革价值,但直到现在已降级为严格控制的操作环境。圣地亚哥的Brain Corp创建了AI软件,系统开发人员用来构建可以在零售商店,机场,医院等公共室内空间中安全有效地导航的自动驾驶机器。

Demand has grown as the result of the Covid-19 pandemic for robot cleaning systems. The company raised $36m (€30m) in April to expand its AI technology globally, bringing the total investment to over $160m (€135m). The robots are trained by operators, once, to follow a route.

“The world of fully autonomous robots has only been around at scale for the last five years. We started with neuromorphic computing research and we looked at how the brain processes vision and how the brain learns and that gave us a technology that could be applied to robotics to solve navigation in complex environments,” said Phil Duffy, vice president of Innovation at Brain Corp.

“We have over 14,000 robots out in the industry which we believe is the largest fleet operating in public environments, retail, airports,” he said. “The retail and the cleaning industry adopted robots earlier than anyone one so they were semi-prepared for Covid-19 – we have seen 133 percent increase in usage during daytime hours as result. That leaves staff to sanitise the areas that robots cannot.”

The key is the data showing where the robot has travelled, and customers set their own compliance levels to show that areas are clean.

“One of the reasons we went into retail space is its complex. Cleaning, inventory delivery and scanning – if we can solve retail then we can go into area, and it’s very scalable. When we look at robots today, where autonomous robots navigate such as material handling or cleaning, and then into mechanical arms for industrial automation – there’s a huge opportunity for robots,” he said.

下一步:自治的培训机器人


The technique uses the operators to train the robots, showing them the routes to travel once so the routes can be repeated. “The flow of how the machines are used is integrated into the process of the store itself so we use the operator as the domain expert.”

“What allowed us to do that was a very machine learning environment – that approach of solving individual problems was the way we went to market,” he said. “We have travelled over 4 bn sq m over 3.1m autonomous hours and 4.2m km – it’s the coverage of unique footage we see the edge cases from the network effect of large robotic fleets, the larger the data set, the more edge cases you solve.”

One example is where a supermarket used infrared heaters. When these were switched on, the heaters washed out the sensors. Another is ghost pixels. The robotic platform is visual and if it sees there’s obviously a blockage but not connected to anything, the chances are it’s a ghost pixel, a reflection, especially if its consistently in one area. This usually means the machine would stop because it saw something, as filters for reflections don’t work.

Then there are the really odd cases where it takes human eyeballs to solve the problem. At one store in the mid-west of the US, a robot kept stopping for no apparent reason. It turned out that the doors of the store were opening and rubbish was dancing in the air, which the system saw as an obstacle. These edge cases, such as reflections from different materials, objects sticking out form a shelf, different shopping carts, help to improve the AI framework.

这个用于清洗保持存储清晰of Covid-19. Some robot systems are using UV-C light for this, but the systems supported by Brain Corp are geared to chemical disinfectants.

“The problem with UV-C is the dwell time is very slow, so even if it is high power it has to sit there for a long time, and it doesn’t clean everything. Chemical disinfectant starts to sanitize the moment it drops, and it fits with a mobile robot. Driving through a retail store at any speed means the dwell time isn’t there.”

人工智能平台,胸罩inOS, is designed to run on off the shelf hardware and multisensor arrays using cameras and lidar. The controllers are a mixture of Intel-based and Qualcomm’s ARM-based Snapdragon chips.

For the sensors, the platform uses a single lidar or a double lidar from German supplier SICK on large systems that need a range of 10m. “We can automate off a single camera, but we do a full safety analysis , size, weight, speed, the environment,” said Duffy. “We use the lidar for mapping and SLAM and navigation and also people detecting. Then there is a slanted lidar to create a ‘virtual bumper’, as well as three time of flight (ToF) cameras. All these sensors are combined in mapping and navigations with data that overlaps to map the space through training once to create the routes.”

“我们有能力将手机与计算机配对以选择路由 - 如果完成或遇到问题或被阻止,则将带有图片的文本发送给操作员。它可以在本地重新布局,它可以使用基于规则的方法来计划返回路线。我们正在研究更新地图的能力,例如,如果存在常规阻塞。我们现在正在开发区域填充 - 驱动外围,机器人计算填充清洁区域的最佳途径。”

“The Brain Compute Module is a kit for OEMs that includes everything down to the wiring harness,” he said, “We work to integrate that into the machines and we run the pilot manufacturing line, calibration and end of line systems and move that to the OEM for manufacturing and we set up their manufacturing process, their quality systems

This is based on Ubuntu Linux with the BrainOS security layer then full stack with navigation primitives, then a set of capabilities for the user interface (UI) and cloud connections via various networks, including 5G.

“We built our own hardware abstraction layer (HAL) and we take a platform approach to firmware for the sensors and motor drivers so we are hardware agnostic,” he said.

“Latency is critical to the safety certification so that is all handled with on board processing,” he said. “Even with 5G certain operations may off board but not safety critical. There is the low level deterministic logic for safety perception, which is black and white, the machine is either on and off, it stops without having to process high level logic. At a higher level it cross checks the sensors, handles the movement detection etc – that’s what allows us to avoid the latency in the processing.”

Then there is the opportunity for integrating the frameworks with smart buildings.

“The sensor systems are built in, there is an ability for robots to perform these functions with the focus on retail, for inventory scanning, add scanning towers, pricing, look at lights out operation, checking occupancy levels in the office, temperature control, WiFi and LTE coverage – that’s the next wave we see,” said Duffy.

www.braincorp.com

Related articles on robots

Other articles on eeNews Europe

链接的文章
eeNews Europe
10s
Baidu