How BMW and Malong used edge AI and machine learning to streamline warehouse and checkout systems – VentureBeat

  • Lauren
  • July 18, 2020
  • Comments Off on How BMW and Malong used edge AI and machine learning to streamline warehouse and checkout systems – VentureBeat

VB Transform

Watch every session from the AI event of the year

Watch Now

Watch all the Transform 2020 sessions on-demand right here.

During a panel today at VentureBeat’s Transform 2020 conference, speakers including BMW Group’s Jimmy Nassif, Red Hat’s Jered Floyd, and Malong CEO Matt Scott discussed the challenges and opportunities in AI with respect to edge computing and IoT. While each came from a different perspective — Nassif from robotics, Floyd from retail — all three were in agreement that AI has the potential to accelerate existing work while enabling entirely new capabilities.
BMW produces a car every 56 seconds, Nassif says. Millions of parts flow into the automaker’s factories from over 4,500 suppliers involving 203,000 unique parts numbers, which translates to about 100 end-customer options. (99% of orders are completely unique.) As BMW’s car sales doubled over the past decade to 2.5 million in 2019, this created a logistics dilemma — one that was solved in part by Nvidia’s Isaac, Jetson AGX Xavier, and DGX platforms. Nassif says BMW is tapping them to develop five navigation and manipulation robots that transport materials around warehouses and organize individual parts.
“The most important thing we do is bring cars to our customers in the cheapest way we can do it, but with better quality. We need to implement new technology like AI and robotics in order to … support our people on the production line to do their job easier and faster to produce more cars,” Nassif said, adding that BMW hopes to have the five manufacturing robots in production by the end of 2021. “I won’t say it’s easy to convince leadership to adopt these technologies, but it isn’t difficult. They understand the importance of it.”
BMW’s robots — two of which have been already been deployed in four factories in Germany — perceive the world around them using computer vision techniques including pose estimation. Courtesy of algorithms trained on both real and synthetic data, they’re able to recognize specific parts as well as people and potential obstacles (even occluded obstacles) in a range of challenging lighting conditions. To bolster accuracy, the algorithms are continually retrained in Nvidia’s Omniverse simulator, which BMW engineers from around the world can log into remotely.
Malong applies machine learning to a different problem: recognizing products at retail self-checkouts. Cameras overhead feed footage of objects on scanning beds to algorithms that spot accidental or intentional mis-scans. Malong’s technology looks for common issues like occluded barcodes and products left in shopping carts as well as “ticket switching,” where a product is scanned with a cheaper barcode lifted from another, dissimilar product.
Malong’s algorithms — which run on on-premises Nvidia hardware — are trained using weakly supervised learning, enabling them to learn to distinguish among products from noisy, limited, and imprecise signals in video feeds. Once they detect an issue, the company’s platform alerts a staff member, who confronts the offending customer.
Edge computing comes into play here because of the privacy implications of storing closed-circuit footage in the cloud, Scott says. But edge computing also makes Malong’s platform highly scalable and cost-effective, such that it’s able to span thousands of stores without the latency that might be introduced by server-side processing.
“Making an AI system scalable is very different from making it run,” Scott said. “That’s sometimes a mirage that happens when people are starting to play with these technologies.”
Regardless of the use case, Floyd emphasized the importance of open platforms with respect to edge computing and AI. Popular frameworks and programming notebooks like TensorFlow and Jupyter are open source, he noted, as are container orchestration systems like Kubernetes. “With open source, everyone can bring their best technologies forward. Everyone can come with the technologies they want to integrate and be able to immediately plug them into this enormous ecosystem of AI components and rapidly connect them to applications,” he said.
Red Hat facilitates this collaboration with Open Data Hub, a platform-agnostic blueprint packed with tools that deploy an end-to-end AI platform. It’s the foundation of the company’s own data science software development stack, and it’s designed to help engineers ideate AI solutions without incurring high costs or having to master modern machine learning workflows.
“This allows rapid innovation of new applications and new technologies,” Floyd said.