Oct 28, 2019
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
Car factories have come a long way since Henry Ford’s assembly lines. Overall, the manufacturing industry has gotten smarter, safer, and more efficient. And now, more than 100 years after Ford invented the assembly line, factory operations are being transformed again with the integration of artificial intelligence (AI).
Whether it’s your next international trip, your pursuit for wellness, or your everyday responsibilities on the line, on-device AI can make user experiences more intuitive and productive. That’s because it can process massive amounts of data on the device, enhancing your experiences while also avoiding the need to send data to the cloud. By processing AI on the device, platforms can preserve privacy, reduce latency, and increase reliability while saving power and network bandwidth.
Imagine how this technology could be used within an auto factory: A smart security camera could provide you with secure access to the parking lot. As you begin the day’s work, your augmented reality (AR) glasses could overlay guided instructions on your field of vision at your work station. Your factory’s automated guided vehicles (AGV) could help you move car parts. And robotic arms powered by on-device AI could remove faulty parts from the assembly line. These are just a few ways on-device AI could enhance today’s factory.
When you arrive at work, you pull up to your factory’s security gate and an AI-powered enterprise security camera scans your face and your car’s license plate, confirming your identity and providing you access to the grounds. You enter quickly and securely.
How it works: Your factory’s security camera could be powered by the Qualcomm Vision Intelligence 300 Platform, a reference design created for IoT applications like surveillance for infrastructure and industry. The platform is based on the Qualcomm QCS603 system-on-chip, one of the first Qualcomm Technologies SoCs designed specifically for the IoT.
The Qualcomm Vision Intelligence 300 Platform uses deep learning and data collected from the camera to recognize faces and objects, such as license plates. Through on-device image processing and AI technology, the platform can analyze the camera data with virtually no lag and without consuming excessive amounts of power. This means it can successfully match your face and plate with those in its system almost instantly, so you’ll be able to head into work securely and on time.
After making your morning coffee and greeting your coworkers, you head to your workstation, put on your AR glasses, and start installing a car’s tail lights into a rear fender. Thanks to the glasses’ embedded technologies, you’re able to use AR as a guide. Instructions are overlaid onto your view of the fender in real time, noting where you’ll need to torque down certain parts and double-check placement.
How it works: As you work on the car’s rear fender, the AR glasses you’re wearing could capture data from their stereo cameras, a depth camera, gyroscope, and accelerometer. The data is then sent to the glasses’ integrated Qualcomm Snapdragon XR1 Platform, which processes the information on the device. Using on-device AI, 6-DOF head pose generation, gesture recognition, and simultaneous localization and mapping (SLAM), the platform can accurately recognize the specific parts you’re working on, understand the environment, and realistically superimpose any relevant instructions over the scene in real time.
Because the Snapdragon XR1 Platform efficiently processes AR and VR workloads on the device, it can render and track objects with low latency and without consuming too much power. It can also enable true-to-life and intuitive AR. This means the AR glasses are aware of your surroundings and field of vision, so the digital instructions can adapt to your position as you move.
In addition, AI could be used for predictive machine maintenance, so the factory line is virtually never stalled. Factories could collect massive amounts of sensor data about their machines and the machines’ health, and develop AI models that accurately predict when a machine is close to failing. This early warning could alert your AR glasses, which could help in the process of diagnosing and applying maintenance. For complicated maintenance procedures, an expert in a remote location could connect with your AR glasses, see through your field of vision, and guide you through the process of fixing the machine.
While you’re working, an AGV delivers ignition coil connectors to the interior department of the factory. It “sees” any obstacles in its path and finds its way around them. It also notes the location of other AGVs to avoid any traffic congestion. If the AGV needs help, it can even send an alert to your XR glasses.
How it works: Your factory’s AGVs could act as autonomous vehicles, using on-device AI, the factory’s on-premise compute, cellular vehicle-to-everything (C-V2X), and data captured from their camera and radar sensors to transport supplies around the facility. If an object is in their way, the AGVs could detect it using computer vision. They could also use machine learning to distinguish between the different sections of the factory and create a precise map of its environment.
With a 5G NR C-V2X connection, the autonomous vehicles could even communicate with each other to avoid potential collisions; the AGVs could share rich sensor data with each other, so not only will you know where they are on the factory floor, but they’ll also know each other’s locations. They could also use this connection to communicate with you, sending notifications when problems arise.
On the factory floor, you work alongside autonomous robotic arms. Because of their reliability, they’re counted on for a variety of tasks, including deciphering defective parts and picking them out, so that they’re not installed into the cars. The AI technology that’s integrated into their software also ensures the robotic arms learn from past mistakes, increasing the factory’s efficiency.
How it works: With the help of on-device AI, robots in the factory could perform meticulous operations that are programmed into their software and course-correct if any changes occur. In order for this to successfully occur, an AI model could be developed for the task. Data could then be collected from the robot’s sensors, such as integrated camera sensors, and entered into the model for analysis. The data from successful outcomes could also be evaluated for supervised training.
For this specific task, the robot could swiftly identify particular car components and determine each component’s function. If it spots any wrong or defective auto parts on the production line, it could pull them out. As the robots continue to work in the factory, their on-device AI capabilities would improve over time through distributed learning until your factory runs, well, like a well-oiled machine.
As your work day winds down, you check to see if you and coworkers reached the day’s quota. Thanks to the on-device AI that powers the devices and machines in your factory, you managed to stay productive and exceed your company’s expectations.
Qualcomm Technologies engineers AI solutions that can help create entirely new industries, business models, and experiences. With on-device AI, we’re working to create smarter industrial applications that push the boundaries of what’s possible for you and your machine counterparts.