Not since Henry Ford’s Model T revolutionized transportation has the auto industry seen so much change. What’s driving this revolution? Technology, certainly. But more than that, it’s a growing need for safety, efficiency and convenience.
Shared mobility services (ride-sharing companies like Uber, for example) are already in use and changing the definition of car ownership. And the race toward automated driving is heating up, with trials already underway in the U.S., U.K. and multiple Asian countries. At the same time, innovation continues to transform the car as we know it.
Simply put: Cars are getting smarter. They’re becoming hubs of intelligent sensors that constantly generate and exchange information with the cloud. Every car will be outfitted with hundreds of sensors from cabin and outward-facing cameras to radars, LIDAR (light detection and ranging), V2X (vehicle to everything) connectivity, and GNSS (global navigation satellite system) technology. Tons of data can be collected from these sensors — up to 11 petabytes per year by 2020.
This transformation will not just affect the automotive industry. Everything from a city’s infrastructure to the lives of its inhabitants can change and smarter transportation systems can play a key role in the evolution of our cities into sustainable, efficient, convenient and cooperative places to live.
Sensing our surroundings
Safer, more efficient driving isn’t just about who’s driving (or isn’t). What’s happening around us is important as well. Today, Waze users share real-time traffic and road conditions, including construction delays and accidents to assist other drivers.
As connected cars get smarter and are able to see and understand traffic and road conditions, massive numbers of intelligent sensing vehicles will be able to provide real-time road condition updates on their own — generating rich, high quality maps. This will be key for autonomous driving, as driverless systems will require high-definition maps that are much more precise, frequently updated and richer than what we have today. Instead of sending out a dedicated fleet of mapping cars today, vehicles can use an array of sensors, cameras, high-precision positioning and on-device machine learning to create the most accurate and detailed maps. Connected cars will then become live sensors that can deliver alerts about hazardous conditions as well as detect traffic warnings, lane markers, cracks or potholes in the road, and construction zones — building a detailed map of the road that’s updated in real time. The cars can even highlight improvement areas in infrastructure such as a particularly dangerous intersection or an area prone to accidents.
Safer driving through data
The key to building a safer car is understanding drivers’ behavior — how they react to varied road and driving conditions, for instance. This information is vital for everyone from fleet management and shared mobility companies to insurance firms and makers of autonomous cars.
For example, cameras will be able to capture if a driver is tailgating or seems distracted, and this data would then be passed through the network, or to other vehicles on the road via V2X, to caution those in the vicinity. In addition, shared mobility companies can see improved driver safety, lower costs and minimized risk exposure by getting a comprehensive view of their fleets’ driving behaviors. This information, in turn, would allow for more thorough risk assessment.
Collecting this data and understanding the minds of drivers is also becoming increasingly important when it comes to autonomous driving. One of the best examples of this is what’s called the “yield problem.” Today, autonomous car prototypes might wait for long intervals at intersections before finally making a left turn on yield. Monitoring and observing how human drivers handle such situations will be very important. Future autonomous cars would benefit from the learnings provided by the human driver data and personalize it for individual car owners too.
Introducing Qualcomm Drive Data Platform
As we embark on a new era in our automotive history, the “whys” and “whats” are pretty clear. What we have yet to fully grasp is the “how” — until now. To help get us to this connected and autonomous future, we’re introducing the Qualcomm Drive Data Platform. The platform is designed to intelligently collect and analyze information from a vehicle’s sensors. This means these smart cars can determine their location up to lane-level accuracy, monitor and learn driving patterns, perceive their surroundings, and share this reliable and accurate data with the rest of the world.
Qualcomm Drive Data platform is built on three pillars: heterogeneous connectivity, precise positioning and on-device machine learning, all integrated into the Qualcomm Snapdragon solution.
The platform uses a Qualcomm Snapdragon 820Am automotive processor. With an optional integrated X12 LTE modem capable of up to Cat 12 speeds, the Snapdragon 820Am is designed to deliver the next level of intelligence and mobile connectivity (e.g., 802.11p, Wi-Fi and Bluetooth LE).
It is designed to fuse data from the vehicle’s camera feed with inertial sensors and navigation data, optimized for precise positioning up to lane-level accuracy even in challenging environments such as urban canyons. Additionally, Qualcomm Technologies is offering a deep-learning software development kit (SDK). The SDK, called the Qualcomm Snapdragon Neural Processing Engine (SNPE), is engineered to use the Snapdragon processor’s heterogeneous compute capabilities to provide auto makers a powerful, energy-efficient platform for delivering the next level of automotive intelligence.
We’ll be back with more technical details on the Qualcomm Drive Data Platform. In the meantime, come see our demo at CES 2017, in North Hall #5609, showing map crowdsourcing and critical safety alert use cases. We are also showcasing the Qualcomm Drive Data platform with Nauto partnership for fleet based use cases