Apr 23, 2020
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
[Editor’s Note: This op-ed was originally published January 2020 in the German-language publication Elektronik Automotive. The publication and author, Thomas Dannemann, have given us permission to republish with region-based edits.]
Let us start with some everyday examples: Driving across town, you come to a stop sign. A large truck is parked on the corner to your left and a construction fence blocks your view to the right. You cannot see whether cross traffic is approaching, so you inch forward carefully, looking anxiously in both directions. What if your car could tell you when traffic is clear, so that you can proceed safely across the intersection?
Or, suppose that you leave the office to drive to an appointment. Congestion and delays cause a half-hour trip to stretch into more than an hour of precious time in the middle of your day. What if your car were a mobile device on wheels, with the artificial intelligence, natural language understanding and driver monitoring to help you use the time productively and safely?
Or, imagine that you’re an automotive designer, trying to differentiate your brand with cockpit and dashboard features. You build in reconfigurable clusters for driver information, a center navigation display, entertainment displays and a head-up display with augmented reality. You realize that more computing hardware means more battery usage and more heat.
The “what ifs” are closer than they appear
Those scenarios are not so far-fetched. In fact, the automotive industry continues to bring the “what ifs” closer to reality. Every time you get into a new car, more of the mobile experience surrounds you, including multi-gigabit LTE and 5G connectivity, electronic clusters, high resolution touch screen displays, streaming media and high-definition video.
As you watch the gradual convergence of systems for driving, infotainment and road safety, pay attention to four technology factors shaping the automobile of the future:
1. The vehicle communicates with infrastructure, other vehicles and the cloud
As described in the first scenario above, drivers never have all the information they need to drive perfectly safe, because there are too many variables. But cellular vehicle-to-everything (C-V2X) can help eliminate some of those variables.
Vehicles will soon have two transmission modes for communicating with the world around them: uplink, or car-to-cloud, and sidelink, or car-to-car. Uplink and downlink includes the kind of 4G LTE and 5G internet connections we’re accustomed to for mobile broadband. Sidelink includes connections in dedicated ITS 5.9 GHz spectrum that allow cars to communicate with one another directly, without the need for the wider, cellular network.
C-V2X is designed to connect more than just vehicles, as shown in Figure 1:
- Vehicle-to-vehicle (V2V) — Vehicles use sidelink to apprise one another of their presence; for example, approaching from a blind spot or encroaching on lane position.
- Vehicle-to-infrastructure (V2I) — Traffic lights, speed signs and roadside units (RSUs) use sidelink to send the vehicle local information needed for safe navigation.
- Vehicle-to-pedestrian (V2P) — A sidelink alert informs the driver of the presence of a cyclist approaching from the right.
- Vehicle-to-network (V2N) — Through uplink and downlink connectivity, the vehicle receives cloud-based information about construction, weather conditions and road hazards, along with indications for re-routing.
For sidelink communication, C-V2X exists on the same wireless modem chipset, such as the Qualcomm 9150 C-V2X ASIC, as an additional feature to existing telematics units. That makes C-V2X a feature that can be adopted, given antenna, processor and other hardware items.
The result of all the intercommunication is a kind of cooperative perception and maneuvering in which vehicles accumulate information from their surroundings and share it with one another, greatly increasing mobility and safety, in addition to ushering in a new world of cooperative automation.
2. Big data comes to the car — new opportunities for services in car-to-cloud
Smartphones and computers generate huge amounts of data that online marketplaces and advertising platforms can use. Similarly, cars generate big data that automotive companies can use.
As new vehicle models with cameras and sensors begin circulating, auto manufacturers are expected to have the data to update road condition maps of cities, states, and entire countries in real time. In contrast to today’s crowdsourced maps, manufacturers could be able to build real-time maps that better understand the source of problems (construction, accidents, traffic jams, etc.) and provide more accurate recommendations.
Wireless communication has long connected the manufacturer to the vehicle for maintenance alerts and software updates. By taking the next step and connecting the manufacturer to the vehicle owner, car-to-cloud presents a useful channel for promoting upgrades to existing features, along with entirely new features. Armed with internal data from sensors (such as charge or fuel remaining) and external data from surroundings (such as parking availability), manufacturers are best situated to offer automobile-related services that are tightly integrated into the vehicle systems to present to the driver and passengers.
This enables, for example, a manufacturer with a fleet of ride-hailing cars to be able to build more data and variables into its models. Knowing where its drivers are going, how the cars are performing and what traffic is like, it can better plan for how its drivers can meet passenger demand.
Traditional providers of information about restaurants and hotels gain another screen. The bigger advantage goes to manufacturers, who can now develop the insight to promote offerings tailored to a car in motion. Naturally, manufacturers should follow a couple of today’s guidelines:
- They should honor data privacy around the personally identifiable information (PII) they collect in providing services to car owners and drivers. Data used for swarm intelligence should remain anonymous.
- They should ensure that drivers, in treating their car as the next screen, should be able to move between their vehicle and their other devices (smartphone, tablet, PC, television) easily and without interruption.
3. The digital cockpit — much more than a dashboard, radio and speedometer
The analog instrument panel continues its evolution toward a fully digital cockpit, including up to a door-to-door series of screens across the front of the passenger cabin that convey information from downlink and sidelink connections (see Figure 2).
In the short term, the cockpit will continue to deliver information traditionally used by the driver — speed, charge or fuel remaining, engine temperature, oil pressure, engine check. In time, video from external cameras is expected to appear on the display, replacing external mirrors and resulting in lower wind resistance and better aerodynamics. The main instrument cluster may be user-configurable so drivers can see the information they want, where they want to see it. Combining information from outside with real scenes of the car ride will create a new way of driving cars: augmented reality will enhance the drivers view like that observed by a front facing camera and overlaid with route guidance and surrounding information on a single screen either combined in the cluster display or projected on a HUD (heads-up display).
Then, with more progress toward autonomous driving, the car could become an extension of the driver’s digital life, in both business and leisure. Drivers and passengers may grow to expect the same user experiences (i.e., apps, media, assets, and content) in the car that they get on their other mobile devices. Displays in the digital cockpit are expected to feature the kinds of information and communication now associated with smartphones, tablets, and PCs, as well as high-resolution displays to accommodate television content and movies.
Auto manufacturers and other 3rd party companies and providers will likely run increasingly varied content in the digital cockpit. As mentioned above, manufacturers can offer content related to driving and safety because of their proximity to sensor data. When a battery fails or a tail light goes out, a manufacturer is in a better position than an internet company to address the problem. Or consider a manufacturer’s app that pools side sensor data from several vehicle brands to help drivers find available parking spaces downtown.
In a dangerous situation, the digital cockpit should take over the control of a semi-autonomous vehicle and deliver safety-relevant content to the driver. From the overall perspective of design, manufacturers are better suited than internet companies to ensure that level of reliability.
4. The growing role of AI as the auto industry evolves
The automotive industry has found that making vehicles that can drive themselves is not very difficult. However, making vehicles that can drive themselves safely is exceptionally difficult.
At all levels of automation, applications depend on multiple sensors that are fused together. The main sensors include a camera, radar, and lidar, which jointly provide data the vehicle uses to make predictions about safe behavior. The holistic perspective shown in Figure 3 supplements the sensors in the technology package with innovations in localization and C-V2X.
The way forward in AI for automobiles lies in applying machine learning techniques to different sensors and tasks. In radar perception, a combination of convolutional neural networks (CNN) and long short-term memory (LSTM) detects and classifies objects, then computes their distance, speed, and direction of travel. Camera perception runs through CNNs and temporal attention-gated models (TAGM) to detect and classify lane markers and interpret the status of nearby brake lights and blinkers.
What does AI look like to drivers and consumers? It spans a wide range of applications for smarter, safer driving and advanced driver assistance systems (ADAS):
- Natural voice recognition: Understanding the driver’s oral commands
- Collision avoidance: Detecting unexpected objects in the road, then alerting the driver or braking
- Lane departure warning: Activating haptic feedback in the steering wheel or displaying messages when the vehicle errs from its lane
- Driver monitoring: Tracking concentration, sleepiness, distraction, and inattentiveness
- Machine learning (inference): Detecting street signs, road hazards, construction warnings, and oncoming vehicles
- Adaptive cruise control: Maintaining a safe distance at high speeds
- Highway co-pilot: Following traffic, passing slow vehicles, and returning smoothly to a lane
Can a car drive itself the way a human can drive it? AI brings human-like autonomous driving into the realm of the conceivable, with behavior planning based on convolutional-social pooling for spatial interactions and LSTM for temporal patterns. Also, behavior planning based on reinforcement learning techniques aims at human-like driving that balances safety and assertiveness.
The future is speeding toward drivers enjoying greater safety, vehicles with increased display areas showing drivers more information about conditions around them; automotive engineers designing for more processing power with a lower thermal profile; and manufacturers and other companies realizing more opportunities to offer content ranging from vehicle control to infotainment.
To envision the automobile of the future, think of the intelligence, connectivity and data behind a smartphone, tablet, PC, and television on wheels. Now apply those elements in the context of safety. Imagine that vehicle moving through its environment and communicating with other vehicles. Add in all the data in the cloud. Combined, it all makes the journey safer, more convenient, and more enjoyable for the driver and passengers.