May 4, 2015
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
We all know our smartphones do lots of useful things for us—like keeping our friends and family up to date on activities, collaborating with co-workers, managing our time, mobile shopping, and keeping up with news that’s relevant to us.
We can also use our phones to learn a new language, play Sudoku, capture a memory, watch videos, graph a Lissajous, share thoughts, or one my favorites: find nearby sushi. The catch is, of course, that often times doing a lot with your device still takes a lot of doing, as we still need to configure and update and manage the device.
You’d think that greater connectivity and computing capability would make our devices so much smarter, simplifying our lives and allowing us to worry only about important things. Instead, as our devices have become more integrated into our lives and more connected with one another, they run the risk of overloading us with the tedium of administrative tasks. For the smartphone to be truly smart, it needs to understand us, anticipate our needs, and respond accordingly. The word for that might be intuition, and Qualcomm is developing cognitive technologies designed to create new intuitive experiences and natural interactions with our devices.
Cognitive technologies require more than just a more powerful mobile processor. They’re about gathering contextual cues, and processing them using learning networks, and in some cases algorithms actually inspired by the human brain.
Over the years, each advancement made to mobile processors, user interfaces, and cameras—each additional sensor and every updated algorithm—has paved the way for the next, enabling unexpected applications and new types of mobile experiences. Today, for example, the same innovations that let your device’s camera recognize faces will also help autonomous cars, robots, UAVs, and other smart devices to perceive and engage with their surroundings like never before. Innovation is a two-way street. Updated cameras, sensors, and visual processing mean robots and our smartphones get “smarter” about how they perceive the world.
Fully realized cognitive technologies—machine learning, computer vision, always-on sensing, intelligent connectivity, indoor/outdoor position location, and contextual awareness at low power—will allow mobile devices to sense their environments and respond accordingly. Eventually, they’ll use that information to improve our everyday lives with intuitive interactions that personalize our mobile experiences and extend our abilities.
Imagine if, ever the self-starter, your device automatically switches to silent mode in meetings, begins taking notes, knows if you are on time or late, and displays relevant information by using always-on listening to stay on topic. Or, while you’re traveling, might take into account what’s actually happening to your schedule right now, and offer a quick prompt to adjust your reservations, taxi, etc. to keep you going. And wouldn’t it be nice to be able to go straight to your hotel room and confidently and securely unlock the door, skipping the usual registration process if you prefer. Qualcomm is working to enable exactly that.
Scenarios like these aren’t as far away as you might think. The first wave of on-device intelligence is just around the corner. In fact, the next generation of devices built around Qualcomm Snapdragon processors will be optimized to support the cognitive capabilities of the Qualcomm Zeroth platform, just announced in March at Mobile World Congress.
To see an example of our work in “brain-inspired” computing and deep learning, check out the Snapdragon Rover, a robot that can be taught to identify and sort objects based on guided instruction (AKA “deep learning”), rather than detailed programming.
Machines that are able to see, hear, understand, and appropriately assist us will be much more useful to us in our daily lives than those we’re using now. For example, picture autonomous and semi-autonomous vehicles in constant communication with each other. We envision a day when autonomous vehicles will help reduce traffic and emissions by driving more efficiently, and avoid potential accidents that human drivers would never have seen coming. In fact the potential to dramatically improve transportation safety is very real, as the majority of accidents today result from driver error.
Or imagine firefighters barreling through city streets toward a blazing building, as autonomous UAVs are dispatched to quickly map the building’s interior. They’ll locate people trapped inside and guide them to the safest area to wait. This sort of assistance would be invaluable to first responders, and could save lives. In fact there are already drone projects taking place right now where these types of intelligent devices are helping to improve safety.
Whether in phones, vehicles, medical devices, or advanced robotics, cognitive technologies will help our devices expand our human abilities, adapt to us (and our environment), and interact with us in more human-like ways. By understanding our individual routines and behaviors, our technology can begin to take anticipatory actions on our behalf. The goal is to streamline the steps it takes to complete any given task—by putting cognitive (intuitive) technologies to work for us
In a world where devices perceive their environment, hear, and respond for themselves, your everyday life will be simplified and enriched.
Find out more at Why Wait.