Jan 25, 2018
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
The start of a new year is often time to look ahead, to set resolutions, and make bold predictions. In the New Year’s spirit, we wanted to share our crystal ball of how we envision technology playing into the future world of ambient computing in the hopes of putting our Qualcomm Developer Network (QDN) community ahead of the curve. On that same note, I want to give you an insider’s view on what we shared at our Qualcomm Snapdragon Technology Summit last month with a glimpse into how we see the future world of ambient computing.
What is ambient computing?
A truly ambient world isn’t here yet, but it’s coming. As our devices get smarter and provide more intuitive ways to interact, they will become less intrusive, facilitating a world of powerful computing all around us. We might not see it, but we will experience it. A user won’t think about technology being there, they will just interact with it naturally. That’s the promise of ambient computing.
What is the experience? How will we interact?
As developers, we know our users don’t want to compromise on their experiences with our applications. As the last decade has shown, technology gains traction quickly only if the user interface and experience is intuitive. The keyboard is not a natural mode to interact, as we witnessed how touch screens have allowed more users young and old to interact with technology. Ambient computing will take this a step further from screens everywhere to no screens while we still engage with our cars, appliances, thermostats and other Internet of Things (IoT) devices through more subtle interactions.
In ambient computing, context and human centered design are important, and require knowing what a user wants to achieve, who they are and where they are. Not only speech, but expression and body language will be a means of interaction as well as activations by presence and sensors of all kinds from chemistry of the body, temperature and motion detection. This type of sensory recognition allows for simple tasks like turning on the lights when we come home, reminding us of appointments and recommending activities based on our location and how we are feeling at that moment, to silently monitoring our blood sugar, all through seamless notifications.
How to get ahead with your development of ambient computing?
It’s already started, or at least most of the building blocks of ambient computing are already here including XR, virtual reality and augmented reality, AI and machine learning, edge computing, robotics, embedded computing and IoT, digital health and more.
In the next phase of the ambient computing journey, the mundane, repeatable tasks will already be taken care of by our bots and virtual assistants. Devices are already starting to fade into the background with the advent of voice personal assistants aided by artificial intelligence, natural language processing (NLP) and voice recognition. Wireless heads up displays (HuD) now provide stunning immersive experiences with six degrees of freedom (6DoF). Smart cameras and computer vision are being integrated into an amazing array of devices. IoT technology and wearables are being seamlessly integrated with connected devices.
It is still early in the launch of truly ambient applications and experiences, but we encourage you to keep learning and experimenting with the new technical building blocks. Try combining technologies and see how they can merge together in a seamless, connected interface.
QDN is excited to support your journey with several tools to help you innovate in ambient computing. Do you have an example of ambient computing? Let us know as we would love to showcase it on our blog.
Happy 2018 everyone!