OnQ Blog

The personification of things [VIDEO]

19 août 2014

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

Makers of high-tech gear and providers of online services have been touting personalization for more than a decade. You can set up your computer or smartphone just the way you want it. The online shopping sites you frequent remember what you looked at, what you bought and make suggestions about things you might be interested in. These are basic examples of what Stan Davis, in his seminal 1987 work, Future Perfect, termed “mass customization.” Nearly 30 years on, it is axiomatic that, user experiences are enhanced when those experiences are tailored to the preferences of the individual. In the age of the Internet of Everything, this concept will gain new impetus through what I call "the personification of things."

We’ve already seen one-offs. Remember Furbies? Using sensors, these toys could detect people and each other and, over time “learn” real languages. This led to some amusing scenarios. A friend of mine bought Furbies for his kids back when they were all the rage. After the novelty wore off, the Furbies were banished to the closet where, occasionally they would wake up and riff off each other, conducting entire nonsensical conversations locked away from any human interaction. Today, we have Siri and Google Now providing iPhone and Android phone users respectively, a voice interface with a “personality.” What happens when everything can talk? We’re going to find out because, particularly in the connected smart home, via stereo speakers and other devices in the home with speakers (like the TV), any connected device can have a voice... and maybe even a personality.

And just as Furbies interact directly with each other and with the kids playing with them, in the proximal networks of our connected smart homes, things could talk (silently) directly to each other—no roundtrip to the cloud necessary—and audibly to us. My friend with the Furbies in his closet has a freezer in his garage. He has found a design flaw—grab something quickly out of this door-on-the-top freezer and slam the door back down and what do you find? Hours later a trip into the garage reveals the door standing wide open. Turns out the door bounces. Even if the door had some sort of beeping alarm, it wouldn’t come on until he was well away from the garage. No help there. In the connected smarthome, the freezer would notify his TV, his smartphone, and his smartwatch that its door was left open. This could be a text message. But there’s no reason the freezer couldn’t talk. And if it can talk, it can have a personality. My friend is a sarcastic person. His freezer would say: “Trying to cool the garage, are you? My door's open again!”

Ok, so I am making the case for proximal networking, devices interacting directly with each other, like so many 21st Century Furbies. Only these devices are not all made by the same company. How do you get devices—from freezers and fridges, to TVs, stereos, thermostats, door locks and alarm systems—to all speak the same language? Well, you could put them all on the Internet (now that sounds secure) and hope that each manufacturer shares their proprietary cloud APIs and then build custom intelligence around each of these APIs so that the devices can start to interact. Or you could use AllJoyn™, a collaborative open source project of the AllSeen Alliance.

AllJoyn exposes the capabilities of all your connected devices in the same way. Via this OS-, platform- and device-agnostic software framework, any type of connected device can understand the capabilities of and the communications from any other connected device. Just as you settle in with your favorite pint of ice cream, your radio alarm clock speaker says “Trying to cool the garage, are you? My door is open again!” And you know it’s the freezer, because only the freezer talks like that.

Personification of individual things. Now comes the personification of the billions of things that will surround us in the Internet of Everything.

With direct peer to peer communications, see how IoT devices that are in a proximal network could interact and perhaps even more importantly REACT to one another and give “inanimate objects” a voice and even a personality.  See how they could come to life. AllJoyn is a collaborative open source project of the AllSeen Alliance. 

How cool would it be to almost be a part of the video game, while you play the game?   With AllJoyn device discovery, notifications, events and actions,  a video game could interact with your AllJoyn capable devices in the room. Imagine the fun and the level of complexity that this could bring! AllJoyn is a collaborative open source project of the AllSeen Alliance.

Child’s play can be even more imaginative and creative when devices that are AllJoyn capable  can connect and begin to interact.  Bring to life all of the toys that, when you were a kid, you believed were alive anyway.  These inanimate objects can interact with other AllJoyn capable connected household objects for a new and engaging child’s playtime.  AllJoyn is a collaborative open source project of the AllSeen Alliance.