Sep 1, 2017
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
Our “things” are getting smarter.
This includes everything from cars to smartphones to digital assistants, and, yes, even robots. And we’re not just talking about adding on groundbreaking, new features (though that’s happening, too). Devices, computers, and machines are performing tasks and behaving in ways that are smart. How? Artificial intelligence, also commonly known as AI.
The term “artificial intelligence” was first coined by cognitive scientist John McCarthy in a proposal he wrote for a study. In it, he said, “The study is to proceed on the basis of the conjecture that every learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” This description still applies today, with some added levels of complexity.
You’ve likely been hearing the term “artificial intelligence” a lot lately, along with a couple of others, specifically machine learning and deep learning. They’re often used interchangeably, but while they’re related, they’re not the same.
AI is now the overarching descriptor for devices or machines that act in ways that are smart, with machine learning as its subset, and deep learning as a subset of machine learning. In other words, machine learning and deep learning are categorized under AI, but AI isn’t necessarily machine learning or deep learning.
Since it can get confusing, we broke down the distinctions between artificial intelligence, machine learning, and deep learning via a classic trope: comparing apples to oranges.
Artificial Intelligence (AI)
AI, in the broadest sense, describes the different ways a machine interacts with the world around it. Through advanced, human-like intelligence — courtesy of software and hardware — an AI machine or device can mimic human behavior or perform tasks as if it were human.
Today, we read a lot about AI in regard to things like speech recognition (used by intelligent personal assistant devices), facial recognition (used by popular filters in social media), or object recognition (like searching for images of apples and oranges). But how are these features made smart?
Machine learning is an approach, or subset, of AI, with an emphasis on “learning” rather than just computer programming. Here, a machine uses complex algorithms to analyze a massive amount of data, recognize patterns among the data, and make a prediction — without requiring a person to program specific instructions into the machine’s software. After incorrectly identifying a cheese puff as an orange, the system’s pattern recognition improves over time as it learns from its mistakes and corrects itself, just like a human would.
Deep learning, a subset of machine learning, takes computer intelligence even further. It uses massive amounts of data and computing power to simulate Deep Neural Networks. Essentially, these networks imitate the human brain’s connectivity, classifying data sets and finding correlations between them. With its newfound knowledge (acquired without human intervention), the machine can then apply its insights to other data sets. The more data the machine has at its disposal, the more accurate its predictions will be.
For example, a device with deep learning can examine large data sets — such as the color, shape, size, peak season, and origins of a fruit — to determine if an apple is, specifically, a Gala or an orange is a Blood Orange.
The differences between artificial intelligence, machine learning, and deep learning aren’t as obvious as those between apples and oranges; they’re much more nuanced. Qualcomm Technologies has integrated these AI technologies into its Qualcomm Snapdragon mobile platforms, engineered to create compelling, intuitive experiences that allow devices to learn more about, well, you.
Learn more about Qualcomm’s work in Artificial Intelligence.