OnQ Blog

Dr. Max Welling on how AI will impact the world, now and in the future

Artificial intelligence (AI) technology will likely be one of the most transformative forces in the future, impacting nearly everything we do. Just as Qualcomm Technologies laid the foundation for the mobile revolution, we’re creating innovations that will define the AI era.

One of the key figures in this space is Dr. Max Welling, a renowned professor at the University of Amsterdam. Welling joined the Qualcomm Technologies family last summer when we acquired his company Scyfer B.V., which created powerful AI solutions for companies across industries.

We spoke with Dr. Welling about the current state of AI and what this technology will be capable of in the future.

[This interview has been edited for clarity and length.]

Let’s talk about the current state of AI. Where are we today?

Right now, we have AI systems that can do one thing very well — better than humans, in fact. For example, AI can analyze medical images and detect if somebody has a melanoma. And AI can beat human world champions in Go or chess.

These are very restricted domains where AI works very well, but it all becomes much more complicated and interesting when we look at “multi-agent systems.” For example, in a self-driving car, the AI will need to understand what the other people on the road and side roads are doing. The multiple agents the self-driving car interacts with have intentions, so we’ll have to develop new planning algorithms and intelligence that anticipates what other road users are going to do.

What’s the next frontier for AI?

With regard to deep learning, I think the next big thing is reasoning. We can do speech recognition and image analysis pretty well. We can identify the objects that are in an image. We can even say what people are doing. Now what if I asked, “What happens next?” or “What caused this particular scene to happen?” The system would apply reasoning and respond.

Does deep learning/AI require more efficient hardware, software, or a combination of both?

People underestimate how important hardware is in AI. There’s no improvement in AI without Moore’s Law keeping apace. But what’s going to happen is that fairly soon, you’re going to see today’s deep learning solutions become too expensive in terms of energy consumption. We’re going to hit a ceiling and will need new solutions.

Currently, our algorithms are extremely energy hungry and inefficient. It’s a real issue — not just for mobile, but also in clouds and data centers. New breakthroughs are necessary. Qualcomm Technologies is leading the way there, and I think people underestimate how important that’ll be. Our newly formed Qualcomm AI Research team understands that this will be the next AI battleground, and is looking at the development of energy-efficient hardware. We’re committed to this mission and that’s why we’re unifying the AI research efforts across various groups within Qualcomm Technologies.

On-device AI has many benefits, as does AI in the cloud. In future AI-based applications, how important will the edge be versus the cloud?

My guess is that there will be a hybrid. Clearly, the cloud has advantages because the data is centralized there. But I think as we get better at learning from distributed data sources, it becomes less and less important that all of that data is located in a central place. It could live in many different places as long as we can easily train models in a distributive way.

Is privacy a concern?

I think the public is becoming increasingly aware and cautious of big internet companies collecting their data. So, we want to develop machine learning algorithms, or frameworks, that help protect privacy.

One method is to secure data behind a firewall and never let it leave your protected environment. But to get the most out of AI, you’ll probably need to communicate with the cloud, because that’s where data aggregates and where the software learns and improves. There are a few ways to utilize the cloud while still protecting your data.

The first way, though expensive, is to encrypt your data. This encrypted data can be used to improve the model, while the data remains unreadable. In the second way, called “differential privacy,” you basically compute typical averages of the things you are interested in. Then you add noise to guarantee that whatever’s been sent out of your protected environment will never reveal anything sensitive about an individual.

How do you see machine learning being utilized in the IoT?

In the home, there will be sensors throughout, including in your appliances. These devices will talk to each other and anticipate your needs, making the home more comfortable and efficient.

The same type of ubiquitous networking will be found in manufacturing. In factories, many machines have to cooperate in a production process. They’ll all be loaded with sensors to figure out how to optimally cooperate, how to perform quality control, and how to detect failure before it happens.

We know AI makes a lot of people nervous. They’re concerned about job loss, for example. How do you respond to that fear?

In the short run, I think it will disrupt society and certain jobs may disappear. It’ll take a little while, but in the long run, a lot of new opportunities will arise. With the right training, new generations might see a much more flexible job market with a lot of new opportunities.

What are you working on now? What’s exciting you about AI?

Within Qualcomm Technologies, there’s a lot of potential for innovation, for instance in reinforcement learning and co-design of algorithms and chips. There are actually three types of machine learning. In broad strokes, unsupervised machine learning is where you basically have no labels. A machine looks at the world, and tries to structure it to find repetitive patterns. Then there’s supervised learning, where you have a label. You say here’s the input image, and this is what’s in the image. Now predict for me what’s in the next image.

Reinforcement learning is not just making predictions or finding structure in the world, but actually acting within it. Think of a robot: It makes decisions and perform actions. It can pick something up, look at it, learn from it. Or it may play a game or drive a car.

Another hot topic is the co-design of chips and the algorithms that run on it. For instance, we can train an algorithm to perform well on a chip with limited memory and precision. But reversely, we can design a chip that have the ability to execute deep learning algorithms efficiently and at low power. Qualcomm AI Research offers a unique environment where machine learning meets hardware.

Imagine what can be accomplished if you put together the minds of the brightest AI scientists to solve these kinds of problems.

 

Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.

 

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries ("Qualcomm"). Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries. The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.