At Qualcomm AI Research, we are advancing AI to make its core capabilities – perception, reasoning, and action – ubiquitous across devices. Our mission is to make breakthroughs in fundamental AI research and scale them across industries. By bringing together some of the best minds in the field, we’re pushing the boundaries of what’s possible and shaping the future of AI.
Qualcomm AI Research Introduction
Jun 8, 2018
“Machine learning algorithms such as deep learning already use large amounts of energy. And with AI increasingly moving to power-constrained edge devices, energy efficiency will only become more important. The benchmark that matters will be how much intelligence one can squeeze out of every joule of energy. I believe that Qualcomm is uniquely positioned to address this problem and be a key player in AI.”
- Professor Dr. Max Welling
VP of Technology, Qualcomm Technologies Netherlands B.V.
Model design, compression, quantization, activation, algorithms, and efficient hardware.
Continuous learning, model adaptation, and privacy-preserved distributed learning.
Robust learning through minimal data, unsupervised learning, and on-device learning.
Multi-task and multi-modal learning, sensor fusion, and cloud-edge systems.
Webinar: Pushing the boundaries of AI research
Sep 9, 2020
Webinar: Enabling power-efficient AI through quantization
May 1, 2020
Webinar: 5G+AI: The Ingredients Fueling Tomorrow's Technology Innovations
Feb 19, 2020
AI Research Video
Dec 6, 2019
Qualcomm Innovation Center open sourced AIMET, which includes state-of-the-art quantization and compression techniques. The goal for this open source project is to collaborate with other leading AI researchers, provide a simple library plugin for AI developers, and help migrate the ecosystem toward integer inference.
Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc.
AI Model Efficiency Toolkit is a product of Qualcomm Innovation Center, Inc.
Qualcomm Innovation Center, Inc. is a wholly-owned subsidiary of Qualcomm Technologies, Inc.