OnQ Blog

Caffe2 and Snapdragon usher in the next chapter of mobile machine learning

Apr 18, 2017

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

Machine learning, at its core, is a method by which we can turn huge data into useful actions. Most of the attention around machine learning technology has involved super-fast data processing applications, server farms, and supercomputers. However far-flung servers don’t help when you’re looking to magically perfect a photo on your smartphone, or to translate a Chinese menu on the fly. Making machine learning mobile — putting it on the device itself — can help unlock everyday use cases for most people.

Qualcomm Technologies’ engineers have been working on the machine learning challenge for years, and the fruits of that work are evident in Qualcomm Snapdragon mobile platforms, which have become a leader for on-device mobile machine learning. It’s a core component of the Snapdragon product line, and you’ll see machine learning technologies both in our SoCs (820, 835, and some 600-tier chipsets) and adjacent platforms like the IoT and automotive.

And we aren’t pushing this technology forward by ourselves. We’re working with a whole ecosystem of tools, savvy OEMs, and software innovators to proliferate new experiences for consumers. These experiences use on-device machine learning, and we could not have conceived of them all by ourselves.

An exciting development in this field is Facebook’s stepped up investment in Caffe2, the evolution of the open source Caffe framework. At this year’s F8 conference, Facebook and Qualcomm Technologies announced a collaboration to support the optimization of Caffe2, Facebook’s open source deep learning framework, and the Qualcomm Snapdragon neural processing engine (NPE) framework. The NPE is designed to do the heavy lifting needed to run neural networks efficiently on Snapdragon, leaving developers with more time and resources to focus on creating their innovative user experiences. With Caffe2’s modern computation graph design, minimalist modularity, and flexibility to port to multiple platforms, developers can have greater flexibility to design a range of deep learning tasks including computer vision, natural language processing, augmented reality, and event prediction, among others.

Caffe2 is deployed at Facebook to help developers and researchers train machine learning models and deliver artificial intelligence (AI)-powered experiences in various mobile apps. Now, developers will have access to many of the same tools, allowing them to run large-scale distributed training scenarios and build machine learning applications for mobile.

One of the benefits of Snapdragon and the NPE is that a developer can target individual heterogeneous compute cores within Snapdragon for optimal performance, depending on the power and performance demands of their applications. The Snapdragon 835 is designed to deliver up to 5x better performance when processing Caffe2 workloads on our embedded Qualcomm Adreno 540 GPU (compared to CPU). The Hexagon Vector eXtensions (HVX) in the Qualcomm Hexagon DSP are also engineered to offer even greater performance and energy efficiency. The NPE includes runtime software, libraries, APIs, offline model conversion tools, debugging and benchmarking tools, sample code, and documentation. It is expected to be available later this summer to the broader developer community.

Qualcomm Technologies continues to support developers and customers with a variety of cognitive capabilities and deep learning tools alongside the Snapdragon platform. We anticipate that developers will be able to participate in a wider and more diverse ecosystem of powerful machine learning workloads, allowing more devices to operate with greater security and efficiency.

We don’t yet know the full range of applications for the technology, but we can’t wait to see how it’s used by innovative developers around the world.

Sign up to be notified when the Snapdragon neural processing engine SDK is available later this summer.