OnQ Blog

Where Does Mobile Computing Go from Here?

Jun 24, 2013

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

People always want more. That’s just the way it is. In the case of computing devices, mobile or otherwise, we’ll always want better user experiences and more performance at affordable prices.  So far, the industry has delivered. And we’re all growing to expect it. But is there an end? Is there a point where the mobile experience reaches its peak?

At Qualcomm, we are excited to see many emerging mobile experiences and applications coming, such as computational photography, augmented reality, realistic physics, and contextual awareness. These experiences are not only computationally intense, but they also invoke new types of diverse workloads with diverse requirements.

In augmented reality, for example, your mobile device has to continuously analyze the camera feed, recognize and track interesting objects, locate them in 3D space, and superimpose perspective-corrected overlay images—the “augmented” part.  These diverse workloads demand a lot of compute horsepower! In addition, these diverse workloads are computed with evolving algorithms, which means that the processors need to have some level of programmability. Although programmability gives the flexibility to compute diverse algorithms, it also comes at the cost of power.

So, the challenge is to provide these emerging mobile experiences while still satisfying the key mobile device constraints that consumers desire: a sleek, ultra-light device that stays cool and delivers long battery life.

Some tech savvy people might think that the CPU is the answer, but that’s only part of the answer. As I alluded to earlier, the CPU’s immense flexibility and programmability comes at the price of power. In fact, previous methods of scaling the CPU to address the increased compute requirements, within the power and thermal constraints of mobile devices, have delivered diminishing returns. Let’s take a look at a couple of these methods:

  • Single-core CPU scaling improved compute performance by increasing the CPU clock frequency and increasing instructions per clock (IPC) through architecture improvements. CPU clock rates are flattening. Remember the CPU GHz race in the PC market? Well, that race slowed many years ago, and the max frequency is now saturating. IPC increases have also slowed due to the increasing micro-architectural complexity to squeeze out more performance, which is not only challenging but also power hungry.
  • Multi-core CPU scaling was the next step to scale computing performance and deal with the issue of clock rates flattening out. By duplicating CPU cores, semiconductor companies took advantage of the extra transistors and scaled the overall maximum theoretical compute performance. However, the ability to take advantage of this increased performance depends on being able to run multiple programs or threads in parallel. You just need to look at Amdahl’s Law to see how quickly you get diminishing returns on performance improvement when programs have sequential code. In the PC market, the number of CPU cores has mainly settled at quad-core. In addition, since the CPU, as I already noted, is not necessarily the most efficient processor, running multiple CPUs at maximum clock frequency for a sustained period is very challenging in a thermally limited mobile form factor (i.e. things get hot!).

So how are we going to continue scaling compute going forward? Just like we have done over and over in the past, we need to change our computing approach so that we can keep increasing the compute performance and give consumers the mobile experiences that they want.

At Qualcomm, we like to the give people what they want. So, what’s next?  Well, over the next several months I’m going to explain why mobile heterogeneous computing is the next paradigm in mobile computing. By intelligently utilizing appropriate processors, heterogeneous computing improves app performance, battery life, and thermal efficiency to enable the evolution of new mobile experiences.

Look for future blogs and webinars to learn about Qualcomm’s view on heterogeneous computing.

Pat Lawlor

Senior Manager, Technical Marketing

More articles from this author

About this author

Related News


Get immersed: scene-based audio with MPEG-H

Qualcomm Technologies has been working on audio technology for next generation television broadcasts using the new MPEG-H standards. The audio technology is designed to help content creators, content hosts, consumer electronics manufacturers, and broadcasters create, capture, and render true-to-life immersive 3D audio experiences and scene-based audio so the viewer feels immersed in sound.

At the upcoming National Association of Broadcasters (NAB) show, we’ll showcase a comprehensive live production of immersive audio for both traditional TV and VR. The production will use scene-based (Higher Order Ambisonic, or HOA) and object-based audio, while the transmission will be using the MPEG-H 3D audio standard. The audio production will use multiple audio sources, including ambisonic and spot microphones. The video production will be through traditional TV cameras as well as an Omnicast VR camera. The audio production can simultaneously feed into both OTA (for linear TV) and OTT (for both linear TV and VR consumption) transmission.

The production process will show monitoring using sound bars as well as immersive loudspeaker layouts. The playback process will show flexible rendering to any number of loudspeakers, audio-rotation over loudspeakers for 360 video, as well as live VR overhead mounted displays.

Qualcomm Technologies will also showcase its high-quality HEVC cloud and server-based encoder, which is engineered for over-the-top services and 4K real-time encoding with multi-threading on a single machine. The high-quality HEVC encoder has significantly lower complexity than x265 for the same coding efficiency.

Also at our booth (#SU11013) our friends at b<>com will demonstrate a live VR feed combining scene-based audio with a multiple camera VR system. HOA scene-based audio encoded using MPEG-H will be binauralized and delivered to a VR headset, where the essential component that scene-based audio provides to VR can be experienced with head-tracking. You can learn more about b<>com at NAB at booth #N2035-FP.

Check out the video below for a quick overview of scene-based audio, and please wear headphones for the best binaural audio experience:

Apr 20, 2017


Hardware-software convergence: Key skills to consider

Hardware-software convergence, or how hardware and software systems are working more closely together, illustrates how each are empowering (and sometimes literally powering) the other. And in our current development environment, this is happening more than ever. Of course, deep technical skills will be of the utmost importance to navigate this technological trend, but it is also the soft skills we apply to our engineering practices that are as important in determining our success.

What skills do developers need to nurture, and how do you put them to good use? In this piece, we’ll cover three soft skills developers can use to stay ahead of the hardware-software convergence, and share resources to help you grow and maintain those skills.

Creative inspiration

First off: Creative Inspiration. While it’s easy to identify your technical shortcomings and fill those gaps with training and practice, knowing which soft skills to hone can be a lot more complicated. In fact, you could even think of these soft skills as “mindsets,” since they’re more about how you approach a problem instead of just being a tool you use to solve it. For this first skill, it will be important to start approaching challenges antidisciplinarily, rather than relying on existing mental frameworks. That’s what being creative is all about – finding new ways of doing things.

So where do you start? Ask yourself this question: What is the dent you want to make in the universe? Begin from a place of passion – think about what problems and projects keep you up at night, and what issues big or small you want to solve.

Then, understand that creative inspiration is a process. What seems like overnight genius is often the result of many erroneous attempts (ex: Thomas Edison’s 1,000 or so attempts in creating the lightbulb) and then having the fortitude to gain deeper understanding of an issue to then apply your imagination. We particularly like the design thinking method, which encourages starting from a place of inspired empathy and developing knowledge through lean prototyping and iteration. The Stanford D.School has a Bootcamp Bootleg that you can download for a quick start guide to this design framework.

Apr 17, 2017


Developer of the month: Computing on the edge with Solstice

The beauty of developer boards like the DragonBoard 410c from Arrow Electronics is their adaptability, and we love hearing about all the different ways people are using them.

We selected Julian Dale and the team at Solstice as our Qualcomm Developer Network Developer of the Month because of their use of the DragonBoard 410c as the hub for their edge facilities management solution in a proof of concept demo that was one of the first to showcase AWS Greengrass, which is only available in limited preview.

We talked to Julian to find out about the challenges of building real-world IoT solutions, and how using the Dragonboard 410c helped them prototype an edge gateway running AWS Greengrass, extending AWS IoT and Lambda functions to add intelligence to what gets sent to the cloud.

Can you tell us about your company and what you develop?        
We’re strategists, researchers, designers and engineers hell-bent on changing the way the world does business. We’re headquartered in Chicago, IL and have delivery offices in New York, NY and Buenos Aires, Argentina.

How was your company started?
In 2001 J Schwan, Founder & CEO, established Solstice Consulting in Chicago as an IT services firm. In 2008, J pivoted the company from developing web-based experiences into mobile. Over these past seven years we’ve grown from a Chicago-based mobile boutique, to a technology firm of over 400 designers and engineers. To reflect this growth, this past fall we rebranded to simply: Solstice.

What is your company's mission?
Solstice is a global innovation and emerging technology firm that helps Fortune 500 companies seize new opportunities through world-changing digital solutions. We exist to prove what businesses are capable of.

Can you share with us your company’s project using DragonBoard 410c and AWS Greengrass? How did your company decide on these technology solutions?
We built a facilities management solution called
The Pulse, designed as an elegant way to capture data in existing buildings and derive insights from the vast amount of information. It was first featured at Solstice's annual innovation conference, Solstice FWD, in September 2016, and has since been upgraded to leverage AWS Greengrass. This powerful edge computing case was showcased during AWS re:Invent in November 2016 .

We built a number of sensor packs that tracked temperature, humidity, sound intensity, and motion. We deployed them throughout the conference venue in order to show attendees a heat map of the busiest areas – where exciting things might be happening – or how to get away from everything for a bit. With The Pulse our aim is to help attendees feel more plugged into the conference experience, and help them make informed decisions about what to do next.

The Pulse uses a DragonBoard 410c, a development board based on the Qualcomm Snapdragon 410 mobile platform, as a gateway device that aggregates and batches information from many different sensors. Snapdragon processors have the power and speed to support edge processing, allowing increased local analysis of data for a more secure and reliable IoT implementation. Combined with AWS Greengrass, this brings new possibilities to commercial, industrial, medical and smart city solutions that can’t rely exclusively on the cloud.

What does innovation mean to your company? 
Innovation is a core pillar of why Solstice exists as an organization. We are always looking at the now, near, and next technologies that will shape our future. In 2015, we launched Solstice Labs, our internal R&D extension, to ensure we’re constantly investigating, testing and dreaming up the innovative technologies of the near future. From emerging IoT technology to augmented reality and much more, we invest in learning what’s coming next so we can keep our clients always looking ahead.

Share with us a fun fact about your company.
Every week, Solstice nominate their peers for the “Awesomeness Award”, which is given out every Thursday at our all hands company meeting. The winner is chosen by the previous week’s winner. The winner then has the privilege of sitting in the “Awe-some” Office the following week, and often uses it as a fun space for daily team standups and meetings. The Awesomeness Award promotes our culture of servant leadership by recognizing people that are making their teammates and projects shine.

Apr 6, 2017


4 new IoT development kits for Bluetooth Low Energy applications

Get your Bluetooth® Low Energy IoT applications ready for a new family of development kits based on the CSR102x modules from Qualcomm Technologies International, Ltd.

The CSR102x family is designed to reduce the development time of the Bluetooth-connected IoT applications your customers are asking for:

Heart rate sensors, security tags, general IoT – The CSR102x Starter Development Kit is a good entry point for IoT development, with I/O expansion connectors for off-board sensors and actuators. It’s ideal for software developers looking to make the transition to embedded programming.
Lighting, home automation, sensor networks – The CSR102x IoT Development Kit is a package of 3 target boards made for networking Bluetooth devices and equipped with on-board LEDs, buttons, switches and sensors.
Beacons, proximity tags, wearables – The CSR102x Bluetooth Node Development Kit is powered by a coin cell battery, comes in a small footpod form factor, and includes a chip antenna, motion sensor, programming connector and internal flash.
Health & fitness, keyboards, mice, alert tags, keyless entry – The CSR102x Professional Development Kit is made for flexibility, with a pluggable CSR1025 chip module and multiple power supply options. It’s built to accommodate application-specific plug-in boards, currently the Sports Watch Application Board and the Smart Remote Application Board (sold separately).

Because “IoT” means so many different things to so many different developers, the CSR102x family covers a wide spectrum of application possibilities. It also checks three of the most important boxes on your IoT shopping list.

The CSR102x Development Kits – Low active power, low overall cost and security

Always-on devices are always-need-power devices. The CSR102x modules feature built-in power regulation and low active power consumption, with less than 5mA active current for transmit and receive operations. In the right applications, a coin cell battery in these modules can last for years.

Somewhere in the family you’ll find a kit with the hardware configuration you’re looking for. To keep your overall system costs low, we’ve designed the CSR102x kits so that very few external components are needed. The modules are implemented with only a single crystal, and you’ll find a direct connection between the antenna and device on each kit. The application boards include all the input controls, sensors and radios for prototyping a sports watch and smart remote control.

Without security, you don’t have much of an IoT story, so the CSR102x modules include application-level security features including encryption, authentication and over-the-air updates (OTAU) which are designed to prevent software running on them from being easily compromised. Applications downloaded may be authenticated using a SHA-256 hash and RSA-1024 signature, and may (at your option) be encrypted using AES-128 – only being decrypted before being loaded into RAM.

The CSR102x family picks up where the CSR101x family leaves off, adding new storage options for application software, lower power consumption, and support for the higher data throughput and additional security features of Bluetooth 4.2.

Are the CSR102x Development Kits right for you?

If you’re an embedded developer with C programming experience, you can dive right in to any kit in the family and start prototyping new apps in no time.

If you’re an app developer who’s been looking for a smooth path into embedded programming, we’ve made the CSR102x Starter Development Kit with you in mind. The IDE that comes with the kit includes full documentation on building sample apps and getting them downloaded to the hardware. The board includes expansion connectors and a programming and debugging interface to connect to the host development PC.

There’s no need for extensive knowledge of processor or Bluetooth technology. You can get the examples up and running quickly with basic knowledge of embedded design and C programming skills.

Next steps

If there’s a one-size-fits-all for IoT, we haven’t found it yet. (And we would know.) That’s why we release our kits in families, with different applications and form factors in mind.

Have a look at our CSR102x Development Kits and find the one that best suits your needs:

Starter Development Kit
IoT Development Kit
Bluetooth Node Development Kit
Professional Development Kit and separate Sports Watch Application Board and Smart Remote Application Board

You’re just a couple of clicks away from taking your IoT development to the next level.

Mar 30, 2017