OnQ Blog

Skate to Where the Puck is Going to Be: Three Reasons to Adopt WebGL

Mar 28, 2011

Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.

“I skate to where the puck is going to be, not where it has been.” Ice hockey great Wayne Gretzky said that, and that’s how I feel about the momentum behind WebGL. I think that this web-based graphics library is going to help tear down the walls that separate native app developers from web app developers today and allow them to inhabit the same world.

If you’re a native mobile application developer , what are you trying to do right now? You’re trying to deepen the user experience through high-end, high-performance graphics on the phone. You’re going the native-app route because that has been the only way to get close enough to hardware resources like GPS systems, sensors, camera and graphics processing unit (GPU) for a decent user experience. But you’d probably rather live in the web-world, where you’re not bound to an operating system and development environment.

And if you’re a web app developer today, you’re not bound to an OS or development toolset, but you can’t get close enough to the hardware for high-performance, high-quality graphics. You need to work through a browser, and browsers require plug-ins, so your graphics take a performance-hit.

Enter WebGL

At the recent Game Developers Conference (GDC) in San Francisco, Qualcomm, as a member of the Khronos Group, participated in their announcement formally ratifying the WebGL 1.0 standard. In a nutshell, WebGL gives applications written in web languages (HTML, JavaScript, CSS, XML) full access to the hardware acceleration capabilities in today’s GPUs, so they can enjoy native-app performance with web-app portability. This paves the way for hardware-accelerated 3D graphics in HTML5 web browsers without plug-ins. It means that WebGL moves the world one step closer to developers writing one application using web languages and running it on a wide variety of devices.

The puck is going to WebGL, so skate there, three proof-points:

1. Web-based gaming
Gaming really drives graphics, and it drives dedicated hardware. For example, have a look at the new Sony Ericsson Xperia™ PLAY, which marries a PlayStation™-certified game controller to a native, Android-based smartphone.

But for every hardcore gamer who will buy a device like the PLAY or pay for a complex game, there are 10 other people (friends, mom, gardener?) who won’t. These people want easy access to games that are easy to use; they don’t want to have to download a game just to play it, but they wouldn’t mind playing it over the web. The success of web-based casual games like Farmville, with tens of millions of daily active users on Facebook, shows that the popularity of web-based games is increasing. WebGL is made for this, because it lets developers add rich graphics and more realism.

2. Rich web content

Look at build-your-own websites that allow you to configure, say, a car without leaving the page or downloading an app. In the near future, with built-in browser support for WebGL, these sites will provide stunning 3D interaction and customization without the need for a plug-in.

How about a graphics-intensive, native app? At the Khronos event at GDC, Qualcomm and the Qualcomm Innovation Center, Inc. (QuIC) demonstrated a “web-Neocore” app for Android, in which we had hand-ported Java/OpenGL ES code to WebGL, HTML and JavaScript. It ran on a reference-design handset with our latest dual-CPU Snapdragon™ MSM8660™ processor and Chrome OS. The app performed remarkably well, reaching well over 25 frames per second and smooth graphics throughout.

3. Optimizations for hardware

To make rich graphics work, we’ve built the Adreno™ GPU into the Snapdragon platform. The many use cases for 3D-enabled, mobile web content have inspired Qualcomm and QuIC to optimize the code paths between the WebGL bindings found in the webkit browser and the Adreno GPU driver in the Snapdragon platform. It’s a direct pipe between the rich graphics APIs in WebGL and the dedicated graphics hardware of Adreno. Our goal is to enable web content owners and game developers to build stunning 3D content within the browser at speeds that are close to native code.

We also joined forces with Sony Ericsson to demonstrate the web-Neocore app running on Android. Sony Ericsson’s engineers developed a port of the WebGL bindings code, and together we were able to demonstrate the same modified Neocore app in an extended Android browser. The hardware we used was the single-CPU Snapdragon MSM8255™ found in the Xperia PLAY and Xperia Arc. Have a look at the WebGL Tech demo on Sony Ericsson’s blog to see how smoothly the application runs and how hard it is to distinguish it from a native application. You can also see this same type of user experience in the video of our demos from Mobile World Congress.

WebGL is one of many core technologies driving the next generation of HTML5-branded applications and content. Like our partners at Khronos and Sony Ericsson, Qualcomm believes that software development and web languages are heading toward the same future of cool, innovative user experiences.

What do you think is bringing native app developers and web app developers into each other’s world? Where do you think the puck is going to be? Let me know in your comments below.

Sayeed Choudhury

Senior Director of Product Management, Internet of Everything

More articles from this author

About this author

Related News

Snapdragon

Snapdragon Wear 2100 powers high-end fashion smartwatches at Baselworld

Silicon Valley met Switzerland at this year’s Baselworld, the world’s premier event for the watch and jewelry industry, which celebrated its 100th anniversary this year. Several impressive smartwatches made their debut, all touting the Qualcomm Snapdragon Wear 2100 Platform and all powered by Android Wear 2.0. With this reliable platform and OS developed specifically for wearables, it’s no wonder high-end brands are looking beyond basic wearable functions, and combining style with technology to develop chic smartwatches fit for any lifestyle.

The superior SoC for smartwatches, Snapdragon Wear 2100, is an integrated, ultra-low power sensor hub. It’s 30 percent smaller than previous-generation wearable SoCs, allowing OEMs the freedom to develop thinner, sleeker product designs. And because it uses 25 percent less power than its older sibling (the Snapdragon 400), watchmakers can offer even more features and better designs.

The Snapdragon Wear 2100 comes in both tethered (Bluetooth and Wi-Fi) and connected (3G and 4G LTE) versions. The latter allows wearers to do more with their wearables, from streaming music to sending messages to calling a cab, in tandem with — or even without — having to bring their smartphones along.

Each of the touchscreen smartwatches included in this roundup run Android Wear 2.0, Google’s latest wearable operating system, and can pair with both iOS and Android phones. With Android Wear 2.0, users can personalize their watch faces with chronometer-style complications and create shortcuts to their favorite applications. In addition to the pre-installed Google Fit and calendar apps, more apps can be downloaded directly through the on-watch Google Play store, so wearers can customize their device to their lifestyle.

Android Wear 2.0 brings the Google Assistant to your wrist. Find answers and get things done even when your hands are full. Reply to a friend, set a reminder, or ask for directions. Just hold the power button or say “OK Google”.

Check out the some of Snapdragon Wear powered smartwatches that made a splash at this year’s Baselworld:

Apr 18, 2017

Snapdragon

Caffe2 and Snapdragon usher in the next chapter of mobile machine learning

Machine learning, at its core, is a method by which we can turn huge data into useful actions. Most of the attention around machine learning technology has involved super-fast data processing applications, server farms, and supercomputers. However far-flung servers don’t help when you’re looking to magically perfect a photo on your smartphone, or to translate a Chinese menu on the fly. Making machine learning mobile — putting it on the device itself — can help unlock everyday use cases for most people.

Qualcomm Technologies’ engineers have been working on the machine learning challenge for years, and the fruits of that work are evident in Qualcomm Snapdragon mobile platforms, which have become a leader for on-device mobile machine learning. It’s a core component of the Snapdragon product line, and you’ll see machine learning technologies both in our SoCs (820, 835, and some 600-tier chipsets) and adjacent platforms like the IoT and automotive.

And we aren’t pushing this technology forward by ourselves. We’re working with a whole ecosystem of tools, savvy OEMs, and software innovators to proliferate new experiences for consumers. These experiences use on-device machine learning, and we could not have conceived of them all by ourselves.

An exciting development in this field is Facebook’s stepped up investment in Caffe2, the evolution of the open source Caffe framework. At this year’s F8 conference, Facebook and Qualcomm Technologies announced a collaboration to support the optimization of Caffe2, Facebook’s open source deep learning framework, and the Qualcomm Snapdragon neural processing engine (NPE) framework. The NPE is designed to do the heavy lifting needed to run neural networks efficiently on Snapdragon, leaving developers with more time and resources to focus on creating their innovative user experiences. With Caffe2’s modern computation graph design, minimalist modularity, and flexibility to port to multiple platforms, developers can have greater flexibility to design a range of deep learning tasks including computer vision, natural language processing, augmented reality, and event prediction, among others.

Caffe2 is deployed at Facebook to help developers and researchers train machine learning models and deliver artificial intelligence (AI)-powered experiences in various mobile apps. Now, developers will have access to many of the same tools, allowing them to run large-scale distributed training scenarios and build machine learning applications for mobile.

One of the benefits of Snapdragon and the NPE is that a developer can target individual heterogeneous compute cores within Snapdragon for optimal performance, depending on the power and performance demands of their applications. The Snapdragon 835 is designed to deliver up to 5x better performance when processing Caffe2 workloads on our embedded Qualcomm Adreno 540 GPU (compared to CPU). The Hexagon Vector eXtensions (HVX) in the Qualcomm Hexagon DSP are also engineered to offer even greater performance and energy efficiency. The NPE includes runtime software, libraries, APIs, offline model conversion tools, debugging and benchmarking tools, sample code, and documentation. It is expected to be available later this summer to the broader developer community.

Qualcomm Technologies continues to support developers and customers with a variety of cognitive capabilities and deep learning tools alongside the Snapdragon platform. We anticipate that developers will be able to participate in a wider and more diverse ecosystem of powerful machine learning workloads, allowing more devices to operate with greater security and efficiency.

We don’t yet know the full range of applications for the technology, but we can’t wait to see how it’s used by innovative developers around the world.

Sign up to be notified when the Snapdragon neural processing engine SDK is available later this summer.

Apr 18, 2017

Developer

Hardware-software convergence: Key skills to consider

Hardware-software convergence, or how hardware and software systems are working more closely together, illustrates how each are empowering (and sometimes literally powering) the other. And in our current development environment, this is happening more than ever. Of course, deep technical skills will be of the utmost importance to navigate this technological trend, but it is also the soft skills we apply to our engineering practices that are as important in determining our success.

What skills do developers need to nurture, and how do you put them to good use? In this piece, we’ll cover three soft skills developers can use to stay ahead of the hardware-software convergence, and share resources to help you grow and maintain those skills.

Creative inspiration

First off: Creative Inspiration. While it’s easy to identify your technical shortcomings and fill those gaps with training and practice, knowing which soft skills to hone can be a lot more complicated. In fact, you could even think of these soft skills as “mindsets,” since they’re more about how you approach a problem instead of just being a tool you use to solve it. For this first skill, it will be important to start approaching challenges antidisciplinarily, rather than relying on existing mental frameworks. That’s what being creative is all about – finding new ways of doing things.

So where do you start? Ask yourself this question: What is the dent you want to make in the universe? Begin from a place of passion – think about what problems and projects keep you up at night, and what issues big or small you want to solve.

Then, understand that creative inspiration is a process. What seems like overnight genius is often the result of many erroneous attempts (ex: Thomas Edison’s 1,000 or so attempts in creating the lightbulb) and then having the fortitude to gain deeper understanding of an issue to then apply your imagination. We particularly like the design thinking method, which encourages starting from a place of inspired empathy and developing knowledge through lean prototyping and iteration. The Stanford D.School has a Bootcamp Bootleg that you can download for a quick start guide to this design framework.

Apr 17, 2017

Snapdragon

Artificial intelligence tech in Snapdragon 835: personalized experiences created by machine learning

As our mobile devices have matured, gaining the ability to connect to the Web, we’ve labeled them as “smart.” But why settle for just smart? Harnessing the power of the Qualcomm Snapdragon 835 processor, developers, and OEMs are taking our devices to the next level, creating new experiences with the aid of machine learning. From superior video and security to your own personal assistant, your Snapdragon device has the ability to operate intelligently — outside of the cloud or Web connection — allowing you to experience your smarter phone in an entirely new way.

Application developers and device manufacturers understand what their users want. They can create a feature or an application that uses machine learning (more specifically, deep neural networks) to improve the performance a particular task, such as detecting or recognizing objects, filtering out background noise, or recognizing voices or languages. These applications are usually run in the cloud, and depending on the device they’re in, this could be sub-optimal.

The Snapdragon Neural Processing Engine SDK was created to help developers determine where to run their neural network-powered applications on the processor. For example, an audio/speech detection application might run on the Qualcomm Hexagon DSP and an object detection or style transfer application on the Qualcomm Adreno GPU. With the help of the SDK, developers have the flexibility to target the core of choice that best matches the power and performance profile of the intended user experience. The SDK supports convolutional neural networks, LSTMs (Long Short-Term Memory) expressed in Caffe and TensorFlow, as well as conversion tools designed to ensure optimal performance on Snapdragon heterogenous cores.

The Hexagon DSP and its wide vector extensions (HVX) offer an impressive power and performance mix for running neural networks on device. Performance is up to 8X faster and 25X more power efficient than using the CPU, which translates to lower battery consumption overall. In addition to support via the Snapdragon Neural Processing Engine, TensorFlow is directly supported on the Hexagon DSP, giving developers multiple options to run their chosen neural network power apps.

Here are a few applications that could be facilitated by Snapdragon 835 on-device machine learning tech:

Photography: Machine learning can aid in scene classification, real-time noise reduction, and object tracking, making it easier to take the perfect shot, or capture video regardless of the conditions.

VR/AR: With machine learning on your device, VR/AR feature can operate faster and with less lag, so everything from gestures and facial recognition to object tracking and depth perception are an immersive experience.

Voice detection: Your phone’s on-device AI can listen for commands and keywords to help you navigate the data and apps on your device more efficiently, and save power doing so.

Security: With facial recognition software and iris scanning, all operating independently from the cloud, your device can learn to identify, and help protect, you.

Connections: Your Snapdragon device has the ability to filter out distracting background noise during calls for clearer conversations with friends and family.

Qualcomm Technologies’ unique machine learning platform is engineered so devices powered by the Snapdragon 835 can run trained neural networks on your devices without relying on a connection to the cloud. Pretty innovative, right?

Take a look at our previous deep dives into each of the Snapdragon 835 key components — batteryimmersive AR and VRphotos and videoconnectivity, and security — all of which combine to make the Snapdragon 835 mobile platform truly groundbreaking.

And sign up to receive the latest Snapdragon news.

Apr 13, 2017

Developer

Developer of the month: Computing on the edge with Solstice

The beauty of developer boards like the DragonBoard 410c from Arrow Electronics is their adaptability, and we love hearing about all the different ways people are using them.

We selected Julian Dale and the team at Solstice as our Qualcomm Developer Network Developer of the Month because of their use of the DragonBoard 410c as the hub for their edge facilities management solution in a proof of concept demo that was one of the first to showcase AWS Greengrass, which is only available in limited preview.

We talked to Julian to find out about the challenges of building real-world IoT solutions, and how using the Dragonboard 410c helped them prototype an edge gateway running AWS Greengrass, extending AWS IoT and Lambda functions to add intelligence to what gets sent to the cloud.

Can you tell us about your company and what you develop?        
We’re strategists, researchers, designers and engineers hell-bent on changing the way the world does business. We’re headquartered in Chicago, IL and have delivery offices in New York, NY and Buenos Aires, Argentina.

How was your company started?
In 2001 J Schwan, Founder & CEO, established Solstice Consulting in Chicago as an IT services firm. In 2008, J pivoted the company from developing web-based experiences into mobile. Over these past seven years we’ve grown from a Chicago-based mobile boutique, to a technology firm of over 400 designers and engineers. To reflect this growth, this past fall we rebranded to simply: Solstice.

What is your company's mission?
Solstice is a global innovation and emerging technology firm that helps Fortune 500 companies seize new opportunities through world-changing digital solutions. We exist to prove what businesses are capable of.

Can you share with us your company’s project using DragonBoard 410c and AWS Greengrass? How did your company decide on these technology solutions?
We built a facilities management solution called
The Pulse, designed as an elegant way to capture data in existing buildings and derive insights from the vast amount of information. It was first featured at Solstice's annual innovation conference, Solstice FWD, in September 2016, and has since been upgraded to leverage AWS Greengrass. This powerful edge computing case was showcased during AWS re:Invent in November 2016 .

We built a number of sensor packs that tracked temperature, humidity, sound intensity, and motion. We deployed them throughout the conference venue in order to show attendees a heat map of the busiest areas – where exciting things might be happening – or how to get away from everything for a bit. With The Pulse our aim is to help attendees feel more plugged into the conference experience, and help them make informed decisions about what to do next.

The Pulse uses a DragonBoard 410c, a development board based on the Qualcomm Snapdragon 410 mobile platform, as a gateway device that aggregates and batches information from many different sensors. Snapdragon processors have the power and speed to support edge processing, allowing increased local analysis of data for a more secure and reliable IoT implementation. Combined with AWS Greengrass, this brings new possibilities to commercial, industrial, medical and smart city solutions that can’t rely exclusively on the cloud.

What does innovation mean to your company? 
Innovation is a core pillar of why Solstice exists as an organization. We are always looking at the now, near, and next technologies that will shape our future. In 2015, we launched Solstice Labs, our internal R&D extension, to ensure we’re constantly investigating, testing and dreaming up the innovative technologies of the near future. From emerging IoT technology to augmented reality and much more, we invest in learning what’s coming next so we can keep our clients always looking ahead.

Share with us a fun fact about your company.
Every week, Solstice nominate their peers for the “Awesomeness Award”, which is given out every Thursday at our all hands company meeting. The winner is chosen by the previous week’s winner. The winner then has the privilege of sitting in the “Awe-some” Office the following week, and often uses it as a fun space for daily team standups and meetings. The Awesomeness Award promotes our culture of servant leadership by recognizing people that are making their teammates and projects shine.

Apr 6, 2017