Achieving autonomy and seamless cellular connectivity: Mission Impossible?
For many years, drones and robots have been one-dimensional, executing simple remote control commands. While robotics perception, path planning, and flight control performance have evolved, the three fields have not been integrated, preventing true autonomy from being realized. We recognize this capability gap and are tackling the problem at inception by not only significantly advancing each of the three fields but also pioneering the means for successfully intersecting and interconnecting them, leading to full autonomy.
Beyond autonomy lies the issue of cellular connectivity. Today’s cellular networks are ready to start serving Unmanned Aircraft Systems (UAS), but will need some optimization to handle the inevitable hundreds of autonomous drones poised to flood our skies. Our team is meeting this challenge by studying how to optimize existing 4G LTE networks to handle the increased load, and also driving the design of 5G to natively and optimally support the drones.
Key Research Areas:
New advances in autonomous perception.
We are developing a sensing system that would enable drones to simultaneously communicate imagery to its users and also leverage the imagery as a means for detecting obstacles. Our team has developed a mobile-optimized depth from stereo implementation. Depth from stereo correlates common features across a pair of cameras to see an image and triangulate its depth information, thus determining the drone’s proximity to the object found in the field of view and creating a two-dimensional depth map. Each pixel in the depth map contains the distance to that point in space. As the drone flies, depth maps are updated at 30 FPS and can be used for obstacle avoidance.
Visual-inertial Odometry (VIO) provides the drone’s relative indoor and outdoor location using a monocular camera. Combining the camera’s temporal sequence of images with inertial sensors (gyroscope and accelerometer), VIO generates a 6-DOF pose of the drone in real-time and centimeter-level accuracy. For outdoor missions, VIO and GPS data can be fused, resulting in centimeter-level GPS accuracy, indicating exactly where the drone is relative to the world, with exact 3D positioning. As the drone flies, it combines VIO and depth map data to form 3D representations of objects which assist in planning and navigation.
Depth from stereo. A pair of cameras find the same point in left and right images, triangulate to get depth, and create a point cloud by repeating for every pixel.
Overcoming obstacles in autonomous path planning and navigation.
We have developed autonomous path planning and navigation systems for drones and robots, enabling them to move safely through indoor and outdoor environments. For example, prior to flight, the user designates where the drone should go and the bounds of the area it will fly through. The drone’s path planning algorithm uses a 3D model of the world (generated through voxel mapping) to build a random graph of unoccupied points in space and safely transitions between them. The graph represents all the collision-free paths the drone could select to reach its goal. The drone may see multiple paths but will pick the shortest path to its destination. Every hundred milliseconds, it updates its 3D voxel map and re-checks the planned path to ensure it is still safe. If at some point the drone encounters a potential hazard or obstacle in its path, it will re-vector to an alternate route based upon its internal decision making.
During autonomous path planning, the robot builds a 3D roadmap of safe waypoints and collision-free edges between them. The drone picks the minimum cost (shortest) path to the goal on this graph but may change its path whenever obstacles come into view.
Innovating advanced flight controls.
We continue to evolve our Snapdragon™ Flight, a highly sophisticated flight controller board which integrates features such as sensors (IMU, cameras, etc.), the main processor, power management systems, and memory all on one open source platform. Snapdragon Flight was designed optimally for robotics application, both in terms of form factor size and input/output capabilities, enabling it to run computer vision, video processing, and flight control functions all on a single system on chip (SoC). Learn more about Snapdragon Flight.
Monitoring and regulating motor speeds have been a longtime research challenge for drone flight control development as the motors that spin a drone’s propellers did not communicate with the flight controller. The pilot was effectively “flying blind”, unaware of why motors were spinning slower or faster than usual, thus creating uncertainty about the drone’s flight performance. To overcome this issue, we pioneered Electronic Speed Control (ESC), a closed loop system that monitors and regulates motor speeds, ensuring safety and responsiveness. ESC is critical for reporting each motor’s RPM to the flight controller, allowing it to take corrective action if needed. For example, should a propeller break, its related motor will start spinning faster than usual, due to the decreased load. The flight controller receives this data from the ESC and could direct another motor to perform a counter-yaw rotation to control the drone’s yaw and keep it stable.
Autonomous visual navigation for drones and robotics begins with image collection, whereby a stereo pair of cameras capture an image and create a depth map. During obstacle mapping, VIO is leveraged in order to accurately stich the maps together and this data is used for path planning. The flight controller then takes action, spinning the motors appropriately.
Pioneering new means for autonomous drone cellular connectivity.
We provide the connectivity fabric for autonomous drones. Through 4G LTE, we are optimizing LTE for safe drone operation by leveraging our FAA-authorized test environment that represents “real world” conditions. Accelerating 5G technology development, we are supporting 5G specifications within 3GPP, specifically for massive deployments of mission-critical drone use cases.
Our early findings from wide-area 4G LTE autonomous drone mission tests conducted over commercial cellular networks proved that today’s LTE networks could serve drones at low altitudes, but without impacting terrestrial devices. While antennas in today’s network do not usually point toward the sky, the free-space propagation characteristics seen by drones more than makes up for reduced antenna gains, providing good received signal power at these altitudes. During hundreds of tests, drones demonstrated seamless handovers between different base stations with zero link failures. Operating at 400 feet, drones detected base stations from long distances, receiving strong reference signals from them. Our Unmanned Aircraft System (UAS) Flight Center was recently opened to conduct these tests, operating under a FAA Certificate of Authorization. We continue to conduct multi-band and multi-altitude tests in conditions set within FAA-controlled Class B airspace where commercial, residential, and rural, environments reside.
We are accelerating 5G development specifically for wide-scale deployments of mission-critical drone use cases such as package delivery, search and rescue, agricultural inspection, and many more. 5G will elevate the user experience, offering wider bandwidths and massive MIMO. It will deliver uniform coverage with reliable mobile broadband at different altitudes and speeds for numerous types of drones. Additionally, it will provide direct drone-to-drone communication, multi-hop, and relays for safety and extended coverage.
If you find the work we’re doing in autonomous robotics to be exciting, and you have a background in computer vision, autonomous path planning, flight controls, or drone cellular connectivity, we’d love to hear from you. Please visit us at www.qualcomm.com/company/careers to submit your resume. When creating your Qualcomm profile, please enter the activity code, "Robotics".