Verizon and the Caltech Center for Autonomous Systems and Technologies (CAST) are testing ways to use the carrier’s 5G Ultra Wideband network, mobile edge compute (MEC) technology and artificial intelligence (AI) to enable interpretation of data from weather drones in near real-time.
By shifting the heavy computing tasks associated with AI processing of the massive amounts of data the drones generate, Verizon and CAST hope to enable nearly instantaneous in-flight course adjustments.
The tests are being conducted in a three-story aerodrome with more than 2,500 tiny computer-controlled fans that can simulate conditions that range “from a light gust to a gale.” The fan wall, which was designed by Caltech graduate students, can be tilted 90 degrees to simulate vertical takeoffs and landings. The fan wall previously served as the blueprint for the fan wall used to test the Mars Ingenuity helicopter at the JPL (the Jet Propulsion Laboratory) that Caltech manages for NASA.
Verizon is funding the one-year drone project and several other 5G CAST projects.
“By collaborating with CAST researchers, we hope to accelerate the innovation process and development of unmanned aerial vehicles that can autonomously navigate using 5G, edge compute and AI,” Nicki Palmer, Verizon’s Chief Product Development Officer, said in a press release. “This research project is just the tip of the iceberg of what we hope to see tested. The facility and areas of exploration that CAST is working on represent the types of use cases that 5G can really take to the next level.”
The relocation of computing to the edge of the network drastically limits latency and makes ambitious applications possible. Last April, Honda and Verizon said they would use 5GUltra Wideband and MEC platforms to enhance road safety by speeding communications between road infrastructure, vehicles and pedestrians. The research was to be conducted at the University of Michigan’s Mcity test bed for connected and autonomous vehicles.