Professor in wireless engineering exploring means to optimize AR/VR device speeds

Published: May 4, 2022 8:15 AM

By Joe McAdory

Can wireless augmented reality/virtual reality (AR/VR) devices perform at optimal, real-time speed? Shiwen Mao, director of the Wireless Engineering Research and Education Center, will soon find out.

His three-year, $993,319 study, “Learning based Resilient Immersive Media-Compression, Delivery, and Interaction,” was one of 37 selected by the National Science Foundation’s (NSF) Resilient and Intelligent NextG Systems (RINGS) program. RINGS is jointly funded by the NSF, the Office of the Under Secretary of Defense for Research and Engineering (OUSD R&E), the National Institute of Standards and Technology (NIST), and several industry partners, seeking to accelerate research in areas that potentially have a significant impact on emerging Next Generation wireless and mobile communication, networking, sensing, and computing systems.

Mao, who will explore innovative technologies to provide a unified media compression, communication, and computing framework to enable real-time AR/VR, believes the study has the potential to make significant impacts within the research community and society.

“Immersive media, such as Augmented reality/virtual reality, has been recognized as a transformative service for Next Gen network systems, while wireless supported AR/VR will offer great flexibility and enhanced immersive experience to users and unleash a plethora of new applications,” Mao said.

For instance, retail inventory managers can don wireless AR/VR goggles, connect to specific servers and, voila, suddenly have access to the location in real time … as if they were there. “But the challenge is that it has to be in real time,” Mao said, noting that wireless devices — which allow users more freedom — are not as fast as wired counterparts and immersive media applications usually require too much data to be transmitted.

The project features five thrusts, Mao said. Parts one and two focus on learning-based immersive media compression or developing high-efficiency light field and point cloud compression solutions. Thrusts three and four explore the fundamental performance concepts and techniques that facilitate wireless AR/VR transmission and interaction. Thrust five will integrate all techniques developed and validate their respective performance with simulation studies using open-source datasets and experimental studies using an AR/VR testbed and publicly available wireless and cloud-related platforms.

Mao, the principal investigator, is collaborating with Zhu Li, co-investigator and associate professor in computer science and electrical engineering at the University of Missouri-Kansas City. “Our team is unique,” Mao said. “Dr. Li is a video compression and signal processing expert, and my expertise is in wireless communications and networking.. We complement each other, so we can produce results.”

“Receiving this award is more than an honor,” Mao said. “It is also a great responsibility. We need to deliver.”

But Mao wants to take this study a step further than the RINGS program. He wants to “connect the unconnected.”

“44 million households in America have no broadband connection,” he said. “many living in rural areas with limited access to schools, or education. But what if we can offer immersive media applications? What if we can bring the virtual classrooms to them? We want to use this kind of AR/VR – using tools we find through this study – and deliver real-time education to places that cannot offer this.

“We can make an impact on the world, and that’s what’s most important.”

Media Contact: Joe McAdory, jem0040@auburn.edu, 334.844.3447

Recent Headlines