Spatial computing redraws the world of work

As 2023 draws to an end, it’s natural to look back at the new things that have emerged in the networking industry over the last 12 months and assess what impact they have made and what impact they could have over the next year. To draw a baseline, virtually everyone would agree that this is now truly the age of 5G, artificial intelligence (AI) and virtual/augmented realities, and that companies are beginning to see the emergence of significant and meaningful applications based on these platforms. Key among them has been the appearance of spatial computing, a phenomenon which has existed in theory for a while, but which is now making its mark in transforming the world of work.

Immersive technologies are fast heading from the games and leisure arenas and into the industrial, manufacturing, education and enterprise segments, poised to invent new paradigms for each sector, and to address the new realities of enterprise. These emerging technologies are being used to great effect at the intersection of computer vision, artificial intelligence, deep learning, cutting-edge computer graphics, high-throughput, super-low latency 5G networks, and the cloud.

Using them all can create a new, intelligent perception of the world through augmented reality (AR), mixed reality (MR) and virtual reality (VR) devices. This means addressing devices, display technology, computer performance, network performance, perception technology, content, contextual AI, batteries and charging technologies and connectivity in general.

The spatial computing world is thus one of huge complexity and has created its own ecosystem to address the fundamental functional elements. It is also the fruits of around three to four years of development by leading technology companies.

Moving to spatial computing

For Jérôme Jacqmin, senior director of product marketing at Qualcomm France, the journey to spatial computing comes from rather a longer way back than that. Indeed, he suggests that we have come on a journey from 2D images which, over the past tens and hundreds of years, people have used to communicate with one another.

“So now we move to spatial computing, and suddenly it’s the world that surrounds you basically that becomes your desktop and [the world] with which you are going to interact,” he says.

“You need to provide users with significant applications that really matter to them. This is going to happen thanks to artificial intelligence and the interaction between on-device computing and cloud computing. And this is where 5G plays a very important role. It’s very high throughput, super low latency and service in real time, with the best user experience between the devices and the cloud.”

Qualcomm believes that in order to create effective immersive headsets and devices, a number of perception technologies need to be addressed. From a user understanding perspective this includes tracking technologies that encompass hand controllers, support six degrees of freedom for head movements, as well a tracking of eyes and even facial expressions. This creates an environmental understanding that supports spatial mapping and meshing with scene awareness and object/image recognition.

Now we move to spatial computing, and suddenly it’s the world that surrounds you that becomes your desktop
Jérôme Jacqmin, Qualcomm France

The technology also needs to support 3D place detection as well as local anchors and image persistence. Running over all of these capabilities, there needs to be optimised processing that covers head-mounted split rendering and low-latency video see-through.

In October 2023, Qualcomm announced the fruits of a technology partnership with Meta in the form of two new spatial computing platforms – Snapdragon XR2 Gen 2 and Snapdragon AR1 Gen 1 – designed to be the base for the next generation of MR and VR devices, as well as smart glasses.

The Snapdragon XR2 Gen 2 platform is said to contain a graphics processing unit and AI that can deliver what Qualcomm calls “ground-breaking” immersion that seamlessly blends the physical and virtual worlds, while the Snapdragon AR1 Gen 1 Platform is said to be the first dedicated processor for smart glasses, which delivers premium camera quality, on-device AI and what the tech firm calls “blazing-fast” connectivity. The platform is said to be designed to democratise premium MR and VR technology in a single-chip architecture to unlock gaming, productivity and entertainment experiences in thinner and more comfortable headsets that don’t require an external battery pack.

Offering richer visuals with enhanced clarity, textures and colours at more frames per second compared with previous versions, it is also optimised for resolutions up to 3K per eye to deliver better pixel quality and high-fidelity visuals. The platform is also intended to allow users to blend virtual content with their physical surroundings to transition between MR and VR experiences. It provides full-colour video pass-through capabilities with speeds up to 12ms for improved visual fidelity and digital comfort in MR experiences.

Qualcomm regards video see-through (VST) latency as a key differentiator. It says traditional systems are built on a stack with elements comprising camera, lens correction, wait-for-frame, render, wait-for-display, warp and display that result in latencies of over 50ms. By contrast, its solution is based on a stack with just camera, single step and display elements leading to a 12ms latency.

Commercial prospects

In what is almost certain to be a massive boost to the commercial prospects of the processors, both platforms will debut on Meta devices before the end of 2023. Specifically, the Meta Quest3 will be powered by Snapdragon XR2 Gen 2 Platform and Ray Ban Stories, powered by the Snapdragon AR1 Platform.

“At Meta, we’re focused on developing the technologies of the future in mixed reality and smart glasses, as well as the foundational innovations that will one day power our vision for AR glasses,” says Andrew “Boz” Bosworth, Meta’s chief technology officer and head of the social media giant’s Reality Labs.

“We are defining next-generation technologies that deliver massive breakthroughs in power, performance, and AI. Building this future computing platform requires an industry-leading partner, and this is where our longstanding collaboration with Qualcomm Technologies is critical. The latest Snapdragon XR2 Gen 2 and Snapdragon AR1 Platforms, which power Meta Quest 3 and our next-generation Ray-Ban Meta smart glasses, are another testament to the strength of this partnership, and we are thrilled for users around the world to experience them.”

This is a hugely important point – spatial computing is all about ecosystems and partnerships. And while partners and brands probably don’t come much bigger than Meta on the tech side and Ray-Ban in terms of eyewear, Qualcomm revealed in September that it is working with a number of leading European technology companies to support spatial computing in a variety of use cases, in particular gaming, education, engineering and industry 4.0.  

Founded two years ago, France’s Lynx Mixed Reality has been working with Qualcomm to develop the Lynx-R1 headset. Boasting a team of 15 people with various backgrounds from designers to electronics engineers and from Unity experts to low-level system architects, the company’s stated mission is based on its belief that “virtual reality as a medium is the best storyteller, and augmented reality is basically a superpower.”

With the Lynx device, the company believes that it is pioneering standalone mixed reality, and that its AR/VR ecosystem deserves a versatile and open device, at a “very affordable” price point to empower professionals and industries across the globe to transform how they work, train and learn.

At the product’s launch in 2023, Lynx Mixed Reality CEO, Stan Larroque, said the company had been working with Qualcomm for three years to create what he believed was the first high-end consumer electronics device of its type coming from a European company.

“It’s like Apple Vision Pro, but so much less expensive, but it does kind of the same stuff,” he said, during a demo of the product’s capability. In addition to a detailed look around a hugely complex 3D computer-aided design model of an aerospace engine, looking at how engineers could basically fall right into parts and sub-assemblies, the company also showed medical applications with realistic 3D immersive views of skeletons and musculature.

A world of learning

In education, technology company Avantis Systems has been working for the last three years with Qualcomm to develop ClassVR, as part of its mission to inspire a world of learning where “the impossible becomes possible’. ClassVR was conceptualised in 2014 and launched in January 2017, with the company’s devices aiming to improve student outcomes through increased engagement in the classroom and to improve knowledge retention by learning through experience. The set-up comprises virtual reality headsets, a headset management portal and curriculum-aligned content to add value to lessons regardless of subject. The result of this combination is said to give students the ability to “foster curiosity, ignite imagination and enhance engagement.”

Describing itself as an expert of AR/VR and streaming technologies, Germany’s Holo-Light was founded in 2015 and has over 80 people working from offices in Austria and the US as well as its domestic territory. Focused on the industrial metaverse, it specialises in immersive software and technology for the enterprise market and looks to pave the way for increased adoption and scalability of XR with its XR Streaming platform aiming to change the way companies design and develop products.

As well as being a partner of Qualcomm, the company has a client list including blue chips such as Northrop Grumman, P&G, BMW and Mercedes Benz. Its technologies see use in a number of situations including product development, design reviews, stakeholder engagement, plat design, assembly training and remote support.

Arthur Technologies aims to solve enterprise hybrid work with a mixed reality office solution that introduces what the company calls meaningful presence and collaborative productivity, enabling firms to embrace hybrid work without what the firm calls its downsides. The technology is deployed in use cases including special events , learning and development, client and team meetings  as well as what it calls agile project world and meetings. At the heart of the solution is Qualcomm’s Snapdragon Spaces XR Developer Platform which Arthur says allow the company to reach its goals faster in spatial meetings, local spatial anchors and hand tracking.

And while it is totally behind its own developers such as the aforementioned firms, Qualcomm stressed that it was essential that all of the players in the immersive space prospered to create a tide that would lift all virtual boats. Jacqmin noted the importance of Apple with its Vision Pro AR/VR product. “We want them to be successful,” he reveals, “It is important for them to be successful. [That will be] very important for the ecosystem of the market and will attract a lot of developers to invest into it.”

The brave new world of spatial computing will indeed be a complex one, with many moving parts and totally dependent on a number of companies co-operating with each other at various parts of an intricate and multifarious ecosystem. The shape of things to come in many industries is set to be a lot clearer. 

Exit mobile version