The MWC 2020 event was suspended due to Corona 19. Eventually, Qualcomm announced a new XR headset reference design at an online press event.
This product combines a camera and various sensors based on the Snapdragon XR2, an XR platform announced by Qualcomm last year. It features high-quality mobile XR and 5G communication, which will be the key to VR. XR (eXtended Reality) is a word that combines virtual reality VR, augmented reality AR, and mixed reality MR.
Snapdragon XR2 has twice the CPU and GPU performance, four times the video bandwidth, six times the resolution, and 11 times the AI processing speed than the Snapdragon 835 adopted by Oculus Quest and HTC Vibe Focus. Although it seems to be a vague expression of AI, the on-device machine learning engine is important for realizing VR or AR applications that are close to human senses, such as finger tracking, facial expressions, speech recognition cameras, and interactions with virtual objects through various sensors. Do.
There are 7 cameras supported by the reference design released this time. Two inside are for gaze tracking, two of the outside are head tracking and surrounding environment mapping, and the other two are RGB cameras for MR that see through the outside or overlap with virtual objects. Each manufacturer can add cameras for facial expression recognition, lip tracking, and controller tracking.
5G is supported by the Qualcomm X55 5G modem and supports both millimeter wave and sub 6GHz wave. In addition to simply streaming high-definition VR video, there are AR applications that stream from a nearby PC or connect to an edge computing node with low latency. Edge computing is the idea to intelligently interact with large amounts of data at low latency by placing computing nodes in 5G base stations or routers and integrating with wireless networks. Mobile terminals are so heavy that existing cloud computing can process near real-time as the processing time is slow, or local processing of massive data required for AR applications in the future can speed up.
Another reference design specification supports hand tracking and head tracking, an IR emitter for SLAM, a liquid crystal display supporting 2K per side, and a tracking receiver (Atraxa) from partner NDI, and a foveated rendering for eye tracking. Comply with the supported Tobii.
Atroxa is a technology that realizes accurate 6DoF tracking through magnetic and acceleration sensors. Like optical tracking that measures the controller’s magnetism with a headset-side sensor, it can measure outside the camera’s angle of view, such as the back side blocked by a hand or other controller, without relying on the camera image.
Fobited rendering using gaze tracking is a technology that realizes graphics beyond the actual GPU performance by drawing around the point of interest in high definition and making it look high quality no matter where you look with human perception. Even in existing virtual reality headsets that cannot track the eye, it supports Fixed Foveated rendering in which the visual center is rendered in high resolution and the surroundings are rendered in low resolution. In the case of fixed, the drawing appears rough by aiming only the gaze around the head without moving the head, or the gaze is not slightly turned to read the text. You need to be conscious by moving your head and looking straight ahead. However, with gaze tracking, the spots of interest are drawn in high quality, subjectively, it feels as if the overall graphic quality has improved.
The announcement this time is expected to be available in a few months to OEMs who want to make a headset using the Snapdragon XR2. Being adopted in consumer products requires a little more waiting. Related information can be found here .