Apple recently applied for a patent to track inputs such as the eyes, gestures and even facial expressions of its users. The company is coming up with a mixed reality headset that would combine these inputs with information gathered from outward-facing sensors for mixed reality experiences.
The patent application is titled “Display System Having Sensors,” was filed in March earlier this year, but was published last week. Apple has been working on its own headset for a couple of years now. Using a range of sensors Apple would be now able to track gestures, track eyebrows and jaws, read facial expressions and gather other data from the wearer to realistically reproduce a user’s facial expression in mixed reality. The patent also mentions that eye tracking camera could also be used for biometric authentication.
Apple has already developed facial tracking software for Animoji, the company’s animated AR emoji. Animoji make use of an iPhone’s selfie camera to track facial expressions, and then reproduce those expressions in animation.
From the patent application – “In some embodiments, the world sensors may include one or more “video see through” cameras (e.g., RGB (visible light) video cameras) that capture high-quality video of the user’s environment that may be used to provide the user with a virtual view of their real environment.”
Apple has not yet publicly commented on this product and its launch details. In March, Apple analyst Ming-Chi Kuo estimated that the company may start to produce its headset by Q4 of this year, and then publicly introduce it in 2020.