Meta supremo Mark Zuckerberg unveiled Orion smart glasses, a new augmented reality (AR) prototype, at the annual developer conference. Ten years in the making, and still not expected on high streets until 2027, these will be a new way to meld the real and digital worlds. They will be controlled by the eyes and also the fingers via a neural interface on the wrist.
So what does this mean for the future of AR wearables and how we interface with computers? We asked three tech specialists at the University of ϲʹ, Peter Butcher, Llŷr ap Cenydd and Panagiotis Ritsos.
Why has Orion been such a technical challenge?
There are serious technical challenges in packing so much sophisticated technology into something so compact. This includes new , hand and eye tracking, off-device processing, cameras, speakers and microphones – all while ensuring the device remains aesthetically appealing and has decent battery life.
Meta’s chief tech officer, Andrew Bosworth, recently captured the scale of the challenge by saying: “In consumer electronics, it might be the most advanced thing that we’ve ever produced as a species.”
The optical design is a huge challenge. Mixed reality headsets such as Meta Quest 3 and rely on “passthrough” technology, in which external cameras capture real-time video of the user’s surroundings. This is displayed inside the headset, with digital elements overlaid.
In contrast, Orion’s holographic projection allows users to directly see through transparent lenses, with graphics projected into their view. This has demanded substantial R&D.
Are there other notable innovations?
One key factor that determines the immersiveness of mixed reality headsets is their field of view, meaning the angular range that the viewer can see through the headset. The state of the art is the 70° field of view of the , bigger holographic AR glasses aimed at businesses ]. They are made by Magic Leap, a US company whose backers include Google and AT&T.
With Orion, Meta has achieved a field of view of 70° in a much smaller product, which is a grand innovation and crucial for Zuckerberg’s vision of an unobtrusive wearable device.
The neural interface wristband is also vital. It listens to nerve impulses from the brain to the hand, allowing users to control the device using subtle finger gestures such as pinching and swiping thumb against index finger. Newer mixed reality headsets such as Apple Vision Pro are controlled similarly, but rely on external cameras to interpret hand movements.
An advantage of tapping into nerve impulses directly is that gestures do not require line of sight, and eventually might not even require the person to perform the full gesture – only to think about it. The technology also opens up brand new input methods, such as texting via mimicking handwriting, and is likely to mature before consumer-grade holographic displays become available.
Has Orion been more trouble than Meta expected?
Meta initially gave the Orion prototype only of success, so it has exceeded expectations. While there is still much work to be done, particularly in reducing costs and miniaturising components, Orion could eventually lead to a consumer-ready device.
Do you think Meta will get an affordable version launched by 2027?
Meta thinks the initial price will be comparable to flagship phones or laptops . We might see development kits released towards the end of the decade, aimed at early adopters and developers, much like how VR headsets were introduced a decade ago.
In the meantime, other AR glasses and mixed reality headsets such as Meta Quest 3 and Apple Vision Pro serve as platforms for developing applications that could eventually run on AR glasses.
Why are the Orion glasses still so expensive?
Holographic AR glasses remain expensive because much of the hardware – including and (which are used to optimise light transmission) – isn’t yet produced at scale. These components are critical for achieving high resolution and holographic displays – and production constraints are reportedly pushing Orion unit prices . Even then, battery life is currently limited to around two hours.
Could anyone potentially beat Meta to market?
Thanks to Meta’s multi-billion dollar investment in R&D through its Reality Labs subsidiary, it has become a leader in virtual and mixed reality headsets, with a robust app ecosystem. However, Apple, Microsoft, Samsung and Google are developing similar technologies.
and [Snapchat owner] Snap Inc’s have made strides in AR, but responses have been mixed due to limitations such as narrow fields of view and lower graphics quality. Orion appears to be ahead in holographic display technology. Another company to particularly watch is Apple, which is refining Vision Pro and AR smart glasses.
Will AR glasses change the world?
AR glasses could ignite a transformative “iPhone moment” that redefines how we interact with technology. Zuckerberg envisions them as the next major computing platform, offering a more natural and intuitive alternative to smartphones.
The success of early mass-market smart glasses such as Meta’s , which allow users to make calls, capture videos and interact with Meta AI, hints that AR glasses could see widespread adoption.
Zuckerberg initially believed holographic technology would be necessary for smart glasses to offer functionality beyond the basic features of these Ray-Bans. But being able to incorporate an AI voice-powered assistant has made Meta realise that smart glasses can be developed from the ground up as a new consumer product category. While the four-hour battery life requires improvement, the positive feedback from both reviewers and users, particularly using them on Instagram and TikTok, demonstrates the potential.
What does the future look like?
Reading messages, watching a virtual screen on the wall, playing games, – all the things you can do with mixed-reality headsets, but shrunk down to a pair of glasses. Friends will teleport into your living room, a video call where both people feel present in the same space.
It gets even stranger when you incorporate AI: virtual assistants can already see what you see, hear what you hear, talk to you, answer questions and follow commands using smart glasses. In future, AI will be able to manifest itself in your vision, and you’ll be able to have natural conversations with it.
By 2030, AI will radically change the ways in which we interact with each other, our physical world and computers. Orion aims to prepare us for a world in which the physical, artificial and virtual co-exist.
, Lecturer in Computer Science, ; , Senior Lecturer in Visualisation, , and , Lecturer in Human Computer Interaction,
This article is republished from under a Creative Commons license. Read the .