Apple is widely expected to unveil its long-rumored mixed reality headset at WWDC 2023. This comes as a surprise to few, also because Apple has been praising augmented reality at least since WWDC 2017. It was then that Apple began laying the groundwork for the technology used in the headset through developer tools on the iPhone and iPad.
Back then, Apple first introduced its ARKit augmented reality framework, which helps developers create immersive experiences on iPhones and iPads.
ARKit was such a focus for Apple in the years that followed that the company devoted much of its recent live keynotes to introducing and demonstrating new AR features. Who could forget the sparse wooden tabletops that served as surfaces for building virtual LEGO sets on stage?
By highlighting these tools, Apple has communicated the importance of augmented reality technology as part of the future of its platforms.
iPhone and iPad software isn’t the only thing designed for a mixed reality future. iPhone and iPad hardware have also been better equipped to serve as wearable windows into an augmented reality world.
Beginning with Face ID and Apple’s Animoji (and later Memoji) feature, Apple began optimizing the iPhone for AR capabilities. Internally, Apple has customized the iPhone’s Neural Engine to handle augmented reality with ease.
iPhones’ main camera even featured a special LiDAR sensor, similar to how lunar rovers navigate the moon’s surface and self-driving cars read their surroundings.
There was even a hardware update for the iPad Pro that focused almost entirely on the addition of a LiDAR scanner on the rear camera.
Why? Sure, it helped with focus and depth-sensing for Portrait mode photos, but there were also dedicated iPad apps that let you decorate your room with virtual furniture or try on glasses without actually owning the frames.
From the beginning it was clear that ARKit was not intended exclusively for immersive experiences on the iPhone and iPad. The phone’s screen is too small to be truly immersive, and the tablet’s weight is too heavy for long-term use.
There is definitely a use for AR on iPhones and iPads. Catching pocket monsters in the real world is more adventurous in Pokémon GO than in an all-digital environment. Also, dissecting a virtual creature in a classroom can be more inviting than touching the actual guts.
Still, the most immersive experiences that really make your brain believe you’re actually surrounded by the digital content you’re viewing require goggles.
Does that mean everyone is so interested in AR and VR that the headset will be a hit? The reactions to AR on the iPhone and iPad have at times been that Apple is looking for a solution and offers a solution.
Still, there are some augmented reality experiences that are clearly appealing.
Want to see all the dimensions of the announced but unreleased iPhone or MacBook? AR is probably how many people first experienced the Mac Pro and Pro Display XDR.
Projecting a 1:1 scale virtual space rocket in your living room will also give you a good idea of the scale of these machines. Experiencing a virtual rocket launch, looking back at Earth like a passenger, can also be exhilarating.
Augmented Reality was also the best way to introduce my kids to dinosaurs without risking time travel and bringing the T-Rex back to the present.
As for ARKit, there are a number of ways Apple is openly developing tools that will be used to develop the headset experience starting next month.
First off, the framework introduced a way to provide developers with the tools, APIs, and libraries they need to build AR apps in the first place. Motion tracking, scene detection, light detection, and camera integration are required for AR app adoption.
Another important factor is tracking in the real world. ARKit introduced the tools needed to accurately track the position of virtual objects in a real-world environment via Apple devices using hardware sensors such as camera, gyroscope, and accelerometer.
Then there’s face tracking. With ARKit, developers can integrate the same face tracking capabilities that Apple uses to provide facial expression mirroring for Animoji and Memoji.
AR Quick Look is another technology already mentioned. AR experiences use this to place virtual objects like products in the real environment around you. If you scale these objects properly and remember their position relative to your device, you can create the illusion.
Recent versions of ARKit have focused on supporting shared AR experiences that can persist between uses, recognizing objects around you, and hiding people from scenes. Performance has also been steadily tweaked over the years, so the core technology that powers virtual and augmented reality experiences in the headset should be pretty solid.
We’re expecting our first official look at Apple’s headset on Monday, June 5, when Apple kicks off its next keynote event. 9to5Mac will be present at the special event. So stay tuned for up-close, comprehensive coverage. Best of luck to the HTC Vives and Meta headsets in the world.
FTC: We use income generating auto affiliate links. More.