“Welcome to the age of spatial computing,” Apple announced at the launch of its latest device, mixed reality glasses called Vision Pro. CEO Tim Cook described it as “a new breed of computer that augments reality by seamlessly connecting the real world with the digital world.” The device is powered by a new operating system called visionOS, which Apple says will “bring the building blocks of spatial computing”.
If it’s “a new type of computer,” as Apple claims, that means a new greenfield for developers. So what can developers expect from visionOS and Vision Pro? I watched a WWDC session titled “Getting Started Building Apps for Spatial Computing” to find out.
“Apps launch in Shared Space by default,” began Apple’s Jim Tilander, an engineer on the RealityKit team. “Apps coexist here, much like multiple apps on a Mac desktop. Passthrough keeps people connected to their environment.” (Passthrough in this case means moving attention from the virtual world to the physical world or vice versa.)
He then introduced three new concepts, all SwiftUI scenes: Windows, Volumes, and Spaces. SwiftUI has been around for four years and serves as Apple’s primary user interface framework for its various products. For visionOS, SwiftUI has been enhanced with “completely new 3D capabilities and support for depth, gestures, effects and immersive scene types”.
Each of the three scenes is self-explanatory, but it’s worth noting that in addition to the “Shared Space” concept, Apple also offers “Full Space” where you want “a more immersive experience” for an application and so “It only gets the content of that one App displayed.”
It’s interesting to note that Apple seems to have a different definition of “presence” than Meta (née Facebook). Meta defines presence as “high fidelity digital representations of people that create a realistic sense of connectedness in the virtual world.” In other words, for Meta, “presence” means total immersion in the virtual world. But based on the graph below, which I saw in this session, for Apple “presence” means less immersion – it means the physical world intrudes into the field of view of your Vision Pro glasses.
Pros and cons of data protection
Apple claims that the Vision Pro and visionOS platform takes user privacy as a core principle, while “making it easy for you, as a developer, to leverage APIs to take advantage of the device’s many features.”
Apple’s solution to protecting user privacy is to curate data and interactions for developers. Tilander gave two interesting examples of this.
“Rather than allowing apps to access sensor data directly, the system does it for you, providing apps with events and visual cues. For example, the system knows a person’s eye position and hand gestures in 3D space and delivers these as touch events. Also, the system creates a hover effect on a view when it’s the center of attention, but doesn’t tell the app where the person is looking.”
Sometimes “curated” data isn’t enough for developers. Tilander explained, “In those cases where you actually need access to more sensitive information, the system will ask people for their permission first.”
Given the potential impact on people’s privacy from the Vision Pro – including the user, since it has eye scanning capabilities for login and tracking – the limitations Apple has imposed on developers sound reasonable.
However, Google developer Brandon Jones pointed out on Twitter that “if you want to make AR apps, you have to give Apple full rendering control.” Though he generally thinks that’s a good thing – “You don’t want to, for example , that ads can be used to draw conclusions about how much time a user has spent looking at them” – he is not so enthusiastic about Apple, “that he quietly … To achieve this, web standards have to be invented and circumvented. “
In short: Apple’s privacy restrictions on Vision Pro are implemented at the operating system level, giving Apple a high degree of control. Jones acknowledged that most developers will be happy with this, but correctly noted that “Apple (which is already known for limiting what iOS can do) is further limiting the ability to deviate from the chosen patterns.”
The tools
“It all starts with Xcode,” Tilander said of how developers will build apps for visionOS. Xcode is Apple’s integrated development environment (IDE) and comes with a simulator for Vision Pro and an advanced performance analysis tool, Instruments (including a new template, RealityKit Trace).
The frameworks for creating 3D content are ARKit and RealityKit, which handle tracking, rendering, physics, animation, spatial audio, and more.
For visionOS, Apple is introducing a new editor called Reality Composer Pro that “enables the preview and preparation of 3D content for your apps”. One Reddit user described it as “like Powerpoint in AR” so the focus is on ease of use.
No doubt recognizing that more than just existing Apple developers need to think about developing Vision Pro, Apple has also partnered with Unity, an existing 3D platform. In the opening speech of WWDC 23, one of the presenters noted that “popular Unity-based games and apps can get full access to visionOS features like passthrough, high-resolution rendering, and native gestures.” Tilander confirmed in his session that no Unity plugins are required and that developers can simply “take your existing content.”
how to start
To launch a new app, you can select the default app template for “xrOS” (apparently the shortened version of visionOS) in Xcode. From there, select a “Scene Type,” with the default being “Window.” This is what happens in a Shared Space by default, but you can change that.
“And when you’re done with the wizard,” Tilander continued, “you’re presented with a first working app in SwiftUI, displaying familiar buttons mixed with a 3D object rendered with RealityKit.”
You can also easily convert iPhone or iPad apps to visionOS apps, Tilander noted.
Developers can expect more resources, including a developer kit, in July. A first visionOS SDK will be available in Xcode later this month.
Apple wants developers to get into 3D
As usual when Apple announces a new device, there has been a lot of thought given to the developer tools and techniques to do so. There’s nothing in visionOS that seems out of reach for existing iOS developers, so it’s a pretty seamless transition for Apple’s developer community.
The downside, of course, is that Apple lures developers into yet another closed developer ecosystem. visionOS will have its own app store, we were told at WWDC 23, but you can guarantee it won’t be more open than the iOS app store.
Finally, for developers, it’s important to note that the user interface doesn’t differ much from the iPhone, at least on the first-gen Vision Pro. “On the internet, it’s still just rectangles,” as one Twitter user put it. As others have pointed out, this is likely because Apple wants to make it easier for its existing developers to get started developing visionOS. From a user perspective, early reports are now suggesting that Vision Pro could actually be magic. But from a developer’s perspective, Vision Pro isn’t that revolutionary yet.
Group created with Sketch.
Richard MacManus is Senior Editor at The New Stack, writing on web and application development trends. He previously founded ReadWriteWeb in 2003 and grew it into one of the world’s most influential technology news sites. From the beginning…
Read more by Richard MacManus