Arrow
Back
Apple Vision Pro changes everything: Why the WVS team is excited to support VisionOS developers.
July 22, 2024
Updates

Neville Spiteri

July 6, 2023

“Search and Replace” on reality in real-time!

Having had access to the early “-1 prototype” VR devkit from Valve in 2015, I was pretty obsessed with the concept of “Presence” and the requirement of “sub 20ms latency” to achieve it (as first described by Michael Abrash at the time). And now with the M2/R1 architecture, the visual pipeline is sub 12ms and sensor-to-display is reportedly 4ms. This  incredible technological breakthrough which enables “seamless passthrough” is the foundation for AR, the layering and synchronization of the real and virtual inputs. Hence, this is the first Apple device that “you look through versus look at” (as Tim Cook described it). And this is achieved because the Apple Pro is fundamentally a VR device, the entire visual input comes through the device, sensor-to-display. We’ll now be able to do search and replace on reality in realtime. You want the sky to be purple, your sofa to be black, your skin to look more tanned… no problem!

Spatial computing, not “AR or VR”.

Being both an AR and VR device, it makes perfect sense (and it’s very refreshing) that Apple has opted for the term “spatial” to describe this vision-based computing paradigm, supporting the full continuum of real-to-virtual. VisionOS supports Shared Spaces and Full spaces, bounded and unbounded volumes enabling full passthrough to full immersion and everything in between. This provides an easy way for developers and users to transition from the current paradigm of 2D interfaces and “windows” to “volumes” to full immersion - a much more accessible transition from the current computing paradigm to the next. This seems like such an obvious way to go in retrospect, but I guess that’s the genius of Apple’s approach.

Another iPhone / Blackberry moment?

The Vision Pro is positioned as a versatile device catering to a wide range of human activities and functions. Its primary focus is not on gaming, which sets it apart from devices like Oculus, Valve/Vive, or Sony, originally designed for gaming on PCs or consoles. Despite Meta's emphasis on moving away from PC with the mobile/all-in-one form factor of the Quest, it unsurprisingly gained popularity in gaming with apps like Beat Saber. Apple, on the other hand, takes a different approach by emphasizing that the first device the Apple Pro replaces is your TV. In a bold move reminiscent of the iPhone's "no buttons" position, the Vision Pro ditches controllers, offering a novel finger and eye based input method instead. The Vision Pro targets work, rest, and play, seemingly granting each with equal importance. Whether or not the Apple Vision Pro becomes another iPhone / Blackberry moment, with its all-purpose design, I believe it becomes the first of a series of Apple products that takes spatial computing mainstream.

The Apple Vision Pro is one hell of an MVP! I couldn’t be more excited about it both as a consumer and as a developer.

Developing for Apple Vision Pro with Wevr Virtual Studio

At Wevr we’re focusing on how we can make it easier for creators and developers to host, build and publish apps on Vision Pro with Wevr Virtual Studio (wvs.io). We’re currently working on starter templates for both Unity and native Swift/RealityKit apps, making wvs.io the best “place” where to build your Vision Pro app from your first commit to published app in the Store.

Stay tuned for more.