What Apple calls a spatial computer, some technologists call “mixed reality” — or possibly “augmented reality,” “holographic computing,” “the metaverse” or “XR,” which some people say is shorthand for “extended reality.” Others say the letters don’t stand for anything.
Technologists parse different meanings from these terms. And this is puzzling for nearly everyone.
“Even I can’t grasp what things mean all the time,” said Alex Coulombe, co-founder of Agile Lens, which identifies itself as an XR company. Coulombe began to tell me that spatial computing and XR are identical — but changed his mind midsentence.
It’s insightful when a term embodies a product or your emotions — “podcast” or “languishing.” When a technology can’t define itself clearly, it’s a barrier to feeling like it’s right for you.
Now let’s examine the nonsense words and how the gibberish jousting reveals that none of these computers for your face are what you or their creators truly want.
Oh, and if you’ve assumed the Vision Pro is a virtual reality headset — you’re essentially correct.
What is spatial computing, anyway?
I’m going to define it as an immersive video feed of the physical world plus the internet.
When you fasten on the Vision Pro, you can watch a movie through the screen on your face and observe your living room around you. You can access a recipe app through Apple’s headset and position virtual cooking timers above your pots as you follow the instructions.
But you’re not seeing the real world. You’re seeing a nearly live streaming video of your living room or kitchen with apps superimposed on there. Meta’s $500 Quest 3 headset operates this way, too.
Some technologists use terms such as “mixed reality” to describe a combination of virtual elements and a digital feed of your physical environment. Or “pass-through.” I’m sorry.
Some experts instead use spatial computing as a catchall term for a range of technologies, including 3D images, virtual reality and smartphone games such as “Pokémon Go.” Other people use XR as a catchall term.
An Apple representative didn’t respond when I asked how the company defines spatial computing.
Even the experts don’t agree! It’s probably best if no one uses any of these words. (I hereby vow to avoid them.)
“The industry loves to argue about these terms,” said Anshel Sag, principal analyst with Moor Insights & Strategy. “Most of the terminologies we use today are irrelevant to the layman.”
Actually, the Vision Pro is mostly VR
After days of conversations that sloped me dizzy, most experts agreed on a verbal shortcut.
Most of the digital-plus-physical experiences that companies might call spatial computing, the metaverse, mixed reality, blah blah blah, are on a continuum between virtual reality and augmented reality.
You probably know what virtual reality is. You’re immersed in a simulated digital world, typically through computer goggles. You don’t see the real world.
The flip side is augmented reality or AR. You see the world with your own eyes, and digital images are mixed in.
If you’ve peered through the Pokémon Go app on your phone and saw a real park bench with a virtual monster hopping on it, that’s augmented reality. So are Snap’s experimental Spectacles glasses through which you might look at a restaurant menu and see it morph from Japanese to English.
By this standard, the Vision Pro and Quest 3 are mostly VR with a dash of augmented reality. The Fortnite game is mostly virtual reality, though you don’t play it with VR goggles.
(Fortnite calls itself a metaverse, which it defines as social and immersive virtual interactions.)
Why these verbal semantics matter
Matthew Ball, an entrepreneur who writes extensively about [whatever we call this stuff], instead suggested that we normal humans call these technologies immersive 3D.
Everything we experience on our phones or computers is a flat simulation. What could be amazing, Ball said, is to feel the lines blur between your reality and more immersive, helpful digital experiences.
Imagine wearing a lightweight, inexpensive pair of glasses and seeing digital walking directions in your field of vision that point where you turn left. Or imagine sharing a video of your kid’s birthday party that makes others feel like they were there.
Those types of immersive 3D experiences through unobtrusive computers are what Apple wants to do. And what Mark Zuckerberg envisions. It was also the idea behind Google Glass a decade ago.
The technology simply isn’t ready. Maybe we’re not ready, either.
What you’re getting in the meantime are pricey compromise products described with nonsense words — and a promise that an awesome future is coming.