5 Things Apple Got Right with Vision Pro
Apple’s new AR headset is getting mixed reviews. Let the haters hate. Here’s 5 reasons Apple Vision Pro will make Spatial Computing the norm by 2040
Happy Thursday, friends. It’s been an eventful week with WWDC earlier this week where Apple entered the Augmented Reality space, the SEC suing both Binance and Coinbase, exciting innovations in AI, all against the backdrop of a jobs market that seems confused at best. So—how are you holding up?
Poll: Will You Buy a Vision Pro?
5 Things Apple Got Right with Vision Pro
Haters gonna hate, but the Vision Pro Augmented Reality Spatial Computing device from Apple got several things right that will put a surprising number of these on the faces of tech workers everywhere in the next few years.
So much so, in fact, that I estimate by 2040*, I believe 25-30% of iPhone users will have Vision Pro, with Vision Pro replacing the iPad for second-device dominance from Apple that the iPad currently has. That means hundreds of millions of Vision Pro devices being sold every year by 2040.
And here are 5 things Apple got right that are reasons Vision Pro will silently dominate the AR/VR market:
Everything just works with the entire Apple ecosystem out of the box.
Not me, remembering the hour I spent with another Oculus user trying to simply join a Meta Workrooms conference room together, but the fact that your email, messaging, web, bookmarks, apps and more will just work with Vision Pro on day one is a killer app by itself.You can see your surroundings as much (or as little) as you like while wearing the device. Importantly, if you are on a plane or in a room less (ahem) sleek than the broad living spaces in the Apple videos, you can dial back the transparency, enabling full immersion into your space.
People can see your eyes and expression while you’re wearing the device, and, importantly, they can see when you’re unable to see them because of the “curtain” effect closing off your eyes to them when you’re fully immersed.
Your AR-self can appear in video calls (not a stupid cartoon avatar), thereby finally solving rule-number-one of person-to-person communication — always make eye contact. There’s nothing dumber in a business call than looking at another person’s cartoon avatar. And, the AI-enabled tech here is only going to get better and more realistic over time.
They’re focused on getting the experience right, then will fix the hardware later. This is brilliant because it enables Apple to do what Apple does best—prove the concept, create delightful experiences, and then make the hardware increasingly gorgeous over time to drive additional adoption.
My $0.02: I personally own two Oculus devices and, while I like them and, for the price, they are pretty killer tech, if I was Zuck, I would be shaking in my avatarified boots after seeing Apple completely dominate the space he has been struggling to get adoption for.
Why I Don’t Mind the Hardware
The only thing I personally have to gripe about is the form factor. It’s currently clunky and people will complain about the battery life being limited out of the box, but how in the world are you supposed to release a product every year that is “our best one yet” if you have nothing to improve on?
And Apple is inventing a full category of computing here.
If they can do anything at all it is prove the use case, and then refine, refine, refine the hardware again and again, building custom parts and pieces and dominating the supply chain to drive cost down and create a product that I believe, by 2028, will be so stylish, it will start to become the default second-screen device people opt for after the iPhone. More and more, people will get an iPhone (and wearables) and then a Vision Pro device whereas people currently get an iPhone and an iPad for times when they want to lean back and watch something a little bigger without putting it on the TV.
Given Apple’s proven ability to pivot the software and make hardware so compelling you feel an innate need to own, by 2030, Vision Pro will replace many people’s need for a laptop computer at all, allowing the pure creators like programmers, writers and whatever the Ai-enabled future of knowledge workers are called.
The next wave of this?
When Siri is able to take on Chat GPT-like abilities and completely change the input of computing (finally) from something we unnaturally do with our hands to something we do with a combination of our voice, our eyes and natural human hand-gestures like pointing and swiping the same way someone would draw on a chalkboard (or table) or point at text in a book.
That next leap in AI-human enables tech will make Vision Pro so effortless, I won’t be surprised if new worlds of computing are created just through Siri+AI enablement alone.
Imagine all the value-adds we get in Auto-GPT today—in task lists that do themselves—and multiply that by an AI that knows you as intimately as Siri will, and you truly can live inside your computer from that point forward.
Don’t Forget Services
Keep in mind that Apple’s second-largest revenue generating division last year was Services, raking in 19% of their revenue in 2022.
From that standpoint, does it matter what device you choose to consume Season 8 of Ted Lasso on when you’re still tied into the Apple ecosystem one way or another, be it iPhone, iPad, Vision Pro, Apple TV or even that Vizio TV in the spare bedroom running the Apple TV app on it?
No. The answer is, no, it does not.
Thanks to services, almost reguardless of Vision Pro’s mass adoption, the fact they will have even niche means to access Apple’s vast services ecosystem means Apple still wins.
It’s The Experience, Stupid.
Since what Apple has always been doing is creating seamless computing experiences, not just devices, the most-important job for Apple in the next three years is to find a killer app that encourages even 2% of iPhone users to buy one of these and use it publicly enough that others see them use it (and get FOMO in the process).
So, what do you think?
The haters will complain about this and that, but my chips are on the realities that Apple just made a device that was already so seamlessly tied into our computing ecosystem and now will be so seamlessly tied in to our vision that we won’t even see the change coming before it’s suddenly here
*2023 to 2040 is roughly the same amount of years we are now from the launch of the iPhone in 2008
LOL








