Aug 23, 2023
Love it or hate it, the Apple Vision Pro is a game changer
June 7, 2023 by Kevin C. Tofel Leave a Comment I’m sure by now, you’ve seen or
June 7, 2023 by Kevin C. Tofel Leave a Comment
I’m sure by now, you’ve seen or heard about Apple Vision Pro, the company's $3,499 spatial computing headset. If you haven't, this nine minute video will quickly catch you up:
Reactions from those who have, and those who haven't, used it are very polarizing. The former are uniformly blown away by the technology and experience. The latter mostly don't see the point of this product or are already predicting a flop. After letting my brain marinate on the news and watching some of the Apple WWDC 2023 developer videos, I’m leaning more towards Vision Pro being as much as game changer the original iPhone was.
In fact, watching the live introduction of Vision Pro recalled the same feeling I had in January of 2007. I was in Las Vegas at the Consumer Electronics Show. And during that event, Steve Jobs introduced the first iPhone. Tens of thousands of people across the more than 2 million square feet of CES basically stopped what they were doing. And they watched Jobs present Apple's newest creation.
It was as if all of the air at CES was sucked out of the many, large rooms all at once. To be fair, the first iPhone really wasn't impressive. It had a slow 2G mobile broadband connection, no App Store and a crappy 2 megapixel camera. But it changed the phone paradigm forever, thanks to the capacitive multi-touch display technology and software platform that Apple has continued to evolve.
I see the Apple Vision Pro much the same way. Yes, it's extremely expensive but the price will come down with subsequent models. It also has a new software platform for how we interact with computing experiences. And in a way, it's ahead of that initial iPhone because Apple's massive iOS and iPadOS app library comes along for the ride on day one.
Rather than rehash the basic facts and some of the many features of the headset, let's dig a little deeper into that theme of what makes Vision Pro a game changer. Even if most people won't experience it for a few years.
There's no lack of sensors in the Vision Pro. Apple says there are 12 cameras, six microphones and five additional sensors.
This includes a LiDAR sensor on the outside for object and environmental scanning. Inside are infrared sensors that are constantly looking at your eyes. These are used to authenticate you as a user by using your retinal pattern. And, what I think is even more important, these also track your eyes to see where you’re looking.
Why is that important? Because you don't have a mouse, trackpad or touchscreen to use in Apple's new personal computing paradigm. The cursor for interaction is always where you’re looking. In fact, Apple is actually predicting where you’ll be looking with the sensors and its new R1 chip. (There's also an M2 chips for processing, graphics and connectivity). That's according to Sterling Crispin, a developer who worked on Vision Pro project at Apple:
One of the coolest results involved predicting a user was going to click on something before they actually did. That was a ton of work and something I’m proud of. Your pupil reacts before you click in part because you expect something will happen after you click. So you can create biofeedback with a user's brain by monitoring their eye behavior, and redesigning the UI in real time to create more of this anticipatory pupil response.
Think about that for a second. In this new computing paradigm, you’ll no longer need to touch a physical object to effectively use a computer or the apps on it. You’ll simply look at what you want to do. Then, using the downward and side-facing cameras, Vision Pro will detect one of a just a few intuitive default gestures for interaction.
If that's not a game-changing computing paradigm then neither is the first mouse nor the first iPhone.
By removing the physical interaction, you don't need a battery-powered handheld controller to navigate around. I’m looking at you Oculus! And of course, by its very definition, you’ve removed the constraints of a fixed-sized display given the headset can simulate better than 4K resolution at up to 100 feet.
Of course, a new personal computing paradigm will mean not just new experiences but new challenges as well. Aside from the gen-one pricing, the biggest challenge I see with the Apple Vision Pro is how it will socially affect people. Let's be clear: Unlike the iPhone, this is not what I’d consider a "mobile-first" device. Put another way, I wouldn't leave home with a Vision Pro in a backpack or on my face too often. Actually, never in the latter case.
Phones today mostly go everywhere with us. A headset is really meant for stationary use, such as at home, perhaps in an office or in a location where you’ll have a lot of down time. You’d better hope that location has a nearby power outlet too because the portable battery pack runs Vision Pro for up to two hours. There's no internal battery: Apple showed it off being corded all the time. Not having an internal battery helps reduce weight. That further suggests this isn't a mobile-first product. Keep that in mind when reading hot-takes, by the way, because I think it's a important bit of information some are overlooking.
So the Vision Pro use case is primarily when you’re sitting or standing in one place, likely for some extended period of time. If you’re wearing one at home and your family is trying to interact with you, how are they going to feel? When you walk through an airport and see someone using Vision Pro while waiting for a plane, won't it look like they’re jacked into "The Matrix?"
Apple is trying hard to say no, you’ll still be socially aware of the people around you and that they can easily interact with you. That's the reason Vision Pro has an external display on the front of it. This shows a virtual representation of your eyes. It's not a live video feed of your peepers; it's a 3D projection based on what the sensors view your eyes doing. And it's actually kind of neat. But many people will still see that large set of goggles on your face and wish they actually saw your face. And that's going to reduce social interaction.
Along the same lines, nobody else can participate in your Vision Pro activities with you. Unless they have a headset or a support iOS or macOS device, that is. At WWDC 2023, Apple is spending a large amount of time explaining how developers should create the best SharePlay experiences for this new computing paradigm.
SharePlay, if you’re not familiar with it, is a way for people in different locations to share a computing experience. It might be a watch party for "The Mandolorian" or it could be something more fun, like crunching numbers in a spreadsheet together. It's all built upon Apple's FaceTime technology, so non-FaceTime users need not apply. In the above image, the shared experience is the green screen. Based on the context of the app or activity, Apple is recommending three ways to build the app with shared users in mind.
Obviously, this is new for developers and it will be for users as well. Devs only had to worry about a flat display in a select range of sizes when building their apps. Now they need to account for how people might "gather around" in a virtual experience with the apps.
Honestly though, I think most Vision Pro owners will end up using the device by themselves far more than with others through SharePlay. That's a definite downside to this computing paradigm as we become more isolated with our use of apps and the internet. Oddly, the internet was supposed to bring us closer together in a way, at least in theory.
There seems to be no middle ground here. People either think this headset will either be a huge success or flop bigger than the first Apple HomePod. If you’re judging this first-generation device only, then I think you’re missing the bigger picture. I’m not suggesting Vision Pro will sell into the millions when it arrives next year. However, I do think that Apple will continue to improve and refine it, bringing the cost to a more reasonable price year after year until it hits some number that fits with its typical 35 to 45 percent profit margin.
While I’m generally excited by the product, I’m more excited about the foundational change it can bring to personal computing.
From a smart home perspective, I can envision looking at a smart lamp through Vision Pro and having menu options appear next the bulb. Forget using voice to control that device, although Vision Pro relies on voice for input. No need to pull out that ol’ physical phone to open an app with a few clicks just to change the bulb color. Instead, I could virtually see white temperature options and an RGB color palette floating in front of my face. All it takes is one look and a finger gesture to get exactly the bulb color and hue I want.
Of course, I can "type by voice" in this future, just like I can (and do, very often) using Siri. Apple showed off a virtual keyboard but without any haptic or other feedback, I’m not a fan. So I’ll just use a traditional keyboard and even get the benefit of virtual autocomplete thanks to Apple's new spatial computing platform.
Even with my enthusiasm for the next big thing when it comes to personal computing, I won't be purchasing my own Apple Vision Pro. It's simply unaffordable on my budget, as I’m sure it is for many others. Instead, I’ll wait until a second or third-generation model with a target price of $1,999. Well, that's my target; I have no idea how Apple will price future versions. And hey, there's always a chance this "Pro" edition will appear as a cheaper, slightly cut-down model for the rest of us! Here's hoping.
Want the latest IoT news and analysis? Get my newsletter in your inbox every Friday.
Filed Under: Analysis, Featured Tagged With: Apple, AR, augmented reality, iOS, iPadOS, IR sensors, LIDAR, Meta, Oculus, sensors, smart home, spatial computing, virtual reality, VisionOS
Enter your email address to receive notifications of new posts by email.
Email Address
Subscribe