Our $0.02 on Apple Vision Pro.

We have now watched the Apple Vision keynote a few times. We have not read much of the other coverage on the event, but from what we gleaned from the headlines, we are as impressed as everyone else by the device and the bigger vision (no pun intended) behind it. Apple’s stated ambition is to introduce a whole new compute platform – joining mobile, personal computers and tablets. And from what we can tell that seems possible, but by no means guaranteed.

By far, the biggest question we have about the device is how the user human interface will work. From the keynote two things about this were clear. First, the device tracks users eyeballs in lieu of a mouse to select an object. Second, users can tap two fingers together to initiate an action. Beyond that there is still a lot of mystery. For instance, how do users scroll through a document? Will there be a virtual keyboard? How do you zoom in and out? How do users multi-task when the device is tracking their eyeballs? These are major questions, around which the VR industry has struggled. Apple is asking users to learn a whole new ‘language’ for interacting with their compute device. This is incredibly challenging, but Apple is better than every other company on the planet (by far) when it comes to designing these interfaces.

The second thing that was apparent was the vast amount of thought Apple put into the design of the Vision Pro. Beyond the definitional Apple industrial design and intense focus on materials, they seem to have thought through so many usage modes. Glasses? Special inserts. Use of FaceTime? They created highly realistic digital avatars (a stunning technical feat). Audio? Audio ray tracing. The isolation of goggles? See-through screen. Device getting hot? A magical airflow system. The list goes on, and on.

Of course, we were keeping a careful eye out for silicon. The Vision Pro is powered by an M2, a laptop class CPU, which is a lot of computing power for a device this size. Apple also introduced a new chip – the R Series – which they labeled a signal processor, but we see as a Sensor Fusion Hub, processing data from 12 cameras, 6 microphones and 5 other sensors. The critical role of this chip is its ability to update this constantly, as visual and audio cues have a big impact on users’ sense of place and balance. In the demo videos, the image seems to move very smoothly as users move their heads. We imagine Apple did this well, but this will be the real test of the device once users have it on their heads. It is important to note that Apple felt the need to build their own chip for this. Our thesis is that it only makes sense for companies to do this when the chip conveys some form of strategic advantage, and it is easy to see how that applies here.

Another major question will be content. Apple knows this, and if anything their history has given them any long-term lessons, it is how to build content for a new device. They are announcing the device at least six months ahead of shipment, a big gap for Apple. But they are launching the device at a developer event, Apple has a robust-looking set of developer tools ready to go already, so this seems like a sensible approach. This seems to be a comprehensively thought out developer plan, and a marked contrast from the Apple Watch and TV, two other platforms which have struggled to gain app developer traction.

So our first impression is generally positive, but beyond that, we are left with a lot of questions. The keynote video had a lot of moments that just seemed odd. For instance, they seemed to imply that every device sale would require a session at an Apple Store to customize the fit of the device. (And were we the only ones to notice a lot of elements of the video that seemed to call directly on episodes of the Twilight Zone? ) And of course there is the question of battery life. With the attached battery pack (retail price unspoken) the Vision Pro has a two hour life, which implies that the device has a very short life without the battery. Our sense is that most usage will be done seated at a desk with the device plugged into persistent power. Go back and look at the usage of the device in the videos, almost everyone is sitting down. At this point, we risk nit-picking – this is a first generation device on a brand new platform, if it catches on, we will see improvements on all these issues.

That being said, the Street did not seem to agree. Apple’s stock actually fell during the keynote. We cannot fathom why. There were enough leaks about the device ahead of time that we all had a pretty good idea of its drawbacks ahead of time (e.g. the price, the battery pack), while the keynote actually showed off some incredible achievements. So to put all this in perspective, it may be worth thinking through the numbers a bit. The device will be priced ‘starting’ at $3,499. Obviously, this is not for everyone, but the word Pro is right there in the name, so we will probably have a cheaper model before too long. At that price, if they sell 285,000 units next year it will be a $1 billion business. There are over 1 billion iOS devices on the market, and the company sold ~220 million iPhones last year, so a 0.1% attach rate does not seem like a big reach. But it’s still a $1 billion business. At a $1 billion it would be larger than all but a few hundred public companies in the US. Of course, by Apple scale this is still small, less than 1% of revenue. We think it is important to keep this figure in mind, because we can already see the headlines in six months saying “Vision Pro” is a failure when it ‘only’ ships a few hundred thousand units.

In short, Apple has created what looks like an incredible product. If it performs as well in real life as it does in our keynote-inspired imagination, it will be the start of another major compute platform and change the way we interact with data and content once again.

Photo Credit: Apple

2 responses to “Our $0.02 on Apple Vision Pro.

  1. I got heads on (report up on Techsponential). To quickly answer your key questions: the user interface is simple, flexible, and fluid; Apple nailed it. You can use it sitting, standing, or walking around, but you’ll need to be tethered to power (either to the wall or the small battery pack which fits easily in your pocket). Content is an open question, but that’s why it was introduced at a developer conference well ahead of release. Reach out; happy to collab with your silicon coverage.

    • Thanks Avi. I believe that Apple nailed the Human Interface and a lot of other technology. But now the question is really how will consumers react to this new platform.

Leave a Reply