June 5, 2023 will be remembered as the day Apple started something big. It’s not what people expected. It’s more expensive, more ambitious and has a much longer runway.
The Apple Vision Pro looks like magic ski goggles, but it’s actually a computing platform that could eventually take over much of what we do today with smartphones, tablets, and computers. That’s because Apple made augmented reality, not virtual reality, the core of its product.
At WWDC 2023, a live audience of developers and journalists at Apple Park fell silent in shock when Tim Cook announced that the Vision Pro would be an AR headset instead of the expected VR headset. and for good reason.
Here are my first impressions from the scene of the event:
AR is far more important than VR
Most of the anticipation swirling around the launch of the Apple headset centered around it being a VR device with built-in AR capabilities. The reality was quite the opposite. Vision Pro is an AR headset with some VR-like features.
VR has its natural limitations, because when you put on a VR headset you are almost disconnected from the world around you. This provides an immersive experience that transports you to different locations, but limits the amount of time most people use the headset to 30 minutes or less per day.
Meanwhile, AR glasses could shrink significantly in the next decade and become digital displays that overlay much of our everyday experience.
Tim Cook called it “the first Apple product to look at, not look at.”
Connect the digital world with the real world
Vision Pro is actually a mixed reality headset. It is a combination of AR and VR. But despite AR and VR being discussed for over a decade, the world already has little understanding of them, so it helps that Apple didn’t introduce a whole new terminology to confuse people.
Instead, Apple talked about new ways Vision Pro can integrate our daily lives with the online world that many of us spend so much time in today. Cook characterized it as “seamlessly blending the physical and digital worlds.”
Again, AR overlays digital information on top of the real world, allowing developers to build on existing activities, professions, hobbies and passions rather than digitally recreating them in VR. and an entire category of experiences.
“Vision Pro blends digital content into the space around us,” summarized Cook.
It’s a “new kind of computer”
One of the biggest surprises for me was when Apple demoed different interfaces for Vision Pro that mimicked the iPad, Apple TV, and Mac. To be honest, the iPad and Apple TV interfaces didn’t surprise me, but the Mac did. Apple has shown that users can create the equivalent of a giant multi-monitor Mac setup within Vision Pro.
Cook doesn’t try to downplay the importance of this, even calling the Vision Pro “a new kind of computer,” saying, “The Mac brought us personal computing, the iPhone brought us mobile computing.” In the same way that Apple Vision Pro introduces spatial computing.”
He added, “With Vision Pro, you’re no longer limited by your display. It’s an infinite canvas all around you.”
While there are serious questions about how this will work (which we’ll get to in a moment), the fact that Apple is considering the Vision Pro as a work and productivity tool is one of the event’s most It was an unexpected and pleasant surprise. . This gives the device a much wider range of possibilities than any AR or VR device we’ve seen so far, and will likely make it more interesting to a wider range of his ZDNET readers.
some big questions
Again, there are a lot of questions about how the Apple Vision Pro will perform in the real world when it hits the market for $3499 next year. Over the next few months, we will continue to explore these questions further as we learn more about this device and explore its possibilities.
For me, the concept of Vision Pro as a virtual Mac desktop computer raises the strongest questions. You can even compile all these questions into a list and eventually make it its own article. But for now, let’s focus on the biggest question: how it becomes possible to interact with virtual mice and keyboards in that environment.
Ergonomics aside, mimicking mouse and keyboard movements by moving your hands in the air might work for a few simple gestures like opening a website in Safari, but many augmented tasks. and less useful for more complex tasks. Perhaps you’ll be able to pair a physical Magic Keyboard with a Magic Trackpad and use them in your virtual space.
The other thing I’m most curious about in my head is that the Vision Pro is meant to be spent sitting in my living room or den, while I can move around with it in the real world. It’s about how much you anticipate what you can do. The WWDC demos were all very sedentary and seemed confined to indoor spaces.
One of the strongest long-term attractions of AR glasses is the ability to bring them into the real world and overlay experiences like hiking to Half Dome in Yosemite Park. It feels like such an experience is still years away, but I’d love to hear more about Apple’s vision for this product to be the first step in that journey. I am thinking.
https://www.zdnet.com/article/apple-vision-pro-first-take-3-reasons-this-changes-everything/#ftag=RSSbaffb68 Apple Vision Pro first take: 3 reasons why this could change everything