Gadget

Announcement of Apple AR at WWDC 2021

At this week’s annual WWDC conference, Apple released several new augmented reality tools and technologies for software manufacturers. These technologies can be essential if Apple actually releases augmented reality headsets or eyeglasses within the next few years.

Apple hasn’t confirmed plans to release augmented reality hardware, but it’s possible. Reportedly We will announce the headset as early as this year. Facebook, snap, And Microsoft We are also working on devices that can understand the world around us and display information in front of our users.

To be successful with augmented reality devices, Apple needs to come up with strong reasons why people use it. It results in useful software, just as apps such as Maps, Email, YouTube, and Mobile Safari Browser helped, spurring the adoption of the original iPhone. Engaging developers to build augmented reality software increases the likelihood that one or more “killer apps” will be available at launch.

Apple wasn’t spending too much time on augmented reality at the time. WWDC launch keynote on MondayHowever, the announcement of some updates in the more technical part of the conference shows that it’s an important long-term effort for Apple. CEO Tim Cook Said AR is the “next big game”.

“From a high level, this year’s and perhaps next year’s WWDC event will be a tranquility before Apple’s innovation storm,” said Gene Munster, founder and longtime Apple analyst at Loup Ventures. Wrote by email this week. “Today, we don’t see Apple’s ongoing intensive development in relation to new product categories for augmented reality (AR) wearables and transportation.”

What Apple announced

At the week-long conference, Apple is a rapidly improving tool that allows you to create 3D models, use your device’s camera to understand hand gestures and body language, and add a quick AR experience on the web. I explained to the developer. Interesting new sound technologies such as surround sound for content and music and other audio.

Below are some of Apple’s announcements of AR that show how they are paving the way for greater ambitions.

Object capture. Apple has introduced an application programming interface, or software tool, that allows applications to create 3D models. 3D models are essential for AR because the software puts them in the real world. You can’t use Apple’s machine vision software to put your shoes on the table without the exact shoe details file in the app.

Object Capture is not an app. Instead, it’s a technology that takes multiple pictures of an object with a camera like the iPhone’s camera, stitches them into a 3D model, and makes them available in software in minutes. Previously, scanning objects in detail required accurate and expensive camera setup.

Ultimately, third-party developers say: UnityIs a leading manufacturer of AR engines and plans to incorporate it into their software. For now, it can be used frequently in e-commerce.

Reality kit 2. Object Capture is just one of the key updates to RealityKit, a set of software tools for creating AR experiences. In addition to object capture, RealityKit 2 has many small improvements to make it easier for app makers to work. For example, improved rendering options, how to organize images and other assets, and new tools to create player-controlled characters in augmented reality scenes.

Apple’s new city navigation features in Apple Maps.

Apple

ARKit5. ARKit is another software toolset for creating AR experiences, but with a focus on understanding where to place digital objects in the real world. This is the fifth major version of Apple’s software since it was first released in 2017.

This year it includes what is called a “location anchor”. This means software makers can program the AR experience pegged to London, New York, Los Angeles, San Francisco, and several other US map locations. In a video session for developers, Apple states that it is using this tool to create AR-oriented overlays in Apple Maps. This is a potentially useful scenario for head-mounted AR devices.

AI that understands hands, people, and faces. Apple’s machine learning and artificial intelligence tools aren’t directly tied to augmented reality, but they do represent important capabilities for computer interfaces operating in 3D space. You can call Apple’s Vision framework software from the app to detect people, faces, and poses through your iPhone’s camera. Apple’s computer vision software can now identify objects in images, such as billboard text. It also has the ability to search for objects in the photo, such as dogs and friends.

Combined with other Apple tools, these AI tools can apply effects similar to Snap’s filters. One session at WWDC of his year also touches on how hand poses and movements can be identified. This lays the foundation for advanced hand gestures that make up the majority of the interface of today’s AR headsets like Microsoft Hololens.

https://www.cnbc.com/2021/06/10/apple-ar-announcements-at-wwdc-2021.html Announcement of Apple AR at WWDC 2021

Back to top button