Apple AR announcements at WWDC 2021

Apple released several new augmented reality tools and technologies for software makers at its annual WWDC conference this week. These technologies could be vital if Apple did indeed release augmented reality headphones or glasses in the years to come.

Apple has never confirmed its intention to release augmented reality hardware, but could would have announce a helmet this year. Facebook, Snap, and Microsoft are also working on devices that can understand the world around them and display information in front of the user’s eyes.

To be successful with an augmented reality device, Apple will need to find good reasons for people to use it – and it boils down to useful software, just as apps like Maps, Mail, YouTube, and the Safari mobile browser have helped encourage the adoption of the original iPhone. Involving developers in creating augmented reality software now increases the chances that one or more “killer apps” will be available at launch.

Apple didn’t spend much time on augmented reality during its WWDC launch keynote on Monday, but announced several updates during the more technical portions of the conference that show it remains an important long-term initiative. for Apple. CEO Tim Cook has mentionned AR is the “next big thing”.

“From a high level, this year, and maybe even next year’s WWDC event, will amount to a calm before an Apple innovation storm,” wrote Gene Munster, founder of Loup Ventures and analyst at longtime Apple, in an email this week. “Today, Apple’s continued intense development related to new product categories around wearable devices and augmented reality transportation is not visible today.”

What Apple announced

During the week-long conference, Apple briefed its developers on its rapidly improving tools that can create 3D models, use a device’s camera to understand hand gestures and body language, add fast AR experiences on the web, a standard strongly supported by Apple for 3D. content and intriguing new sound technology that resembles surround sound for music or other audio elements.

Here are some of the AR announcements made by Apple and how they are paving the way for its bigger ambitions:

Object capture. Apple has introduced application programming interfaces, or software tools, that will allow applications to create 3D models. 3D models are essential for AR because they are what software places in the real world. If an app doesn’t have an accurately detailed file for a shoe, it can’t use Apple’s machine vision software to place it on a table.

Object Capture is not an application. Instead, it’s a technology that allows a camera, like the iPhone’s camera, to take multiple photos of an object and then stitch them together into a 3D model that can be used in one. software in minutes. Previously, precise and expensive camera setups were required for detailed object scanning.

Finally, third-party developers like Unity, a leading manufacturer of AR motors, will include it in their software. For now, it will likely be widely used in e-commerce.

Reality kit 2. Capturing objects is only part of an important update to RealityKit, which is its set of software tools for creating augmented reality experiences. Aside from capturing objects, there are a lot of small improvements to make life easier for app makers in RealityKit 2, including improved rendering options, a way to organize images and other assets, and new tools for creating player-controlled characters in augmented reality scenes.

Apple’s new city navigation feature in Apple Maps.

Apple

ARKit 5. ARKit is another set of software tools for creating augmented reality experiences, but it is more focused on determining the location of digital objects in the real world. This is the fifth major version of Apple’s software since its release in 2017.

This year it includes something called “location anchors,” which means software makers can schedule augmented reality experiences linked to map locations in London, New York, Los Angeles, San Francisco, and a few other states. United. In a developer video session, Apple said it uses the tool to create AR direction overlays in Apple Maps – a potentially useful scenario for a head-mounted AR device.

AI to understand hands, people and faces. While Apple’s machine learning and artificial intelligence tools are not directly related to augmented reality, they represent capabilities that will be important for a computer interface operating in 3D spaces. Apple’s Vision frame software can be called by applications to detect people, faces and poses through the iPhone’s camera. Apple’s computer vision software can now identify objects inside images, including text on signs, as well as the ability to search for items in photos, such as a dog or a friend.

Combined with other Apple tools, these AI tools can apply effects similar to Snap’s filters. A session at her year’s WWDC even explains how to identify the position or movement of a hand, which lays the foundation for advanced hand gestures, which are a large part of the interface of current AR headsets like Microsoft Hololens. .

About Jerilyn Graves

Check Also

10 Tips to Drive More Traffic to Your Small Business Website

If you want your small business website to make an impact, you need traffic. The …

Leave a Reply

Your email address will not be published. Required fields are marked *