Apple announces augmented reality at WWDC 2021

During this week’s annual WWDC conference, Apple released several new augmented reality tools and technologies for software makers. These technologies can be essential if Apple actually releases augmented reality headsets or glasses in the years to come.

Apple has never confirmed its intention to release augmented reality hardware, Apparently We will be announcing the helmet soon this year. Facebook, break,and Microsoft We are also working on devices that can understand the world around us and display information in front of our users.

To be successful on augmented reality devices, Apple needs to find good reasons why people use it – and it’s as useful as apps like Maps, Email, YouTube, and Mobile Safari Browser. It comes down to the software, driving the adoption of the original iPhone. Engaging developers in creating augmented reality software increases the likelihood that one or more “killer apps” will be available at startup.

Apple hasn’t spent a lot of time on augmented reality. WWDC launches keynote on MondayHowever, the announcement of a few updates during the more technical part of the conference shows that this remains an important long-term initiative for Apple. CEO Tim Cook Mentionned AR is the “next big thing”.

“High levels this year, and maybe even next year’s WWDC event, will set in ahead of Apple’s innovation storm,” wrote Gene Munster, founder of Louup Ventures and Apple analyst at long time ago, in an email this week. “Today, Apple’s continued intense development of wearable augmented reality devices and new categories of transportation products is out of sight.”

What Apple announced

During a week-long meeting, Apple spoke about a rapidly improving tool that could create 3D models, use the device’s camera to understand hand gestures and language. body, and add a fast augmented reality experience to the web. I explained to the developer. Attractive new sound technologies such as surround sound for content, music and other audio elements.

Here are some of Apple’s announcements regarding AR and how Apple is leading the way to greater ambition.

Object capture. Apple introduced an application programming interface or software tool that allows applications to create 3D models. 3D models are essential for AR because the software puts them in the real world. If the app doesn’t have the exact shoe details, you can’t put the shoe on the table using Apple’s machine vision software.

Object capture is not an app. Instead, it’s a technology that allows a camera, like the iPhone’s camera, to take multiple photos of an object and stitch them together into a 3D model for use in software in just a few minutes. minutes. Previously, detailed scanning of objects required precise and expensive camera setup.

Ultimately, third-party developers Unit, Major AR engine manufacturers include it in their software. For now, it can be widely used in e-commerce.

Reality Kit 2. Object capture is just one of the major updates to RealityKit, a set of software tools for creating augmented reality experiences. Besides capturing objects, there are a lot of small improvements to make RealityKit 2 more user-friendly for app makers. For example, improved rendering options, how to organize images and other resources, and new tools for creating player-controlled characters in augmented reality scenes. ..

Apple’s new city navigation feature in Apple Maps.

Apple

ARKit5. ARKit is another set of software tools for creating augmented reality experiences, but with a focus on understanding the location of digital objects in the real world. This is the fifth major release of Apple software since its first release in 2017.

This year, it includes what is called a “localization anchor”. This means that software makers can schedule a docked AR experience to map London, New York, Los Angeles, San Francisco, and several other locations in the United States. I mentioned that I was using this tool to create AR orientation overlays in Apple Maps. This is a potentially useful scenario for head-mounted AR devices.

AI to understand hands, people and faces. Apple’s machine learning and artificial intelligence tools are not directly related to augmented reality, but they represent important capabilities for computer interfaces that operate in 3D space. Apple’s Vision frame software can be called up from the app to detect people, faces and poses through the iPhone’s camera. Apple’s computer vision software can now identify objects in images that contain sign text. It also has the option to search for items in the photo, such as dogs and friends.

These AI tools can be combined with other Apple tools to apply effects similar to Snap filters. A session at WWDC in his year will also show you how to identify hand poses and movements. This lays the foundation for the advanced hand gestures that make up the majority of the interface of today’s AR headsets like Microsoft Hololens.

About Jerilyn Graves

Check Also

10 Tips to Drive More Traffic to Your Small Business Website

If you want your small business website to make an impact, you need traffic. The …

Leave a Reply

Your email address will not be published. Required fields are marked *