Color Mode


    Language

A first look at Apple's new Augmented Reality features

June 23, 2020

Apple's WWDC event is greatly anticipated in the Apple developers community every year. When it comes to the augmented reality world, we only got a quick glimpse of what is new in yesterday's keynotes. Here is an overview of what we know so far, with more details to unravel this week after the engineering sessions. You can find a list of AR recommended sessions to check out from this year's WWDC at the end of the article.

ARKit 4

ARKit helps developers build powerful augmented reality experiences for millions of users worldwide. Here are the main 3 new features announced this year:

Depth API

The new Depth API powered by the LiDAR scanner available on the iPad Pro gives developers access to use per-pixel depth information about the surrounding environment. This will make occlusion and placement of 3D content even more realistic. Some examples of how this could be used:

  • Taking more precise measurements
  • Applying effects to a user’s environment
  • Taking body measurements for more accurate virtual try-ons
  • Testing how your room will look like with different wall colors

3D depth map created using the output from the Depth API: Image source: Apple Platforms State of the Union

Available on iPad Pro 11-inch (2nd generation) and iPad Pro 12.9-inch (4th generation)

Location Anchors

A new session configuration is available for tracking geographic locations: ARGeoTrackingConfiguration which will combine GPS, the device's compass and world-tracking features to enable back-camera AR experiences to specific locations.

Location anchors (ARGeoAnchor) will be used to specify the latitude, longitude, and optionally, altitude. For example, when you would be close to a landmark, the app will reveal a virtual signpost with historical facts. What makes me really excited about this new feature is the ease of developing AR street routes now by placing a group of location anchors.

To place location anchors with precision, GPS data is not sufficient. ARKit will download batches of imagery that depict the physical environment in that area and assist the session with determining the user’s precise geographic location.

This localization imagery captures the view mostly from public streets and routes accessible by car. As a result, geo tracking will be available in limited cities and locations where Apple collected localization imagery in advance.

Requires iPhone XS, iPhone XS Max, iPhone XR, or later. Available in select cities and areas

Expanded Face Tracking Support

Face tracking will now be supported on the front-facing camera on any device with A12 Bionic chip and later. Face filters are a big success nowadays on popular apps like Instagram, Snapchat or TikTok so opening up this feature to more devices is a great addition. You will be able to track three faces at once using the TrueDepth camera.


You can read about more about ARKit's features in this article about ARKit 3.

RealityKit

Introduced at last year's WWDC, RealityKit is Apple's rendering, animation, physics, and audio Swift framework built from the ground up with augmented reality in mind. Latest improvements include video textures, scene understanding using the LiDAR scanner on iPad Pro, Location Anchors, face tracking, and improved debugging tools.

Video Textures

Video textures can now be added to any part of a scene in RealityKit, bringing objects, surfaces, and even characters to life. Example use cases of this new feature:

  • Animated virtual TV screens with rich videos
  • Making virtual characters smile

Improved Object Occlusion Rendering

The LiDAR scanner comes with many powerful use cases and being able to place virtual objects more accurately, like under a table or behind a wall, it will support a much more seamless AR experience. A quick glimpse of how this will work was presented in the Platforms State of the Union keynote and it showed how crisp the definition is.

Video source: Apple Platforms State of the Union

Automatic Updates

Location Anchoring, extended support for face tracking and improved object occlusion rendering are available for apps using RealityKit automatically due to its native ARKit integration.

USD

Universal Scene Description is the technology behind USDZ, the AR focused file format introduced by Apple at WWDC 2018. A new proposed schema and structure updates to the standard will help us with building AR content with interactive properties like anchoring, physics, behaviours, 3D text and spatial audio.

Recommended AR WWDC20 sessions

  • Explore ARKit 4
  • What's new in RealityKit
  • The artist’s AR toolkit
  • What's new in USD
  • Shop online with AR Quick Look

Resources

  • Augmented Reality - Apple Developer
  • Using USDZ for a better AR experience
  • How to make an augmented reality decorating experience app with AR Quick Look
  • How to convert 3D models to USDZ files using Apple's Reality Converter
  • iPadOS 14 introduces new features designed specifically for iPad

Article Photo by Apple

iosarkitrealitykitaugmented realitywwdc20

Author

Roxana Jula

Roxana Jula

Senior Mobile Developer

👩🏼‍💻 Augmented Reality and Future Tech Enthusiast

You may also like

November 7, 2024

Introducing Shorebird, code push service for Flutter apps

Update Flutter apps without store review What is Shorebird? Shorebird is a service that allows Flutter apps to be updated directly at runtime. Removing the need to build and submit a new app version to Apple Store Connect or Play Console for review for ev...

Christofer Henriksson

Christofer Henriksson

Flutter

May 27, 2024

Introducing UCL Max AltPlay, a turn-by-turn real-time Football simulation

At this year's MonstarHacks, our goal was to elevate the sports experience to the next level with cutting-edge AI and machine learning technologies. With that in mind, we designed a unique solution for football fans that will open up new dimensions for wa...

Rayhan NabiRokon UddinArman Morshed

Rayhan Nabi, Rokon Uddin, Arman Morshed

MonstarHacks

ServicesCasesAbout Us
CareersThought LeadershipContact
© 2022 Monstarlab
Information Security PolicyPrivacy PolicyTerms of Service