Apple, amid a bumper crop of announcements at its WWDC keynote, has announced ARKit 3, which features major improvements to ARKit, its augmented reality platform. Among major new features are People Occlusion, which lets you layer content in front of and behind people, and Motion Capture, which allows capturing of human motion to inject into the AR experience.
People come first
People Occlusion, which can simultaneously track up to three faces, allows for the integration of human movement into your app to aid in the immersive effect. With it, AR content can realistically pass in front and behind individuals or enable a green-screen effect. The new multiple face tracking feature can track three faces at once with the front-facing TrueDepth camera on iPhone X, iPhone XS, iPhone XS Max, iPhone XR, and the iPad Pro.
The new ARKit facilitates the capture of human motion by a single camera in real time. It can perceive body position and movement to allow motion and poses to be used as inputs into a scene and be manipulated as animated objects. ARKit can use both front and back cameras simultaneously for face and world tracking so you can interact with AR content in the back camera view.
The update allows for detection of up to 100 images at a time, featuring an automatic estimate of the actual size of the image. It also enhances 3D-object detection for better object recognition, and uses machine learning to more quickly detect planes within the environment.
ARKit 3 features live collaboration between multiple people on a shared world map, making it easier to participate in environments like multiplayer games, as demoed on stage with Minecraft Earth.
Reality Composer and RealityKit
In conjunction with the new ARKit 3, Apple has also unveiled Reality Composer and RealityKit for easy creation of interactive AR experiences. This is targeted to all developers, but especially benefits those without extensive 3D experience. Both RealityKit and Reality Composer promise to accelerate the development of AR apps and content for the future.
Reality Composer, a new authoring app for iOS, iPadOS, and MacOS, lets even new 3D developers easily prototype and produce AR experiences via a drag-and-drop interface and a prebuilt library of 3D objects and animations. You can also import your own USDZ files.
You get the flexibility of movement between your desktop Mac, iPhone, or iPad, and the ability to build with live linking. You can have animations that let you move or scale or add a wiggle or spin, for example, to virtual objects and set them off with a trigger of your choosing. You can place, move, and rotate AR objects to create a new AR experience, which you can directly integrate into your app using Xcode or export to AR Quick Look.
The RealityKit framework features improved tools to accomplish photorealistic rendering, environment mapping, and support for camera noise, motion blur, environment reflections, and grounding shadows to integrate virtual and real-world content. It works with animation, physics, and spatial audio, and allows developers to leverage the framework with the new RealityKit Swift API.
AR Quick Look lets you place 3D objects in the real world and supports models and scenes created in Reality Composer, letting you create interactive experiences for viewing and sharing on iOS 12 and beyond.
ARKit 3 is available as part of the developer beta release of iOS 13.