Skip to main content

Apple secretly adds AR-powered FaceTime eye correction in iOS 13

Image used with permission by copyright holder

While we knew iOS 13 was going to contain a lot of useful additions outside of the headline features like Dark Mode, we didn’t expect Apple to add AR-powered eye correction to FaceTime video calls. But that seems to be exactly what it’s done in the most recent update for the iOS 13 public beta.

According to app designer Mike Rundle (and signal-boosted by the surprised folks on Reddit’s Apple subreddit), the iOS 13 public beta now includes an option for “FaceTime Attention Correction.” According to the feature’s tooltip, turning this on will increase the accuracy of your eye contact with the camera during FaceTime video calls. What does that mean? AR black magic trickery, basically.

Recommended Videos

Haven’t tested this yet, but if Apple uses some dark magic to move my gaze to seem like I’m staring at the camera and not at the screen I will be flabbergasted. (New in beta 3!) pic.twitter.com/jzavLl1zts

— Mike Rundle (@flyosity) July 2, 2019

It all comes down to a minor, but irritating flaw that FaceTime — and admittedly, all other video-calling apps — suffers from. If you’re looking at your screen to look at the person you’re talking to, then you’re not looking at the camera. If you’re not looking at the camera, then it doesn’t seem as if you’re looking at the person you’re calling — which leads to a weird disconnect where everyone in the call seems to be not looking directly at anyone else.

Apple’s new setting changes that, making subtle alterations to your video stream to make it seem as if you’re actually looking directly at the person on the other end of the call. People were quick to try it out, and noticed immediately that the setting is actually fairly effective.

So how does it work? It’s a combination of Apple’s ARKit augmented reality software and the TrueDepth cameras built into the latest iPhones. FaceTime uses the TrueDepth camera to grab a depth map of your face — much like FaceID — and then runs the data through ARKit, creating a slightly altered version of your eyes and nose with a new focus. Thanks to the processing power of the most recent iPhones, this can happen in real time, making the process seamless. In a video, Dave Shukin shows how it’s done.

How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.

Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN

— Dave Schukin (@schukin) July 3, 2019

As ever, there’s a catch to this amazing new feature. It’s only available to the most recent batch of iPhones — so only iPhone XS and XS Max owners are currently able to experience it. Despite being loaded with the same hardware, the iPhone X misses out. But with Apple being Apple, don’t be surprised if this rolls out for iPhone X in the full release of iOS 13, or comes to it shortly afterward. At this moment, it’s also unknown whether this feature will also come to MacOS and iPadOS — but we’d be surprised if it didn’t.

Mark Jansen
Mobile Evergreen Editor
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
I thought I’d love these two iOS 18 features, but I don’t
iPhone 15 Pro with iOS 18 lock screen.

Apple finally released iOS 18 to the public on September 16 after months of betas. It’s one of the biggest iOS updates in history, ushering in a new age of Apple Intelligence, more customization, RCS support, a new Photos app, and a lot more.

I didn’t use the iOS 18 betas, so the public release is the first time I’ve tried out all of the new features, minus Apple Intelligence (it's coming in iOS 18.1 next month).

Read more
iOS 18: Everything you need to know about the iPhone update
An iPhone 15 Pro Max running iOS 18, showing its home screen.

Apple showed off the next major iteration of iOS during its Worldwide Developer Conference (WWDC) in June. The next big update is iOS 18, and it’s packing quite a punch in terms of features.

It will eventually bring in Apple Intelligence, which is Apple’s suite of AI tools. Combined with new customization tools, a redesigned Photos app, and more, there’s a lot to dive into. Here’s everything you need to know about iOS 18.
iOS 18 release date

Read more
Apple just launched the iOS 18.1 public beta. Here’s how it’ll change your iPhone
Someone holding an iPhone 15 Pro Max outside on a patio, showing the back of the Natural Titanium color.

This week is quickly shaping up to be a huge one for Apple fans. On Monday, Apple officially released iOS 18, watchOS 11, and macOS 15 to the general public. Tomorrow, regular sales begin for the new iPhone 16, iPhone 16 Pro, and Apple Watch Series 10. As if that weren't enough, Apple is now rolling out its first public betas with Apple Intelligence features.

Starting today, September 19, the public betas for iOS 18.1, iPadOS 18.1, and macOS 15.1 are available for anyone to download. The main draw to these public betas is that they all include Apple Intelligence features, which were previously locked to the developer betas for these software versions.

Read more