This new FaceTime update solves one of the most awkward parts about video calls
Video chat is an incredibly useful feature that has always been plagued by a weird, 'slightly off' feeling whenever you're trying to talk to someone. A good chunk of that is caused by the video itself — the camera is usually located at the top of your phone or computer, so every time you look at your screen, you appear to be looking downward instead of at the other person. It's such a noticeable flaw that "look at the camera, not the screen" has become a hot tip for job interviews done by webcam. Apple apparently knows this all too well, and an iOS 13 FaceTime update is attempting to tackle the issue to make video chatting look and feel more natural.
A recently discovered feature in the iOS 13 beta, called FaceTime Attention Correction, can be switched on in the settings to make it appear like you're looking directly at the person you're speaking with — even if you're looking at the screen. At this time, there's nothing else it does — it's purely just an optional setting that makes a video adjustment to fix the eye contact problem.
On Twitter, tech enthusiast Will Sigmon tested out FaceTime Attention Correction with some comparison photos. In the first two photos, he takes a picture of himself looking at the camera and looking at the screen while using the FaceTime correction. In the second two, he's doing the same using the regular Camera app, with a noticeable downward look in the photo showing him looking directly at the screen.
How does this work? By using augmented reality (AR). This is technology similar to what's already used to turn yourself into an emoji and catch Pokémon in the real world. Now, Apple is using it to enable users to make a minor adjustment to their FaceTime chats.
Dave Schukin, co-founder of autonomous driver monitoring Observant AI, posted on Twitter to show where AR is coming into play. In his short video, he takes the temple of his eyeglasses and moves them in front of his face. The thin metal remains straight as he slowly moves it upward, then suddenly warps when it reaches his nose and eyes.
As he explained it, this indicates that FaceTime Attention Correction is using AR to figure out the features and depth of your face, then corrects the image of your eyes to match where you'd be looking if you were talking in-person.
FaceTime Attention Correction is currently only available for the iPhone XS and XS Max. This is quite possibly due to the type of AR used for the feature. Software developer Aaron Brager suspected the correction requires ARKit 3, the latest version of Apple's AR development tools, which isn't available for the iPhone X and earlier models.
There's no news whether there will be a version of FaceTime Attention Correction for iPad or iPhone users with different models, but the iOS 13 is still in beta, and there's always the possibility that a few things might change by the time it's fully released to the public.
Additionally, FaceTime Attention Correction isn't the only new feature coming to iOS 13. The entire operating system is getting a revamp to make some menus look better, offer more privacy options, add explanations to settings, and a few redesigned animations when launching certain apps. It will also have new region-based features, such as allowing German users to use the iPhone as an ID card.
Apple users who enjoy FaceTiming their friends can look forward to these new changes when iOS 13 finally arrives. There is no set release date, but tech enthusiasts have predicted it will debut by fall of this year.