Home Business & Finance iOS 13 to Use AR to Fix Eye-contact in Facetime

iOS 13 to Use AR to Fix Eye-contact in Facetime

iOS 13 to Use AR to Fix Eye-contact in Facetime

If you use video calls on a regular basis, then you might be familiar with the sensation of the communication method feeling too impersonal due to the lack of eye contacts. Since people are required to look directly at the camera to maintain eye contact, it can be problematic as people are more likely too look at the center of screen instead. In order to fix this nature of video calls, Apple has introduced a fix in the third beta of upcoming iOS 13 that makes eye contacts feel more natural during FaceTime.

The latest feature, called FaceTime Attention Correction, will make correction to your eye contacts with the camera while you’re on a call. Mark Rundle recently turned to Twitter to reveal his discovery as the feature’s description notes, “Your eye contact with the camera will be more accurate during FaceTime Video calls.”

Shortly after sending out the tweet, Rundle revealed that he and Will Sigmon tested out the feature which makes a correction to the eye contact and it’ll look like you’re looking at the screen while you keep your eyes on the camera.

In a follow up, Sigmon posted effects of the feature on his Twitter feed to show how it works. He revealed that the feature “simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.”

As AR has come very far in the last couple of years, it’s good to see that the technology is being used to create a more ‘personal’ experience even if that means creating fake eye contact. Although the feature is available in the 3rd developer beta of iOS 13, you can expect the feature to be included in the public release of third iOS 13 beta very soon.

Related Articles

Leave a Comment