Apple To Use AI For FaceTime Calls

Share
Listen to this article!
Voiced by Amazon Polly

In its third iOS 13 Beta developer, Apple has developed a new feature which shows you are viewing the camera when using FaceTime, even if you’re looking at the screen.

To start working, the new “FaceTime Attention Correction” feature is to be enabled in FaceTime Settings.

This AI based feature basically corrects the ‘no-eye-contact situation’ as you and the user you call look at the screen.

Image source: @shanselman on Twitter

Apple also has a short description under the button, which says, ‘Your eye contact with the camera will be more accurate during Face Time Video calls .’

The new feature is available on iPhone XS, iPhone XS Max, iPhone XR and iPad Pro 2018 in the third developer beta of iOS 13, according to a report by Mac Rumours.

A third beta version of the iOS 13 and iPadOS was launched on July 2 by the Cupertino-based tech giant.

Apart from FaceTime Attention Correction, the third iOS 13 beta also features a new pop-up in the Home app, which allows you to know that your cameras don’t support safe recording if you have HomeKit-connected cameras.

In the Find My App, there’s also a ‘Me’ tab showing your present location. Finally, when consumers open the app store and go to the Arcade Tab, Apple has added a new video highlighting the upcoming Apple Arcade service.

Apple announced the iOS 13 beta version in June 2019 for iPhones and iPads. However, it is worth noting that iOS 13 is a public beta, and a beta version will therefore not be as smooth and bug-free as the final version.

Author

Anirudh Muley
Anirudh Muley
Anirudh is the Editor in Chief and Main Writer at Clickdotme. He does not like describing himself in the third-person and had a hard time coming up with these two sentences!
Advertisements

You may also like...

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: