News18»Tech
1-MIN READ

iPhone 12 Pro Models Getting New Tool to Help Visually Impaired Users to Know About Their Surroundings

iPhone 12 Pro Models Getting New Tool to Help Visually Impaired Users to Know About Their Surroundings

The People Detection feature appears to be a subset of the Magnifier app and uses LiDAR sensor and wide-angle camera to function. It is extension of Apple's ARKit People Occlusion.

Apple has added a new tool to the iOS 14.2 Beta that potentially aids visually challenged users or anyone with low vision to detect how far away they are from other people. The feature, People Detection that is a subset of the Magnifier app within the Setting's Accessibility, uses augmented reality (AR) and machine learning to detect where humans and objects are in a given space. However, the new tool appears to work with the LiDAR sensor on the iPhone 12 Pro and iPhone 12 Pro Max that essentially measures the range between an object and smartphone. It also uses the phones' wide-angle camera that captures and detect the wide space. At the moment, the feature appears to be limited to the iPhone 12 Pro and iPhone 12 Pro Max, and is not available on the vanilla iPhone 12 or the iPhone 12 mini.

According to CNet, the People Detection leverages technology from Apple's ARKit People Occlusion to detect if someone is in the camera's field of view and estimate how far away the person is. If other people are in a range of 5 meters (15 feet), the user will get sound alerts accordingly. Additionally, sound alerts can be customised based on the distance between the phone and to objects, or in this case people. It can also be a useful feature amid the pandemic when people are advised to maintain six feet of social distance from others.

Additionally, Apple has reportedly provided a haptic pulse option to the People Detection feature that can also help users with hearing impairments to get haptic feedback. Since the feature requires the iPhone 12 Pro and iPhone 12 Pro Max' wide-angle camera, it may not work well in pitch darkness or low light environments. Meanwhile, some users on Twitter have also showcased how the feature works with iOS 14.2 Beta. However, since it is still a developing feature, the sound feedback appears to be buggy. Apple is reported to rollout the stable version of iOS 14.2 this week.


Next Story
Loading...