Apple working on new AR based feature that will help visually impaired to know how many people are around them

Apple is working on a new feature that is supposed to help those with visual impairment to be aware of how far they are from others. This way, they can effectively maintain social distancing norms without having to rely on others for the same.

According to a TechCrunch report on this, the above feature is however still in the works and forms part of the iOS 14.2 beta version. Also, the feature is going to be available on only the high-end iPhone 12 Pro and iPhone 12 Pro Max versions that come with a LIDAR sensor and is based on the Apple augmented reality platform – ARKit meant for iOS devices.

Apart from detecting the presence of other people in the vicinity, the said system is also designed to measure the distance to other people who are in view of the iPhone 12 camera. All of this owes it to the new depth application programming interface that is part of ARKit 4 and has made it possible for the LIDAR sensor on the latest iPhone 12 Pro devices to record depth information more accurately.

Apple had earlier announced it has developed a new feature called ‘people occlusion’ which is able to detect the shape of people. It is this combined with the depth information provided with the LIDAR sensors that are supposed to provide an accurate map of sorts of the people around.

This should help those with visual impairments to be more aware of how many people there are around them. Apart from the two iPhone 12 Pro models, the iPad Pro too comes with the LIDAR sensors though it is not known at the moment if the above feature will be introduced in the tablet as well.

Leave a Reply

Your email address will not be published. Required fields are marked *