BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Apple Adds New AR-Enhanced ‘People Detection’ Accessibility Feature To iOS 14.2 Developer Beta

Following
This article is more than 3 years old.

Apple has included a new accessibility feature in today’s release of the iOS 14.2 beta called People Detection. The software, which is actually a subset of the Magnifier app introduced with iOS 10, uses augmented reality and machine learning to detect where humans and objects are in space. The addition was first spotted in a late September report by Juli Clover of MacRumors.

The purpose of People Detection is to aid blind and low vision users in navigation; this type of application is particularly well-suited for the LiDAR sensor in iPhone 12 Pro. The goal is to help the visually impaired understand their surroundings—examples include knowing how many people there are in the checkout line at the grocery store, how close one is standing to the end of the platform at the subway station, and finding an empty seat at a table. Another use case is in this era of social distancing; the software can tell you if you’re within six feet of another person in order to maintain courtesy and safety.

Users can set a minimum distance for alerts—say, six feet for the aforementioned social distancing—as well as having an option to use haptic feedback to deliver those notifications. There also is audible feedback; if a person is wearing one AirPod, they will be notified when they’re in close proximity of a person or whatnot. People Detection is fully compatible with VoiceOver, Apple’s screen-reader technology.

A crucial limitation of People Detection is it does not work in the dark.

In a broad sense, People Detection is the high-tech equivalent of classic orientation and mobility training taught to visually impaired students in school. Whereas classically with mobility training, you learn living skills—street-crossing, riding public transit, and proper cane technique, amongst others—People Detection takes the concept and puts a digital spin on it by heightening spatial awareness via tech.

As useful as People Detection may be on the iPhone, the glaringly obvious dot to connect is how useful the feature would be on a pair of glasses. The Apple rumor mill has been churning out story after story speculating on the company’s purported work in creating AR-powered glasses. Apple’s notorious reputation for secrecy and surprise notwithstanding, the 2017 debut of ARKit at WWDC was not an accident. For those who follow the company closely, it doesn’t take an astrophysicist to figure out their best-laid plans for future products often are hiding in plain sight. Considering this context, it wouldn’t be far-fetched to wonder if People Detection isn’t one such idea. It’s useful today, but will be even more useful tomorrow—whenever that is.

The advent of People Detection, while notable, is not the first time Apple has applied its burgeoning machine learning tech to accessibility. As part of their work on iOS 14, Apple enhanced VoiceOver so as to be more cognizant of context and the physical world. This applies to static situations such as deciphering an image posted to social media, but also of dynamic situations such as a pet being in a room. People Detection takes this concept even further by leveraging not only complex ML algorithms, but also of AR-friendly hardware like the iPhone 12 Pro’s (and iPad Pro’s) LiDAR sensor.

Users can find People Detection by tapping on the people icon on the far right of the Magnifier toolbar. People Detection will be available to all customers with the public release of the iOS 14.2 update.

Follow me on Twitter or LinkedInCheck out my website