iPhone 12 Pro can help the blind get around safely with LiDAR
The iPhone 12 Pro is the first iPhone to feature a LiDAR scanner that is capable of measuring distances. While many reviewers of the new iPhone models do wonder what the depth information is ultimately good for, Apple has an answer: The new distance-measuring capability can help blind users protect themselves from bumping into objects whenever they are in a new environment or somewhere that they are unfamiliar with.
Try taking a break and standing up briefly as you read this news. Close your eyes and walk around the room with outstretched arms. While your colleagues at work are probably looking at you in bewilderment, you should start to feel a fear of the unknown rising from within. Am I about to hit something or is there a stranger standing before me? People with visual impairment help themselves in such situations by using a walking stick or with a trained ear. If the iPhone 12 Pro is also nestled in their pocket, they now have another useful aid for everyday life.
LiDAR scanner reveals distances to walls and objects
Because as Techcrunch discovered in the latest beta version of iOS, there will be a new feature for people with visual impairment that will be implemented in due time. Using this feature, the iPhone relies on information from the LiDAR scanner to measure the distance to people or objects. The most powerful iPhone should even be able to detect whether said object is a human or an inanimate object.
This feature was derived from Apple's ARKit, for which Apple had developed a feature that is known as "People Occlusion". The feature allows virtual objects to appear behind people in augmented reality applications and can be drawn around the person(s) without clipping or other optical errors.
These are the possibilities offered by such optical image recognition for blind people
In the future, this feature will be available in the iPhone's magnifying glass app. Apple's smartphone will then launch the ultra-wide-angle camera and measure the distances using the said scanner. Users are then informed by voice output whether there are people in the iPhone's field of vision and how far away they are from the iPhone. In addition to the information that is provided by the voice output, stereo sound from the iPhone and connected hardware such as the Apple AirPods will also provide information about the location of people in the field of view.
Users should be able to gauge the distance measurement based on their own experience. For instance, figuring out the loudness of the sound at certain distances. As Techcrunch noted, this could be especially helpful during times like these, especially in the midst of a coronavirus pandemic where social distancing is of utmost importance. In quiet environments or where there is additional interference with hearing, the iPhone 12 Pro can also communicate distances via vibrational impulses. However, this feature is limited by the available brightness measured in the room. This is due to the ultra-wide-angle camera having to work hard to provide the iPhone with a reliable image, such a function does not work in the dark.
Here’s how people detection works in iOS 14.2 beta - the voiceover support is a tiny bit buggy but still super cool https://t.co/vCyX2wYfx3 pic.twitter.com/e8V4zMeC5C
— Matthew Panzarino (@panzer) October 31, 2020
With this new feature, Apple is once again displaying its creativity when it comes to the use of sensors in modern smartphones. When it comes to such obstacle aids, you can already find different implementations like vibration-based control of the iPhone via the keyboard. Apple smartphones are therefore often a good choice for people with impaired vision and hearing. How well the new Apple smartphones perform in practice has already been revealed by Julia in her first hands-on with the new iPhone 12 models.
More about the iPhone 12 on NextPit:
Source: Techcrunch