How to use your smartphone to cope with vision loss (2022)
To make the same changes to the Google Assistant, go to Setting > Google > Settings for Google Apps > Search, Assistant and Voiceand choose Google Assistant. You may want to mine Lock screen and turn on Feedback assistant on lock screen. If you scroll down, you can also adjust the sensitivity, turn on The conversation continuesand which one to choose? Notification you want the Google Assistant to give you.
How to identify objects, doors and distances
Firstly Released in 2019the observation app for Android lets you point your camera at an object to find out what it is. This smart app can help you organize mail, identify groceries, count money, read food labels, and more. The application has different modes for specific situations:
Letter mode for signs or letters (short text).
Documents mode can read the entire handwritten letter to you or the entire page of text.
Picture This mode uses Google’s latest machine learning model to give you an audio description of the image.
Food labels The mode can scan barcodes and identify foods.
Currency denomination mode for different currencies.
Discover the mode will highlight objects and text around you as you move your camera.
The AI-powered features work offline, no Wi-Fi or data connection required, and the app supports several languages.
Apple has something similar built in magnifying glass app. But it relies on a combination of camera, on-device machine learning, and lidar. Unfortunately, lidar is only available on Pro model iPhone (12 or later), 12.9-inch iPad Pro (4th generation or later), and 11-inch iPad Pro (2nd generation or later). If you have, open the app, tap the gear icon and select Setting add Detection mode for your control. There are three options:
Detect people will notify you of people nearby and can tell you how far away they are.
Detect door can do the same for doors, but can also add borders in your favorite color, provide information on door color, material and shape, and describe decoration, signage or text. copies (such as opening hours). This video shows some Apple accessibility features, including Door Detection, in action.
Apple over Simon Hill
Apple over Simon Hill
Image description can identify multiple objects around you by text, voice on screen, or both. If you are using speech, you can also enter Setting > accessibility > Excessive sound > Voice recognition > Image description and turn it on to activate detection mode that depicts what is depicted in the image you point your iPhone at, such as a painting.
You don’t need a Wi-Fi or data connection to use these features. You can configure things like distance, whether you want sound, touch, voice feedback, etc. detector the part at the bottom of Setting in the Magnifier app.
How to Take Better Selfies
Guide frame is a brand new feature that works with TalkBack, but it’s currently only available on Google Pixel 7 or 7 Pro. People who are blind or have low vision can capture the perfect selfie thanks to a combination of precise audio guidance (move right, left, up, down, forward or back), animations with High contrast and haptic feedback (different combinations of vibrations). This feature tells you how many people are in the frame, and when you hit that “sweet spot” (which the team used machine learning to find), it will count down before taking a photo.
The Friend Controllers feature on iPhone (iOS 16 and later) lets you play with someone in a single-player game using two controllers. You have the ability to help a friend or family with a visual impairment when they get stuck in the game (make sure you ask first). To enable this feature, connect the two controllers and enter Setting > overview > Game controller > Friend controller.
While this guide can’t cover every feature that can help improve vision loss, here are some final tips that might help.
You can get voice directions when out and about on your Android phone or iPhone, and they should be turned on by default. If you use Google Maps, tap your profile picture in the top right, select Setting > Navigation settingsand choose your favorite Guide volume.
Both Google Maps and Apple Maps offer a feature where you can see your directions live superimposed on your surroundings by simply picking up your phone. For Apple Maps, sign up Setting > map > Take a walk (Below Direction) and make sure Increase to see is turned on. For Google Maps, go to Setting > navigation settings, and scroll down to make sure Watch live Below Walking option is turned on.
If you’re browsing the web on your Android device, you can always ask the Google Assistant to read a web page by saying, “Ok Google, read it.”
You can find more helpful tips on how technology can assist people with vision impairment at Royal National Institute of the Blind (RNIB). To find video tutorials for some of the features we’ve discussed, we recommend you visit Hadley’s website and try the workshops (you will need to register).