Smartphones have step by step grow to be more useful for individuals with a variety of physical abilities, due to tools like screen readers and adjustable text sizes.
With the recent release of Apple’s iOS 16 and Google’s Android 13 software, much more accessibility features have been introduced or upgraded, including improved live transcription tools and apps that use artificial intelligence to discover objects. When enabled, your phone can send you a visible alert when a baby is crying, for instance, or a sound alert if you’re approaching a door.
And plenty of accessibility tools, old and latest, make using the phone easier for everybody. Here’s a tour.
On either an iOS- or Android-based phone, open the Settings app and choose Accessibility to seek out the entire tools and features available. Take time to explore and experiment.
Swiping and tapping by hand to navigate a phone’s features doesn’t work for everybody, but iOS and Android provide several ways to maneuver through the screens and menus, including quick-tap shortcuts and gestures to perform tasks.
Each platforms support navigation through third-party adaptive devices like Bluetooth controllers or by utilizing the camera to acknowledge facial expressions assigned to actions, like seeking to the left to swipe left. These devices and actions will be configured within the iOS Switch Control and Head Tracking settings, or in Google’s Camera Switches and Project Activate apps for Android.
Apple and Google provide several tools for many who can’t see the screen. Apple’s iOS software offers the VoiceOver feature, and Android has an identical tool called TalkBack, which provides audio descriptions of what’s in your screen (like your battery level) as you progress your finger around.
Turning on the iOS Voice Control or Android’s Voice Access option permits you to control the phone with spoken commands. Enabling the iOS Spoken Content or Android’s Select to Speak setting has the phone read aloud what’s on the screen — and will be helpful for audio-based proofreading.
Don’t forget just a few classic methods of hands-free interaction together with your phone. Apple’s Siri and the Google Assistant can open apps and perform actions with spoken commands. And Apple’s Dictation feature (within the iOS Keyboard settings) and Google’s Voice Typing function allow you to write text by speaking.
Of their Accessibility settings, iOS and Android include shortcuts to zoom in on sections of the phone screen. But should you’d generally like larger, bolder text and other display adjustments, open the Settings icon, select Accessibility and choose Display & Text Size. In Android, go to Settings, then Accessibility and select Display Size and Text.
The Magnifier app, Apple’s digital magnifying glass for enlarging objects within the camera’s view, has been upgraded in iOS 16. The app’s latest functions are designed to assist people who find themselves blind or low vision use their iPhones to detect doors and people nearby, in addition to to discover and describe objects and surroundings.
Google’s recently updated Lookout assisted-vision app (a free download within the Play store) can discover currency, text, food labels, objects and more. Google introduced Lookout in 2018, and it really works on Android 6 and later.
With the iOS 16 update, Apple includes Live Captions, a real-time transcription feature that converts audible dialogue around you into text onscreen. Android’s Accessibility toolbox includes the Live Caption setting that routinely captions videos, podcasts, video calls and other audio media playing in your phone.