Apple gets inspired by Samsung for some of its iOS 17 features

May 18, 2023
374 views

Just after Samsung announced new accessibility features for the Galaxy Buds 2 Pro, Apple announced some accessibility features for iPhones. Just ahead of Global Accessibility Awareness Day (May 18, 2023), Apple announced features to improve accessibility on iPhones, and it looks like those features have been inspired by Samsung’s features.

Apple’s Assistive Access is similar to Samsung’s Easy Mode, while Live Speech is similar to Bixby Text Call

Apple revealed Assistive Access, Live Speech, and Personal Voice. Assistive Access is for people who have cognitive disabilities and offers bigger UI elements and layouts so that people can access the essential features with ease. This feature is similar to Samsung’s Easy Mode, which streamlines the UI for easier access that helps elders or people with cognitive impairments. Apple’s feature will work on iPhones and iPads with essential apps like Calls, Camera, Messages, Music, and Photos, offering bigger touch targets and UI elements. This feature will roll out later this year as a part of the iOS 17 update.

With Live Speech, users can type what they have to say during a voice call. An iPhone, iPad, or Mac will then convert that text to speech and relay it to the other side of the call. Users can also save quick phrases to use during calls. This feature is similar to Samsung’s Bixby Text Call feature, which transcribes voices to text and vice versa during calls.

Apple’s Personal Voice feature is similar to Samsung’s Bixby Custom Voice Creator

Apple’s Personal Voice accessibility feature is designed for users who are at risk of losing their voice. They can create a voice that sounds like them by reading a randomized set of text prompts to record 15 minutes of audio on an iPhone or iPad. This feature appears inspired by Bixby Custom Voice Creator, which Samsung launched earlier this year.

Apart from all these features, Apple also announced Detection Mode in Magnifier that iPhone users with low vision could use to read text from objects. Users can simply point the iPhone’s camera to an object or text they are looking at, and the Detection Mode will read that text and say it loud. This is similar to Samsung’s Bixby Vision’s Color Detector, Object Identifier, Scene Describer, and Text Reader features.

Other new features include ‘Made for iPhone Hearing Devices’ certification for hearing aid, Voice Control improvements, more text size options in essential Mac apps, pause images with moving elements (for those sensitive to rapid animations), and more natural voices for Voice Over.

Source: SamMobile - Samsung news