Google officially introduced Live Caption with Android 10, a headline feature that generates texts from videos, podcasts, or audio messages on your phone. The company has been expanding the service’s capabilities ever since and currently uses the same technology to power Live Translate. Now, coming up on three years since Google first announced its solution, Apple’s introducing something similar for iPhones alongside a slew of other accessibility tools.

Live Captions will arrive for iPhone, iPad, and Mac with similar capabilities to its Android counterpart. That means auto-generating on-screen text from any audio content — like a FaceTime call, a social media app, or streaming content — for hearing impaired users to follow along with conversations more easily.

The display of captions will be pretty flexible, letting users tweak font size to their preference. With FaceTime calls, the system will automatically distribute transcripts to participants, and calls on Mac will support a text-to-voice response component. Live Captions on iPhones will be generated on-device just like on Android, so Apple doesn't get to listen in to any of this.

Meanwhile, Apple is also introducing an intriguing door detection feature for the blind and people with reduced vision. Door Detection will use the iPhone’s LiDAR scanner, camera, and machine learning to locate a door when a user arrives at a new destination, figure out its distance, and note whether it's open or closed. Since LiDAR is a prerequisite for the feature, it’ll likely be limited to the iPhone Pro 12 and 13 models and some iPad Pros.

Apple’s got a ton of other accessibility features it's showing off, including the ability read signs and symbols around doors. We also get new wearable abilities such as mirroring to allow users to control their Apple Watch with the iPhone’s assistive features (like Switch Control and Voice Control) and over 20 additional VoiceOver languages, including Bengali and Catalan.