If you’re living a life free from disability you have the freedom to do what you want, how you want on any device. Now think about the one in five Aussies living with a disability and think about the challenges they may face doing things you take for granted in this ever tech oriented world.

Apple has long been a leader in accessibility on their devices, and today they’ve announced new features which will come to iOS later this year. Apple won’t say if that’s in an iOS15 update, or in a release of iOS 16 later in the year, but they are coming this year.

4.2 million Aussies are over 65, and of them – over 50% have a disability, from hearing loss to vision loss, these challenges could limit their access and use of technology – Apple hopes to limit those challenges with their accessibility features.

Here’s what’s coming.

Door Detection

Yep, we take it for granted, walking down the street you see the doors of a shop and know what the shop is, and if the doors will auto open or need to be pushed or pulled, or frankly if the door is already open.

With Door Detection in the Magnify app, using the power of the LiDAR sensor in Apple’s pro phones, and using on-device machine learning, the iPhone can describe in words on the screen and or via a voice prompt all the things you need to know about the door.

Amazing stuff.

Apple Watch Mirroring

The Apple Watch can do lots of things, and some of those applications are even more suited to those who don’t have the ability to fully use the Watch.

Adding Apple Watch Mirroring means the Watch “screen” can be viewed on an iPhone and used like any other app.

Imagine someone with quadriplegia, in a wheelchair, Apple Watch on, iPhone on a mount in their view. They might control their wheelchair via a simple breath or voice, and that’s also how they control the iPhone.

Now, with Apple Watch Mirroring, they can command their Apple watch to run heart rate checks, blood oxygen levels and more.

Live Captions on iPhone, iPad and Mac.

Once again making full use of the processing power and machine learning capabilities of the iPhone, Apple is bringing Live captions to iPhone, firstly in the US and Canada.

This Live Captioning when turned on, will transcribe ANY audio coming out of the device, it doesn’t matter if you’re on the phone, on a video or anything – all the sound can be transcribed.

Additionally, you can switch to “input” mode, so it will transcribe what it hears – great for a conversation.

These are remarkable features, and no matter what your view of iPhones or Apple, you simply cannot argue that Apple is putting accessibility at the core of it’s products and these advances are mind blowing even to those of us who are able bodied without any vision or hearing loss.

Perhaps the day we need them ourselves, we’ll appreciate them most.