At a time when Artificial Intelligence (AI) is all over the news with people being blown away by what it is, what it can do but also worried about where it might go, Apple has announced a feature coming later this year that demonstrates an amazingly positive use for a virtual version of your own voice.

Previewed as part of an announcement featuring a range of updates to Apple’s Accessibility options, “Live Speech” and “Personal Voice” are going to be game changers in this space.

Globally, millions of people are unable to speak or have lost their speech over time.

Live Speech will allow users to type phrases to be heard on a FaceTime call, Phone call or even in-person conversation.

You can even save commonly used phrases so you can more quickly chime in on a fast-paced conversation.

But it’s the Personal Voice feature that is set to be the most amazing.

For people who are losing their voice, be it a temporary thing, or a debilitating disease like Motor Neurone Disease (MND), it will be possible in a future iOS update to create a “personal voice” by reading a randomised set of text prompts, accumulating into 15 minutes of audio on your iPhone or iPad.

This voice will then be processed and a Personal Voice created, so those text prompts you type or choose, can be read out on loudspeaker or spoken on a phone or FaceTime call sounding very much like you.

In the US, MND is known as ALS, and ALS Advocate and Board Member at the Team Gleason nonprofit Philip Green sums up the importance perfectly, saying “At the end of the day, the most important thing is being able to communicate with friends and family,”

“If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world — and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary.”

In addition to Live Speech, Apple is adding a “Detection Mode” in Magnifier which introduces a point and speak feature for those who are blind or suffer from low vision.

This feature allows you to point a finger at a button – on a Microwave for example – and iPhone will read out what that button is. Simple, but important.

Assistive Access is a third amazing feature announced today, which helps those with Cognitive Disabilities.

Some will say this is like “simple mode” on many Android phones, and in some ways it is. But it’s been deeply engineered by Apple to suit those who are not only daunted by the complexity of features available on iPhone, but also those who are using a device for independence with the support of a carer or family member.

Large easy to see icons, customised interfaces for things like taking photos and simple contact lists with labels and images. A great initiative.

No date or iOS version number listed on these updates, but coming “later this year”