By Stephen Mills PwC Director Data & Analytics North
As we draw closer to our #AIHealth Hackathon (30th November – 1st December) at PwC’s new Manchester office, it is important to consider the implications of automating intelligence and what that means for patient care. In this blog in our #AIHealth series, we think about the ethical implications of AI to ensure anything that we bring into our organisations not only draws upon the benefits that this type of automation brings, but also is done in a responsible way. Something in PwC we refer to as Responsible AI.
At the heart of Responsible AI in Healthcare has to be the understanding that we are often dealing with extremely sensitive topics for the patient, something that has always required a good bedside manner from the clinician. How can AI replicate this bedside manner and make sure the patient is getting something akin to ‘the human touch’? Should AI even be involved in direct patient interactions?
Let us consider some of the scenarios where the use of AI in healthcare draws up some ethical questions:
We can see through only a handful of scenarios that there could be significant implications of AI making decisions that could either have adverse outcomes for patients, or potentially include biases that may not be tolerable in human society.
Alder Hey encountered such ethical challenges during their AI implementation, as it was important to ensure automated intelligence that interacts with children was within a safe environment, that messages delivered were right for a young audience, and language and tone was also appropriate.
Whilst the benefits of AI are clear, as part of any prototype created in the Hackathon event and any subsequent implementation, we need to bring the ethical questions to bear to ensure we are doing the right thing by our patients, and that people’s human rights are considered by any AI solution we bring into our organisations.
We look forward again welcoming organisations to the Hackathon at the end of November, and will continue our blog series with a post on the practical capabilities required to implement AI solutions. Stay tuned!
If you would like to see our opening blog in the series you can find it here.
Any Trusts or organisations’ wishing to discuss ‘AI in Health’ should contact:
Stephen Mills, PwC Director Data & Analytics North
Mobile: 07966 265 804