Visiting the doctor in the age of AI

May 31, 2023
Doctor AI, artificial intelligence in modern medical technology.

On 8 May, Four Corners (Artificial Intelligence Rising: the new reality of artificial life), portrayed an isolated man’s relationship to a robotic woman and a sex doll, and in another scenario, artificial memories were generated for family members to communicate with the long dead – weird stuff.

AI is not new to the work of a doctor, nor to our daily lives, but will future visits to the GP be changed for the better? Or community health? So important are these issues, that the New England Journal of Medicine (NEJM) has started a new journal devoted solely to AI.

It was thought 40 to 50 years ago that the brave new world of computing would replace the intellectual functions of the physician. A chain of algorithms could link pattern recognition to “illness scripts” and a diagnosis would result. But much machine learning (AI in another guise) was developed for commercial organisations and turned out to be poor fit to clinical problems. Interest then lagged. Now AI has gathered a hectic momentum.

AI already has a strong place in medicine; ECGs can be read, differential blood counts calculated, retinae photographed, and skin lesions analysed. Some of these were already in use 50 years ago. AI is also used in imaging, detecting drug interactions and in recording medical notes.

Current electronic medical records are clunky, cluttered, and have inadequate representations of the trajectory of illness. There is much duplication. The average electronic health record in the US has 50% as many words as Shakespeare’s longest play, Hamlet. There is potential to improve and rationalise health care records.

Computers can generate data from an amazing array of sources – social networks, blogs, product reviews, websites, documents, wearable and environmental sensors, and much more, and with added huge storage capacity, information can be found to a level previously unknowable. With this capability, clinical research could be speeded up by efficient recruiting and matching to research criteria. But the issues researched should be clinically meaningful and scalable to populations of patients.

The digital inputs which bombard GPs and specialists could be prioritised allowing more attention to the needs of patients. And AI could trigger prompts and indicators in treatment and increase diagnostic precision.

The stakes are high. As against such potential, there are concerns about the utility and safety of AI in healthcare which will require the same rigour in testing as demanded for other advances in medicine.

AI in medicine

The NEJM has published examples of AI in medicine. Using GPT-4 (created by OpenAI) the physician starts a ‘session’ with a “prompt” in natural language then follows an immediate ‘response’ in natural language. When asked a question the chatbot gives an accurate answer, if there is, in fact, a firm answer. When there are alternative possible answers, the chatbot is less reliable. When asked, what is metformin? The chatbot correctly describes metformin’s use in diabetes. When asked whether all people with diabetes can take metformin, the answers are correct. And GPT-4 scores well when answering multiple-choice exam questions.

In a real-life doctor-patient interview, using a “smart speaker”, the GPT-4 output provides a cogent record of interview. When asked to correct the record, an accurate edited version is created. And when asked to give advice on a straight-forward case, AI produces an appropriate plan of management. But, as with all medical records, records generated by AI need to be carefully checked.

Clinical skills

Generations of medical students have learnt real-life medicine through clerking hospital patients. They were taught to recognise patterns and to apply rules of pathophysiology and behaviour to get the best-fit to the patient’s symptoms and signs. However, there is more to evaluating illness and injury than seeing and listening (visible signs and symptoms); it involves touch, feelings, smell, movement, emotion, empathy, pain, and distress.

The skills of a clinician are gained by being exposed over time to a range of health problems, in different contexts, and through tacit learning. In high-performing clinicians, intuition is developed from memorised patterns of association. Not only patterns of knowledge, but patterns of action and sense of acuity. But that is not sufficient, as preliminary diagnoses and actions must be checked by reflection and testing hypotheses, especially when there is complexity.

Not only do these skills define the clinician’s task, but the ability to communicate with patients and work with other members of the healthcare team are key parts of the package. And something computers will never do, is to take into account the views and observations of others – mother of the child, carer of the disabled person, the spouse of the demented patient.

Warnings

When Jeffrey Hinton, founder of neural networks, announced he was leaving Google and warned of the dangers of uncontrolled AI, he was joined by other key researchers and managers – Samuel Altman, CEO of Microsoft, Tom Gruber, co-author of Siri, Professor Toby Walsh from UNSW, and many more. More than 1000 AI leaders signed an open letter calling for a six-month moratorium. Later, leaders of the Association for the Advancement of Artificial Intelligence released a letter warning of the risks. On 28th May, the AMA urged caution and national regulation to control AI in health care.

The neural networks which Hinton and his colleagues developed in 1972 could analyse thousands of images and common objects. He is now concerned about the proliferation of false information; but his special concern is the use of AI in warfare, “robot soldiers”.

For medicine, artificial intelligence will enable new discoveries and improve processes in the continuum of healthcare, in ethics, in governance, and it will require regulatory consideration in these domains.

But for each of us, the intimacies and nuances of illness cannot be emulated simply by technology, we will always need clinicians whose skills have been learnt tacitly during their privileged access to the human suffering of others.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!