Public perceptions of AI in healthcare revealed in new study

New generation artificial intelligences that communicate face to face.
image: ©Devrimb iStock

A recent study conducted by King’s College London found that while there is broad public support for the use of artificial intelligence in healthcare, participants expressed clear concerns and limits regarding its appropriate application

In the last ten years, AI has been applied in research and application development in healthcare, and its use is quickly entering the real world, with figures suggesting that over 200 FDA-approved products for radiology alone are on the market and more in the pipeline.

Research into the use of AI in healthcare is rapidly growing. Despite this, data on public perception is generally limited in this field, highlighting the necessity of research into the public’s view on AI in healthcare. To support this, King’s College London surveyed 2000 people as part of an exhibition at Science Gallery London called AI: Who’s Looking After Me? to get visitors’ views on AI in healthcare.

80% of respondents view AI in healthcare as necessary

An anonymous, quantitative questionnaire was given to participants, containing eight questions based on previously validated subject areas designed to assess respondents’ views on the use of AI in healthcare. 

The researchers found that 80% of people said AI should be used in medicine, while just over half (56%) felt it would be safe. However, the participants were wary about trusting AI with significant decisions, with more than 70% rejecting the idea that AI could take over doctors’ roles. Even if AI made fewer mistakes, respondents were uncomfortable letting it act alone. “Most people would not be happy for AI to make decisions without considering their feelings”, the researchers noted.

The general public’s perception is essential for using AI in healthcare, especially in countries where taxation funds healthcare, like the system in the United Kingdom.

The study had several limitations worth noting, including the fact that the survey was administered publicly with no supervision. Therefore, the responses were not quality-controlled. The demographics of respondents were also young, which is generally a group more accepting of technology and less likely to engage with healthcare regularly.

Bridging the research gap between public perception and AI use

Previous studies have predominantly concentrated on either physician or patient perspectives, often overlooking the broader societal viewpoint. In contrast, the researchers in this study sought to address this gap by examining public perception, offering a more comprehensive understanding of how these issues are viewed beyond the clinical environment.

Furthermore, the researchers found that older respondents (50+) were more likely to consider AI safe (62%) compared to younger participants (55%). Gender differences were also evident, with 88% of men supporting AI’s implementation in healthcare systems, compared to 77% of women.

The findings align with existing research on the attitudes of healthcare professionals, specifically radiologists, who generally view AI in healthcare as a complementary tool, not a replacement. The study highlights two critical factors for AI adoption in healthcare: public consent and transparent communication. Trust in AI depends on explaining how these technologies operate and ensuring that human oversight remains central to clinical decision-making.

“As a radiologist and researcher, I read a lot about how AI will replace me. This research shows that the general public, like patients and physicians, are comfortable with AI as a tool, but not as an autonomous decision-maker. This perspective is vital in informing how we implement AI programmes, keeping patients and future healthcare users (i.e. the public) not only informed but at the heart of what we do,” commented Dr Carolyn Horst, NIHR Clinical Lecturer & Radiology Registrar at King’s and GSTT.

OAG Webinar

LEAVE A REPLY

Please enter your comment!
Please enter your name here