ARTIFICIAL INTELLIGENCE

People are more honest with AI than their doctors, new Aide Health report finds

Digital health company reveals how “adaptive” AI could help rebuild trust between patients and healthcare

People are more likely to open up about their health to artificial intelligence (AI) than to their doctors, according to a new white paper from UK digital health company Aide Health.

The report, “Building Patient Trust in AI,” reveals a striking contradiction at the heart of modern healthcare: while most people still feel uneasy about AI, many are more honest with it than with clinicians.

In NHS primary care, Aide Health’s conversational platform found that 26% of asthma patients admitted to not taking their medication as prescribed – a figure far higher than typically reported in face-to-face consultations. The finding suggests that well-designed AI can uncover hidden health risks often missed in busy clinics. 

Key Findings

  • People are surprisingly open with AI. Patients disclosed sensitive information – including medication non-adherence – far more often to AI systems than to clinicians.
  • Trust is complicated. While 48% of people are comfortable with AI identifying health risks through wearables, only 35% believe it improves care quality.
  • Culture and experience matter. Levels of trust vary widely across communities, reflecting broader health inequalities rather than differences in AI performance.
  • Design makes the difference. When AI systems feel non-judgemental, people report less fear of being criticised and are more willing to tell the truth about their health.

This is based on data from Aide Health’s NHS programmes alongside analysis of peer-reviewed studies.

Ian Wharton, Founder and CEO of Aide Health and author of the report, said:
“Healthcare AI sits at the crossroads of trust and empathy. People trust technology to listen without judgment, but they still want to feel understood. Our research shows we can design systems that do both – helping patients speak honestly while supporting clinicians to respond with compassion.” 

Teaching AI when to listen and when to care

To solve what Aide Health calls the “trust paradox”, the report introduces a new design principle called adaptive neutrality –  teaching AI when to be neutral and when to show empathy.

  • In high-neutrality moments –  such as medication tracking or lifestyle monitoring – patients often prefer the calm, private space that AI provides.
  • In low-neutrality moments – like diagnosis or treatment decisions – human empathy and connection remain essential.

Wharton added: “AI should never replace a doctor. But when it’s designed to listen, guide and adapt, it can make every conversation between patient and clinician more meaningful.”

Why this matters?

Despite rapid growth in AI investment – projected to reach $148.4 billion by 2029 -60% of patients remain uneasy about its role in healthcare. With the NHS workforce gap expected to reach 360,000 by 2037 and nearly half of U.S. doctors reporting burnout, Aide Health argues that trustworthy AI could ease pressure on overstretched health systems, supporting both patients and clinicians.

The paper also calls for clear standards around transparency, fairness and privacy in AI design:

  • Patients must understand how their data is used and how it benefits them.
  • Systems should be tested across diverse populations to ensure fairness.
  • Interfaces must be simple, inclusive and accessible to all.

“People often reveal more to AI than they expect, and less to clinicians than they intend,” said Wharton. “If we can design technology that respects that honesty, we can make care safer, more personalised and more human.”

Aide Health is already applying its trust-centred design approach in new technologies. The company has just announced Mirror – the UK’s first AI-powered “medical memory” designed entirely for patients.

The app discreetly listens during healthcare consultations and produces a plain-English summary that patients can revisit at any time, ensuring that vital medical advice isn’t forgotten the moment they leave the room. Research shows up to 80% of medical advice is forgotten immediately, and nearly half of what’s remembered is inaccurate – confusion that costs the NHS an estimated £1 billion a year.

Categories
ARTIFICIAL INTELLIGENCEFront PageNews

Join our audience of
healthcare industry professionals

Join our audience of
healthcare industry professionals

X