Don’t ask chatbots for mental health help, says NHS chief

You cannot trust AI advice on the issue – and it can even be dangerous, warns expert

Dec 26, 2025 - 07:39
Don’t ask chatbots for mental health help, says NHS chief
Dr Adrian James says AI chatbots don’t understand the nuances of a serious mental health situation Credit: Justin Grainge

An NHS boss has warned people with mental health problems to avoid using chatbots for support over Christmas.

Dr Adrian James, the national medical director for mental health at NHS England, said using the technology as therapy “can be dangerous”.

A survey published last month showed that more than a third of adults had turned to artificial intelligence for help with their wellbeing.

“As a psychiatrist, I’ve seen an increase in the number of vulnerable patients turning to AI chatbots for mental health support over the last year,” Dr James said.

“Despite AI now being part of everyday life and a fantastic resource when used appropriately, it cannot be relied upon for everything – and in some cases can be dangerous.”

In November, a poll of 2,000 people by the charity Mental Health UK found 37 per cent had used an AI chatbot to support their mental health or wellbeing.

When asked why they had turned to AI, around four in 10 people said it was down to ease of access, while almost a quarter cited long waits for help on the NHS.

‘Potentially dangerous’

Dr James said: “During the festive period, I know Christmas can affect mental health in lots of different ways – whether it be financial pressures or feeling isolated – so it is vital that people know that they can turn to the NHS for help.

“The vast majority of AI chatbots do not have access to your mental health history, cannot fully understand the nuances during a serious mental health situation and can give completely wrong advice, especially when they’re led off their script.

“But my biggest worry is for those users who are at risk of losing touch with reality.

“During an episode of psychosis, people are at higher risk of self-harm and suicide, and chatbots have an in-built preference to agree while lacking the sophistication to pick up on and to challenge problematic thoughts – this could lead to potentially dangerous situations.

“The best support for your mental health comes from a trained healthcare provider, so I would urge anyone concerned to come forward and seek NHS support as soon as possible. You can get urgent support in a crisis by phoning 111.

“If you need support for depression or anxiety, you can refer yourself to NHS talking therapy service online at nhs.uk or by going to your GP.”

Elsewhere, NHS England said record numbers of people were using the NHS app to manage their health, with more than 39 million registered users.

More than 313,000 people used the app on Christmas Day last year, with over 200 logins every 60 seconds on average.

Jules Hunt, the interim director general for technology, digital and data, said: “Nearly 40 million people in England are now registered with the NHS app and I’d encourage anyone who needs it to log into the app over the festive season to take advantage of the range of features it now offers – from tracking when your prescription is ready to checking the latest health advice.

“As ever, please continue to use A&E and 999 in life-threatening emergencies or use 111 Online and other services through the NHS app for less urgent conditions.”

[Source: Daily Telegraph]