by Candidate A14710
I took a deep breath, and another, preparing myself for the unknown. There was a lump in my throat and a knot in my stomach; thoughts pacing heavily through my brain and fighting to be let out; pain in my heart wanting to be given a warm hug and a band-aid. A friendly-looking woman strode out of a door at the opposite end of the cheerful waiting room, which was fitted out with multi-coloured chairs and ice lollies and called my name. I followed her into her dimly lit office and was offered tea, or a blanket if I was cold. It was winter.
“So, what brings you hear today? I read your patient survey, and you seem to have been feeling anxious lately.”
I walked out of the session with a sense of validation and a clearer mind, as well as pamphlets on what anxiety is and some meditation techniques.
I also walked out of the session a little bit frustrated for two reasons:
- I felt concerned that my words were inadequate in expressing what was going on in my mind and body. I wanted my counsellor to truly feel and understand what I was feeling.
- I wanted to know a bit more about why I was feeling what was feeling, and what exactly it was.
In short, I felt like I wanted more answers.
At the Human Neuroimaging Laboratory in Raonoke, Virginia, Pearl Chiu spearheads a project which hopes to allow for clearer answers in the diagnosis in the treatment of mental health disorders.
“What we have now just isn’t working,” asserts Chiu. Researchers on this project are highlighting the glaring inconsistencies in the field of mental health, encompassing the lack of a clear and universal scale of diagnosis as well as the common occurrence of misdiagnosis. When living in a world where people with bipolar disorder wait an average of 13.2 years before diagnosis (Bipolar VIC, Royal College) and psychological ailments are diagnosed solely by human communication and observation of a counsellor, the hope at HNL is that machine learning can drive a more accurate and data-driven concrete understanding of mental health.
The HNL project seeks to create a vastly more universal method of diagnosing mental illnesses; patients participating in the study first go through a clinical survey, go on to play various behavioural games and finally sits a test as the algorithm runs and collects the data, to then spit out a report weeks later detailing the neural pathways and brain functions in comparison to healthy brains.
“AI, and other advanced statistical techniques, have great potential for clinical application,” says James Cole, a professor in Neuroscience at King’s College London. “The first benefits I see happening will be largely technical, in other words helping sort through the mass of clinical data.” As progressively more data is collected in differing categories of mental health disorder (e.g. bipolar disorder, depression, anxiety), clearer behavioural patterns can be deduced, allowing algorithms to match patients up to these specific patterns in the future.
Breaking through boundaries not only in the area of diagnosis itself, Chiu and associates intend to aid in reducing the stigma that surrounds mental health issues. According to the Mental Health Foundation, one in four people will experience a mental health problem at one point in their lives. According to Diabetes UK, one in ten people are living with Type 2 diabetes. However, physical maladies are perceived as far more legitimate than mental health disorders, and thus provided with more funding and clear steps of support. Only one tenth of NHS taxpayer money is spent on mental healthcare, and sufferers of mental health disorders are subject to common scorn and their problems undermined.
“Whether an illness affects your heart, your leg or your brain,” preaches Michelle Obama, “it’s still an illness, and there should be no distinction.” Through utilising data-driven research and logical algorithms to prove that mental health disorders are real, the HNL project aims to prove Obama right and destigmatise mental health.
However, the power and the effectiveness of AI in mental health diagnoses may be limited by its very identity as a machine.
“Just for anything, whether it’s AI for mental health or algorithms in general, music, movies and things, they miss a crucial part in the middle which is that unknown human magic factor,” remarks Chris, 24, a Film Studies graduate of King’s College London. He suggests that perhaps cool logic and data aren’t enough to aid in mental health issues.
“As someone who has used a mental health service before, I know that it’s a lot more complicated than asking a few questions,” reflects Maddy. A study abroad student from Australia attending London’s King’s College, she sits and reads in the student courtyard. “It’s about other things like body signals, you know, intonation, really small things, that often require a lot more than simply algorithms.” Raising a discussion about ethics, it is to be questioned and discussed whether a health issue as fundamentally human and vulnerable as mental health can truly be gauged accurately by data-crunching and a machine.
Further, the question of bias enters the discourse. If we are trying to eliminate the risk of misdiagnosis and human bias from today’s mental health support system, can we confirm that algorithms are able to do this?
“That’s a really, really, really tough question,” deems Pearl Chiu. She makes reference to issues regarding human bias that inevitably manifest themselves through algorithms, as they are, ultimately, constructed and written by humans. Further, the doctors’ decisions are likely to be impacted by information provided by AI. It seems that an entirely unbiased and flawless process may be impossible to achieve; however, current processes can be moulded and fine-tuned into greater efficacy and accuracy. However, efforts are made to make the entire screening process as blind as possible for participants as well as professionals. Machine learning and clinician interactions can work in conjunction to continue to research and learn.
“In my view, it’ll be some years before we see computers making diagnoses,” says Cole. “Even when we do, these will be done to aid clinicians, rather than replace them.”
Thus, a vision for the future is laid out.
In tandem, the expertise of mental health specialists and the cool inimitable logic of AI are envisioned to morph into a new and improved model of mental health disorder diagnosis; a new era in which mental illness is not stigmatised, and patients provided with more concrete support and explanations.