Ben* sat across from me, explaining how his low motivation, lethargy and trouble sleeping seemed like depression from content he had seen online. I made a recommendation to get his bloodwork done with his GP, who advised that Ben was low in vitamin D and iron, which can mimic depressive symptoms. Under the care of his GP, Ben’s symptoms quickly resolved without requiring further psychological intervention
Thuy* made an appointment with me, armed with information and old school and university records after her colleague was diagnosed with attention deficit hyperactivity disorder. After going through the assessment process, I diagnosed her with inattentive ADHD, a commonly underdiagnosed condition among women and girls. Thuy was relieved and felt as though her life finally made sense to her, after years of assuming she was “just lazy”.
In my clinical work, a new ritual has become commonplace. Clients no longer just describe their symptoms, they often arrive with printouts, screenshots of dense articles, some AI chatbot information and the phrase “I’ve done my research”.
Make no mistake, I am fully supportive of people trying to make sense of their mental health symptoms, and too often when there are comorbid physical and mental health issues people have been turned away from health professionals without the care and support they need. Often, like Thuy, people can be correct with their hypotheses. Also often, like Ben, they can be incorrect.
What can follow this self-directed research is half-understood statistics, cherry-picked case studies, viral social media threads and anecdotes masquerading as legitimate data. I’ve seen anxiety spiral from misreading a side-effect profile and depressive withdrawal justified by a misinterpreted, dangerously low-quality study.
Client-led research is empowered by the internet’s vast library yet unaided by guidance about how to interpret the information. We are witnessing the rise of the amateur health expert, a well-intentioned but at times costly role. Taking an active interest in your health is positive but the democratisation of information, without the concurrent democratisation of critical research skills, has created a perfect storm for misinformation.
We are drowning in data but missing vital how-to knowledge. The consequence is individual confusion and a collective erosion of trust in the scientific process, fuelled by cognitive biases that run rampant in online echo chambers. Confirmation bias, for example, leads us to seize upon the one outlier study that confirms our fears. The Dunning-Kruger effect allows a few hours on YouTube to foster an illusion of expertise that dismisses experts who have decades of clinical training.
For many, research has become synonymous with reading or searching online. For scientists, reading is merely the first step in a gruelling process. True research involves designing a question that can be tested, selecting an appropriate methodology, navigating ethical reviews, collecting and analysing data, and subjecting every assumption to peer scrutiny. Academia’s barriers, which include paywalls, jargon and complex statistics, reflects this specialised, rigorous work. I believe a public health campaign would be helpful to increase data literacy.
To navigate the research landscape, people must first understand the hierarchy of evidence. Not all information is created equal. At the top of the evidence hierarchy are systematic reviews and meta-analyses, which synthesise all available randomised controlled trials (RCTs) on a topic, offering the highest certainty. RCTs are considered the gold standard for intervention studies, and they come next. Descending the pyramid, we find cohort studies, case series and, finally, anecdotal evidence; the personal testimonies and “I know someone who …” stories that, while powerful, prove nothing about general efficacy or safety. A viral Instagram reel is anecdote; a meta-analysis of 50 RCTs is evidence. Confusing the two is a critical error.
How can you become a smarter consumer of health information, rather than a casualty of it? When you encounter a claim or a miracle cure that sounds too good to be true, pause and interrogate the source with these questions.
What is the study design? Is it a controlled trial or a single-case report? Locate it on the evidence hierarchy.
Who was studied? Did the research include people like yourself in age, gender, health status or ethnicity? A study on 20-year-old athletes may not apply to a 60-year-old with a chronic condition.
Who is behind it? Check the funding source and author affiliations. Is it published in a reputable, peer-reviewed journal? Be warned: the peer-review system itself is under assault from AI-generated “slop papers” – fake studies churned out to pad academic CVs – making vigilance even more paramount.
What are the numbers? How many participants were involved? Are the results statistically significant and do the authors openly discuss the study’s limitations?
What is the consensus? Is this a lone finding or does it align with the broader body of evidence? What do other independent experts in the field say?
This critical lens is your best defence. It allows you to distinguish between a robust clinical guideline and a compellingly packaged story.
The most vital tip is to turn to the experts you are trying to emulate. Your research should be a prelude to a conversation, not a replacement for one. Being curious online and coming to a conclusion is not necessarily testing your convictions. A qualified professional is trained in the process of questioning, weighing conflicting evidence and applying population-level data to your unique, individual context. This is absolutely not to say experts are never wrong and science is infallible.
Science is evolving because it is being tested and retested and knowledge is being built upon. Medical misogyny, racism and classism still exist in spades, and these too must be attended to immediately to restore public faith in institutions and experts.
In our collective quest for agency over our health, we must not mistake information for understanding, or confidence for competence. An important act of self-care in the digital age may not be finding the answer yourself but developing the wisdom to know who to ask.
*All clients are fictional amalgams





