
Utah is allowing an AI system to prescribe psychiatric drugs without a doctor. It’s only the second time the state — and the country — has delegated this kind of clinical authority to AI. State officials say it could bring costs down and ease care shortages, but physicians warn the system is opaque, risky, and unlikely to expand mental health care to those who need it.
The one-year pilot, announced last week, will allow Legion Health’s AI chatbot to renew certain prescriptions for psychiatric medications, in some cases. The San Francisco startup promises Utah-based patients “fast, simple refills” through a $19-a-month subscription. The program starts at some point in April, though the company is only operating a waitlist at the moment.
The AI chatbot will renew certain prescriptions for psychiatric medications, in some cases.
The program is deliberately narrow in scope, limited both in terms of the medications it covers and the conditions patients must meet to qualify. According to Legion’s agreement with Utah’s Office of Artificial Intelligence Policy, the chatbot can renew only 15 lower-risk maintenance medications that have already been prescribed by a clinician. That includes fluoxetine (Prozac), sertraline (Zoloft), bupropion (Wellbutrin), mirtazapine, and hydroxyzine, commonly used to treat anxiety and depression. Patients must also be considered stable: Anyone with a recent dose or medication change or a psychiatric hospitalization in the last year is excluded, and patients must check in with a healthcare provider every 10 refills or after six months, whichever comes first.
The system cannot issue new prescriptions or handle medications that require closer clinical oversight, including drugs that need blood-test monitoring. Controlled substances are also barred, ruling out many ADHD medications. The exclusion of benzodiazepines, used for anxiety; antipsychotics, used for conditions like schizophrenia and bipolar disorder; and lithium — widely considered the gold-standard treatment for bipolar disorder — leaves many more complex psychiatric cases outside the pilot’s scope.
To use the system, patients must opt-in, verify their identity, and prove they already have a prescription, such as by providing a photo of the label or pill bottle. They are then asked about their symptoms, as well as side effects and efficacy of the medication. They’re asked questions about suicidal thoughts, self-harm, severe reactions, and pregnancy in order to log red flags. If any answers fall outside of the pilot’s low-risk criteria, the cases are supposed to be escalated to a clinician before any refill is issued. Patients and pharmacists can also request human review.
“By safely automating the renewal process for maintenance medications, we are allowing patients to get the care they need much more quickly and affordably,” state officials said when announcing the pilot. Over time, they said, the program could free healthcare providers to “focus their time on more complex, higher-risk patient needs” and help address shortages that have left 500,000 Utah residents without access to mental health care. Legion cofounder and CEO Yash Patel has cast the program in even grander terms, describing it as a global first that will dramatically expand access to healthcare and mark “the beginning of something much bigger than refills.”
Psychiatrists are less convinced. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, told The Verge he thinks the “advantages of an AI-based refill system may be overstated.” He suspects the tool “will not increase access for those who are most in need of care.” The target patient would already have to be on a treatment plan with their psychiatrist to use the service.
“It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this.”
Kious suggests the automation could contribute to what he called an “epidemic of over-treatment” in psychiatry, with some patients staying on medication longer than they need to. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, raised a related concern, noting that some people benefit from staying on psychiatric medications long-term, while others may benefit from reducing or stopping them. “They require more active management, changes, and careful consideration,” he said. That’s harder to do if you’re outsourcing refill check-ins to a chatbot.
A bigger worry is whether a chatbot can safely automate even the most routine parts of psychiatric care. Torous said prescribing involves more than just checking for drug interactions, and questioned whether any AI system today “can understand the unique context and factors that go into a person’s medication plan.” Kious made a similar point: “This is something that could be safe in principle, but it all depends on the details.” Those concerns are compounded by how new these systems are — and how opaque they remain to outsiders. “It feels a bit like alchemy right now,” he said. “It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this.”
There are more immediate safety concerns, too. Kious said the chatbot could miss something during screening: It may not ask the right questions, a patient may not recognize a side effect, or they may answer inaccurately. Some may simply tell the system what it wants to hear in order to speed up care. He stressed that this is not unique to chatbots; much of psychiatry relies on self-report. But human clinicians usually have access to other information as well, he said, adding that when he sees patients, he pays attention not just to what they say, but also to what they do not say and how they present themselves. And while patients can also mislead human providers, Kious said a chatbot system may make it easier for patients to adjust their answers until they produce the desired outcome.
Torous said there are more overt safety risks as well, which will be familiar to anyone following how chatbots fare in the real world. Legion’s chatbot is Utah’s second experiment with AI prescribing, joining an ongoing, broader pilot focused on primary care with Doctronic that launched last December. Within weeks of going live, security researchers had managed to push Doctronic’s system into spreading vaccine conspiracy theories, generating instructions for cooking meth, and tripling a patient’s opioid dosage. State officials say the more focused program with Legion is designed specifically to target “the state’s mental health shortage.”
Legion says the pilot is operating under tight guardrails. In addition to what it calls “conservative eligibility gates,” its agreement with Utah requires it to provide detailed monthly reports and have the first 1,250 requests closely reviewed by human physicians, with periodic sampling of around 5 to 10 percent of requests thereafter.
Legion cofounder and president Arthur MacWaters told The Verge that “risks exist in any remote care model, whether AI-assisted or fully human-led” and stressed the company’s “workflow does not rely on a single self-reported answer to unlock treatment.” He said key safeguards include the pilot’s narrow limits on medications and patient eligibility, built-in AI safety screens, pharmacist involvement, and the ability to escalate to a clinician. “We see this as critical to expand access to hundreds of thousands of people in Utah who live in mental health shortage areas, as well as an important proving ground for AI in medicine.”
MacWaters would not comment on additional use cases, medications, or expansions to other states, but said the firm is “excited for what the future holds.” He would not offer a timeline on Legion’s expansion plans either, though both MacWaters and Legion have signalled broader ambitions beyond Utah publicly: Legion’s refill site says the service will be available “nationwide 2026” and MacWaters has suggested it “will be in every state very very quickly.”
For the psychiatrists I spoke to, it all seems to raise a rather basic question: What problem is Legion really solving? Established patients often don’t even need an appointment to get a refill, Kious said, explaining that most psychiatrists are probably “happy to refill prescriptions for free and without an appointment” unless they are worried about the patient or the medication carries a meaningful risk. Those are the very cases Legion’s AI is barred from handling.
“I would personally avoid it for now,” Torous said, adding that if you’ve found a good treatment plan that works for you, it’s probably best to stick with that clinician.






