To the Editor: I am a psychiatric survivor. Since moving to Vermont over 12 years ago, I have been doing peer support, education, and advocacy across the state. Recently I learned that Health Care & Rehabilitative Services (HCRS), the largest providers of mental health services in Windham and Windham counties, has signed a contract with Eleos Health, an Israeli start-up company, to provide artificial intelligence (AI) notetaking software in its clinical programs.
I have serious concerns about this action.
Throughout my decades of involvement supporting other psych survivors and service users, a major theme has emerged that involves not only our civil and legal rights, but also our relationships with providers and the mental health system as a whole: that of informed consent.
Lack of informed consent does real harm. It breaks trust and deepens our fears of mistreatment in the system. It reproduces the conditions in which many of us have previously been traumatized: a lack of power and autonomy over our own bodies, our own minds, and our own privacy.
HCRS insists that getting consent to treatment during intake covers the use of this technology, and further consent is unnecessary. Clients are technically allowed to ask questions and even opt out, but clinicians are being given training and talking points designed to achieve compliance.
In my own lived experience, I have had to fight for access to complete and accurate information about psych drugs, diagnosis, and many other aspects of treatment in order to make decisions in my own best interest.
In conversations with other psych survivors and service users, I've heard folks ask, "Who trained the AI, and who was it trained on? What type of biases might be trained into the system? Who profits from our healthcare agencies implementing this technology?"
To meaningfully consent to use of Eleos's AI, clients need to know that a founder and COO of Eleos Health boasts that his experience as a drone pilot in the Israeli Air Force informs his approach to the delivery of mental health care. That the bulk of the company's leaders are active members of a military force that has targeted healthcare workers in Gaza and all but destroyed the region's healthcare system. That this same military used AI to determine which families' homes to bomb. That it detonated pagers in Lebanon, some of which were in the hands of children.
Psychiatry for decades has labeled patients as "paranoid," meanwhile using all manner of surveillance, from secret tape recorders to security cameras. I'm alarmed by the rollout of this controversial technology and concerned that clients' reasonable fears that their most private and vulnerable conversations could be leaked will be dismissed and pathologized.
I'm asking HCRS to reconsider this contract and to seek solutions to providers' notetaking woes that address the root causes and do not cause harm to those it has taken on the responsibility to care for.
Calvin Moen
Dummerston, Oct. 15