Blue background that reads "facebook" with a silhouette of a person looking down on his phone in front

On Social Suicide Prevention, Don’t Let the Perfect be the Enemy of the Good

In a piece in The Guardian and a forthcoming article in the Yale Journal of Law and Technology, Bill of Health contributor Mason Marks recently argued that Facebook’s suicide prediction algorithm is dangerous and ought to be subject to rigorous regulation and transparency requirements. Some of his suggestions (in particular calls for more data and suggestions that are really more about how we treat potentially suicidal people than about how we identify them) are powerful and unobjectionable.

But Marks’s core argument—that unless Facebook’s suicide prediction algorithm is subject to the regulatory regime of medicine and operated on an opt-in basis it is morally problematic—is misguided and alarmist. Read More

image of hands texting on a smart phone

Artificial Intelligence for Suicide Prediction

Suicide is a global problem that causes 800,000 deaths per year worldwide. In the United States, suicide rates rose by 25 percent in the past two decades, and suicide now kills 45,000 Americans each year, which is more than auto accidents or homicides.

Traditional methods of predicting suicide, such as questionnaires administered by doctors, are notoriously inaccurate. Hoping to save lives by predicting suicide more accurately, hospitals, governments, and internet companies are developing artificial intelligence (AI) based prediction tools. This essay analyzes the risks these systems pose to safety, privacy, and autonomy, which have been under-explored.

Two parallel tracks of AI-based suicide prediction have emerged.

The first, which I call “medical suicide prediction,” uses AI to analyze patient records. Medical suicide prediction is not yet widely used, aside from one program at the Department of Veterans Affairs (VA). Because medical suicide prediction occurs within the healthcare context, it is subject to federal laws, such as HIPAA, which protects the privacy and security of patient information, and the Federal Common Rule, which protects human research subjects.

My focus here is on the second track of AI-based suicide prediction, which I call “social suicide prediction.” Though essentially unregulated, social suicide prediction uses behavioral data mined from consumers’ digital interactions. The companies involved, which include large internet platforms such as Facebook and Twitter, are not generally subject to HIPAA’s privacy regulations, principles of medical ethics, or rules governing research on human subjects.

Read More

Taking action to prevent male suicide

By John Tingle

The issue of male suicide and prevention seems to have been an obscured or perhaps even a forgotten issue in reports discussing the care of vulnerable people. The UK media have recently focussed on this issue with the Project Eighty-Four campaign. This campaign  aims to raise awareness of male suicide with sculptures being placed on the top of a London tower block to mark this. The sculptures are on the top of ITV’s (Independent Television ) Buildings on London’s Southbank Promenade from 26th March 2018.The sculptures are designed to get people talking about the issue. Friends and families of the deceased men helped create them: “Each one, a poignant reminder of a real life lost and a call to society to come together and ultimately take a stand against male suicide.

BBC News has also covered the event. Project Eighty Four states that the statistics on male suicide are shocking. Every two hours a man in the UK takes his own life. Project Eighty Four is an initiative of the charity CALM (Campaign Against Living Miserably).CALM is dedicated to preventing male suicide and they say that male suicide and mental health is a big issue that cannot be ignored for any longer.

Interestingly they report in latest annual report and accounts a modest but noticeable increase in the number of female callers for help and advice. CALM’s focus is on men because of the high rate of male suicides.Helpline workers helped to directly prevent 409 suicides in 2016-17, up 19% on the previous year. Read More

“Ex-Gay” Speaker, Attempted Suicide, and HCSMs

On February 16, Jackie Hill-Perry, an outspoken speaker against homosexuality, delivered a controversial, unapologetically homophobic speech at Harvard’s Emerson Hall. Harvard College Faith and Action, the religious student group that invited Hill-Perry, reserved all the center-front seats for attendees “engaged in protest,” who were “welcomed” to their space of worship. This seemingly beneficent seating arrangement, however, allowed many protestors wearing rainbow flags to experience 30 minutes of worship songs with references to sin and redemption, before having a close-encounter with Hill-Perry. The emphatic speaker then recounted her own journey from initially accepting her same-sex attraction to her eventual embrace of heteronormativity due to her rediscovered Christian faith. A few protestors stormed out of the lecture hall during the height of her speech, when she called same-sex attracted Christians to practice “self-denial,” the same way a Christian would deny lying, stealing, and other grave “sins.”

As undergraduate and graduate students at Harvard, we are fortunate to have access to resources that may help us deal with and recover from the detrimental effects from a hate-filled speech like this. Though far from perfect, we do have at least a limited access to mental health services and other support groups on campus. Intellectually, we have academic resources that could dispute the religious reasoning behind homophobia. In his opening question for Hill-Perry, Professor Jonathan Walton of the Memorial Church quickly challenged the flawed theology Hill-Perry relied on, revealing the parallels between biblically justified racism to biblically justified homophobia. Some students from the audience also pointed out several logical missteps in her reasoning, which led Hill-Perry exclaim how “smart” people at Harvard are. Perhaps, she wasn’t used to speaking to a highly academic audience during her tours. Nonetheless, many non-protesting members of the audience, presumably members of the Harvard Christian group, did nod and clap during her speech. If her remarks could resonate with these Harvard students, how much more persuasive would it be in Christian conferences and churches? Who could stand up for LGBT people, especially the youth, in evangelical communities?

It has long been demonstrated that LGBT youths have a much higher suicide and attempted suicide rate comparing to their heterosexual counterparts in the United States and abroad. They are also significantly more likely to suffer from mental health issues ranging from depression to self-harm. Moreover, those living in evangelical families or communities where homophobia is still prevalent are especially vulnerable. Listening to a speech like the one delivered by Hill-Perry may worsen their daily struggles and increase their risk of suicide. Given these health risks of LGBT youths, we might expect that evangelical leaders who “love the sinner but hate the sin” would at least care about the health and safety of these minors, or simply respect their dignity as human beings. However, the reality could be far gloomier, falling short of these minimum expectations. The rest of the essay will turn the discussion toward how LGBT youths might be treated under the practices of Christian health-sharing ministries (HCSMs).

Read More

Medical Malpractice and the Middle-Ground Fallacy: Should Victims’ Families Recover Compensation for Emotional Harm?

By Alex Stein

Medical malpractice victims are generally entitled to recover compensation for emotional harm they endure: see, e.g., Alexander v. Scheid, 726 N.E.2d 272, 283–84 (Ind. 2000). But what about a victim’s close family member? Take a person who suffers emotional distress from witnessing a medical mistreatment and the consequent injury or demise of her loved one. Should the court obligate the negligent physician or hospital to compensate that person for her emotional harm?

This question has no uniform answer under our medical malpractice laws. Some states allow victims’ families to recover compensation for their emotional harm, while others do not. Three weeks ago, the Connecticut Supreme Court struck a middle ground between these two extremes. Squeo v. Norwalk Hosp. Ass’n, 316 Conn. 558, — A.3d —- (Conn. 2015). Read More

“Proximate Cause” and the Patient Suicide Problem

By Alex Stein

This difficult problem and the underlying human tragedy have recently been adjudicated by the Supreme Court of Mississippi in Truddle v. Baptist Memorial Hosp.-Desoto, Inc., — So.3d —- (Miss. 2014).

A hospital patient suffering from a number of illnesses became agitated and aggressive. He took the IV out of his arm and attempted to leave the hospital. When nurses stopped him and forced him back to his room, he hallucinated that someone was trying to rape him. Despite these psychiatric symptoms, the patient was discharged and treated as an outpatient. During his outpatient treatment, he complained to his doctor that the medications he was taking “make him crazy.” Six days after his release from the hospital and two days after his last outpatient appointment, the patient barricaded himself in his bedroom and committed suicide.  Read More