News

Can cell phones help predict suicides?

Special for Infobae of New York Times.

(Science Times)

CAMBRIDGE, Massachusetts — In March, Katelin Cruz emerged from her latest psychiatric hospitalization with a familiar mix of feelings. On the one hand, she was relieved to leave the room, where her assistants removed her shoelaces and sometimes followed her into the shower to make sure she didn’t hurt herself.

Yet his life abroad was as unstable as ever, he said in an interview, with a pile of unpaid bills and no permanent home. It was easy to have suicidal thoughts again. For frail patients, the weeks after being discharged from a psychiatric facility are a notoriously difficult time, with the suicide rate about 15 times the national rate, according to one study.

This time, however, Cruz, 29, was released from the hospital as part of a vast research project that attempts to use advances in artificial intelligence to do something that has eluded psychiatrists for centuries: predict who is likely to try. commit suicide and when they are likely to do so, and then intervene.

On his wrist he wore a Fitbit electronic bracelet, programmed to record his sleep and physical activity. On her cell phone, an application collected data on his mood, his movements and her social interactions. Each device provided a continuous stream of information to a team of researchers on the twelfth floor of the William James Building, which houses Harvard University’s Department of Psychology.

In the field of mental health, few new areas generate as much buzz as machine learning, which uses computer algorithms to better predict human behavior. At the same time, there is growing interest in biosensors that can track a person’s mood in real time, taking into account musical choices, social media posts, facial expression, and vocal expression.

Matthew K. Nock, a Harvard psychologist and one of the nation’s leading suicide researchers, hopes to marry those technologies into a kind of early warning system that could be used when an at-risk patient is discharged from the hospital.

He offers this example of how it might work: The sensor reports a patient’s sleep is disturbed, she reports low mood on questionnaires, and the GPS shows she doesn’t leave the house. But an accelerometer on her phone shows that she moves around a lot, suggesting agitation. The algorithm marks the patient. A notification sounds on the dashboard. And, at just the right moment, a doctor calls or sends you a message.

There are many reasons to doubt that an algorithm can achieve that level of accuracy. Suicide is such a rare event, even among those most at risk, that any effort to predict it will result in false positives, forcing people who may not need it to intervene. False negatives could impose legal liability on physicians.

The algorithms require long-term granular data from large numbers of people, and it is nearly impossible to observe large numbers of people committing suicide. Finally, the data required for such monitoring raises concerns about the invasion of privacy of some of the most vulnerable people in society.

Nock is aware of all those arguments, but has persisted, partly out of sheer frustration. “With all due respect to people who have been doing this work for decades, for a century, we haven’t learned much about how to identify people at risk and how to intervene,” he said. “The suicide rate now is the same as it was literally a hundred years ago. So if we’re honest, we’re not getting any better.”

A data fire hose

On an August afternoon in the William James Building, a lanky data scientist named Adam Bear sat in front of a monitor in Nock’s lab, dressed in baggy shorts and flip-flops, staring at the zigzag graphs of the levels. of stress of a subject in the course of a week

When moods are represented as data, patterns emerge, and Bear’s job is to look for them. He spent the summer analyzing the days and hours of 571 subjects who, after seeking medical attention for suicidal thoughts, agreed to continuous follow-up for six months. During the period, two committed suicide, and between 50 and 100 attempted it.

In Nock’s opinion, this is the largest reservoir of information ever collected on the daily lives of people experiencing suicidal thoughts.

The team is most interested in the days preceding suicide attempts, which would allow time to intervene. Some signs have already emerged: Although suicidal impulses do not usually change in the period before an attempt, the ability to resist those impulses does seem to decrease. Something simple—sleep deprivation—seems to contribute to this.

Nock has been looking for ways to study these patients since 1994, when he had an experience that had a profound impact on him. During an undergraduate internship in the UK, he was assigned to a closed unit for violent and self-harming patients. There he saw things he had never seen before: patients had cuts up and down their arms. One of them tore out his eyeball. A young man he befriended, who seemed to be on the mend, later appeared in the Thames.

Another surprise came when he started peppering the doctors with questions about treating these patients and realized how little they knew: he remembers being told, “We give them some medicine, we talk to them, and we hope they get better.”

One reason, he concluded, was that it had never been possible to study large numbers of people with suicidal ideation in the same way that we can observe patients with heart disease or tuberculosis. “Psychology hasn’t advanced as far as other sciences because we’ve largely done it wrong,” he explained. “We have not gone looking for an important behavior in nature nor have we gone out to observe it.”

But with the advent of phone apps and wearable sensors, he added, “we have data from many different channels, and increasingly we have the ability to analyze that data, and observe people as they live.” One of the dilemmas of the study design was what to do when participants expressed a strong desire to harm themselves. Nock decided that they should intervene.

Telling the truth to a computer

It was around 9 pm, a few weeks into the six-month study, when the question popped up on Cruz’s cell phone: “Right now, how strong is your desire to kill yourself?”

Without stopping to think, he dragged his finger to the end of the bar: ten. Seconds later, she was asked to choose between two statements: “I’m definitely not going to kill myself today” and “I’m definitely going to kill myself today.” She preferred the second option.

Fifteen minutes later, her phone rang. He was a member of the investigation team calling her. The woman called 911 and kept Cruz on the line until police knocked on her door and she passed out. Later, when she regained consciousness, a medical team was rubbing her sternum, a painful procedure used to revive people after an overdose.

Cruz has a pale, angelic face, as well as curly, dark hair. She was studying nursing when a cascade of mental health crises took her life in another direction. She maintains an A-grader’s interest in science, joking that the ribcage on her T-shirt is “totally anatomical.”

From the get-go, she was intrigued by the essay, dutifully responding six times a day when her phone apps polled her about her suicidal thoughts. Her warnings were invasive, but also comforting. “I felt like I wasn’t being ignored,” she said. “It takes some weight off me that someone knows how I feel.”

The night of her attempt, she was alone in a hotel room in Concord, Massachusetts. She didn’t have enough money to spend another night there, and her belongings were piled in garbage bags on the floor. She was tired, she claimed, “of feeling like she had no one and nothing.” In retrospect, Ella Cruz said she thought the technology — her anonymity and her lack of judgment — made it easier to ask for help.

“I think it’s almost easier to tell the truth to a computer,” he added.

Last week, as the six-month clinical trial drew to a close, Cruz filled out her final questionnaire with a twinge of sadness. She would miss the dollar she received for each response. And she would miss the feeling of someone watching her, even if she was faceless, at a distance, through a device.

“Honestly, it makes me feel a little safer to know that someone cares enough to read that data every day, you know?” she said. “I’ll be a little sad when it’s over.”

If you are having suicidal thoughts, text the National Suicide Prevention Lifeline to 988 or visit SpeakingOfSuicide.com/resources for a list of additional resources. em>

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button