Marquette University

08/22/2024 | News release | Distributed by Public on 08/22/2024 08:23

Researching the promise, perils and ethical dimensions of technology’s relationship with mental health

Arts & Sciences

Researching the promise, perils and ethical dimensions of technology's relationship with mental health

From developing peer support apps to investigating social media users' perceptions of mental health and offering ethical leadership about privacy concerns, faculty and students in the Klingler College of Arts and Sciences are finding insights at this intersection.

  • By Claire Curry
  • August 22, 2024
  • 6min. read

With nearly one in five U.S. adults living with mental illness, communities across the nation are struggling to meet the care and treatment needs of their residents, especially the most vulnerable. Faculty members and students in the Klingler College of Arts and Sciences are exploring the role technology may play in this area. Can it help improve diagnoses, access to care or peer support, and even flag individuals who are experiencing serious mental health crises? Could technology and social media promote openness and reduce the stigma surrounding mental health, or might they trivialize or glamorize serious conditions? Across the college, questions like these are driving vital discussions and research leadership on the ethical dimensions involved when mental health meets technology.

Peer-to-peer mental health support - in an app

One team of researchers is focusing on U.S. military veterans who are known to be at higher risk for mental health disorders. According to the National Institute of Mental Health, more than 1.7 million veterans receive treatment at VA mental health specialty programs, and organizations like Dryhootch, a nonprofit with coffeehouses and resource centers for military veterans in Milwaukee and Madison, are aiding veterans who are struggling by providing peer support.

"Veterans don't often willingly seek out assistance from health care professionals about anything related to mental health and PTSD," says Dr. Praveen Madiraju, professor of computer science. "They feel more at ease and open when talking to their peers. So Dryhootch started that program where veterans can go get a coffee and have that really cool, no-pressure atmosphere where they can sit and chat with other veterans."

"Marquette was brought in to offer technological expertise - to help bridge the gap between trauma and peer mentorship."

Dr. Praveen Madiraju

Leveraging that success, Marquette faculty, working in partnership with Dryhootch, have co-created a telehealth app called BattlePeer that brings the support of peers - akin to the "battle buddies" who had their backs in wartime - directly to veterans on their mobile phones.

"Marquette was brought in to offer technological expertise - to help bridge the gap between trauma and peer mentorship," Madiraju says. Marquette faculty, including Madiraju, and Dr. Iqbal Ahamed, Wehr Professor of Computer Science, collaborated with Dryhootch founder Bob Curry, a combat veteran, and Dr. Zeno Franco, associate professor at the Medical College of Wisconsin and psychologist at the U.S. Department of Veteran Affairs, to develop the app that matches veteran mentors with mentees, sends weekly check-ins to assess mental health and offers private and group chat features.

The team continues to refine the app and plans to expand its use to support first responders and people with cancer and other chronic diseases. "Sometimes you just need a bridge through whatever you're going through," Curry says. "This technology can be the bridge that gets you through to the other side."

So far, the app has put peer support in the hands of hundreds and has the potential to reach many more, thanks to a recent licensing agreement that now makes BattlePeer available in Apple and Google app stores.

"This has been more than a decade in the making," says Madiraju. "With the licensing, we have a massive scale advantage. Potentially, [BattlePeer] can reach hundreds of thousands of veterans across the nation."

Online storytelling communities help new mothers

Dr. Sabirat Rubya, Northwestern Mutual Data Science Institute Assistant Professor, who contributed to developing BattlePeer, is also exploring how technology can support another at-risk population: new and expecting moms. According to Rubya, one woman in seven encounters postpartum depression, and many turn to online communities for information and advice.

"Think of it as a curated space for sharing and finding support through powerful stories."

Dr. Sabirat Rubya

Women are at times reluctant to talk about their challenges as soon-to-be and new parents, even though this type of information sharing offers valuable emotional support, says Rubya. Many women aren't sharing their problems directly with one another because of the stigma around mental health issues related to pregnancy and new motherhood.

Rubya and graduate student Farhat Tasnim Progga analyzed three online venues - Reddit, What to Expect and BabyCenter - and concluded that online storytelling is an effective way to foster support for perinatal mental health. Their findings inspired them to develop "Mom Stories," a web-based application where women can find stories on a range of topics of interest to new and expecting moms, from depression and baby blues to breastfeeding and newborn health. "Think of it as a curated space for sharing and finding support through powerful stories," Rubya says about the application, which is set to launch in late 2024.

Risks and ethical questions for data use

"Is there not an ethical obligation to use every tool, every piece of data at their disposal, to attempt to save lives and help improve public health outcomes?"

Dr. Michael Zimmer

While technology is making inroads in mental health research and treatment, health care professionals and scholars, including those at Marquette, are identifying ethical issues raised by these applications of technology. Collecting and accessing large data sets to advance research may compete with the goals of protecting patient privacy, for example. And deploying AI bots to provide real-time medical advice risks overlooking how biased and insufficiently trained the bots may be.

Dr. Michael Zimmer, director of the Center for Data, Ethics, and Society at Marquette and professor and vice chair of the Department of Computer Science, shared his thoughts on these ethical tensions in a June webinar hosted by the Center for Suicide Research and Prevention, joined by experts from Harvard Medical School and Northwestern University.

In his presentation, Zimmer pointed to work being done at the Smoller Laboratory at Massachusetts General Hospital, where researchers are using large data sets to help develop suicide risk prediction models. The researchers built a sound predictive model, he says, and are now exploring ways to enhance it by tapping public data sources. While the ultimate goal is to improve health outcomes and save lives, using public data for this purpose raises ethical questions.

Public data and prediction models - a step too far?

"Imagine you had access to a hundred thousand people's health records," says Zimmer, who served as a consultant on the project. "You knew which 10,000 had made a suicide attempt at some point and you started looking at their records to see what was unique compared to everyone else." Such records might reveal, for example, whether an individual had an interaction with law enforcement or a bankruptcy - information that Zimmer says could lead to "unintended consequences."

Using such data in the prediction model could lead to the presumption that those who had an interaction with law enforcement or experienced a bankruptcy are at risk of suicide - even though many would not be. And reaching out to those "at risk" individuals the model identifies based on such criteria could possibly help avert tragedies, while also potentially representing yet another breach of privacy.

"In the health setting, there's a different kind of ethical calculus at play because these are folks who are trying to find ways to do good things with technology," he explains. "Is there not an ethical obligation to use every tool, every piece of data at their disposal, to attempt to save lives and help improve public health outcomes?"

At the same time, it's critical to look beyond how valuable the data is and recognize that the facts and figures represent real people and, often, vulnerable populations. "Things are moving quickly, but in the mental health space, everyone recognizes there are risks of moving too fast," Zimmer says. "If we want to do good, we have to get this right."

Student researchers investigate how social media users perceive mental health

In addition to faculty, Marquette students and alumni are conducting research on technology and mental health. Psychology major Iza Guzek, Arts '23, was part of a team under the guidance of Dr. Stephen Saunders, professor of psychology, that investigated whether social media has a larger role in helping students open up about their mental health - or leads to students trivializing, or even glorifying, mental illness. The students surveyed more than a hundred 18- to 28-year-olds to learn how users of TikTok and Instagram perceive people posting on these venues about mental illness. The majority of respondents described them using terms such as "admirable," "cool," "brave" and "strong."

The increasing use of social media to discuss mental health issues has helped to reduce stigma, Saunders and Guzek agree, but the pendulum may have swung a bit too far. "From our findings, we were glad to see that people disagree with the notion that the individuals [posting about mental health issues] come off as 'scary' or 'dangerous,'" Guzek explains. "This shows that they do not view them in the typical 'stigmatizing way' so to speak. But what was surprising, yet supporting our hypothesis, was how a lot of participants strongly agreed to mental illness being portrayed as something 'cool' or 'admirable.'"  Early results suggest that this is particularly true of the social media platform TikTok.  

"Getting students involved also shows them, hopefully, that anything about which they are curious can be studied."

Dr. Stephen Saunders

The team, which included Shea O'Connor, Arts '24, and Christina Schmidt, Arts '23, presented its research poster at the Wisconsin Psychological Association and the Marquette Undergraduate Research Symposium and has continued work on the project and corresponding paper. The goal is to present it at the Association for Psychological Science for the 2024 Global Psychological Science Summit in October. Guzek is committed to carrying the research forward and building upon it in her future career.

"At Marquette, we are taught cura personalis, which means care for the whole person," Guzek says. "We are advised to Be The Difference. Research such as this is one of the ways I can care for others and make a difference. Even after graduating, I feel it is my mission."

The Guzek-O'Connor-Schmidt team was one of four that conducted independent research under Saunders' mentorship last year.

Another group studied the associations of prosocial and antisocial behavior with mental health both on and offline. Conducted by Liam Pyne, '24, and Pat Swanson, a rising junior psychology and economics major, the study examined the social media behaviors and reactions of 275 survey participants and concluded that prosocial behavior leads to increases in mental health and self-esteem, especially among people who know each other, while antisocial behavior leads to negative feelings - especially among strangers online.

Saunders says that in addition to learning about the scientific process, students learn about themselves and what they may want to pursue as a career. "Getting students involved also shows them, hopefully, that anything about which they are curious can be studied."

Did you find this article helpful?