Week 11: The Impacts of Technology Innovations and the Implications for Data and Privacy as Related to Suicide Prevention Among First Responders

Suicide continues to be a major public health issue and one of the leading causes of death. Technological innovations offer an opportunity to implement useful tools to assess suicide risk, identify high-risk and suicidal behavior, and potentially prevent suicide. In 2015, the White House and the Obama Administration hosted “the Partnerships for Suicide Prevention event, as part of Global Suicide Prevention Month and Global Mental Health Day. The event had a mission statement of “Using Data to Strengthen Mental Health Awareness and Suicide Prevention.” The White House also organized a five-city suicide prevention “hackathon” that brought together data scientists, innovators, designers, and next-gen technologists from several different organizations. They were challenged to collaborate and develop products, tools, or data analysis on suicide prevention” (1).

          One of the technological advances that could aid in suicide prevention are mobile apps (e.g., Better Stop Suicide, Suicide Prevention App, Suicide Safety Plan, Suicide Safe by SAMHSA, etc.). These apps offer a range of suicide prevention strategies, including public health techniques, screening tools, help accessing support, mental health treatments, and follow-up strategies following a suicide attempt (2)

Facebook has joined in the suicide prevention efforts years ago when it allowed its users to report Facebook posts that they felt were indicative of someone thinking about suicide. That flagged the posts for review by trained members of Facebook Community Operations team, who would  connect the poster with support resources as needed. In the last few years, Facebook has begun using machine learning by getting a computer to recognize suicidal expression in order to expand their ability to help people in need. This new tool uses signals to identify posts from people who might be at risk, such as phrases in posts and concerned comments from friends and family (3).

Some emerging evidence suggests that computerized suicide prevention approaches may also be useful (i.e., the use of automated cognitive behavioral therapy (CBT) that delivers a course of therapy without involvement of a human therapist). Speech and facial emotions may also provide a window into detecting suicidal thinking. Using computerized speech analysis, researchers may be able to find differences in how depressed and/or suicidal people talk. People who become suicidal may have differences in the sound frequency of their speech. Research has also shown that people with depression exhibit a reduced acoustic range to their speech. Current research is also looking into using computerized real-time facial emotion monitoring to detect subtle changes in the facial expressions of people with suicidal thoughts (4).

As companies embrace new technologies in their efforts to prevent suicide, they must consider data, privacy, and ethical implications that come with it. Making sure that these new technologies do not cause harm is a priority. Privacy concerns is another significant issue. The algorithm created by Facebook automatically scores all posts in the US and select other countries on the scale of 0 to 1 for risk of imminent harm. While the algorithm was created with the intent of detecting signs of potential self-harm and addressing the rapidly increasing suicide problem, many see it as unethical (5). In a way, Facebook is collecting data and conducting research without individuals’ consent or protection they would normally receive in a study. There is also a possibility that some people flagged by the algorithm as suicidal may not be at risk yet be forced into treatment. Many questions remain unanswered at this point. What happens with all the sensitive, mental health related information that Facebook collects? Should it be protected under HIPAA? What happens if there is a data breech and all this sensitive information becomes public? These questions require careful consideration as we use technology more and more to prevent suicide.

  1. Patil, D., & Honey, K. (2016). Open data and innovation for suicide prevention: #MentalHealthHackathon. The White House website. Retrieved from https://www.whitehouse.gov/blog/2015/12/09/open-data-and-innovation-suicide-prevention-mentalhealthhackathon
  2. Larsen, M., Nichols, J., & Christensen, H. (2017). Apps for suicide prevention What the research says. Psychiatric Times. Retrieved from https://www.psychiatrictimes.com/telepsychiatry/apps-suicide-prevention-what-research-says
  3. Facebook (2018). How Facebook AI helps suicide prevention. Retrieved from https://about.fb.com/news/2018/09/inside-feed-suicide-prevention-and-ai/
  4. Vahabzadeh, A., Sahin, N., & Kalali, A. (2016). Digital suicide prevention: Can technology be a game-changer? Innovations in Clinical Neuroscience, 13 (5-6): 16-20.
  5. Goggin, B. (2019). Inside Facebook’s suicide algorithm: Here’s how the company uses artificial intelligence to predict your mental state from your posts. Business Insider. Retrieved from https://www.businessinsider.com/facebook-is-using-ai-to-try-to-predict-if-youre-suicidal-2018-12

2 thoughts on “Week 11: The Impacts of Technology Innovations and the Implications for Data and Privacy as Related to Suicide Prevention Among First Responders

  1. This was a very interesting post for me to read especially with the added stressors I’m sure first responders are under these days with the covid-19 pandemic! I had no idea that Facebook did any sort of research; I rarely use social media for anything but that is the last company I would have thought to become involved in tracking potential self-harm and suicide risk. I interpret this to mean there is software that analyzes every post or picture that is uploaded onto any one of Facebook’s owned sites and is saved on a large server somewhere. This plays a major role in why I do not partake in social media because of the fear that Big Brother is watching every move I make, besides the fact that no one really cares what I had to eat for breakfast nor how pretty it was before I started eating. I don’t even care for ad tracking when something I search for pops up on the next website I visit, and I think it is safe to say I am not the only one that finds this an invasion of privacy. Also, as you mentioned, what happens to this data that is collected without people’s consent, unless creating an account acts as a form of consent? With HIPAA regulations being waived during this time for Skype, Facetime, and other non-encrypted video messaging sites to assist in promoting telehealth (Centers for Medicare and Medicaid Services, 2020), it makes me think this will be a progressive change in the future of healthcare and these sites will eventually be allowed for use regardless of pandemic. I am much more a proponent of the mobile apps and cognitive behavioral therapy you discuss in your blog post as being a way to combat the high rate of suicide in first responders. Thank you Natalia.

    References
    Centers for Medicare and Medicaid Services. (2020). Medicare telehealth frequently asked questions. Retrieved from https://www.cms.gov/files/document/medicare-telehealth-frequently-asked-questions-faqs-31720.pdf

    Like

  2. week 11:
    I love this blog; I like the focus on emergency responders. This blog has become even more meaningful with the recent COVID 19 outbreak. There has been so much more stress on our healthcare staff since the spread of COVID 19. Stress from limited access to supplies, Increased work load, and the increasing loss of patients are major issues for healthcare mental health.
    Most recently I found an article with promising results (Gold, 2020). A small study done in Standford found that a new form of magnetic brain stimulation relived sever depression in about 90% of those who participated (Erickson, 2020). These severely depressed patients 21 in total also all had suicidal ideations and thoughts but after the treatment, none treated reported having suicidal thoughts after the treatment. These same participants had no improvement with medical anti-depression treatment (Erickson, 2020). This is great news in the treatment of depression as no other therapy has had more than a 55% improvement for treating resistant depression (Erickson, 2020).
    This is great news for people suffering from depression and suicidal thoughts. This is especially meaningful to the emergency and healthcare workers dealing with moral traumas that cause depression and suicide.

    Erickson, M., (2020). Stanford researchers devise treatment that relieved depression in 90% of participants in small study. Retrieved from https://med.stanford.edu/news/all-news/2020/04/stanford-researchers-devise-treatment-that-relieved-depression-i.html

    Gold, J., (2020). The Covid-19 crisis too few are talking about: health care workers’ mental health. Retrieved from https://www.statnews.com/2020/04/03/the-covid-19-crisis-too-few-are-talking-about-health-care-workers-mental-health/

    Like

Leave a reply to jamesmerdith Cancel reply

Design a site like this with WordPress.com
Get started