Suicide prevention is a vexing problem facing the United States, and in particular our Veterans. In 2014, according to The Department of Veterans Affairs (VA), there were 41,425 suicides among U.S. adults and among these individuals, 18% (7,403) were identified as Veterans of U.S. military service. The VA also reported that an average of 20 Veterans died from suicide each day that year; however, only 6 of those 20 were users of VA services .
As evidenced from the statistics, suicide is not an insignificant issue; further, there are at least three challenges: i) being able to identify and determine whether a patient that receives services at a given facility may be dealing with issues and challenges that might lead them to attempt to take their lives; ii) helping individuals that have been determined to be contemplating taking their lives and ensuring they have the services and follow up they need; and iii) identifying those at risk for suicide outside of the formal walls of an entity or organization - for example, the 14 Veterans that died from suicide each day in 2014 that were not users of VA services.
In relation to challenge number one - identifying patients that may be at risk for suicide - the good news is that the VA has made some significant advances in their ability to predict which of their patients may potentially be contemplating suicide with their REACH VET Initiative , a predictive analytics tool that helps identify Veterans at high-risk for suicide.
Iconic Data, via its Patient Case Manager (PCM)™ Suicide Prevention Module™, has also brought to bear its expertise with technology enabled standardization of complex clinical workflows, process measurement, monitoring and data visualization to help health care provider organizations like the VA deliver highly reliable services to patients at risk for suicide. That addresses challenge number two above - ensuring those identified to be at high risk for suicide get the services and follow up they need.
Challenge number three above - identifying those at risk for suicide outside of the formal walls of an entity or organization - is, no doubt, of the three, the challenge that is most difficult to address and the one that will most likely lead us into new territory. Issues like whether individuals need to opt into participation (i.e. agree to be monitored for potential risk for suicide) and how this might be accomplished are two dilemmas. Perhaps a model like the VA’s Million Veteran Program would work for this? Or questions like, which individuals are okay, appropriate, and / or authorized to be notified in the event concerning patterns are identified? Is it spouses, family members, others? And how can this be navigated while maintaining privacy? Can meaningful insights can be gleaned from social media using artificial intelligence? For example, would it be of benefit to use social media application programming interfaces to observe for patterns in individual behavior that might indicate someone may be contemplating suicide or may be having suicidal ideation? Or even, whether there is a role for facial recognition and facial expression 'analysis of emotion' technologies, both of which may prove particularly interesting in the context of social media.
Suffice it to say, with the VA Secretary David Shulkin, M.D.’s ongoing prioritization of suicide prevention as a top clinical priority for the VA, the next decade will continue to be rife with innovation in this important area.