VA’s AI model to prevent suicides is a 'game changer,' official says
The REACH-VET program identifies roughly 6,700 retired servicemembers per month, according to a top lawmaker.
An official with the Veterans Affairs Department said the use of artificial intelligence technologies and predictive analytics has been a “game changer” when it comes to identifying and supporting veterans who are at risk of suicide.
During a House Veterans’ Affairs Subcommittee on Health field hearing on Monday, VA Assistant Under Secretary for Health Carolyn Clancy told lawmakers that the department’s algorithm-based Recovery Engagement and Coordination for Health — Veterans Enhanced Treatment program, or REACH VET, has led to “decreased incidents in suicidal ideation and suicide attempts.”
The program, which was first launched as a pilot in October 2016 and implemented more broadly across VA in 2017, uses an AI model to identify veterans at the highest risk of suicide at Veterans Health Administration facilities. Identified veterans — those who are predicted to be in the top 0.1% tier of suicide risk — are then connected with coordinators to provide additional care and support.
“You're actually using AI to prioritize those at the highest risk,” Clancy said, adding that “I think we're going to see a lot more of that in healthcare.”
Rep. Mariannette Miller-Meeks, R-Iowa, who chairs the subcommittee, noted that the REACH-VET program is currently active at 28 VA sites and identifies roughly 6,700 veterans per month who need additional support.
“It is with breakthrough technology like this that we're able to make a difference and save veterans' lives,” she added.
Veterans are at a higher risk of suicide than most U.S. populations, and the department has been exploring other uses of AI to limit or prevent suicides among retired servicemembers. But some lawmakers have expressed concerns about VA’s use of AI and emerging technologies to identify veterans at risk of self-harm.
Rep. Matt Rosendale, R-Mont. — who chairs the House Veterans’ Affairs Subcommittee on Technology Modernization — told VA officials during a hearing in January that “we need to do whatever we can to prevent veteran suicide” but added that he was worried that predictive tools “could lead to a violation of veterans’ rights.”
VA has identified more than 100 use cases for AI across the department, including at least seven that directly deal — or deal in large part — with using emerging technologies to identify and predict signs of suicidal risk.
While the REACH-VET program is listed as one of the department’s AI use cases, VA’’s list also includes an effort focused on the “implementation of natural language processing, predictive modeling and artificial intelligence methods and findings to improve REACH VET suicide prediction” and on a project to “extract signals of suicidal risk from clinical progress notes using natural language processing.”