House Technology and Innovation Committee HB524 Proponent Testimony

February 3, 2026, 10:00 A.M. ET

The following statement is attributed to OSPF CEO Tony Coder:

Chair Claggett, Vice Chair Workman, Ranking Member Mohamed, and members of the House Technology and Innovation Committee, thank you for allowing me to testify as a proponent of House Bill 524. My name is Tony Coder, and I am the CEO of the Ohio Suicide Prevention Foundation. I am grateful to Representatives Mathews and Cockley for introducing this bill, as we are beginning to see more examples of young people and even some adults utilizing AI and AI chatbots for therapy and mental-health care, and we are now beginning to see some suicides where AI has played a part.

The latest official report from the Ohio Department of Health saw 1,777 people dying by suicide in 2023 –nearly five people every day. In addition, the ODH Ohio Emergency Department Suspected Self-Directed Summary report that gives suicide attempt and suicide ideation data from Ohio’s emergency report shows that approximately 25-30 people are seen every day in Ohio’s emergency rooms with a suicide attempt, Nearly 81% of our suicide deaths are adult men, but suicides by young people continues to be one of the leading causes of death for young people, with suicide being the second-leading cause of death for children ages 10-14.

Over the past decade, youth mental health has been declining. Since COVID, Johns Hopkins Hospital researchers have shown that depression, anxiety, and suicidal thoughts have significantly increased in the COVID and the post-COVID period. In Ohio, a child dies by suicide every 36 hours. That number did decrease slightly from 2022 to 2023, and, through investments that you have made in suicide prevention, even though more youth are struggling with mental health, we have also seen more youth access 988. The stigma around talking about suicide is decreasing for young people, and we are discussing it more, and we are seeing that decrease in youth suicides.

However, with all of these positive pieces that we are building in Ohio to continue to bring the youth suicide numbers down, we continue to see many outside influences to youth suicide outside of brain health. Social media and internet-related issues continue to drive up rates of mental health problems for teens. Problem gambling is now the #1 addiction related to suicides (young adults with gambling problems have a four times higher risk of suicide attempts compared to peers). Now, we have AI, and we are beginning to see AI being utilized by children, and we are hearing more anecdotes about AI’s influence on some children in their suicide.

I want to be clear, OSPF is not anti-technology or anti-AI. We do believe and support advances in technology that could, with some imagination, do amazing things that could impact society in positive ways. But, we also must protect children from the consequences, especially as youth develop relationships with AI chatbots and put their trust in these entities.

A Common Sense Media report released in 2025 found widespread use of AI companions, with 72% of teens having tried an AI companion at least once, and 52% of youth using AI companions on a regular basis.1 AI companions were also far less likely to provide appropriate mental health referrals, with AI companions responding with an appropriate referral for a teen mental health crisis only 22% of the time. We are also concerned about privacy and youth data with AI companions, and we have just signed a letter of support for HR 6291 (Children and Teens’ Online Privacy Protection Act) by Republican Michigan Congressman Tim Walberg to help strengthen youth data protections nationally.

In research authored by Dr. Laurie O. Campbell from the University of CentralFlorida, she states, “Adolescents have been induced to die by suicide through conversations with AI. In October 2024, a 14-year-old began communicating frequently with an AI chatbot on Character.AI (Noam Shazeer and Daniel De Freitas). The communication included messages that were sexually explicit and may have been attributed to the teen withdrawing from his family and believing the AI chatbot was real. Ultimately, following encouragement from the chatbot, the teen completed suicide.”2

I speak with parents multiple days of the week about a child’s suicide. Just last Thursday, I spoke with a mother of a teenage boy whose son died by suicide. She told me that he had been struggling with mental health issues, but she thought he had turned a corner. She told me that one evening, he messaged his mom with, as she put it, a pretty arbitrary message of “I love you,
mom,” which she didn’t seem too concerned about and responded, “I love you too.” Her son had been driving around town, but a friend called the mother and said that they were concerned about her son in a message that they received. Like a lot of parents today, she had a tracking feature on his phone and drove to where her son was and found that he had died by suicide. She said that she went through his phone a few days later and found the last message that he wrote to a friend about his struggles he was secretly having, and his friend responded, as a teenage boy might, “You just need to man up, boy.” That was the last message that her son received. I tell that story not because AI was responsible, but instead that if people aren’t getting appropriate messages of support, whether from a human friend or an AI companion, the consequences can be devastating. People who are struggling with suicidal thoughts need the most thoughtful care, not an AI companion to validate their thoughts of dying. If that happens, the chances of saving that young person might be almost nil.

I am asking this committee to support HB524 before we have more stories like those where children are encouraged by AI companions to end their lives. I am grateful to Representatives Mathews and Cockley for their foresight and attention to Ohio’s kids so that we can reduce the number of suicides of Ohio youth.

1https://www.commonsensemedia.org/research/talk-trust-and-trade-offs-how-and-why-teens-use-ai-companions
2https://pmc.ncbi.nlm.nih.gov/articles/PMC12371289/