Question: Have you compiled 'lessons' learned from schools using monitoring software on devices, particularly with regard to detection of, and response to, youth who may be at suicide risk? We hear about schools having to start from scratch on their learning curves about setting up safe and feasible ways to respond to suicide risk alerts. This report summarizes those and many other challenges.


Answer: Thank you for this question and for sharing the research report. Youth suicide prevention is a critically important topic for schools, as it is one of the leading causes of death for youth and young adults (CDC, 2023). To address this critical concern, some schools have begun to use Artificial Intelligence (AI) tools to monitor students via technology for indicators of suicide risk.

How AI Monitoring Software Works

Generally, AI tools used by schools monitor school-issued devices and school-owned accounts for data that indicates a potential risk for suicide or self-harm. This could include a post made on social media, an online search, or a message sent to another person. If the AI tool flags content as concerning, it notifies school officials to follow up with the student (Ayer et al., 2023).

Potential Benefits and Risks of AI Monitoring Software for Suicide Risk

Research has identified potential risks and benefits associated with using AI tools to monitor suicide risk among students. A perhaps obvious, but important benefit includes the early detection of suicidal ideation (Atmakuru et al., 2024Ayer et al., 2023) and the ability to provide targeted interventions to at-risk students (Adrian et al., 2020; Atmakuru et al., 2024).

However, research has also identified potential risks to student privacy and equity with the use of AI monitoring tools (Center for Democracy and Technology, 2023). These include biased AI algorithms (Atmakuru et al., 2024; Ayer et al., 2023; Coley et al., 2021), which can lead to the misidentification of risk for specific populations of students through false positives or false negatives. Students of color and LGBTQ+ students may also be at risk for biases in how suicide risk alerts are handled. Students of color are already disproportionately affected by school discipline policies, and AI software alerts may lead to law enforcement contact that could result in lasting consequences (Ayer et al., 2023). LGBTQ+ students who are flagged for online activity related to their identity may be forced to disclose their identity to caregivers or school staff which could be harmful (Ayer et al., 2023). Violations of students’ privacy and right to confidentiality is also a risk (Adrian et al., 2020; Ayer et al., 2023). Additionally, there are concerns about how parental informed consent is obtained and whether students and families understand how AI monitoring tools work (Ayer et al., 2023; Collins et al., 2021). While the technology is promising, there is not enough data to understand how effective AI monitoring technologies are in appropriately identifying students who are at risk for suicide.

Suicide prevention is a critical issue for schools and families, and it is wise to consider any tool that may prevent unnecessary harm. However, as is the case with many new technologies, it is important that schools implement AI monitoring tools in a thoughtful way that centers equity, diversity, and inclusion and fully informs families and students of how the tools are being used and provides protection against other harms associated with AI monitoring.

Recommendations for Schools

Schools that are considering using AI tools to monitor student technology use for signs of suicide risk can reference the Future of Privacy Forum report The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies: Recommendations for Schools for a list of recommendations. These recommendations include:

  • Seek input from the school community including parents, caregivers, students, and school staff prior to adopting any new monitoring program. (Center of Excellence note: Consider the purpose and goals of implementing a monitoring program and what steps will be implemented to avoid perpetuating bias and inequities across student populations.) If a program has already been adopted, schools should provide opportunities for school community members to provide input on and ask questions about the monitoring program.
  • Fully evaluate any potential suicide risk monitoring product before purchasing it. (Center of Excellence note: Common Sense Media’s Privacy Program has evaluations of the privacy policies of several EdTech companies which provide suicide risk monitoring products. Schools can consider checking these ratings to determine whether AI monitoring tools meet Common Sense’s data privacy.)
  • Ensure students and families are aware of how the AI monitoring software works.
  • Create transparency around school monitoring programs and practices, including explicitly stating which information will be tracked, the purpose of the tracking, who will have access to the information, and how the information will be used and stored. (Center of Excellence note: Consider situations that allow students and families to opt out of this tracking, such concerns about privacy violations or potential for biased algorithms.)
  • Provide training for all school personnel about student data privacy, including on Family Educational Rights and Privacy Act (FERPA) requirements and privacy in the context of harm-monitoring programs.
  • Establish privacy protocols and practices that go beyond the requirements of the law. Schools can consider using the Principles for School Safety, Privacy, and Equity as they build these protocols.

The report also includes useful resources for schools, including key questions schools should ask AI monitoring software companies before using their products (Appendix A), a checklist for school districts developing monitoring plans and policies (Appendix B), and a long list of additional resources for schools (Appendix C).

Additional Youth Suicide Prevention Resources


References

 

Age: 13 - 17

Topics: Monitoring software, suicide, suicide risk, suicide prevention

Role: Other (NIH)

Submit a New Question

Have additional questions after reading this response? Or have any other questions about social media and youth mental health? Submit your own question to be answered by our expert team. Your answer will then be added to our Q&A Portal library to help others with similar questions.

Ask a Question

Last Updated

01/16/2025

Source

American Academy of Pediatrics