Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the U.S. don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts.

“As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth,” the Electronic Software Foundation (EFF) says.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

  • Vodulas [they/them]@beehaw.org
    link
    fedilink
    arrow-up
    17
    ·
    9 days ago

    Yes. There are tons of enterprise tools to lock devices to certain activities. Surveillance is not necessary and will be used to violate privacy, and I am not talking about just on device communication. Remember when companies were caught using their employees cameras without any indication on the device? The suspected benefits of surveillance is not worth the potential harm.

    • TexMexBazooka@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      9 days ago

      My company exclusively deploys machines with physical coverings for the camera and hardware disconnects for the mics.