NICK EICHER, HOST: Coming up next on The World and Everything in It: facial recognition software and school security.
Students at an upstate New York school district returned from the recent holiday break to find out they were being watched. Not by school staff. By the district’s new biometric security system.
Now, Lockport City School District bought the facial recognition software about two years ago. But administrators didn’t start using it until now.
MARY REICHARD, HOST: The state’s Department of Education delayed the rollout primarily due to privacy concerns. And those concerns remain. There’s also the question of cost: relative to the overall budget it’s just 1 percent, though it does come to $1.4 million.
But fears of school shootings mean high-tech security systems are likely to become a part of campus life across the country.
WORLD Radio correspondent Laura Edghill reports now on why some people think that’s a problem.
LAURA EDGHILL, REPORTER: Lockport Superintendent Michelle Bradley’s primary concern is student safety. She told Buffalo ABC affiliate WKBW the cutting-edge technology can help prevent a tragedy.
BRADLEY: The whole idea for this project and this technology is to make our school safer. Unfortunately it’s a priority based on what’s happening across the nation with school shootings.
Lockport’s current database includes sex offenders, suspended staff members, and anyone barred from school property by a court order. When the system detects a flagged individual on campus, it alerts school officials. The software can also recognize 10 common types of guns and issue an alert if it spots one.
But Lockport has not included any student images in the database. Administrators originally planned to upload photos of certain suspended students, but that provision became a major point of contention for parents, privacy advocates, and civil rights activists.
The state’s Education Department ultimately barred the district from including the students.
BRADLEY: We believe in the initial policy that that category should have been included, but the state Education Department wasn’t comfortable with that and that’s why it’s been removed.
Jim Shultz is the parent of a Lockport High School junior. He says students are largely unaware of the swirling controversy.
SHULTZ: Well, so from the student perspective, the big news is they have been completely and totally left in the dark. They have no clue what’s been happening here.
Shultz is a vocal opponent of the district’s decision to purchase the system. For one, he says, the school board didn’t give parents enough opportunity to share their concerns. It only held one public comment session before voting … and that was on a weekday afternoon in the middle of summer.
Shultz also points out that the system has intrinsic flaws.
SHULTZ: The whole system is premised on that you will be able to predict in advance by name and face who a school shooter will be. That you will be able to put their picture in a database and that if they are physically present around the cameras that they, you will trigger some sort of alert that gives you enough extra response time to intervene and that they won’t spend $2.99 at the hardware store and put on a face mask.
But even more worrisome, Shultz says district leaders haven’t carefully considered the full implications of the software’s capabilities.
SHULTZ: Our kids, right now, every day are being scanned with this technology and recorded. If at any point, as long as those recordings exist, they wanted to put my daughter or anybody else’s face in the system, they can go back and retroactively map where they’ve been, who they’ve been with, and all the rest.
Other parents and privacy advocates, including the New York Civil Liberties Union, have also pointed out a variety of potential abuses. For example, the system could be used to monitor the personal daily habits of both staff and students. What time they come and go, who they greet, even what they wear—all information that could be exploited in the wrong hands.
SHULTZ: How we deploy artificial intelligence in our schools is something that we need to do very carefully, very thoughtfully with transparency, and with a real understanding, you know, that once you open up that Pandora’s Box, you know, everything’s on the loose. You can’t put it back.
For now, the system remains online. But the state’s Senate Education Committee is considering a possible legislative fix. If passed, it would enact a moratorium on facial recognition technology in New York schools until July 2022 to allow policymakers more time to study the issue.
Meanwhile, a small but growing number of both private and public schools across the country continue to experiment with the technology.
One school in Texas reported its system successfully alerted administrators to the presence of an expelled student at a football game. In Oklahoma, biometric technology helped school staff verify the presence of a student whose family feared had run away.
But parents and privacy advocates remain wary of the benefits and the vendors lining up to sell the latest gadgets in the multi-billion-dollar school security industry.
Reporting for WORLD Radio, I’m Laura Edghill.
(Photo/Associated Press)
WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.