Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Facebook gives up facial recognition

But is this a win for privacy or an on-ramp to something even worse?


The Facebook logo on an iPad. Associated Press/Photo by Matt Rourke (File)

Facebook gives up facial recognition
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

In a surprising move, Facebook announced on Nov. 2 that it would be shutting down or limiting use of its facial-recognition technology, a piece of technology central to Facebook that helped awaken the world to the possibility of facial recognition. “People who’ve opted in will no longer be automatically recognized in photos and videos and we will delete more than a billion people’s individual facial recognition templates,” said a statement from Meta, Facebook’s parent company.

For the past several decades, the dominant ideology has been that advancements in technology, despite their costs, are presumed to be a net benefit for society. Facebook’s recent decision—a voluntary admission that society would be better without one of its major technological accomplishments—reflects a potential high-water mark for this philosophy. It’s a small but encouraging development that gives more credibility to those worried about the risks of technology, including social media’s adverse effects on children and mental health.

Of course, Facebook’s motives for this decision may simply be pragmatic. Mark Zuckerberg told The Verge that Facebook’s rebranding to the Meta parent company kicked off in late spring 2021—just a few months after Facebook had reached a $650 million settlement with users who alleged it had stored their faces without their permission. Fast forward to late October of this year, when Facebook announced its new brand in the face of heavy criticism. “The most critical issue with this rebranding is that the new brand has been introduced without any substantive change at the company,” wrote Denise Lee Yohn at Harvard Business Review. “[Customers] must be offered something substantially different or see reliable evidence problems have been fixed before they will believe that the company has actually changed.”

It’s not surprising that days after that development, Facebook anounced this shutdown of its facial-recognition technology. (It’s almost as if the company anticipated the criticism of its rebrand and was ready to show some “substantive change.”) After all, sacrificing a feature that isn’t particularly critical to its growth (do people really still tag photos?) is a relatively large public relations win at a relatively small cost to Facebook (especially given that the feature has been a key target for lawsuits).

Regardless of Facebook’s motives, the decision is good for society. Facial recognition technology comes with very serious moral concerns. According to a 2020 documentary Coded Bias, facial recognition often struggles with racial bias, and the film highlights numerous cases of facial recognition used by police, but with alarmingly low accuracy rates. In the last several years, dozens of states or cities have limited use of facial recognition by police departments after concerns with misidentification and racial profiling.

Internationally, the tides have been turning against facial recognition as well. In September, United Nations High Commissioner for Human Rights Michelle Bachelet called for a moratorium on state use of artificial intelligence technologies, including facial recognition. Even China has started to add restrictions on facial recognition, requiring many private businesses to get consent from customers before storing their facial data. (Make no mistake, the totalitarian Chinese government still uses widespread facial recognition technology—from tracking children’s time online to identifying and oppressing Uyghur Muslims.)

Putting the brakes on facial recognition technology is clearly a positive development for a society that still acknowledges the importance of private in-person interactions. But don’t celebrate yet—Meta has aggressive new goals to redefine social interactions through activities on its platforms. Zuckerberg is “pushing his teams to build technology that could one day let you show up in a virtual space as a full-bodied avatar, or appear as a hologram of yourself in the real-world living room of your friend who lives across the planet,” reports The Verge

Will virtual avatars and holograms really help us build stronger connections with those around us, or will they simply accelerate the troubling transition from the physical world to a digital, Meta-controlled environment? As we’ve learned from the Facebook Files, these aren’t questions that Meta enjoys asking. When facial recognition was launched on Facebook a decade ago, all that mattered was that it saved you a little bit of time. But lest we believe that Facebook is taking a step back, there is good reason to believe that the company (whatever it calls itself) has even bigger plans for transforming society. The tech titans never retreat for long.


Daniel Huizinga

Daniel Huizinga is a strategy consultant, a speaker on personal finance, and CFO of a nonprofit supporting community development in Kenya. He has published more than 200 articles on business, financial literacy, public policy, and education.


Read the Latest from WORLD Opinions

David L. Bahnsen | The inflation situation has something for everyone, and not enough for anyone

Craig A. Carter | Postmodern medicine is a danger to humanity

Hunter Baker | William F. Buckley Jr. was the indispensable man of American conservatism in the 20th century

Daniel Darling | A free-speech showdown in Belgium shows a weakening foundation for liberty

COMMENT BELOW

Please wait while we load the latest comments...

Comments