Cover for censorship? | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Cover for censorship?

0:00

WORLD Radio - Cover for censorship?

Efforts to rein in Big Tech’s reach could lead to silencing unpopular viewpoints


This photo shows the mobile phone app logos for, from left, Facebook, Instagram and WhatsApp in New York, Tuesday, Oct. 5, 2021. Richard Drew/Associated Press Photo

PAUL BUTLER, HOST: Coming up next on The World and Everything in It: the fallout from a whistleblower’s testimony.

Last week, a former product manager at Facebook testified before a Senate panel calling for Congress to step in and hold the tech giant accountable.

HAUGEN: I’m here today because I believe Facebook’s products harm children, stoke division, and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.

Frances Haugen said the company develops algorithms that feed hateful or inflammatory content. She said it’s all in an effort to hook users because the more time they spend on social media, the more advertising the company can sell.

MYRNA BROWN, HOST: She also said the company is endangering children and that Facebook will only clean up its act if Congress steps in.

Lawmakers on both sides of the aisle are indeed voicing concern about the power big tech companies have over news and information today.

Facebook responded, saying Haugen's accusations don’t make sense and that she did not directly work on the issues she raised.

Joining us now with more insight is Lora Ries. She is the Director of the Center for Technology Policy at the Heritage Foundation. Lora, good morning!

LORA RIES, GUEST: Good morning, thanks for having me on.

BROWN: Lora, I do want to ask you about where all of this is heading in terms of content censorship, but let me start with this: Is Ms. Haugen alleging that Facebook may have violated the law in its business and content practices?

RIES: Well, Facebook and the owner Mark Zuckerberg has testified before Congress many times. If he had said anything false before Congress as to representing what his company does and does not do and their customer base, that's one aspect to explore. There are related issues with Facebook and potential anti-competitive tactics. Those need to be investigated as well by the Federal Trade Commission. But we've all worried as parents about social media and the effects on teenagers and basically addiction to these platforms and to their phones. And at the hearing that Francis Haugen testified at, Senator Blumenthal said it's an addictive or an addict's narrative, which is very troublesome. And the parents certainly need more tools to help teach their teens some self-control, some self-moderation and use of these platforms.

BROWN: Now Lora, you’ve pointed to one big concern you had about Haugen’s testimony itself with regard to censorship and policing—quote, unquote—“misinformation.” Talk about that, if you would.

RIES: Yes, the big caution I want to raise regarding this hearing with Francis Haugen is the censorship that she's calling for in the guise of misinformation. She was the lead product manager for civic misinformation at Facebook and this had to do with elections. And over the past year and a half, we have seen many social media companies label what they claim to be misinformation and taking content down—whether it's about COVID, such as the Wuhan lab being the source of the source of the virus or masks are bad, then masks are good, and also elections. And we recall the Hunter Biden laptop was originally taken down as misinformation. Later on, they admit that, no, it was indeed his laptop. And so misinformation is a catchall for whatever the left doesn't like or what contradicts their narrative. And so no one should be calling for or implementing more content removal based on “misinformation.”

BROWN: And this is where we see the partisan divide, right? Most Democrats want stricter policing of content. Republicans largely believe these platforms should not selectively censor or edit user-generated content.

And that speaks to legal protections these companies have under Section 230. Would you explain that please?

RIES: Sure. So in 1996, Congress passed section 230. And the idea was—again, starting with children—that parents needed tools from these companies so that they wouldn't be viewing pornography, explicit sexual activity, violence, cyberbullying, things like that. Now there is a catchall phrase in Section 230 that reads “otherwise objectionable.” And these companies are using that very broad, vague language to moderate content, label it, edit it, take it down completely, based on “otherwise objectionable” when they've gone far afield from the original intent of that liability protection. And so Congress needs to reform that section to return to the original intent and to allow these companies to be sued when they are acting as publishers by labeling or taking content down that has nothing to do with the original intent of the protection.

BROWN: Do you think there’s enough agreement on some issues that we’ll see regulation in the coming months, or years. And if so, what elements of the social media platform do you think lawmakers will target?

RIES: So, I think there's agreement that some of these big tech companies like Facebook or Amazon or Google need to be investigated for any potential anti-competitive behavior and based on current laws and regulations with the Federal Trade Commission or others, investigate those and punish if there are violations. Where there is not agreement is around this misinformation and, as you stated before, you know, generally the left wants more content taken down or labeled as misinformation. And conservatives feel and are seeing what seemed to be a very slanted, very biased content moderation against conservative thought under the guise of misinformation. And so the two parties in Congress want Section 230 reform but for very different reasons. And so it's unlikely that we're going to see Congress successfully get a bill through both houses for that reason. And so we also need to look to other avenues, such as what are the states doing to protect users and consumers? What alternative companies are emerging, such as Rumble as an alternative to YouTube and others to offer public squares where conservatives don't have to fear that their content is going to be taken down.

BROWN: Okay, Lora Ries with the Heritage Foundation has been our guest. Lora, thanks so much!

RIES: Thanks for having me.


WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.

COMMENT BELOW

Please wait while we load the latest comments...

Comments