MARY REICHARD, HOST: Coming up next on The World and Everything in It: protecting children online.
A quick word to parents: this story deals with material related to exploitation, so you may want to fast-forward about six minutes and come back later.
LINDSAY MAST: On Wednesday, a subcommittee of the House Energy and Commerce Committee will host a hearing about legislation to protect young people on the internet. This follows a Senate Judiciary hearing earlier this year in which lawmakers grilled the heads of tech companies like Meta, Twitter, and TikTok.
REICHARD: Some of these companies have already taken steps to improve. For instance, Instagram last week announced that it’s testing a feature to blur photos that may contain nudity. And it would prompt users to think twice before sending or receiving these images.
But are changes like this too little too late?
MAST: Joining us now to talk about it is Tim Nester. He’s Senior Director of Communications for NCOSE, an acronym for the National Center on Sexual Exploitation.
It’s an advocacy group working through legislation and corporate policy to end sexual exploitation.
REICHARD: Tim, good morning.
TIM NESTER: Good morning. Thanks for having me.
REICHARD: Well, Tim, each year NCOSE compiles a list of twelve corporations that enable sexual exploitation. You call it the “Dirty Dozen” list and it came out last week. What are some of the companies on this list that may surprise people to know are enabling bad actors?
NESTER: Yeah, thanks for asking that. It’s, it’s always a surprise when this list comes out. There’s some companies that everyone expects and others that will kind of catch you off guard. Some examples would be Apple, CashApp, Roblox—which if you have kids who like to play games, they’re going to be familiar with Roblox, you’ll be surprised to hear that one’s on here. I think LinkedIn has caught a few people off guard this year, but we’ve listed them, as well as Spotify, the popular music streaming app as well.
REICHARD: Your organization has no doubt sent this list to the offenders. Any response?
NESTER: Yeah, every year, we send a notification letter to these companies before the list is published. Basically, we want to give them a chance to respond to take action to get themselves off the list, which in some cases has happened. We’ve moved them off the main list to a watch list or replaced them with a different target. But every year they have a chance to respond, and we actually already have some victories this year to share, with Apple removing some of the nudifying and horrible apps that they have available on their app store that were rated for kids 4 and up, and so they removed those as soon as we flagged them. And you already mentioned with Meta, Instagram is testing out a feature that will blur all explicit photos sent to or from minors up to age 18, which before that was only up to age 15. So that was a huge win as well.
REICHARD: No surprise to see that Meta is on the list…it owns Instagram…plenty of reports about predators exploiting children through that app. What would Meta need to do to get off the “Dirty Dozen” list?
NESTER: You know, there's so much with Meta. They, they own some of the most dangerous platforms for kids, arguably the most dangerous apps for kids, and so what we need them to do is to have safety features on by default. And this only happens through public pressures, we need to make sure they're protecting kids. They just launched end-to-end encryption, which we think is incredibly dangerous, and really enables those who want to perpetuate exploitation against others, especially minors, to work in anonymity. And so we want to make sure that they are protecting kids and calling out those who would perpetrate this kind of exploitation against others. And honestly, we need to see them do proactive detection and removing of all grooming. And they haven't done that yet. They usually wait until they're called out and in trouble and the public moves, then Meta responds. We want them to be proactive in protecting people.
REICHARD: Tim, do you have any concrete examples of how children have experienced harm on these platforms?
NESTER: I do, and sadly, there’s too many examples across many of these platforms. But one that jumps out to me that hit really close to home for me, because I have a 17-year-old son, is the story of well, a boy we’ll call Alex. It’s not his real name. We’ve changed his name just to protect his identity, but it’s a true story.
So a 17 year old boy named Alex met a teenage girl online named Macy, at least that’s what he thought her name was, and he thought she was a teenage girl, they began chatting. And over time that conversation quickly escalated to Macy asking Alex for sexually explicit images. And she also, as far as he knew, sent him some sexually explicit images. Unbeknownst to him, Macy was a sextortionist, and was not a teenage girl. We actually don’t know the identity of this individual.
But the long and short of it is over the course of one evening, it took about five hours, Alex went from a healthy 17 year-old-boy, a member of youth group, good Christian family, healthy mental awareness of how to be careful online, to the point where he made a mistake, sent an explicit image and was immediately told, “If you don’t cooperate and send me $1,000, I’m going to send these pictures to your family, to your friends” they started listing family and friends that were on their friends list, “until it goes viral, and all you have to do is send me the money and I won’t expose you.” So Alex, being a teenager, did not have a lot of money, sent everything he had. He had about $400 total that he could send. The person said, “That’s not what I asked for, pay me more now.” He said “I don’t have any more.” The person essentially mocked him, and then posted the images. And Alex, who was of course feeling like his life was just ruined, he made the worst mistake of his life, ended up dying by suicide that night, and his family found him the next morning. And this is just one example of how these platforms should be able to shut that down before it ever gets to that point. Messaging like this should never be happening on Instagram. When someone is creating an account, pretending to be someone they’re not, and extorting another human being, it should not be allowed. And that company in my opinion, should be held accountable for that.
REICHARD: Final question here: these are huge problems, and huge corporations enabling a lot of wickedness. Speak to Christians now. What 2-3 things that you and I can do?
NESTER: Yeah, it starts with prayer, and with good conversation with our kids. I think spending that time talking through these issues honestly and openly with our kids makes a big difference. If I can just share as a dad of teenagers, when I just tell them, “No” about a platform they want to use, because they tell me they’re the only teenager in our entire city who doesn’t have this platform. So we talk about it. And when I just tell them, “No,” I’m gonna get resistance, and chances are pretty good they might even try to sneak around and figure out a way to get access. But if I actually sit down and talk through, okay, here’s the problems with this platform, here’s the stories of things that have been happening on this platform, we can talk through, honestly, these issues. And maybe we land in a compromised spot where it’s alright, we’ll create an account, but I get access to your account. Or maybe we just skip it, and we’ll circle back to it when you’re a little bit older. So I think open and honest conversation with your family, with your kids, especially as they become teenagers is super important. But I also think getting the word out to other families, whether it’s in your church groups in your neighborhood and the workplace, talking with people about not only the dangers of some of these platforms, but also what they can do. And in our case, we encourage people to go to dirtydozenlist.com, you can see all of the targets. And we’ve made these actions super simple. You click on each target, there’s a big “take action” button that’ll pop up with you right when you click on the target, and it’ll be anything from a contact your local representative if you’re a U.S. citizen, or it could be email the executive of this company and let them know it’s time for them to make a change. So we have actions ready to go, it doesn’t take more than 20 seconds per action to do something about this.
REICHARD: Tim Nester is senior director of communications at the National Center on Sexual Exploitation. Tim, thank you so much for your time, and for your good work on this issue!
NESTER: Thank you. Appreciate it.
WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.