KELSEY REED: Hello, welcome to Concurrently: The News Coach Podcast from WORLD Radio and God’s WORLD News. Our mission is to come alongside you, learning and laboring with you as you disciple kids and teens through culture and current events. I’m Kelsey Reed. I’m here with Jonathan Boes.
JONATHAN BOES: Hello!
KELSEY: Together, we want to model conversation and apply tools you can use at home or in the classroom. We invite you to record or email in your questions for us to address in future episodes. Please send them to firstname.lastname@example.org.
JONATHAN: So I’m excited today because we have a listener question. This one comes from Aaron Friar. I’m just going to use this opportunity again to remind you, we love getting these listener questions, because we want to be talking about the things that matter to you. We want to be talking about the things that you’re talking about with your kids and students. There’s no better way for us to do that than by hearing your questions. So here is a question from Aaron. He writes:
I was trying to explain to my 13-year-old how you should not sell away all your private data to all these companies just so they could throw you free stuff once in a while. He did not get how any of his personal information would be that valuable to anyone or risky for him to provide. I admit too that I came up dry looking for a full explanation.
I am reading now Yeonmi Park's description of escaping from North Korea where everything is controlled by the government. I tried to make an argument that we do voluntarily to these companies what was forced upon N. Korean citizens. Anyway, the scale is different, or is it?
Current news angle of course is all this fuss about TikTok. My son’s generation REALLY [doesn’t] get the fuss about that debate, and again I could use some help too. HELP!
I love—it ends with just the word “HELP” in all caps. And so thank you so much, Aaron. This is a—man, this is a huge topic. And I’ll say, I relate to your cry for help in this, because it’s such a big thing and so hard to grasp. But we’re going to do our best to explore this issue today. We often call ourselves “professional learners.” I definitely feel that in this subject that is such a mixture of economics and technology and policy, in some cases. But we’re always going to try to bring it back to our discipleship principles, those ordinary responses to extraordinary times.
KELSEY: Thanks for rooting us in that. And I’m going to do something maybe a little bit different, in terms of how we engage the question, because for me, sometimes I need to get my eyes tight in my observation of the question itself. So some of the things that leapt out at me, as I observed Aaron’s question: First of all, seeing the risk, and sometimes being unable to see the risk of selling away private data to companies—so that’s one of the first things I observed—with the hoped-for payoff of “free stuff.” I also observed, there’s this word “value” in there, and how personal information, you know, why is that a valuable thing? I see this connection to a different government than what we live under. So I found that to be very interesting, to toy with this idea of governments, or “Who holds power?” And then we’re always tracing things back to the things of online living, or society. So TikTok, again, rears its ugly head. We’ve had several conversations that allude to TikTok.
And so as I’ve made these observations, I think, for me, in general—I am a generalist. I have a tendency of thinking about broad, sweeping ideas or principles that we can apply to a topic area, because sometimes the details—I’ll get lost in the mix of those details. They’re overwhelming. They are varied, and they’re complex. So I’ll probably be taking that role of observing some of those broad ideas and principles. And that’ll give us some of the structure for our conversation, while the connective tissue, the illustration, and more of those fine details will be refined somewhat—I’m not trying to put the full burden on Jonathan in that regard—but they will be refined somewhat by Jonathan’s study and thinking today. So me, kind of the broad thinking, the discipleship principles, and I’ll share some of that with Jonathan, too, because he’s got wisdom to share in that area. And Jonathan will give us some connective tissue.
So I want to kind of think through some of these broad things that were brought to mind by this question. And I camped out for a while in value. What makes our data, our private information—what makes that valuable, so valuable that we would be careful about it? And so a general principle to me was to think of the value that is imbued into anything that is related to us as image-bearers. As image-bearers, we have high, high value. We were looked at by the Lord and named as “very good.” That’s a value statement. He looked at us, He looked at all of His creation, when the apex of His creation was in its fullness as man and woman, He looked at it and said, “It’s very good.” So by that nature, we have been named as valuable. It means that what we have in our persons, the way that our bodies are made—those are highly valuable. But then anything that’s an extension of ourselves is also, as I said, imbued with value. A photograph of us has high value. Even our email address, somehow, is imbued with value, because it’s connected to us as a person.
JONATHAN: And you’re talking about this objective value, which is something that I think, as Christians specifically, we can really plant a flag in and say, there are things that have value. If you don’t believe in a God who is good, and who fills the world with goodness, then value is only relative. It’s just, what can I get from it? Why is it valuable to me? And so we need to think a bit on that relative level to understand not just why is data objectively valuable, because it’s connected to image-bearers, but why is it valuable to these companies, or even governments. You mentioned TikTok, which is connected to China. And that is a very relative kind of utilitarian value, where it’s related to financial gain or power.
JONATHAN: Transactional. And for company leaders, business cultures that don’t have a higher-order value, that don’t have a sense of the things that truly have value because God has given them value, all that’s left is this relative value of “What can we do with it? What can we gain with it? What power can we hold with it?” That’s a very human level of value with no reference to something transcendent.
I want to step us back just a little bit into the concrete realm, just to ask the basic question: What are we talking about when we talk about data? Because that is such a—I don’t know—it’s an abstract term. Like, we talk about data privacy all the time. But what actually comes into our heads when we think about that? For me, it’s like, I just take a moment to examine the mental picture I receive when I hear the words “data privacy,” and what I immediately envision is just a bunch of ones and zeros on a Matrix-style screen, right? And when that is our idea of data, it’s a lot harder to talk about these things, because they’re just so abstract and vague.
Maybe it would help us to guide our discussion if we just pull this back into what we’re actually talking about. Security.org—this was last updated this February, 2023—they did this survey on the different data that big tech companies have on you. And they kind of gave each company a letter score.
Just for the purposes of understanding the types of data a company might collect, some of the things on this list: Personal Information is one of the categories they use. So that would be your name, phone number, your email address, videos and photos you’ve stored, emails you’ve written and received, comments you’ve made on YouTube. Unique Identifiers is another category: IP address—kind of that unique identity of the computer you’re using, your system activity, the websites you’ve visited, what type of device or browser you’re using, the operating system you’re using, the internet carrier you have. Activity is another category of data companies might collect. So this would be things you search for, videos you watch, ads you look at for a certain length of time or click on, the amount of time you spend on the internet, what third-party websites you visit, what numbers you call, what numbers you receive phone calls from and how long you talk to them. Another big category—this is a huge one—Location Information. If you have a device connected to a GPS, companies might log where you are, where that device is connecting, other things nearby, other Wi-Fi access points, other Bluetooth devices, maybe even other stores. And then also even just looking at publicly accessible sources—that’s the last category that Security.org uses, things they might find in local newspapers or other advertising sources.
So that’s a huge list of things, but just for the purpose of giving us an idea of, what are we talking about when we talk about data? And of course, it’s not just all these big disconnected things, but the full picture companies form by looking at all of this data, running it through algorithms, maybe using artificial intelligence, to create an accurate picture of you, the user.
KELSEY: So we’re thinking of all the what of data and starting to piece that into the why of its value. Just as you listed all of those things, I could immediately sense the value of just, for example, the numbers that I’ve called, or where I’ve been, or maybe even where my children have been. That is valuable information. I think that why with that what, you know, when we think of the worldview aspects of the why something is valuable, like you helped us to think through, that some worldviews are not going to sense that the image-bearer themselves is valuable, and therefore all the things that are associated with them, they’re just treated as something that anybody can use, because they’re not distinct persons with distinct needs and ways that they glorify the Lord and things that they can do in this world that are intrinsic to themselves.
When you start extrapolating all those different areas, for me, as I look at it through the lens of this presupposition that the person that that reflects is somebody of great worth, then I also have to think about, well, why then would they be valuable to a company? So we’re asking more of those why and what questions. Why would that data become something that’s valuable to a company? And then another question would be, how can that data be used in an evil manner by somebody with nefarious intent? So then we’re starting to get into identifying some of those risks, by presupposing that those things even have value and are worthy of care. So I would, I guess, maybe want to go ahead and pose that question to you. You know, what do we see there in terms of why a company would see that as valuable? And maybe, how that could be used in an evil manner?
JONATHAN: Yeah, those are such huge questions. And again, when Aaron wrote to us that, you know, he’s trying to explain to his 13-year-old son why this data privacy issue is important and coming up dry for a full explanation—I so relate to that. Because, yeah—“Data privacy, that’s important! Why?” I feel that so strongly. And the more I dig into it, though, the more I’ve been able to learn. I know that I’m just scratching the surface—again, that professional learner thing—but one of the resources I’ve been looking at is The Age of Surveillance Capitalism by Shoshana Zuboff. I don’t necessarily agree with all of her economics, but she does a really good job of defining this issue that has popped up really over the last few years. Honestly, it’s such a new thing that we are still kind of striving to describe it. And she does good work towards just finding new definitions. And one of the big things is that, again, this idea of relative value—companies want our data—it comes back to the age-old issues of power and wealth. You can sell data to advertisers, and they can use it to create better advertisements.
I mentioned in passing the idea of, you’re walking towards Barnes & Noble and a coupon for Books-A-Million pops up on your phone. You can see why that would be—your location data would be—valuable to the advertisers working for Books-A-Million, because they can steer you away from a competitor. Or they could target certain ads to you online based on your behavior. You will see different advertisements come up on your Facebook feed or in your Google searches. That’s why, if you and I look up the same thing on YouTube right now, we’re going to get different results. Because it’s feeding us things based on our behavioral data, so it can predict what we would most like to watch, and from that, target advertising that we’re more likely to interact with.
If you think about it from that idea of relative value, from the—put yourself in the perspective of, in this case, the stakeholder is the advertiser or the company selling a product. The difference between sending a commercial for—let’s say you’re selling shoes, right? I could send a commercial for shoes out into the void of public television. Everyone is seeing this advertisement, regardless of whether they just bought a new pair of shoes or they care about shoes. Or I could pay Google or Facebook to tell me who has been searching for shoes, who has been searching for insoles or, you know, who has logged so many hours of running on their Fitbit? I could pay to get that information, so I can send my commercial to the people who are probably looking for shoes.
And so you can see how all these bits of data that might seem disconnected, from the things you’re searching for to the stuff you wear on your wrist or even the devices around your home that are connected to the internet—people call that the “Internet of Things,” this interconnected web of, your fridge can get on the internet, your TV can get on the internet—you can see all that comes together to create a picture that isn’t this full, good picture of the image of God that is imbued with value, but has traces of that value. But they’re just using it for the economic value of “What can we sell?”
KELSEY: It’s so good that you use that “traces of that value.” Everything that you’ve said has helped me to think about categories related to a human being, like, unpacking those categories of value through the perception of a company. You know, they’re looking at a human through that transactional or utilitarian economic filter, because they see the human being as an agent in their overall goals to build their company, to build wealth. Because the human agent is the thing around which all of that centers. It is a powerful agent. You used the word “power” at some point. Human beings as living, breathing, imagining active agents in the world—they are a power center. Wealth centers around their activity. They are fascinating, and they are effective in the world. Data in and of itself, it’s nothing without that power center of the one that can make decisions. And so the company using that data to try to leverage those decisions towards also their goals of making it, you know, of a company building itself and growing larger and building wealth. So human as power center, attractive to a company.
So we’ve kind of discussed that first question that I asked about, you know, what makes it attractive? You know, why is data something that is, you know, viewed as valuable? Why is a human being viewed as valuable by a company and what are ways that they seek to use data for their goals? We started moving towards the nefarious usages, or that which is risky, when you mentioned even the smart home. And I was thinking of an example that I heard of—this internet of things, where all these things are connected, our social media somehow also connected to our refrigerator. Don’t ask me how that works.
JONATHAN: There was this thing that went viral online a few years ago, this girl whose parents kept taking away her devices so she couldn’t post on Twitter. And like, it got to the point where she sent a tweet through her smart fridge.
KELSEY: Oh, that’s so hilarious. So yes, the amazing way that all of these things that are smart devices now connect. And you know, in some ways, we can outsmart, maybe, those who are not as adept at technology. I think about what you just gave as an example, that is hilarious, that a refrigerator would communicate to the outside world for this grounded individual. But there are these other ways that that smart and internet-connected set of devices can also be used in a punitive manner against others.
There was an article that came out recently about a man who was locked out of his smart home, because of an altercation that he had had with an Amazon employee. And so a complaint moved through to complete lockout of all of the things that his house ran on, to the point where he could not even get into his house. That is—I’m not going into all the details, we can link that article just for the fascination of reading that and knowing the possibilities, and that fueling our imagination, for understanding some of the consequences of data being linked to itself, being available to others—this is a great place, actually, for the developing learner to begin to employ their imagination of—how could this be used towards, you know, benign ends, or just economic ends? Or how could this actually be used in a dangerous manner, in a destructive manner, to community or to the individual?
JONATHAN: Because those somewhat benign ends, economic ends, are the draw of this use of data, because we like it when Google gives us what we want. When we had Collin Garbarino and Juliana Chan Erikson on the podcast—we need to have them back, because we reference this conversation like every other episode—but I think it was Collin shared, specifically, that Google gives us what we want. And we like that, because we want to type in a search query and get what we’re looking for. That’s how we expect the tool to work. And by using our data to predict what we’re looking for, Google gets better at that. So that’s the kind of the, you know, that’s what makes the forbidden fruit looks so sweet. But when you’re talking about these nefarious purposes—you know that Amazon Smart Home thing—that’s a great example of what I kind of think of as the “first bucket.”
What for me has helped me think through this issue—and hopefully this is helpful to other people, this is just the mental category I came up with out of the blue to help me sort through it—I think of two big buckets, the first bucket being what I have kind of called “lower order dangers” that we can really grasp, and the second bucket being “higher order dangers” that are a lot harder to understand.
So we’re coming back to Aaron’s issue of trying to explain this to a 13-year-old. And being locked out of your home or your fridge, because all that stuff is connected to your Amazon account—yeah, that’s a danger you can explain very easily to a teenager. And it’s very easy to grasp. The other things that I think are easy to grasp are just the ways companies might mishandle data.
I recently read a book called An Ugly Truth, about the security practices of Facebook over some of the really controversial election years we’ve just gone through. And there was an anecdote in this book about kind of that awkward stage where Facebook was expanding like wild. They were everywhere, billions of users. But internally, they didn’t quite yet understand their own power and growth. They were still operating like a small company. And what that led to was the creation of opportunities for bad actors to misuse all the information they had. So there was a time at Facebook where they had access to the information of billions of people, and every engineer could access all that information, because the idea was, if they couldn’t, that would impede their work. And almost every month, they were having to fire engineers who misused that information. There was actually an engineer who went on a date for the first time with some girl and then read her messages and used her location data to figure out where she was, because she wouldn’t respond to his text messages. That’s scary stuff. And that’s one of the more graspable dangers of what we’re doing with this data—the idea that companies might simply fail to handle it well. And that’s a mixture of plain old human fallibility—like not understanding how big your company is, not having good security practices—but then the actual threats, the actual people with ill intentions who want to exploit those vulnerabilities—people like that guy who was stalking the girl he’d gone on a date with, or, you know, the Russians looking to mess with American politics and even, in some instances, make us angrier at each other and increase division in our country. You know, that’s a whole other issue to dig into, but there are warehouses full of people in Russia whose day job is just to go hack things in America. And the more information we give to companies, the more there is for hackers to exploit.
KELSEY: We’re in that realm that, if we were following the Redemptive Narrative to talk about our own narrative today, to give us some of those chapters, we’ve talked about what’s valuable, which really relates to creation being named as good. Now we’re naming things that are broken. And I love the categories that you’ve given. They made me think of a couple of categories that are helpful even as we encourage that use of the imagination, which—I’ll bring some more questions out to help promote that type of learning. But here are some more categories to think of when we name the broken, and name how our brokenness can seek after opportunity, or view things as opportunity to serve self.
And I was thinking about crime. I love to listen to detective novels on audiobook. I love Sherlock Holmes. I’m into that stuff, so easily to mind came the three categories of detecting crime, of understanding who might have done it, “whodunnit.” So there’s means, motive, and opportunity. And so as I was thinking about data, and how we steward our data, I was thinking about—what does it look like for us to be careful not to provide opportunity? This engineer, or these many engineers in Facebook, they had opportunity. And because of their brokenness, they obviously had the motive to take that opportunity and to use it. They had the means for it. They had the skill to look at that data and to interpret it in a way that gave them what they wanted. And so means, motive, and opportunity for turning something that seems benign, or seems, you know—why would anybody even value this? Well, it’s the opportunity to use it in a way in service to self and in service to sin. So we’re naming the broken chapter here.
JONATHAN: Those broken ways people might exploit our information when we give it all to one company, or two or three companies that we’re putting so much trust in—those, again, are all things that, my mental categories, I put it in that bucket of the “graspable threats.” My oldest is seven, going on eight. I could probably explain to her the danger of somebody using information to track you down, or somebody locking you out of your house.
I think where this issue gets really hard to tangle with is when we get into this other category of the higher order, less graspable dangers. And this is what Shoshana Zuboff calls “surveillance capitalism,” the idea that we are entering an era where there is a massive market built on the sale of data, the sale of people’s data, but not only that, on influencing people’s behavior to act in certain ways that make them more likely to purchase certain products, whatever you can think of. Aaron, you mentioned how hard it is to explain this issue to your 13-year-old. I would say that’s because it’s hard to explain this issue to anyone. The more I read about it, the more I find that even experts are struggling to find the categories for this, are struggling to find the language for this, because it’s relatively a new problem. We’ve talked about the general sin issues behind it—greed, desire for power, those idols—but their specific manifestation here is something that’s only possible in our digital age. And so we don’t have a lot of historical data to look at and figure out exactly what’s going on. Zuboff, in her book, calls this the “horseless carriage problem.” The way that people who first encountered automobiles—they thought about them as horseless carriages, because they didn’t have the category yet for what a car was. And she says, that’s kind of the way we are right now with this whole issue of big data and the way companies are using it. We’re still trying to explain it with old categories, and we don’t yet have all the language and all the mental furniture to totally make sense of what’s going on and especially what the consequences will be down the road.
But to me, the biggest thing is that, not only are people collecting data on our behavior, but they’re using it to influence our behavior. And I think that’s the part of this that we can grasp and begin to wrestle with, because whenever somebody is influencing our behavior in ways that we’re not entirely aware of, you know, that’s something I think we need to be really careful with.
KELSEY: I was camping out in that word “influence” as well. That’s such a great thing to emphasize, not only in terms of others having an influence on us, but returning to that idea that we are active agents, that we are powerful agents in this world, made to be those who are influencers, those who—maybe it’s a tight community in which our influence lives, but our behaviors influence and shape things around us. When we are lackadaisical with our data, that influences an entire world. When we, as a community, view that as something that’s really not too great a thing—it’s okay for that to be out there for anybody to use—we’re actually, in our inaction or our lack of stewardship about that data, we’re actually having an influence on how companies work, or governments work.
It’s so important to root back again, for me, as I try to wrap my head around these things that are somewhat intangible—I have to have those guiding principles in mind, when I look at new details about this world. You have a great comment that you’ve made about there being nothing new under the Sun. It’s just a new version of exploring old patterns, I would say. And so I need to go back to those things that help guide the patterns of our behavior, as I look at new details—new details to add to old brokenness, or new details that add to a refreshed view of us as powerful agents in this world.
So how do we influence with our stewardship? How do we influence with our behaviors? How do we create a certain perspective in the world through our engagement of the other as somebody who is of high value? And so if I treat them as somebody of high value, then I’m also treating anything they do, anything they create, anything that belongs to them as something that is worthy of my respect, and of my care.
It’s so funny—my parents taught us a ton about manners when I was a kid. And we grew up in the South, as I think we’ve mentioned a little bit at least. There was a bunch of Southern influence. As I moved around the world, the South was still very much the framework through which we viewed manners. And the cultural South is all about “yes, ma’am” and “yes, sir.” But there’s a principle behind “yes, ma’am” and “yes, sir” that I had to learn to dig into deeply when I went over to Dublin, Ireland, and “yes, ma’am” and “yes, sir” were viewed as snarky or sarcastic. So there’s a principle that is at the heart of how I treat somebody with value, how my action engages, and how it also spurs them on towards not only treating me as someone of value, but of treating others around them. So how do we use our influence, as those who recognize that the person across from us is a reflection of the heavenly Father, and that puts us into a place of awe, and of care, and maybe even of standing on holy ground as it were, when we approach the other, or anything that is associated with the other? So I just was struck by that word “influence” and how we can influence.
JONATHAN: And I like that you’re acknowledging that influence is always going to be there, because when we think about a big tech company influencing our behavior, that sounds really scary, partly because I think our initial response is, nothing should influence my behavior. I should be able to make decisions freely, out of my own brain without anybody, you know, poking around in there and changing the way I see the world. But there was this interesting article that you shared with me from the Conversation, on the subject of neurotechnology. And then—this is a whole other can of worms about the developing world of technology that directly interfaces with our brains, and will someday this whole issue of data privacy, will that extend even to thinking about mental privacy, having a right to our own brain activity? But something they bring out is this whole issue of thinking for yourself, and the idea of being influenced by others in our behavior. We are influenced by everything in our lives, from the communities we’re in, the books we read, the things we watch—all sorts of things influence us. The economic imperative of these companies using our data is, they want to kind of take that influencing power and wield it for gain. In response to that, I think it’d be foolish to try to be uninfluenced. I think we have this modern individualistic make-your-own-truth sort of world, where we have this idea that, in a perfect world, we should be able to be influence-free, make all our own decisions, form all our own beliefs, test the data and see it for yourself—all those good modern values. But in reality, we’re always going to be shaped and influenced. So that’s actually one of the core principles of discipleship—that we are shaped. And maybe—I’m going to float this out there—maybe when we look at these big companies trying to influence us, instead of aiming to not be influenced, maybe we should instead aim to form intentional influences.
KELSEY: I absolutely agree. And I love that you brought the relational connection back to the fore here, because discipleship is at its core—a relational construct, for lack of a better word. It is relational in its intent and in its outcomes, which means that, parent, you are influencing, training up someone who will also become someone who influences and trains up. It is a relational heritage. Discipleship is handed down from one to another, and always has future generations in its outlook. And it’s related to restoring the created order for relationships, where we recognize that we’re not just individualistic—we don’t get to just set our own truth. The truth has been set for us by a good authority. And that we’re learning how to live under that authority, and not just think that we can set our own rules, and maybe even get complacent about some of the things that we, as parents, might be guarded about, but in the next generation, they might be complacent about. It is for us to instill that. That’s something to not be blasé over, those things connected to self, that also interact in community. And I appreciate what you said about being influenced by companies in that connection to discipleship, because something is discipling us, not merely influencing, but setting the tone for what we think is true, and what we perceive as normal or good or broken. And so what are we allowing to disciple us? I think that was a really good connection.
I think this is a great place for us to add some more of those questions. There’s so much more detail to get into. And we hope that you will take that conversation further at home or in the classroom. So here’s some questions to further that discussion.
First: Learner, employ your imagination. What could someone do with your information, or with a whole bunch of people’s information?
How is my stewardship of my information also actually a way of loving my neighbor? We used the word “influence” today. But I think we could also redefine it as how it is a love of neighbor to steward my information well. And a hint there is, you know, are we providing opportunity for them to succumb to temptation? So there’s something that is kind of a hint in that.
What is the result of failing to treat something with high value? And see if you can think of a practical example. How does it affect others’ thoughts, attitudes, and even actions when we model complacency or cavalier action towards our own or others’ possessions, whether that be material, digital, or intellectual?
And now apply those thoughts specifically to private data. Also, what are other things that we need to steward intentionally with a high view of their value, and as a part of our work as those who care for culture, those who steward culture?
JONATHAN: I want to speak briefly to our response to this issue, because that, I think, is one of the most confusing places for parents and teachers. Because we have all this information about what companies do with our data—okay, what do we do with that? So I’m saying all this, but I have a Facebook account. I use Google. You could go the extreme route of chopping off all your connections to these companies. Some people do. That’s not the choice for everyone. This is another thing we have to leave in your court, listener, to decide. How are you going to use these technologies? For some people, they might decide to go cold turkey. For some people, they might just decide to create new parameters to make sure they’re using them with wisdom. It’s going to look different for everyone. Everyone’s going to make different choices. I think a big thing is to have, you know, have grace for other people’s choices.
The one thing I would encourage in response is just starting with the awareness. I think it just goes a long way to understand the basics of how these companies work and assume that everything you’re seeing on the internet is monitored and curated. Assume it’s not private and assume that it is being generated for you by somebody. Because we often go on our Google searches, or our social media feeds, and think we’re getting a clearer picture of the world, where most often we’re getting a picture of the world that has been designed to appeal to us. And I think, just by understanding that, we can start to use these technologies with a lot more wisdom.
KELSEY: I think a good diagnostic question, as we think about those things. It might be helpful, you know, as we think—what are my motivations for either turning off all social media and technology, or for continuing to use it and trying to learn how to use it well? You know, the question that is often in my mind is: Where am I looking for an escape? Where am I looking for things to be made easy? Some of those diagnostic questions, pulling out some of the idol—you know, the heart level idols that we’ve talked about a number of times. If I’m trying to make life easier, if I’m trying to just turn off and not have to do the hard work of discernment, or of discipleship—those are red flags for me, that I’m engaging something not well, not as a steward, not as a powerful agent in the world.
JONATHAN: It’s ultimately an outpouring of this same kind of convenience idol that drives the use of these technologies. But we’re talking about being aware, but not being anxious. And that reminds me of the scripture that we have pulled for today’s episode.
KELSEY: So Luke 12, it has a number of things that I would want to encourage you to draw out, one of them being to not be anxious, because we are of so much greater value than the sparrows and the ravens that Jesus is mentioning in that passage. We’ve talked also before about Philippians 4, and that being anxious for nothing, but as we discern, you know, in everything by prayer and supplication and with thanksgiving, to seek the Lord’s help as we discern what is good, what is true, what is commendable, what is worthy of praise.
But the explicit verse that I want to share is also from Luke 12, that encourages us: “Stay dressed for action and keep your lamps burning, and be like men who are waiting for their master to come home from the wedding feast, so that they may open the door to him at once when he comes and knocks. Blessed are those servants whom the master finds awake when he comes.”
And I chose that because of the diligence, the way that we need to engage culture and current events with a sharpness of mind, a continued learning process, a willingness to learn about the Lord’s ways in His world and seek to be His powerful agents in it. He has equipped us for that work.
Why does our data matter? How do we explain it to our kids? Today we’re tackling a listener question on data privacy.
Check out The Concurrently Companion for this week’s downloadable episode guide including discussion questions and scripture for further study.
We would love to hear from you. You can send us a message at email@example.com. What current events or cultural issues are you wrestling through with your kids and teens? Let us know. We want to work through it with you.
See more from the News Coach, including episode transcripts.
Listen to our previous episode on social media, “TikTok influencers and the weight of glory.”
What information do tech companies collect from us? Read Security.org’s “The Data Big Tech Companies Have On You.”
Read about the man who was locked out of his smart home over a dispute with an Amazon driver.
Mental privacy? Learn about how neurotechnology might change the data privacy discourse.
Concurrently is produced by God’s WORLD News. We provide current events materials for kids and teens that show how God is working in the world. To learn more about God’s WORLD News and browse sample magazines, visit gwnews.com.
Today’s episode is sponsored by Covenant College.
Looking for an unapologetically Christian College Experience? Pursuing knowledge transformed by faith, Covenant College prepares students for their callings and careers. Covenant is located on top of Lookout Mountain, Georgia, 20 minutes from Chattanooga, Tennessee. Students who visit are eligible to receive a grant of $1,200. More at Covenant.edu/world.
WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.