nextTalk

Legislative Work to Keep Kids Safe Online

nextTalk

Send us a text

Tori Hirsch, legal counsel for National Center on Sexual Exploitation joins us to discuss the Kids Online Safety Act (KOSA), image-based sexual abuse, child sexual abuse materials, sextortion and the legal ramifications of sharing nude photos.

Support the show

KEEPING KIDS SAFE ONLINE

Connect with us...
www.nextTalk.org
Facebook
Instagram

Contact Us...
admin@nextTalk.org
P.O. BOX 160111 San Antonio, TX 78280

Speaker 1:

Welcome to the Next Talk podcast. We are passionate about keeping kids safe in an overexposed world. Today we're talking about Kids Online Safety Act, known as COSA, and we have brought in a special guest. Her name is Tori Hirsch. She's the legal counsel at National Center on Sexual Exploitation. Tori, we're glad you're here. Please tell us a little bit about yourself.

Speaker 2:

Yeah, it's so good to be here. Thank you for having me. So I am legal counsel with the National Center on Sexual Exploitation and we're a nonprofit, nonpartisan organization that is committed to using corporate advocacy, public policy and litigation to fight for a world free from sexual exploitation and abuse. So I'm just here to talk a bit about COSA, which is a bill that I think most people have heard of at this point. It's very well known. Just kind of talk about what it would mean for our kids and social media.

Speaker 1:

Absolutely. I think. We see it in the news, we hear it, and so I want to break down just real simple. What would it do if it passes? What would that mean as the protection for our kids?

Speaker 2:

Yeah, so just kind of generally. Cosa is a bill that targets design features of social media companies and other online platforms some video game systems as well messaging apps and requires them to really simply design with the safety of minor users in mind. That's really simply what the bill sets out to do. It is longer than that, but that's just kind of a summary out to do. It is longer than that, but that's just kind of a summary. So right now it is COSA's the most comprehensive child safety legislation pending today in Congress and it basically fundamentally changes social media platforms focus from a duty of profit and making money and getting more users on their platforms and making them consider children's safety and design with their well-being in mind.

Speaker 1:

So what are some practical things that it would do? Like, say, I have a kid on Instagram and their account is private? How would I see these changes on their account if COSA passed?

Speaker 2:

So, for one, it requires platforms to default to the highest level of safety for children. So this will include disabling private messaging from unknown adults to minor accounts, which this, I think, we know is a lot of times the mechanism by which sextortion schemes happen on Instagram. So adults are no longer going to have access to just reach out and message those accounts. Additionally, it's limiting design features, including infinite scrolling, things like auto-playing videos, rewards or incentives to keep users using the platform, keep them addicted, and other features that result in compulsive usage. So, essentially, tech is going to be required to slow the addiction that children have of using their own platforms. It's going to affect the algorithm, so it's going to disable algorithms that facilitate content between predators and connecting them to minor users and connecting them to minor users.

Speaker 2:

And then, on the parental control side, like I said, safety settings are defaulted for minors. So this means private accounts by default, which is huge, and these can be changed or enabled with parental consent. So it kind of puts that back into the hands of the parents, which is extremely important, and it's going to also require easy access to parental tools. So, you know, making it easy to see the account settings that their children are using, allowing them to restrict purchases on platforms, viewing their screen time and usage, allowing them to set limits on that. You know, things like not being allowed to access the app during certain hours, and a good thing about COSA is that it does apply to existing accounts as well, so it's not just going to be accounts that are created after COSA is passed. It's going to be all accounts for minors.

Speaker 1:

And so they will use, I assume, that birth date that you have entered when you started that Instagram account. Is that correct?

Speaker 2:

That should be correct.

Speaker 1:

So, parents, I want you to take note of that, because if your kid has an account that you don't know of, that they entered an incorrect birth date, that's going to be an issue. That's why the conversations with your kid at home is so important about keeping them safe online. But I love all the default. As you were running through that list, one of the things that you said was it's going to limit contact with adult unknown predators with child accounts, and I just kept thinking why is that not the case right now?

Speaker 2:

That's the question we're asking too. I don't have a good answer. This is something that advocates in the space have been asking for saying this simple fix is going to have meaningful ramifications. It's going to make actual, preventable harms from occurring.

Speaker 1:

The two things that I really love too the default to the private account. That's something we've always advocated for, that if you have a minor on social media, they need to have that private account. But I also love the restricted hours that's kind of like is it called sleep mode or something like that? They referred to it that way to basically be like social media is off during these hours.

Speaker 2:

Right, and I don't think COSA particularly uses that language, but I know Instagram's new teen accounts, which you might be familiar with, that they conveniently launched the day before COSA was marked up in the House Energy and Commerce Committee. But that's one of the things that Instagram is now implementing. Cosa is getting at the design features. That's a design feature that we know. A lot of sextortion happens in the middle of the night, when kids are isolated and alone and on their phones, so if they're not on them during that time, hopefully things like that will cease to happen.

Speaker 1:

Predators are coming into the bedrooms at night through the phones and getting access to our kids and manipulating them, and so the more we can prevent that, the better. I know we advocate no phones in bedrooms at all at bedtime. That's one of our big next talk principles. Have you seen a lot of support? Tell us about that, because I think there's a lot of bipartisan. Educate our audience on that.

Speaker 2:

Absolutely. This is an extremely bipartisan bill. It passed the Senate with 91 votes in favor, so if that doesn't tell you something about the bipartisan support, I don't know what will. And it has that support because I think people are understanding that this really is needed to protect children and it's the only bill that would comprehensively regulate social media in this way, right Like we haven't seen a comprehensive bill that addresses social media in this way.

Speaker 2:

And I think Congress has seen the harms. They've heard from the harms, from parents, from children who have been impacted, and we're at a tipping point that it's finally time to take action and get this passed. So right now, where it's stuck is in the House. There's been a lot of misinformation from big tech spreading about how it would impact First Amendment rights, free speech, conservative, pro-life views, and at the same time, they're spreading messages to Democratic House members saying that it's going to prevent LGBT youth from being able to have access to meaningful resources for them. Neither of those things are true. This bill gets at design features of platforms, not content. That's very explicit in the bill. So at this point, I think we can really point to big tech and say they don't want to be regulated in this way because they're not liable right now.

Speaker 1:

Well, if they're regulated, then there's liability. Comes with that liability correct. And also, like you're losing consumers because you're losing time, that you're pushing for those kids to stay on and stay scrolling.

Speaker 2:

And they're losing ad revenue. You know one of the things the algorithm has I think probably every you know third or fourth video is an ad. So that's how they make money and if they have fewer users on their platforms, they're not going to be able to sell that ad space.

Speaker 1:

Well and I know I want to go back to how you said the sleep mode was Instagram. I'm glad you corrected me on that. When they rolled that out, I know some people were frustrated that they conveniently rolled out some more new safeguards in basically some of the things that were in COSA. I actually thought it was almost like great work for all the advocacy work, because they know they're in trouble, they got to do something and that is all the hours of parents going testifying about how their kids have been harmed. I mean, I know some of these families personally who have gone and testified and have lost their children to sextortion schemes on Instagram, and so I just think it's wonderful that we're able to use those voices for change, whether it's through Instagram, rolling out new challenges or COSA, both would be great. How, how hopeful are you that it's going to pass? And what do you think about the timeframe? Do you have any kind of idea of what we could expect?

Speaker 2:

no-transcript. So right now, the chance that COSA has to pass is going to happen during the lame duck session, which is later this year. If it passes, it'll pass during lame duck.

Speaker 1:

So we should be eyes on that and if it doesn't, the next year but hopefully it's going to pass right after the election is what we would be hopeful for. I just want to thank you for all the work that you're doing to help kids and families. I know for next talk, you know we're really focused on parents being involved in their kids' lives and online, and there's millions of kids out there that don't have involved parents, and I feel like that's where COSA and these new safeguards that Instagram is rolling out will really help protect these kids and put at least some safeguards in place for them.

Speaker 2:

You hit the nail right on the head. Tech has consistently put the responsibility and the duty on the parents, but parents can't do everything. They can't, you know, be looking over their kid's shoulder on their phones. These safety settings that exist right now are sometimes really hard to find, sometimes they're buried, and it is time for tech to be responsible in some sense and making sure kids are safe on their platforms.

Speaker 1:

It also goes to show when kids open accounts, it's going to be really important, going forward, that they put their actual birth date in there so that the safeguards can be in place. So, parents out there, listening, this matters. Your relationship with your kid matters. We don't want your kid lying to you. We want them learning the platform with you and that's how they can be healthy and then have all these safeguards in place of COSA.

Speaker 2:

You know, thank you to the parent advocates who have been really brave and been willing to share their stories or their family stories that I can't even imagine like the things that you know they've been through, and I think it's absolutely meaningful to representatives and senators to hear those stories. Those are really what push change forward. So keep being brave and thank you for using your voices to push for change.

Speaker 1:

So well said and we know of families who have been able to save their kids because of an unawareness story Like, oh, we've heard this story and they're sharing their story with COSA advocacy and they're putting a face to a tragedy and people are actually understanding this is happening. One of the things that has shocked me is I just thought some of these extortion schemes that were claiming the lives of these kids were so isolated, and the more podcasts we've done with these parents, the more people have come out of the woodwork to say this happened to my kid as well, which is crazy to me that it's not so isolated anymore.

Speaker 2:

Sex extortion is. I mean, it's certainly on the rise. We've seen that reflected in statistics from the National Center for Missing and Exploited Children. They publish, you know, annual statistics about their reporting every year and it's absolutely on the rise. We've seen really disturbing, actually, handbooks on from the dark web about how to successfully run a sextortion scheme, like details down to like, you know, how to ask certain questions or like how to build a relationship. So I think it can't be, you know, overstated that these harms exist and are real and are out there. So it's extremely important to be vigilant and that's why we need these changes in place too, because sometimes, as parents I'm sure it's, you know hard to know what's up against your kid.

Speaker 1:

Well, and I think most parents would agree. You know it is our responsibility, but we need help. We need help and I think that's where COSA comes in, and it's helping the parents navigate all of these online challenges that we're faced with.

Speaker 2:

Parents and tech for good is what we want to see.

Speaker 1:

So there's a term floating around that I've been seeing, and I want you to explain it to our listeners and to me image-based sexual abuse, ibsa. Tell us what that means.

Speaker 2:

Yeah, so, like you said, IBSA is kind of what we refer to it in casual conversation, but it is essentially the sexual violation of a person that's committed through abuse, exploitation or weaponization of any image depicting that person. So it includes creation, distribution, theft, extortion or any use of an image for a sexual purpose without the meaningful consent of the person depicted. Could this?

Speaker 1:

include nude photos that were willfully exchanged but then being used for revenge.

Speaker 2:

Yes, yep, that's included in the definition.

Speaker 1:

And it would also include, obviously, any sextortion cases, any grooming cases, where kids are manipulated into sending nudes to a predator online, that sort of thing.

Speaker 2:

Yes, that's right.

Speaker 1:

So tell me also. We just did a show on AI deep fakes and nudes and how AI is changing the conversation about this. Can you speak into that?

Speaker 2:

is so prevalent. In creating these deep fakes, there's also the chance that you know victims of it just don't care about it, or you know are less likely to be affected in all the negative ways because it's so prevalent. That was something interesting I hadn't thought of before, but AI has had, you know, a profound effect on IBSA. It just means that you know anyone's at risk. It can put anyone at risk of having you know fake sexually explicit deep fakes created of them and because these AI tools are so new and because tech in this space is moving so quickly, the laws are really catching up to be able to address this and ensure that victims have, you know, accessible routes for justice when they've been violated in this way.

Speaker 1:

So what happens right now? Say, somebody takes my image and creates a pornographic video of me using AI. What are the legal steps that I could take right now? Like what? What would I do?

Speaker 2:

Yeah. So I'll just say now it's different, for if you're a minor versus if you're an adult. Um, so if you're an adult, uh, it's currently not a federal crime to upload non-consensual, sexually explicit content to the internet, but there is a civil cause of action. So if you were to say this image, this video, was taken of me you know, I didn't consent to this or if deepfake was created of me, that's not actually me. Most websites have a way you can report that as violating their terms of service. So then, really, the only remedy at that point is to follow up with the website and hope it's removed or, in certain cases, you know, file a civil lawsuit. But that's, you know, obviously not the best route for a lot of victims. It's not convenient, it's time consuming, it's expensive and just isn't the most practical.

Speaker 1:

And there's research that has to happen to figure out who you're suing in that civil lawsuit, and so that's a whole challenge as well, correct?

Speaker 2:

Right, Absolutely. A lot of times images are, you know, taken and uploaded and explode and go everywhere and you can't find, you know, first the person who created it and then a lot of times, you know, the distribution is just so widespread it's hard to pinpoint anything.

Speaker 1:

Lot of times. You know, the distribution is just so widespread it's hard to pinpoint anything. That's so scary for people to think that you know, I could have an AI generated video uploaded to Pornhub and I could try and contact them to tell them to take it down. It's not me and I can try and civilly sue them. But in the meantime my life is kind of ruined, tainted, because if people saw that they would think it was possibly me because AI technology is so real.

Speaker 2:

Right, absolutely, and the harms are the same too. Victims of deep fakes experience, you know, mental health issues. They experience depression, ptsd, traumatization, high levels of anxiety. You know fear that people are going to see the video and recognize them. It's oftentimes the same as if you know they were an actual victim of a sexual crime. I mean, this is a sexual crime, I should say a sexual crime.

Speaker 1:

I mean this is a sexual crime. I should say Absolutely Humiliating and traumatic for sure. I can totally see the PTSD happening when you're a victim of that. So what is being done at the federal level to pass a law federally for IBSA for adults We'll get to the kid portion in a moment for IBSA?

Speaker 2:

for adults. We'll get to the kid portion in a moment. Yeah, so there's actually a number of really great bills that some of them passed the Senate. One of them is called the Take it Down Act and it does criminalize the uploading of IBSA. So it makes that very explicit. And then it also requires platforms to have IBSA reporting requirements in place, something more, I guess, narrow and just specifically for IBSA, so you wouldn't just be reporting. You know this content is offensive, it's like I'm a victim of IBSA and then it has requirements for the platform to take it down in a swift measure. So that's kind of called like the report and remove requirement for the Take it Down Act.

Speaker 2:

And then there's another bill called the Defiance Act and that creates a civil remedy for victims who are the victims of a deepfake, essentially so that one explicitly applies to deepfakes although the argument has been made that IBSA includes deep fakes already in its definition but that one makes it really explicit it applies to deep fakes. And then that also has privacy protections for victims in court during discovery, so their identities are more protected. And the last one that I'll just mention is called the SHIELD Act, so that also establishes federal criminal punishments for those who share explicit or private nude images without consent, so that one also fills in existing gaps, and it addresses sextortion scams as well. So there's kind of a patchwork of a lot of bills out there right now that are addressing deep fakes specifically, and this would apply, you know, across the board, to adults as well as children.

Speaker 1:

Is there one of these? You mentioned the Take it Down Act first. Is that the one that's gaining the most traction? You said it had already passed the Senate. That's right, and how hopeful are we that that's going to pass the House?

Speaker 2:

If I'm remembering correctly, I think the Take it Down Act hasn't garnered as much attention in the House. I can't speak yet to whether it's passed committee or not, but that one is the most prominent in the Senate as it did pass.

Speaker 1:

I think this is just a reminder that these laws are so good but it takes so much time in getting everybody moving in the same direction. I assume there's bipartisan support for this.

Speaker 2:

Yes, absolutely.

Speaker 1:

And anyone who is against this. Are there reasons for?

Speaker 2:

that. So anything that has to do with criminalizing some offices just don't want to see. They don't want to support a bill that would put people in jail, you know, regardless of what the crime is that they're committing. So it's more of things like that. Also, in an election year, I think it's just very unpredictable with Congress to know what's going to happen, to know what's going to gain traction and what isn't. But one positive thing I will say is that 49 states have state laws that criminalize IBSA. Some even specifically mentioned deepfakes. So even though we don't have, you know, a specific federal bill addressing deepfakes, states are they have the ability to move very quickly to respond to these needs. So you're not completely out of luck. If you know you're the victim of a deepfake, almost every single state has laws to address that and give a remedy of some sort.

Speaker 1:

Okay, so it's. So it's really state law and we're working on that federal law, and so if you've been a victim, it's probably a call to your local emergency non-emergency number, correct?

Speaker 2:

Yeah, absolutely. I don't have the provisions of all the states in front of me, but I think the majority of them are criminal statutes, so it'd be criminalizing the person who created it, uploaded it, distributed it, that sort of thing.

Speaker 1:

So criminally you could go state law, and then civilly you could go federal law right now, and then hopefully we're going to have a federal law criminalizing it nationwide. Okay, okay, great. Now we talked about image-based sexual abuse, ibsa, and that was for anyone adults really but it's different when we're talking about child sexual abuse material. Now, that is different because it's minor, so we're talking about anyone under 18 here. What is the federal law for that?

Speaker 2:

The federal law is very clear and federally it's still called child pornography. But we don't use that term because pornography implies consent of some sort and if you're a minor you legally can't consent. So we call it CSAM to more accurately describe it. But technically federal law it's still child pornography, but it's defined as any visual depiction of sexually explicit conduct involving a minor. So very simple, it includes photographs, videos, computer generated images, deep fakes, adapted, modified, you know, taking someone's body and putting a different face on them. Taking someone's face, putting them on a different body. So that is kind of the definition of CSAM. And what's criminalized is possession, production, distribution, reception, sharing across state lines, which includes posting on the Internet, texting somebody a photo, things like that. So federal law for child sexual abuse material is quite black and white, I would say, and quite strong.

Speaker 1:

So we've got a minor say, a 16-year-old kid takes their own nude photo, shares it with a boyfriend, girlfriend. What could they be charged with for sharing their own nude photo?

Speaker 2:

Yeah, so legally that does still fall under the definition of child pornography, but again you would have to have, you know, a prosecutor bringing that case against them, and that's not something in my work that I've seen. So I mean, if you want to be technical, like that image does constitute child pornography, but as far as if they're going to be criminally charged, I can't really speak to that with any certainty.

Speaker 1:

Well, I think what we see is law enforcement's overwhelmed overwhelmed with online crimes, and so they have to take the most severe cases. So, when we've got a child missing with an online crime or deceased because of an online crime, that's going to take precedence over this and, just like schools and parents, we're all overwhelmed with dealing with these digital crimes that we're seeing. Okay, so, in this scenario, kid shares nude. Now the recipient of that airdrops it or post it. Now we've moved into a different conversation, correct?

Speaker 2:

Yeah, so you've gone from, you know, just possession to distribution and reception. So that's also included under the statute.

Speaker 1:

And I think we see those crimes get prosecuted more than a minor who shares their own, even though that could be the case. We see that less and less.

Speaker 2:

Yeah, absolutely. Media or hope yeah, hopefully that wouldn't happen, but that, you know, draws more attention to it and then more people can report it. Maybe there's an investigation opened. Things like that that are more likely to happen than if it's just you know, existing on someone's phone.

Speaker 1:

Yeah, and then you even said the distribution across state lines, if you post it on certain platforms.

Speaker 2:

Yeah, that deals a little bit with what's called interstate commerce, but basically commonly recognize that posting on social media that crosses state lines, because me in one state and a person in another state will have access to that material.

Speaker 1:

This has been so helpful. What else would you want to say to parents out there? You know we we deal with a lot of nude photos over at Next Talk and just a lot of them between peers, you know, and they're they're figuring out. This is not okay to do, but sometimes it's a predator, it's a grooming situation. Are there any tips or just practical advice that you would like to pass on to parents?

Speaker 2:

Obviously, just having open conversations with your kids about those harms. I think it never hurts to have the reminder that once you take a photo, it exists forever, even if you delete it, even if you send it on Snapchat to somebody. You don't know who else is in the room taking a photo of that phone or taking a video of it. So just be very mindful of the type of images you're sending. Mindful of the type of images you're sending. Not everyone who tries to friend you on social media is who they say they are, because that's a lot of time. What we see with sextortion schemes is oh, I'm a friend from your friend's school, we've never met, but do you want to talk on Snapchat? And then things escalate from there.

Speaker 1:

I'm really glad you said that. We see that a lot with the sextortion cases. We work too, and predators have just gotten very smart about that. Because, your kid, we talk a lot about online strangers with our kids and so when these predators come in with hey, I know so-and-so on your friends list, it automatically lowers their guard down, lowers their guard down and is and makes it so easy for the predator then to um, to manipulate your kid because you think, oh, this is not a stranger, this is somebody who knows so-and-so.

Speaker 2:

Right, exactly, and they've gotten, unfortunately, really creative. They've learned how you know, they've learned lingo that kids use, they've learned how to conversate and things that they talk about, things that are popular to them. So it's very easy to disguise yourself as a kid or a friend when they're actually not.

Speaker 1:

You were saying that there's stuff on the dark web telling people how to do this, how to run a successful scheme.

Speaker 2:

Yeah, it's disturbing and I think we've seen too that sextortion is unique because a lot of times it happens very quickly. So in some cases I've heard of it's even like overnight. They friend the person and then the extortion happens like hours later, and I think that's one of the reasons why it's so, you know, alarming and concerning is because a lot of times it's not this slow, grooming over time. Sometimes it is. But you know the sex extortion handbook that you know we saw on the dark web talked about that. It's like what are you trying to get out of your victim? Are you trying to get them to be like a long-term thing? Or, you know, just get a couple hundred dollars if they have it really quickly. So, yeah, these predators are really smart and you know they know how to use the platforms very well. And yeah, circling back, I think that's why we really need COSAs. It's going to prevent adults from friending kids on platforms like that.

Speaker 1:

That has been a shift I've seen in the last couple years actually the last three years on how fast the sextortion schemes are happening just a couple hours and then we've got a deceased child because there's been so much emotional blackmail and humiliation and you know these biological responses to what they're being shown on a screen and I can't imagine, as a young kid, being faced with that all alone in your room at night and feeling like your world is over, like everybody's going to see. This video that was just recorded of me, and so that is where I am super passionate about is making sure parents are aware that that is happening, because it is not just one case here and one case there.

Speaker 2:

It's, it's concerning and it's scary, and I think too, you know, kids need to know what to do in that situation. They need to know that their parent is like a safe person that they can go to and not worry about. You know what their peers are going to think if the extorter sends those photos out. So it's hard to. I mean it happens at a vulnerable age for a lot of kids, when they're already feeling insecure or you know self conscious or they care a lot about you know what their friends and peers think of them. I think that's like natural to happen, but, um, you know they need to know that, um, I mean, law enforcement can get involved and, um, there, yeah, there are ways to prevent that from happening.

Speaker 1:

Yeah, and I think it's just so important parents you know, tell your kids, even if you make a mistake on your phone, like I'm here for you, I'm always your safe place. Don't ever keep sending, keep engaging. Don't one of the things I always tell parents. Tell your kids don't ever send money If, if somebody online is demanding money from you, red flag alert you got to tell your parents right away, Cause it's only going to get worse Anything else that you would say. So to wrap this up, ibsa is really concerning adults. There's no federal criminalization law in place. We're working on it with the take it down and some other acts, but there is civil recourse, but there is state laws in place in most states. Yes, that's right, correct. And then the CSAM, which is Child Sexual Abuse Materials, and that's anyone under 18, there is good federal laws in place for that, for the possession, distribution of that, and so our kids need to be aware that there are legal consequences for sharing somebody's nude photo.

Speaker 2:

Yeah, a few things I'll just add. So the IBSA laws, or just IBSA generally, I should say, is a bit more broad, so it encompasses CSAM. I think that's a good way to look at it. And then, like you said, csam applies for minors under 18. I would just add to that there are civil remedies for CSAM. So even though it's criminalized, there are federal civil remedies for victims of CSAM as well.

Speaker 1:

Okay, so criminalization and civil for CSAM victims. Yeah, okay, this is great. Tori, I am so thankful for your wisdom and for you breaking it down in real, simple terms for us and answering our questions. You're welcome back at Next Talk, anytime. It's been a pleasure to talk to you Well, thank you so much for having me.