WEBVTT 00:00:17.960 --> 00:00:22.440 >>...safety for many of Snap's online safety strategy 00:00:22.440 --> 00:00:24.440 engagement with external partners. 00:00:24.440 --> 00:00:25.720 We also welcome Emily Mulder, 00:00:25.720 --> 00:00:29.400 program director of the Family Online Safety Institute, 00:00:29.400 --> 00:00:31.040 which is an international organization 00:00:31.040 --> 00:00:33.160 working with parents, industry and communities 00:00:33.160 --> 00:00:36.000 to promote the culture of online responsibility 00:00:36.000 --> 00:00:39.200 and encourage a sense of digital citizenship for all. 00:00:39.200 --> 00:00:40.920 We also welcome Steffie Rapp 00:00:40.920 --> 00:00:43.000 who is a juvenile justice specialist 00:00:43.000 --> 00:00:44.440 at the Office of Juvenile Justice 00:00:44.440 --> 00:00:45.800 and Delinquency Prevention 00:00:45.800 --> 00:00:47.800 at the U.S. Department of Justice. 00:00:47.800 --> 00:00:50.200 She leads the Office of National Initiatives 00:00:50.200 --> 00:00:52.840 in Cyberbullying and Youth Violence Prevention, 00:00:52.840 --> 00:00:54.800 which includes the Internet Crimes 00:00:54.800 --> 00:00:56.600 against Children Passports Program, 00:00:56.600 --> 00:00:58.440 Prevention of Youth Hate Crimes 00:00:58.440 --> 00:01:00.320 and Identity-based Bullying Initiative. 00:01:00.320 --> 00:01:02.200 And serves on the Federal Interagency 00:01:02.200 --> 00:01:05.280 Stop Bullying.gov. Editorial Board. 00:01:05.280 --> 00:01:07.800 I'm also pleased to introduce Dr. Melissa Mercado, 00:01:07.800 --> 00:01:09.160 lead behavioral scientist 00:01:09.160 --> 00:01:10.840 at the Division of Violence Prevention, 00:01:10.840 --> 00:01:12.040 which is part of the Centers 00:01:12.040 --> 00:01:13.760 for Disease Control and Prevention 00:01:13.760 --> 00:01:16.680 with the U.S. Department of Health and Human Services 00:01:16.680 --> 00:01:19.600 and is an expert in youth violence and bullying prevention 00:01:19.600 --> 00:01:21.360 and is also on the Federal Interagency 00:01:21.360 --> 00:01:24.120 Stop Bullying.gov Editorial Board. 00:01:24.120 --> 00:01:27.800 And we welcome Emilia Vance who is vice president of Youth 00:01:27.800 --> 00:01:30.920 and Education Privacy at the Future of Privacy Forum 00:01:30.920 --> 00:01:33.720 and the leading expert on child and student privacy laws 00:01:33.720 --> 00:01:37.000 and best practices and oversees the Student Privacy Compass 00:01:37.000 --> 00:01:39.080 which is a student privacy resource center 00:01:39.080 --> 00:01:41.800 hosted by the Future of Privacy Forum. 00:01:41.800 --> 00:01:44.400 And as the panel moderator today, 00:01:44.400 --> 00:01:45.920 please welcome, Melanie Smith, 00:01:45.920 --> 00:01:47.880 the head of the Digital Analysis Unit 00:01:47.880 --> 00:01:50.320 at the Institute of Strategic Dialogue 00:01:50.320 --> 00:01:52.760 and the leader of a great team of analysts and researchers 00:01:52.760 --> 00:01:54.920 at the ISD that combine large scale 00:01:54.920 --> 00:01:56.760 social media data collection 00:01:56.760 --> 00:01:59.080 and advanced open source investigation techniques 00:01:59.080 --> 00:02:02.280 to study disinformation, hate and extremism. 00:02:02.280 --> 00:02:04.680 So thank you to all the experts here today 00:02:04.680 --> 00:02:08.880 and Melanie thank you so much, I will throw it over to you. 00:02:08.880 --> 00:02:10.400 >>Great. Thank you so much. 00:02:10.400 --> 00:02:14.400 Hello everybody. Very excited to get started. 00:02:14.400 --> 00:02:16.320 I'm only going to take a few minutes 00:02:16.320 --> 00:02:18.760 because as you just heard our panelists 00:02:18.760 --> 00:02:21.760 are incredibly qualified to speak about these issues 00:02:21.760 --> 00:02:24.520 and you are here to hear from them. 00:02:24.520 --> 00:02:26.680 However, I will start laying out 00:02:26.680 --> 00:02:28.680 where I think we're at with online harms 00:02:28.680 --> 00:02:29.920 and I'll touch upon 00:02:29.920 --> 00:02:31.840 where I think we're at with some of the approaches 00:02:31.840 --> 00:02:33.480 to mitigate some of these harms. 00:02:33.480 --> 00:02:36.600 And then we'll open up a little bit for discussion. 00:02:36.600 --> 00:02:39.680 So first on the online threat picture we're looking at 00:02:39.680 --> 00:02:42.000 I want to start by saying that I do think 00:02:42.000 --> 00:02:44.600 it's worth keeping in mind throughout this discussion 00:02:44.600 --> 00:02:47.400 that there is an incredibly positive side 00:02:47.400 --> 00:02:49.600 to the internet and social media. 00:02:49.600 --> 00:02:51.720 This might seem like an obvious thing to say, 00:02:51.720 --> 00:02:53.600 but I feel like at that moment, 00:02:53.600 --> 00:02:56.400 particularly we don't hear about this all that much. 00:02:58.080 --> 00:02:59.720 So just bearing that in mind 00:02:59.720 --> 00:03:02.680 and as many of you will know the access to information 00:03:02.680 --> 00:03:06.080 and now to school and broader learning is of huge benefit 00:03:06.080 --> 00:03:08.840 to millions of children and adults. 00:03:08.840 --> 00:03:11.720 However, we really have the full spectrum of online 00:03:11.720 --> 00:03:14.200 harms to consider and discuss here today. 00:03:14.200 --> 00:03:16.560 So these range from exploitation, 00:03:16.560 --> 00:03:20.200 bullying and intimidation, access to inappropriate content, 00:03:20.200 --> 00:03:22.920 abuse of private data and disinformation, 00:03:22.920 --> 00:03:27.120 extremism, hate speech, radicalization. 00:03:27.120 --> 00:03:30.120 We at the ISD have been studying those last few 00:03:30.120 --> 00:03:32.840 in the online context for the last ten years 00:03:32.840 --> 00:03:35.000 and while we don't simply look at volume 00:03:35.000 --> 00:03:37.280 of that type of content over time, 00:03:37.280 --> 00:03:39.520 the threat presented by disinformation, 00:03:39.520 --> 00:03:41.480 extremist content on social media 00:03:41.480 --> 00:03:45.120 has become much more complex over the last decade. 00:03:45.120 --> 00:03:47.840 These threats are becoming increasingly hybrid 00:03:47.840 --> 00:03:49.400 and that is particularly concerning 00:03:49.400 --> 00:03:51.840 in the context of the Covid-19 pandemic 00:03:51.840 --> 00:03:54.560 where trusted institutions and authorities 00:03:54.560 --> 00:03:58.000 as you know is incredibly important. 00:03:58.000 --> 00:04:00.400 So as a note more broadly on Covid 00:04:00.400 --> 00:04:01.760 because I think it's very important 00:04:01.760 --> 00:04:03.720 to the discussion that we're having today, 00:04:03.720 --> 00:04:07.600 I think we all know and feel the impacts of the pandemic 00:04:07.600 --> 00:04:10.600 on our relationship with technology and with devices 00:04:10.600 --> 00:04:12.360 and here we all are on a Zoom call 00:04:12.360 --> 00:04:15.840 with hundreds of people as testament to that. 00:04:15.840 --> 00:04:17.320 However, as some of our panelists 00:04:17.320 --> 00:04:19.200 will no doubt note throughout the discussion 00:04:19.200 --> 00:04:22.240 is that it has had an outsized impact 00:04:22.240 --> 00:04:25.600 on children and young adults for a number of reasons. 00:04:25.600 --> 00:04:27.800 And unfortunately in many online settings 00:04:27.800 --> 00:04:30.800 we see groups and movements looking to leverage both 00:04:30.800 --> 00:04:35.320 this kind of wider environment of uncertainty and fear 00:04:35.320 --> 00:04:39.040 and our increasing reliance on community spaces online 00:04:39.040 --> 00:04:41.040 for their own benefit. 00:04:41.040 --> 00:04:42.800 To give a few examples a study 00:04:42.800 --> 00:04:45.320 that we published last month analyzed data 00:04:45.320 --> 00:04:48.960 from over 200 right wing extremist channels on telegram 00:04:48.960 --> 00:04:51.600 and found that many of the most egregious of those channels 00:04:51.600 --> 00:04:55.160 are successfully attracting new audiences by amplifying 00:04:55.160 --> 00:04:57.400 and spreading Covid-19 content, 00:04:57.400 --> 00:05:00.720 specifically misinformation about the virus. 00:05:00.720 --> 00:05:02.840 ISC's also produced extensive studies 00:05:02.840 --> 00:05:06.600 on the presentation of online harms within gaming communities 00:05:06.600 --> 00:05:09.000 and these are focused around a set of research questions 00:05:09.000 --> 00:05:10.360 that include an assessment 00:05:10.360 --> 00:05:12.400 of the nature of extremist content 00:05:12.400 --> 00:05:14.880 on social gaming platforms like Disgorge, 00:05:14.880 --> 00:05:16.800 Twitch, D-Live. 00:05:16.800 --> 00:05:19.480 We found a very strong extremist presence on Stream 00:05:19.480 --> 00:05:21.120 and D-Live, in particular, 00:05:21.120 --> 00:05:24.280 with overly racist and neo Nazi content relatively easy 00:05:24.280 --> 00:05:27.240 to find and influencers operating on those platforms 00:05:27.240 --> 00:05:29.000 with very large audiences. 00:05:29.000 --> 00:05:31.200 So just to get us thinking about some of the spaces 00:05:31.200 --> 00:05:32.520 that we're finding this content in 00:05:32.520 --> 00:05:34.800 before we open up the discussion. 00:05:34.800 --> 00:05:37.280 And in terms of where we're at with designing approaches 00:05:37.280 --> 00:05:41.400 to protect users from and prevent online harms 00:05:41.400 --> 00:05:44.000 there is again a very broad spectrum of programming 00:05:44.000 --> 00:05:46.920 that spans from direct interventions with individuals 00:05:46.920 --> 00:05:50.360 that are deemed to be at risk of any of these threats 00:05:50.360 --> 00:05:53.400 to upstream educational programming 00:05:53.400 --> 00:05:56.280 which tend to be built on the principles of digital literacy 00:05:56.280 --> 00:05:58.720 and digital citizenship. 00:05:58.720 --> 00:06:00.600 I'll leave the in-depth assessment 00:06:00.600 --> 00:06:02.440 of some of those approaches to the panelists 00:06:02.440 --> 00:06:04.240 and to my colleague, Jenny King 00:06:04.240 --> 00:06:06.240 who's speaking on the next panel. 00:06:06.240 --> 00:06:08.040 But suffice to say that we consider 00:06:08.040 --> 00:06:12.800 those programs as vital, but we have seen the limitations 00:06:12.800 --> 00:06:14.800 where they're relied upon for prevention. 00:06:14.800 --> 00:06:16.040 And what we found at ISD 00:06:16.040 --> 00:06:18.800 is that these upstream programs need to be complemented 00:06:18.800 --> 00:06:21.200 by some element of applied learning 00:06:21.200 --> 00:06:22.720 and really that the best way to do this 00:06:22.720 --> 00:06:25.160 is often to have individuals exploring 00:06:25.160 --> 00:06:27.400 their own sense of civic action 00:06:27.400 --> 00:06:30.440 and making it a more personal endeavor. 00:06:30.440 --> 00:06:32.880 We think of these as what can be taught versus 00:06:32.880 --> 00:06:36.160 what can be facilitated and what can be encouraged. 00:06:36.160 --> 00:06:37.480 And I think one of the big questions 00:06:37.480 --> 00:06:39.440 that we'll want to get into with the panelists 00:06:39.440 --> 00:06:42.800 is how we encourage that broader cultural shift 00:06:42.800 --> 00:06:44.400 so that there is a different relationship 00:06:44.400 --> 00:06:47.800 between kids and young adults with social media. 00:06:47.800 --> 00:06:49.120 So leading on from that 00:06:49.120 --> 00:06:51.760 I'll put a similar question to our panelists 00:06:51.760 --> 00:06:54.600 so that we can get started with the discussion, 00:06:54.600 --> 00:06:57.000 which is what is the real relationship 00:06:57.000 --> 00:06:59.920 between social media and social connection 00:06:59.920 --> 00:07:02.120 and how do you see that in your lines of work? 00:07:07.000 --> 00:07:08.560 >>Hi. 00:07:08.560 --> 00:07:09.560 >>Go ahead Melissa. 00:07:12.000 --> 00:07:15.600 >>I can get us started. Thank you for that introduction, 00:07:15.600 --> 00:07:17.440 Melanie and that that background, 00:07:17.440 --> 00:07:20.200 certainly got me thinking of a lot of things 00:07:20.200 --> 00:07:23.400 and I was writing notes myself while you were talking. 00:07:23.400 --> 00:07:25.640 For many youth talking about youth and children, 00:07:25.640 --> 00:07:27.640 the online environment is a different extension 00:07:27.640 --> 00:07:29.000 of their offline world. 00:07:29.000 --> 00:07:31.560 It's one altogether. 00:07:31.560 --> 00:07:32.880 They are the new playground, 00:07:32.880 --> 00:07:34.160 they are the new learning ground, 00:07:34.160 --> 00:07:35.400 they're in a new work place, 00:07:35.400 --> 00:07:38.800 they're a place of worship and entertainment outlet. 00:07:38.800 --> 00:07:41.800 So we need to understand that foremost, 00:07:41.800 --> 00:07:43.840 understand the impact and the relationship 00:07:43.840 --> 00:07:45.840 between social media social connection. 00:07:45.840 --> 00:07:47.680 They're having relationships there. 00:07:47.680 --> 00:07:49.600 They're having connectiveness there with others 00:07:49.600 --> 00:07:51.200 which can be very protective, 00:07:51.200 --> 00:07:54.160 which can be very encouraging and can help them grow. 00:07:54.160 --> 00:07:56.200 They're also developing skills 00:07:56.200 --> 00:07:57.600 depending on the type of engagement 00:07:57.600 --> 00:07:59.720 that they're doing in these social media spheres, 00:07:59.720 --> 00:08:02.800 you mentioned gaming, also in the gaming sphere 00:08:02.800 --> 00:08:06.080 there's a lot of collaboration that happens. 00:08:06.080 --> 00:08:08.040 And they're online we need to acknowledge 00:08:08.040 --> 00:08:10.400 that it's not something that used to happen 00:08:10.400 --> 00:08:12.600 and let's see which kids are online. 00:08:12.600 --> 00:08:14.440 We saw it even before the pandemic. 00:08:14.440 --> 00:08:18.200 And now we see it even now that online environments, 00:08:18.200 --> 00:08:21.840 social media became the platform for social connection, 00:08:21.840 --> 00:08:23.240 especially during those times 00:08:23.240 --> 00:08:26.600 where we had to socially distance or physically distance 00:08:26.600 --> 00:08:29.360 because of shelter in place or stay at home orders. 00:08:29.360 --> 00:08:32.560 It became the way in which our youth connected for learning, 00:08:32.560 --> 00:08:34.640 but also the way in which they connected with their loved ones, 00:08:34.640 --> 00:08:36.080 with their family and friends. 00:08:39.000 --> 00:08:42.080 >>Hey, Melissa. I can add to that. 00:08:42.080 --> 00:08:44.800 Developmentally I've been looking 00:08:44.800 --> 00:08:48.000 at the developmental aspects of online learning. 00:08:48.000 --> 00:08:49.240 And young adolescents 00:08:49.240 --> 00:08:52.280 are starting to separate individually from their parents. 00:08:52.280 --> 00:08:54.400 So at this stage their peers who they hang out 00:08:54.400 --> 00:08:56.600 with become extremely influential. 00:08:56.600 --> 00:08:57.960 At the same time they're searching 00:08:57.960 --> 00:08:59.840 for a sense of identity and belonging. 00:08:59.840 --> 00:09:02.200 So if they spend a lot of time on technology 00:09:02.200 --> 00:09:04.360 they could be looking to fill that sense of identity 00:09:04.360 --> 00:09:05.920 and belonging online. 00:09:05.920 --> 00:09:08.400 If they find a positive prosocial peer group online 00:09:08.400 --> 00:09:09.880 with kids who enjoy similar things 00:09:09.880 --> 00:09:12.200 like Melissa's talking about gaming, 00:09:12.200 --> 00:09:14.240 talking, they're supportive 00:09:14.240 --> 00:09:17.160 and there isn't cyberbullying involved online activities 00:09:17.160 --> 00:09:19.600 can be a great resource 00:09:19.600 --> 00:09:23.600 and be developmentally socially fulfilling and appropriate. 00:09:23.600 --> 00:09:26.800 For kids who live in geographically isolated areas 00:09:26.800 --> 00:09:29.720 or like Melissa said during the pandemic positive online 00:09:29.720 --> 00:09:32.200 social interactions can ward off loneliness. 00:09:32.200 --> 00:09:34.120 On the negative side of the issue 00:09:34.120 --> 00:09:36.360 as we started to hear already there are gangs, 00:09:36.360 --> 00:09:37.600 sexual predators and hate groups 00:09:37.600 --> 00:09:40.200 that are attempting to groom kids to exploit them, 00:09:40.200 --> 00:09:42.080 victimize them, join them, 00:09:42.080 --> 00:09:45.000 radicalize them or pressure them to commit criminal acts. 00:09:45.000 --> 00:09:48.280 Online predators often forge a fake social connection 00:09:48.280 --> 00:09:49.760 in order to gain the trust of the youth, 00:09:49.760 --> 00:09:52.000 so I think it's extremely important 00:09:52.000 --> 00:09:54.280 that parents start the conversation early with kids 00:09:54.280 --> 00:09:56.720 so they know how to navigate dangers and benefit 00:09:56.720 --> 00:09:59.320 from the positive aspects of online activity. 00:09:59.320 --> 00:10:01.800 And, of course, I think as parents or guardians 00:10:01.800 --> 00:10:04.040 we need to continuously closely monitor 00:10:04.040 --> 00:10:05.400 who kids are communicating with 00:10:05.400 --> 00:10:08.280 because we want them to have positive and healthy outcomes. 00:10:11.000 --> 00:10:12.440 >>I definitely agree with all of that. 00:10:12.440 --> 00:10:14.760 Good afternoon everybody. 00:10:14.760 --> 00:10:19.200 I think Covid has put a really big bullet on us 00:10:19.200 --> 00:10:22.520 on top of the entire issue of social media and connection 00:10:22.520 --> 00:10:25.000 for all of the reasons that both of you just went over. 00:10:25.000 --> 00:10:26.360 I think those who have, you know, 00:10:26.360 --> 00:10:28.360 they've been at home doing virtual learning, 00:10:28.360 --> 00:10:30.160 riding out this pandemic with their families. 00:10:30.160 --> 00:10:32.360 And, of course, they were already using digital spaces 00:10:32.360 --> 00:10:33.600 to come together, 00:10:33.600 --> 00:10:37.160 but especially now not being able to do that in person. 00:10:37.160 --> 00:10:39.200 Anecdotally just to go to your point Melanie 00:10:39.200 --> 00:10:42.080 about there being some aspects of positive use 00:10:42.080 --> 00:10:44.800 amidst some of the other things that we've heard, 00:10:44.800 --> 00:10:46.880 anecdotally we've heard about students going off 00:10:46.880 --> 00:10:50.600 and creating a discord group on topics with their school class, 00:10:50.600 --> 00:10:53.120 but independent of an organized classroom activity 00:10:53.120 --> 00:10:55.880 to sort of replicate that experience of a group 00:10:55.880 --> 00:10:58.640 or extracurricular space. 00:10:58.640 --> 00:10:59.680 I think there was also, 00:10:59.680 --> 00:11:01.760 there was a report in 20/20 from Roblox 00:11:01.760 --> 00:11:03.560 that came out about midway through the year 00:11:03.560 --> 00:11:06.040 showing that kids were spending just as much 00:11:06.040 --> 00:11:09.880 if not more time with their real life friends on online platforms 00:11:09.880 --> 00:11:12.560 and how important that was to just sort of mitigate loneliness 00:11:12.560 --> 00:11:13.800 and isolation, 00:11:13.800 --> 00:11:15.480 especially during those initial periods 00:11:15.480 --> 00:11:18.080 of really stringent lockdown. 00:11:18.080 --> 00:11:21.040 So I think it's not to say that everyone's behavior 00:11:21.040 --> 00:11:23.200 was 100 percent perfect during this time, 00:11:23.200 --> 00:11:25.520 because of course the uncertainty and the disruption 00:11:25.520 --> 00:11:27.960 that everyone experienced comes with a lot of highs and lows, 00:11:27.960 --> 00:11:30.360 but ultimately the presence of social media 00:11:30.360 --> 00:11:33.800 was really important for keeping an avenue open for connection 00:11:33.800 --> 00:11:35.400 when kids needed it the most. 00:11:40.400 --> 00:11:42.200 >>If I can jump in. 00:11:42.200 --> 00:11:45.600 I would just note something that Emily said with respect 00:11:45.600 --> 00:11:49.640 to real friends and communicating with real friends. 00:11:49.640 --> 00:11:50.920 I think it's important to note 00:11:50.920 --> 00:11:53.440 that not all of these social platforms 00:11:53.440 --> 00:11:55.400 are kind of created equally. 00:11:55.400 --> 00:11:58.080 So coming from Snap Chat I've only been here four months, 00:11:58.080 --> 00:11:59.840 but what I've learned in those four month 00:11:59.840 --> 00:12:02.400 is Snap is very different. 00:12:02.400 --> 00:12:04.640 It self describes as a camera company 00:12:04.640 --> 00:12:07.000 and it's centered on and built around the camera 00:12:07.000 --> 00:12:10.680 that one is on one's cell phone or one's other device. 00:12:10.680 --> 00:12:14.240 When you open Snap Chat it opens to the camera 00:12:14.240 --> 00:12:17.320 so it's really designed for communications 00:12:17.320 --> 00:12:19.440 between and among close friends, 00:12:19.440 --> 00:12:21.080 meaning people that you know in real life. 00:12:21.080 --> 00:12:23.000 So it was very active during the pandemic 00:12:23.000 --> 00:12:25.520 and continues to be to this day. 00:12:25.520 --> 00:12:30.480 Not just anyone can contact you on Snap Chat. 00:12:30.480 --> 00:12:32.160 First they need to know your user name 00:12:32.160 --> 00:12:34.880 and if the know your user name and they reach out to you 00:12:34.880 --> 00:12:38.200 you have to affirmatively accept that person as a friend. 00:12:38.200 --> 00:12:40.400 So there has to be mutual acceptance 00:12:40.400 --> 00:12:42.000 and reciprocated friendship 00:12:42.000 --> 00:12:45.600 before receiving photos and videos or text messages. 00:12:45.600 --> 00:12:49.560 And there are no public friend lists on Snap Chat, 00:12:49.560 --> 00:12:52.360 nor are there any peer validation metrics, 00:12:52.360 --> 00:12:56.440 like the Likes or the Followers or the Public Comments. 00:12:56.440 --> 00:12:59.480 Snap Chat's probably most known for its ephemeral messages, 00:12:59.480 --> 00:13:02.880 which are really delete by default messages. 00:13:02.880 --> 00:13:05.200 The primary motivation for this feature is again 00:13:05.200 --> 00:13:08.000 to mimic real life conversations. 00:13:08.000 --> 00:13:11.400 Again, going back to that real life friends' model, 00:13:11.400 --> 00:13:13.800 because in real life there are no verbatim transcripts 00:13:13.800 --> 00:13:16.200 of dialogues between and among friends. 00:13:16.200 --> 00:13:19.040 And to guard against things like bullying or hate speech 00:13:19.040 --> 00:13:20.640 or other harmful content 00:13:20.640 --> 00:13:22.800 we have policies, we have product features, 00:13:22.800 --> 00:13:25.960 we have messaging and programs in place 00:13:25.960 --> 00:13:28.640 that prevent, detect, remove 00:13:28.640 --> 00:13:31.200 and report illegal and harmful activity. 00:13:33.760 --> 00:13:35.080 >>Thanks, Jaqueline. 00:13:35.080 --> 00:13:36.800 Emilia did you want to come in on this 00:13:36.800 --> 00:13:39.560 from a privacy point of view? 00:13:39.560 --> 00:13:42.040 >>I don't know that I have too much to add. 00:13:42.040 --> 00:13:46.360 Everyone has put in so much already. 00:13:46.360 --> 00:13:50.200 I'll just say a part of privacy 00:13:50.200 --> 00:13:56.640 is the way that the internet has enabled more privacy. 00:13:56.640 --> 00:14:00.920 It used to be that teens, in particular, 00:14:00.920 --> 00:14:06.840 when they were searching for information on sexuality, 00:14:06.840 --> 00:14:10.440 on LGBT issues, on other things 00:14:10.440 --> 00:14:13.880 where they didn't feel safe going to their parents, 00:14:13.880 --> 00:14:15.440 where they didn't necessarily have peers 00:14:15.440 --> 00:14:17.320 or their peers said things 00:14:17.320 --> 00:14:19.720 that weren't necessarily medically true, 00:14:22.520 --> 00:14:25.400 they didn't necessarily have the opportunities 00:14:25.400 --> 00:14:28.120 that they have now to seek out community. 00:14:28.120 --> 00:14:31.560 Of course, this comes with a downside of the internet 00:14:31.560 --> 00:14:35.240 has accurate and inaccurate answers. 00:14:35.240 --> 00:14:41.720 But there's many ways that this has also enabled kids 00:14:41.720 --> 00:14:43.800 to find connection in a way 00:14:43.800 --> 00:14:46.200 that they did not have the opportunity to do 00:14:46.200 --> 00:14:49.800 maybe in a small community, 00:14:49.800 --> 00:14:55.320 in a community that was virulently homophobic, 00:14:55.320 --> 00:14:56.600 that sort of thing, 00:14:56.600 --> 00:15:02.440 and it's important to recognize the advantages 00:15:02.440 --> 00:15:07.400 that come with that versus a dearth of information 00:15:07.400 --> 00:15:10.000 that often existed prior to this. 00:15:12.240 --> 00:15:14.600 >>Yeah, I completely agree with you there. 00:15:14.600 --> 00:15:17.120 And I think I see a lot of nodding panelists, 00:15:17.120 --> 00:15:19.160 which is a usually a good sign. 00:15:19.160 --> 00:15:20.800 We touched on this a little bit already 00:15:20.800 --> 00:15:23.240 and Emily your comment about discord server 00:15:23.240 --> 00:15:26.520 kind of replicating this extracurricular space 00:15:26.520 --> 00:15:30.760 really making me think of this, but do you have any reflections, 00:15:30.760 --> 00:15:33.800 any of you, on how that particular relationship 00:15:33.800 --> 00:15:35.600 and the forming of social connections 00:15:35.600 --> 00:15:39.040 and how that's changed has affected being a student 00:15:39.040 --> 00:15:41.000 and having relationships with your school 00:15:41.000 --> 00:15:42.280 and with your teachers? 00:15:44.800 --> 00:15:47.160 >>I can talk a little bit about what I was thinking 00:15:47.160 --> 00:15:53.240 about was the connection between cyberbullying in schools 00:15:53.240 --> 00:15:54.800 and sexting in schools. 00:15:54.800 --> 00:15:56.880 And with cyberbullying, 00:15:56.880 --> 00:16:00.560 cyberbullying can impact a student, you know, 24/7 00:16:00.560 --> 00:16:02.200 and that's one of the reasons why it can be 00:16:02.200 --> 00:16:04.520 an extremely painful form of bullying. 00:16:04.520 --> 00:16:06.360 The targeted youth can't get relief, 00:16:06.360 --> 00:16:08.520 unless they turn their phone off. 00:16:08.520 --> 00:16:11.280 And that's the ease at which bullying can be shared 00:16:11.280 --> 00:16:14.280 with a click of a button and spread. 00:16:14.280 --> 00:16:17.640 It's extremely damaging and can cause anxiety or depression, 00:16:17.640 --> 00:16:19.200 you know, for youth. 00:16:19.200 --> 00:16:21.880 A student at school may worry about what's being said 00:16:21.880 --> 00:16:24.160 or shared online or the person 00:16:24.160 --> 00:16:26.280 that's cyberbullying them may be in their classes 00:16:26.280 --> 00:16:28.200 and they have to interact with them 00:16:28.200 --> 00:16:30.720 and know that the bullying's continuing online 00:16:30.720 --> 00:16:32.200 while they're in school with them. 00:16:32.200 --> 00:16:34.000 And that can be traumatizing. 00:16:34.000 --> 00:16:36.120 I think it's important that we involve teachers, 00:16:36.120 --> 00:16:37.400 schools and communities 00:16:37.400 --> 00:16:39.760 in the anti-bullying prevention strategies 00:16:39.760 --> 00:16:42.480 and emphasize a need for like a wraparound holistic approach 00:16:42.480 --> 00:16:45.200 to show how deep the impact can be on youth 00:16:45.200 --> 00:16:47.760 and to have the best response. 00:16:47.760 --> 00:16:49.560 Schools should be prepared to link youth 00:16:49.560 --> 00:16:51.480 to mental health resources. 00:16:51.480 --> 00:16:55.480 And just to touch on sexting for a second, 00:16:55.480 --> 00:16:59.000 you know, sexting is, it's illegal for youth 00:16:59.000 --> 00:17:01.640 and it's a problem that can spill over in schools 00:17:01.640 --> 00:17:04.960 because, you know, just to define it, 00:17:04.960 --> 00:17:07.360 sexting is sending and receiving sexual messages, 00:17:07.360 --> 00:17:09.800 photos or videos through technology. 00:17:09.800 --> 00:17:11.520 And kids in middle school and high school 00:17:11.520 --> 00:17:13.640 are starting to explore their sexual identity, 00:17:13.640 --> 00:17:15.600 that's part of adolescent development 00:17:15.600 --> 00:17:17.400 that we talked about earlier, 00:17:17.400 --> 00:17:20.000 and in the past, like when I was growing up 00:17:20.000 --> 00:17:22.400 when before cell phones and before the internet 00:17:22.400 --> 00:17:27.400 maybe you go behind a tree and kiss somebody, 00:17:27.400 --> 00:17:30.520 and you would memorialize in that moment 00:17:30.520 --> 00:17:32.560 and that's how you started dating 00:17:32.560 --> 00:17:35.600 or exploring your sexual identity. 00:17:35.600 --> 00:17:39.000 But now technology's become such a part of our culture 00:17:39.000 --> 00:17:42.000 that kids sometimes think to infuse it in their romances. 00:17:42.000 --> 00:17:45.240 So teens may feel pressure from their significant 00:17:45.240 --> 00:17:47.720 other to send a nude photo of themselves, you know, 00:17:47.720 --> 00:17:49.640 and they think it's going to only be seen 00:17:49.640 --> 00:17:51.200 by their romantic partner, 00:17:51.200 --> 00:17:54.800 but that's not always the case, especially after kids break up 00:17:54.800 --> 00:17:57.640 and then the next thing you know that picture is out there 00:17:57.640 --> 00:18:00.040 and somebody's upset that they got broken up with 00:18:00.040 --> 00:18:02.360 and that picture's been sent to everybody in school 00:18:02.360 --> 00:18:04.960 or anybody in the world. 00:18:04.960 --> 00:18:07.680 And it's extremely difficult, if not impossible, 00:18:07.680 --> 00:18:09.600 to get a picture, a naked picture 00:18:09.600 --> 00:18:12.680 or a picture you don't want out there removed from technology. 00:18:12.680 --> 00:18:15.800 So sometimes kids they don't think necessarily 00:18:15.800 --> 00:18:17.760 about long term consequences, 00:18:17.760 --> 00:18:20.600 but parents need to help kids craft reasons ahead of time 00:18:20.600 --> 00:18:22.800 why they shouldn't send a nude picture 00:18:22.800 --> 00:18:25.280 if their boyfriend or girlfriend asks them to do so 00:18:25.280 --> 00:18:27.480 and have the words already like in their head 00:18:27.480 --> 00:18:31.320 so that the peer pressure in the moment doesn't get to them. 00:18:31.320 --> 00:18:33.320 We want kids to feel safe online 00:18:33.320 --> 00:18:36.320 and we want school to stay a positive environment 00:18:36.320 --> 00:18:39.560 for good social interactions and not embarrassment, 00:18:39.560 --> 00:18:41.800 you know, shame, anxiety, or stress. 00:18:45.840 --> 00:18:48.640 >>I'm glad you started to tap into that Steffie, 00:18:48.640 --> 00:18:53.040 because what happens online doesn't stay online. 00:18:53.040 --> 00:18:54.480 Kids really can shift offline 00:18:54.480 --> 00:18:59.000 and it effects them in many different ways offline as well. 00:18:59.000 --> 00:19:00.680 We're talking about school relationships, 00:19:00.680 --> 00:19:02.800 they're no longer confined to school grounds 00:19:02.800 --> 00:19:04.600 or to entire teenager interactions, 00:19:04.600 --> 00:19:06.440 because of what Steffie was just mentioning 00:19:06.440 --> 00:19:07.800 and talking to us about, 00:19:07.800 --> 00:19:09.680 about social media, about the internet, 00:19:09.680 --> 00:19:11.640 or about how this relationships continues. 00:19:11.640 --> 00:19:13.440 And now that we're relying much more 00:19:13.440 --> 00:19:15.320 on digital learning platforms 00:19:15.320 --> 00:19:18.000 there are other, that relationship is there too. 00:19:18.000 --> 00:19:20.960 We're not talking just about the traditional social media 00:19:20.960 --> 00:19:23.520 that we think about immediately when we mention it, 00:19:23.520 --> 00:19:24.840 we're also talking about gaming, 00:19:24.840 --> 00:19:26.960 we're talking about the learning virtual environment, 00:19:26.960 --> 00:19:29.680 we're talking about online communities and the sort. 00:19:29.680 --> 00:19:33.760 And, of course, there's good and bad in relationships, right? 00:19:33.760 --> 00:19:36.360 We can be connected. We talked about that already. 00:19:36.360 --> 00:19:38.280 We can learn. We can build skills. 00:19:38.280 --> 00:19:40.400 But there's also a variety of risks 00:19:40.400 --> 00:19:42.400 that affect our relationships. 00:19:42.400 --> 00:19:45.560 So for kids it can be cyberbullying, 00:19:45.560 --> 00:19:47.600 it can be posting hateful messages 00:19:47.600 --> 00:19:51.200 or constant participating in negative group conversations. 00:19:51.200 --> 00:19:54.360 And we've seen it reported by UNICEF 00:19:54.360 --> 00:19:56.680 and others that increased online quality 00:19:56.680 --> 00:20:00.920 can also result in increased risk for online harm. 00:20:00.920 --> 00:20:04.400 The more you're exposed the more at risk you are for it 00:20:04.400 --> 00:20:07.080 and cyberbullying is one of them 00:20:07.080 --> 00:20:09.760 in the different wide variety of platforms. 00:20:11.920 --> 00:20:12.760 >>I'm going to jump in. 00:20:12.760 --> 00:20:15.800 I would love if someone could explain to me, 00:20:15.800 --> 00:20:18.920 because this was a term that was relatively unfamiliar 00:20:18.920 --> 00:20:23.400 given that I don't work on kids and teens specifically. 00:20:23.400 --> 00:20:25.080 If someone could explain the relationship 00:20:25.080 --> 00:20:27.040 between digital drama 00:20:27.040 --> 00:20:29.840 and cyberbullying I'd really appreciate that, 00:20:29.840 --> 00:20:32.840 maybe just in the next round of answers. 00:20:32.840 --> 00:20:34.640 >>I'll jump in if I may. 00:20:36.000 --> 00:20:39.200 I look at sort of digital drama, 00:20:39.200 --> 00:20:40.800 I think we should simply call it drama, 00:20:40.800 --> 00:20:44.080 because as we know there's no separate in teens' minds 00:20:44.080 --> 00:20:45.440 between online and offline. 00:20:45.440 --> 00:20:48.440 We've been talking about that already this afternoon. 00:20:48.440 --> 00:20:52.840 But I see this drama piece as a bit of a sister issue 00:20:52.840 --> 00:20:55.400 to online bullying or cyberbullying. 00:20:55.400 --> 00:20:57.360 And I think teens often refer 00:20:57.360 --> 00:20:59.880 to many of these behaviors as drama 00:20:59.880 --> 00:21:02.600 as to not to invoke the bullying moniker, 00:21:02.600 --> 00:21:05.160 which is all but guaranteed to spark parental 00:21:05.160 --> 00:21:07.120 or educator involvement. 00:21:07.120 --> 00:21:10.600 So I would look at it more akin to a petty quarrel 00:21:10.600 --> 00:21:11.880 or a disagreement, 00:21:11.880 --> 00:21:14.800 but like everything it can escalate. 00:21:14.800 --> 00:21:17.600 Then when I look at online bullying or cyberbullying 00:21:17.600 --> 00:21:18.800 depending on the source 00:21:18.800 --> 00:21:22.280 that has a very specific meaning and specific components 00:21:22.280 --> 00:21:23.920 for the past decade or so 00:21:23.920 --> 00:21:27.000 I've defined it as the use of technology 00:21:27.000 --> 00:21:29.200 to demonstrate behavior often repeated, 00:21:29.200 --> 00:21:31.000 but it doesn't have to be repeated 00:21:31.000 --> 00:21:35.800 that teases, demeans or harasses someone less powerful. 00:21:35.800 --> 00:21:39.640 And online bullying can manifest itself in a variety of forms 00:21:39.640 --> 00:21:42.000 as we've heard sending hurtful messages, 00:21:42.000 --> 00:21:45.720 disclosing private information, excluding someone from a group, 00:21:45.720 --> 00:21:47.880 impersonating the target of the bullying, 00:21:47.880 --> 00:21:52.520 pretending to befriend someone and then betraying that trust. 00:21:52.520 --> 00:21:55.200 So that's how we look at those types of issues. 00:21:55.200 --> 00:21:58.360 I just wanted to raise one other point if I may in relation 00:21:58.360 --> 00:22:01.920 to what Steffie was saying about sexting. 00:22:01.920 --> 00:22:03.160 When we're looking at these issues 00:22:03.160 --> 00:22:05.440 we also have to look at the consequences 00:22:05.440 --> 00:22:07.160 and the pain that follow. 00:22:07.160 --> 00:22:10.440 One of the consequences even when sharing pictures 00:22:10.440 --> 00:22:15.400 of oneself of a sexual nature something that is self-produced, 00:22:15.400 --> 00:22:16.920 the obligation is on us, 00:22:16.920 --> 00:22:19.040 it's a legal obligation on platforms 00:22:19.040 --> 00:22:21.560 that when we become aware of this kind of material 00:22:21.560 --> 00:22:22.760 on our platforms 00:22:22.760 --> 00:22:24.840 we are obligated to report it to the National Center 00:22:24.840 --> 00:22:26.800 for Missing and Exploited Children. 00:22:26.800 --> 00:22:28.320 So there are consequences. 00:22:28.320 --> 00:22:31.000 We have to report these things to authorities. 00:22:31.000 --> 00:22:33.000 I don't think that youth often realize 00:22:33.000 --> 00:22:36.080 that when they are in the moment 00:22:36.080 --> 00:22:38.960 and I understand their self-exploration, 00:22:38.960 --> 00:22:40.680 I understand the experimentation, 00:22:40.680 --> 00:22:43.440 particularly in the 21st Century digital world, 00:22:43.440 --> 00:22:46.160 but I think if they understood the background 00:22:46.160 --> 00:22:47.800 and where some of these things could head 00:22:47.800 --> 00:22:50.320 they might think twice. 00:22:52.600 --> 00:22:55.520 >>And we can apply that also to bullying. 00:22:55.520 --> 00:22:58.200 Bullying is an adverse childhood experience. 00:22:58.200 --> 00:23:01.000 It is a form of youth violence. 00:23:01.000 --> 00:23:02.000 And it can be manifested, 00:23:02.000 --> 00:23:04.600 unfortunately it can be through the digital platform 00:23:04.600 --> 00:23:07.280 cyberbullying as Jacqueline expressed. 00:23:07.280 --> 00:23:10.600 So we need to remember that being a form of violence, 00:23:10.600 --> 00:23:12.680 different forms of violence are interconnected. 00:23:12.680 --> 00:23:14.760 So being exposed and experiencing 00:23:14.760 --> 00:23:16.000 one form of violence 00:23:16.000 --> 00:23:19.480 can result in experiencing others across the life stream. 00:23:19.480 --> 00:23:23.000 So it can have a long term impact being exposed 00:23:23.000 --> 00:23:26.520 and being involved in bullying and this type of violence, 00:23:26.520 --> 00:23:28.800 whether it is as the one being bullied, 00:23:28.800 --> 00:23:30.160 as the one bullying others, 00:23:30.160 --> 00:23:34.000 as the one bullying and being bullied or as a witness. 00:23:34.000 --> 00:23:36.400 It's the moment where you decided 00:23:36.400 --> 00:23:37.880 whether you're going to be a bystander 00:23:37.880 --> 00:23:40.480 or an up-stander to bullying. 00:23:40.480 --> 00:23:44.040 So, yeah, we need to be mindful about that and those long time, 00:23:44.040 --> 00:23:48.920 long lifelong consequences of bullying and this experience 00:23:48.920 --> 00:23:50.520 that the kids are having online. 00:23:54.240 --> 00:23:56.840 >>I would echo all those really good points again. 00:23:56.840 --> 00:23:58.720 It's really just a lack of evolution 00:23:58.720 --> 00:24:02.000 between real life drama and digital drama. 00:24:02.000 --> 00:24:03.440 You know, in the day of a teenager 00:24:03.440 --> 00:24:05.040 this is just something that's happening, 00:24:05.040 --> 00:24:08.240 it's not diminished in any way because it's happening online. 00:24:08.240 --> 00:24:10.400 And I think whether you're talking about sexting, 00:24:10.400 --> 00:24:12.640 cyberbullying, all the way up the chain 00:24:12.640 --> 00:24:15.200 to really destructive behavior, 00:24:15.200 --> 00:24:18.560 I think one thing that will be really interesting going forward 00:24:18.560 --> 00:24:22.120 is sort of the parenting approach. 00:24:22.120 --> 00:24:26.000 When we think about, you know, educating parents or educators 00:24:26.000 --> 00:24:29.800 or whoever we may be talking to as millennials move 00:24:29.800 --> 00:24:32.240 into being sort of the predominant parent group 00:24:32.240 --> 00:24:34.560 they are going to relate more closely to kids 00:24:34.560 --> 00:24:37.040 on what the dynamic can be like online. 00:24:37.040 --> 00:24:40.440 For Baby Boomers and even GenX parents not very long ago 00:24:40.440 --> 00:24:43.760 I feel like the response when conflict would arise online, 00:24:43.760 --> 00:24:45.200 whether it was more trivial drama 00:24:45.200 --> 00:24:47.560 all the way up to really serious cyberbullying, 00:24:47.560 --> 00:24:50.440 the response more likely to be seeing digital life 00:24:50.440 --> 00:24:53.400 as separate and the inclination when things went wrong to say, 00:24:53.400 --> 00:24:54.600 you know, just delete the app 00:24:54.600 --> 00:24:56.040 or I'm taking your phone altogether, 00:24:56.040 --> 00:24:58.000 you know, it only causes problems. 00:24:58.000 --> 00:25:00.680 And the risk to that of course is that kids will stop sharing 00:25:00.680 --> 00:25:02.200 with their parents when something is wrong. 00:25:02.200 --> 00:25:04.560 And I think that element of creating secrecy 00:25:04.560 --> 00:25:06.800 and driving bad behavior underground, 00:25:06.800 --> 00:25:10.000 whatever the behavior might be, is how conflicts of all kinds 00:25:10.000 --> 00:25:12.840 then sort of go into this place where they escalate. 00:25:12.840 --> 00:25:15.600 So one interesting thing certainly for us going forward 00:25:15.600 --> 00:25:17.280 is that with this new generation of parents 00:25:17.280 --> 00:25:21.000 who have also grown up with a certain degree of awareness 00:25:21.000 --> 00:25:24.040 of what it was like to have technology from a young age 00:25:24.040 --> 00:25:26.680 it will be interesting from a prevention standpoint 00:25:26.680 --> 00:25:29.480 how that understanding might translate into a parenting style 00:25:29.480 --> 00:25:31.320 that really emphasizes tackling online 00:25:31.320 --> 00:25:33.680 related relationships with the same baselines 00:25:33.680 --> 00:25:37.080 that they might use for offline ones. 00:25:37.080 --> 00:25:38.400 >>I'm just going to jump in 00:25:38.400 --> 00:25:41.360 and I'll ask a follow up question there. 00:25:41.360 --> 00:25:42.640 Given that that's the case 00:25:42.640 --> 00:25:44.680 so we're going to have this new generation of parents 00:25:44.680 --> 00:25:47.800 who are much more kind of knowledgeable 00:25:47.800 --> 00:25:51.120 about how technology fits in with our children's lives, 00:25:51.120 --> 00:25:54.440 are there educational approaches 00:25:54.440 --> 00:25:56.400 that need to be adjusted for that? 00:25:56.400 --> 00:25:58.200 So is the way that we teach parents 00:25:58.200 --> 00:25:59.480 to talk about their kids, 00:25:59.480 --> 00:26:02.800 about some of these issues still going to be appropriate 00:26:02.800 --> 00:26:04.320 for this next generation of parents 00:26:04.320 --> 00:26:08.520 and how can we kind of encourage that shift? 00:26:08.520 --> 00:26:11.800 >>Well, I think one thing that Melissa that you just noted 00:26:11.800 --> 00:26:13.000 is that one interestingly, 00:26:13.000 --> 00:26:16.200 this is based on a study from one year ago in 2020 00:26:16.200 --> 00:26:18.720 that we did where we evaluated parents' attitude 00:26:18.720 --> 00:26:21.160 towards online safety tools. 00:26:21.160 --> 00:26:22.840 But the younger generation of parents 00:26:22.840 --> 00:26:24.440 actually had the most awareness of the fact 00:26:24.440 --> 00:26:26.600 that their child might actually be a bully, 00:26:26.600 --> 00:26:28.200 which is something that I note that you just said, 00:26:28.200 --> 00:26:30.480 whereas the older generations tend to only look at it 00:26:30.480 --> 00:26:32.800 as this thing that happens to their kids. 00:26:32.800 --> 00:26:35.960 So I think one change in tone 00:26:35.960 --> 00:26:37.960 that would definitely be very important 00:26:37.960 --> 00:26:40.360 is just looking at not just what you do 00:26:40.360 --> 00:26:43.240 when bullying happens to you, but how do you not be a bully? 00:26:43.240 --> 00:26:44.720 And I think the younger generation 00:26:44.720 --> 00:26:47.400 has a much more well-rounded understanding 00:26:47.400 --> 00:26:49.600 about because they've grown up maybe doing things 00:26:49.600 --> 00:26:51.000 that they're young that they wish 00:26:51.000 --> 00:26:52.600 they hadn't done or witnessing things 00:26:52.600 --> 00:26:54.680 that they have been able to learn from first hand 00:26:54.680 --> 00:26:56.320 and in part to their child. 00:26:56.320 --> 00:26:59.840 So I just think it's a more well-rounded perspective. 00:26:59.840 --> 00:27:03.440 In terms of lessons I think the only other thing is 00:27:03.440 --> 00:27:04.840 that even a young generation of parents 00:27:04.840 --> 00:27:06.200 is still going to feel overwhelmed. 00:27:06.200 --> 00:27:07.800 We hear that from parents no matter 00:27:07.800 --> 00:27:10.160 what generation they're from, they're still busy, 00:27:10.160 --> 00:27:12.800 they still have kids in the house of multiple ages 00:27:12.800 --> 00:27:15.160 that have different needs when it comes to technology use. 00:27:15.160 --> 00:27:19.400 So I think just the baseline that will always stay the same 00:27:19.400 --> 00:27:22.320 is having the open line of communication. 00:27:22.320 --> 00:27:23.880 Being a successful digital parent 00:27:23.880 --> 00:27:26.880 doesn't look like knowing better than your child 00:27:26.880 --> 00:27:29.280 how to use every single app and every single feature, 00:27:29.280 --> 00:27:31.720 it just means that if something goes wrong 00:27:31.720 --> 00:27:33.520 you know that your child will always come to you 00:27:33.520 --> 00:27:35.000 so that you can figure it out together. 00:27:35.000 --> 00:27:37.080 And that is something I don't think will change 00:27:37.080 --> 00:27:38.560 regardless of generation. 00:27:43.000 --> 00:27:44.280 >>Thanks. 00:27:44.280 --> 00:27:46.680 Does anyone else want to come in on that point 00:27:46.680 --> 00:27:50.320 before we start talking about private information 00:27:50.320 --> 00:27:54.120 and other aspects of online safety? 00:27:54.120 --> 00:28:00.200 >>I just would love to emphasis quickly how much of a difference 00:28:00.200 --> 00:28:07.080 it makes that this information is basically in a child's home 00:28:07.080 --> 00:28:10.960 and the impact that that has on the well-being of kids 00:28:10.960 --> 00:28:12.200 that it's following you, 00:28:12.200 --> 00:28:14.480 that it may not just be the people at your school 00:28:14.480 --> 00:28:19.200 but also people you don't know who are judging you 00:28:19.200 --> 00:28:21.920 or piling on or other things. 00:28:21.920 --> 00:28:24.040 And I don't always know, 00:28:24.040 --> 00:28:27.360 I think parents are often thinking back to bullying 00:28:27.360 --> 00:28:31.920 when they were in school, like even for younger parents 00:28:31.920 --> 00:28:36.320 the limited amount of social media, etc., available. 00:28:36.320 --> 00:28:39.320 I know there was already some cyberbullying 00:28:39.320 --> 00:28:42.440 happening on live journal in My Space, for example. 00:28:42.440 --> 00:28:45.920 But it's on a very different scale now 00:28:45.920 --> 00:28:49.880 and it's much more intrusive in the day to day life of kids. 00:28:49.880 --> 00:28:51.800 And so as we talk about impact, 00:28:51.800 --> 00:28:55.320 as we talk about looking at all of this, 00:28:55.320 --> 00:28:59.200 I think that really needs to be stated again and again. 00:29:01.600 --> 00:29:03.720 >>Melanie can I also mention a little bit 00:29:03.720 --> 00:29:08.360 before we move to privacy about digital hate speech? 00:29:08.360 --> 00:29:09.560 >>Sure, yeah. 00:29:09.560 --> 00:29:11.600 >>Okay. Because we're working, OJJVP's 00:29:11.600 --> 00:29:15.600 working on combating on youth hate crimes, 00:29:15.600 --> 00:29:18.800 so I just wanted to share that. 00:29:18.800 --> 00:29:21.040 Just to give a little definition 00:29:21.040 --> 00:29:24.080 that online hate speech you've already talked about it some, 00:29:24.080 --> 00:29:25.840 but it takes place with the purpose 00:29:25.840 --> 00:29:28.680 of attacking a person or group based on their race, religion, 00:29:28.680 --> 00:29:31.680 ethnic origin, sexual orientation, 00:29:31.680 --> 00:29:33.000 disability or gender. 00:29:33.000 --> 00:29:35.280 And unfortunately as you were saying 00:29:35.280 --> 00:29:38.200 that kids are becoming more and more recipients 00:29:38.200 --> 00:29:40.120 of hearing that type of speech. 00:29:40.120 --> 00:29:43.280 And you mentioned a few of the mainstream social media sites 00:29:43.280 --> 00:29:47.000 used by extremists groups to radicalize or recruit youth. 00:29:47.000 --> 00:29:51.400 And I found similar ones, Instagram, Tic Toc, Spotify. 00:29:51.400 --> 00:29:54.200 Anyways, these hate groups that are, you know, 00:29:54.200 --> 00:29:56.920 trying to reach kids also on gaming systems, 00:29:56.920 --> 00:29:58.240 roadblocks, mind craft, 00:29:58.240 --> 00:30:01.480 we talked about twitch, fort night, you know. 00:30:01.480 --> 00:30:05.080 What I found is that 33 percent of youth surveyed 00:30:05.080 --> 00:30:06.600 between eighth and twelfth grade 00:30:06.600 --> 00:30:08.760 reported seeing hateful content online. 00:30:08.760 --> 00:30:11.000 That was from the Simon Rosenthal Center. 00:30:11.000 --> 00:30:13.280 And I just feel like we really need to beef up 00:30:13.280 --> 00:30:15.640 our anti-bias education, 00:30:15.640 --> 00:30:18.960 teach them how to identify and respond to hate speech 00:30:18.960 --> 00:30:21.680 and also, you know, help them be able to report it. 00:30:21.680 --> 00:30:23.720 But real quick, the Office of Juvenile Justice 00:30:23.720 --> 00:30:25.000 and Delinquency Prevention 00:30:25.000 --> 00:30:27.560 announced this national initiative in October. 00:30:27.560 --> 00:30:29.000 And we're engaging youth, 00:30:29.000 --> 00:30:31.560 identifying evidence-based strategies 00:30:31.560 --> 00:30:34.680 and we had a two-day symposium and I gave the link for you 00:30:34.680 --> 00:30:36.160 to be able to see all the panels, 00:30:36.160 --> 00:30:38.480 because they were very interesting and helpful. 00:30:38.480 --> 00:30:40.320 And we're doing a 12 part webinar series, 00:30:40.320 --> 00:30:41.960 which is also on that link, 00:30:41.960 --> 00:30:44.560 which talks about combatting identity based bullying, 00:30:44.560 --> 00:30:46.400 youth hate crimes, hate speech, 00:30:46.400 --> 00:30:49.040 hate groups and strategy to prevent hate crimes 00:30:49.040 --> 00:30:51.640 and mitigating micro aggressions and implicit bias. 00:30:51.640 --> 00:30:53.880 So just a little plug. 00:30:53.880 --> 00:30:56.800 If people are interested they can join in to our, 00:30:56.800 --> 00:31:00.280 we're only on our third webinar. 00:31:00.280 --> 00:31:02.880 So that's all I wanted to say about that. 00:31:02.880 --> 00:31:04.400 Thanks. 00:31:04.400 --> 00:31:06.560 >>Thanks. We appreciate you sharing the resources. 00:31:06.560 --> 00:31:08.200 That's great. 00:31:08.200 --> 00:31:11.040 Melissa, did you want to comment on this last piece? 00:31:11.040 --> 00:31:12.040 >>Yes. 00:31:13.400 --> 00:31:16.480 Well, I keep on thinking as a parent, 00:31:16.480 --> 00:31:19.800 and I'm assuming there are a lot of parents in our audience 00:31:19.800 --> 00:31:22.240 that this might feel a little bit overwhelming, 00:31:22.240 --> 00:31:27.000 you know, what are our kids being exposed to online 00:31:27.000 --> 00:31:32.520 that we are not physically able to maybe see or understand? 00:31:32.520 --> 00:31:35.600 And I just wanted to see if we could break out the point 00:31:35.600 --> 00:31:38.280 that there are ways that parents can help schools 00:31:38.280 --> 00:31:40.200 to do that at the beginning 00:31:40.200 --> 00:31:42.480 and at the introduction it was mentioned that Vi and I 00:31:42.480 --> 00:31:45.200 were members of the Stop Bullying Editorial Board. 00:31:45.200 --> 00:31:49.000 I had encouraged parents to visit stopbullying.gov. 00:31:49.000 --> 00:31:51.200 There are sections there specific to bullying, 00:31:51.200 --> 00:31:52.560 to cyberbullying. 00:31:52.560 --> 00:31:55.560 There is information there for parents and tips 00:31:55.560 --> 00:31:59.000 on what parent can do to help protect children 00:31:59.000 --> 00:32:01.600 from harmful digital behavior. 00:32:01.600 --> 00:32:05.360 So there's resources. There's help out there. 00:32:05.360 --> 00:32:09.000 There's different, a wide variety of online communities 00:32:09.000 --> 00:32:13.600 and environments that emerge every day. 00:32:13.600 --> 00:32:16.360 And keeping track can be hard, can be difficult. 00:32:16.360 --> 00:32:19.040 But with a little bit of help for that on stopbullying.gov. 00:32:19.040 --> 00:32:22.200 And also regarding the gaming sphere 00:32:22.200 --> 00:32:24.800 sometimes it's just a matter of sitting down with your child 00:32:24.800 --> 00:32:27.800 and trying to play a game or two 00:32:27.800 --> 00:32:30.520 to try to understand that environment. 00:32:30.520 --> 00:32:32.480 It's one of the different suggestions 00:32:32.480 --> 00:32:36.360 that are out there that we can look into. 00:32:36.360 --> 00:32:39.480 >>And Melanie if I may, I think it bears repeating 00:32:39.480 --> 00:32:43.040 that we need an empowering message for parents. 00:32:43.040 --> 00:32:45.880 You know, it's not just about technology 00:32:45.880 --> 00:32:47.120 and they're overwhelmed, 00:32:47.120 --> 00:32:49.640 until we narrow that generation gap 00:32:49.640 --> 00:32:53.920 that Emily was speaking of we're still going to have this place 00:32:53.920 --> 00:32:55.240 where parents feel overwhelmed. 00:32:55.240 --> 00:32:57.960 Technology is so sophisticated. 00:32:57.960 --> 00:32:59.960 The kids are the intelligent ones here. 00:32:59.960 --> 00:33:01.760 They understand and are very astute 00:33:01.760 --> 00:33:03.560 when it comes to the technology 00:33:03.560 --> 00:33:07.080 whereas the parents we have to give them an empowering message. 00:33:07.080 --> 00:33:10.000 And that empowering message is they can work 00:33:10.000 --> 00:33:12.960 with the youth to identify risk, 00:33:12.960 --> 00:33:15.000 because the kids might have the intelligence 00:33:15.000 --> 00:33:17.000 but the parents have the wisdom. 00:33:17.000 --> 00:33:19.080 So they can help the kids identify risk. 00:33:19.080 --> 00:33:20.360 Who is that person you're talking to? 00:33:20.360 --> 00:33:21.640 Do I know that person? 00:33:21.640 --> 00:33:23.400 What is it, that website you're going to? 00:33:23.400 --> 00:33:24.760 Let's talk about that. 00:33:24.760 --> 00:33:26.480 Just to be able to have that dialogue 00:33:26.480 --> 00:33:30.880 and help youth identify risk and potential threats 00:33:30.880 --> 00:33:33.600 or dangers that can mean so much. 00:33:33.600 --> 00:33:35.400 And that doesn't just have to be a parent 00:33:35.400 --> 00:33:36.720 that can be an educator. 00:33:36.720 --> 00:33:38.560 That can be another trusted adult, 00:33:38.560 --> 00:33:41.200 member of the clergy, a counselor, coach. 00:33:41.200 --> 00:33:42.480 There's a lot of people that play 00:33:42.480 --> 00:33:45.760 that trusted adult role in youth lives 00:33:45.760 --> 00:33:48.000 that they can help have those conversations. 00:33:49.560 --> 00:33:51.600 >>And we can set the example too. 00:33:51.600 --> 00:33:55.520 I believe the entertainment software relays data from 2021 00:33:55.520 --> 00:33:58.560 saying that 20 percent of all U.S. video game players 00:33:58.560 --> 00:33:59.800 are children. 00:33:59.800 --> 00:34:02.400 The rest 80 percent is adults, right? 00:34:02.400 --> 00:34:04.160 We're talking about that specific type 00:34:04.160 --> 00:34:05.360 of online environment 00:34:05.360 --> 00:34:08.960 we can model that behavior as well and empower our kids 00:34:08.960 --> 00:34:12.360 to follow our model through examples for them follow 00:34:12.360 --> 00:34:16.120 and be good digital citizens and protect themselves and others. 00:34:18.360 --> 00:34:20.520 >>Completely. 00:34:20.520 --> 00:34:24.600 We've touched upon this a few times throughout 00:34:24.600 --> 00:34:27.720 with the examples given of sexting, 00:34:27.720 --> 00:34:31.040 but I'm curious if we can hear a few of you 00:34:31.040 --> 00:34:34.960 on what are the implications of sharing private information? 00:34:34.960 --> 00:34:39.080 Maybe aside from the consequences 00:34:39.080 --> 00:34:43.400 of sharing inappropriate content. 00:34:43.400 --> 00:34:47.960 So aside from the obvious, what are the other implications 00:34:47.960 --> 00:34:51.680 of sharing that information and where does it get stored, 00:34:51.680 --> 00:34:55.760 where does it go, what are the consequences? 00:34:55.760 --> 00:35:00.840 >> I think I can start there as the designated privacy person. 00:35:00.840 --> 00:35:02.800 I think there's a couple of risks here. 00:35:02.800 --> 00:35:07.280 First of all, the sharing of the information in the first place, 00:35:07.280 --> 00:35:10.960 but also not feeling like you can share that information. 00:35:10.960 --> 00:35:15.240 And I'll get to that point maybe last. 00:35:15.240 --> 00:35:20.160 So when information is put out online 00:35:20.160 --> 00:35:23.160 I think there's often a belief from kids 00:35:23.160 --> 00:35:27.680 that if it's not directly connected to them, 00:35:27.680 --> 00:35:31.480 if it's not a specific name, if it's a user name, 00:35:31.480 --> 00:35:32.720 that they won't be found. 00:35:32.720 --> 00:35:37.800 And, of course, anyone whose mom read them the forwarded email, 00:35:37.800 --> 00:35:42.640 urban horror stories of how random stalkers found kids 00:35:42.640 --> 00:35:45.520 through the name of their softball team 00:35:45.520 --> 00:35:51.200 are aware that it really is easy to trace back who someone is, 00:35:51.200 --> 00:35:54.240 but making sure that that is clear, 00:35:54.240 --> 00:35:57.000 that it is part of what is taught in schools, 00:35:57.000 --> 00:35:58.840 what is made available to parents to share 00:35:58.840 --> 00:36:01.680 with their kids is so important. 00:36:01.680 --> 00:36:05.480 And the fact that even innocuous information 00:36:05.480 --> 00:36:09.200 can be dangerous in certain circumstances. 00:36:09.200 --> 00:36:13.640 So we often hear from schools, for example, 00:36:13.640 --> 00:36:16.760 well, you know, phone numbers are often publically available, 00:36:16.760 --> 00:36:19.200 it's not a big deal, 00:36:19.200 --> 00:36:22.200 but you unfortunately have people in the world 00:36:22.200 --> 00:36:25.960 who, you know, in one case you had hackers 00:36:25.960 --> 00:36:27.240 who thought it would be fun 00:36:27.240 --> 00:36:30.240 to call up parent and student's telephone 00:36:30.240 --> 00:36:32.400 and leave death threats because it was funny. 00:36:34.560 --> 00:36:37.320 So even the things that we don't necessarily 00:36:37.320 --> 00:36:40.200 think of as personal information, 00:36:40.200 --> 00:36:44.160 even things that, you know, 00:36:44.160 --> 00:36:47.800 don't lead to a physical safety threat 00:36:47.800 --> 00:36:52.760 or even necessarily sort of a mental health aspect 00:36:52.760 --> 00:36:56.200 can be used in ways that are unanticipated. 00:36:56.200 --> 00:37:00.600 And a lot of this is not just what I think is really prevalent 00:37:00.600 --> 00:37:04.520 and available right now in the U.S. 00:37:04.520 --> 00:37:08.760 in terms of what sort of digital citizenship curriculum examples 00:37:08.760 --> 00:37:10.480 are there's a lot of good ones 00:37:10.480 --> 00:37:14.600 from commonsense media and others. 00:37:14.600 --> 00:37:19.200 But when we talk about privacy it's often so narrowly construed 00:37:19.200 --> 00:37:23.400 as, you know, just don't post your pictures online, 00:37:23.400 --> 00:37:25.920 don't give your name 00:37:25.920 --> 00:37:28.680 and there isn't necessarily a broader understanding 00:37:28.680 --> 00:37:33.200 of as I said even innocuous information can be an issue, 00:37:33.200 --> 00:37:35.760 but also how information is being collected 00:37:35.760 --> 00:37:40.080 about you generally, you know, by your school, 00:37:40.080 --> 00:37:45.880 by companies and may feed into the ads you see 00:37:45.880 --> 00:37:51.200 or how information might impact you in the future. 00:37:51.200 --> 00:37:56.280 I think we often wait too long to talk to kids about well, 00:37:56.280 --> 00:37:59.440 did you know that the student posted this picture 00:37:59.440 --> 00:38:02.160 and it ended up getting their acceptance 00:38:02.160 --> 00:38:03.400 to this college revoked, 00:38:03.400 --> 00:38:07.520 because kids aren't starting this in high school, 00:38:07.520 --> 00:38:10.200 they have access to devices earlier and earlier 00:38:10.200 --> 00:38:11.600 and they need to know well 00:38:11.600 --> 00:38:14.840 before that that there are going to be consequences 00:38:14.840 --> 00:38:18.120 and because in terms of development 00:38:18.120 --> 00:38:21.480 that's not going to necessarily matter as much to them. 00:38:21.480 --> 00:38:23.840 There have been a lot of great studies talking about, 00:38:23.840 --> 00:38:25.600 you know, when kids think about privacy 00:38:25.600 --> 00:38:28.240 they think about privacy from their parents, 00:38:28.240 --> 00:38:31.560 privacy from their peers, but not necessarily privacy 00:38:31.560 --> 00:38:33.440 against sort of unknown entities, 00:38:33.440 --> 00:38:36.280 other companies, government, etc. 00:38:36.280 --> 00:38:40.600 So they're much more willing to talk to maybe, 00:38:40.600 --> 00:38:44.400 you know, an Alexa as opposed to, 00:38:44.400 --> 00:38:47.800 you know, tell something to a friend or a parent. 00:38:47.800 --> 00:38:52.640 And so really upgrading dramatically 00:38:52.640 --> 00:38:54.040 how we're teaching about this, 00:38:54.040 --> 00:38:58.000 expanding privacy curriculum here. 00:38:58.000 --> 00:39:00.800 We've seen a lot of other countries make those evolutions, 00:39:00.800 --> 00:39:02.520 the Italy particularly, 00:39:02.520 --> 00:39:05.800 Ireland have built in some phenomenal resources 00:39:05.800 --> 00:39:08.760 which are available for free. 00:39:08.760 --> 00:39:10.240 So everyone should check them out. 00:39:10.240 --> 00:39:16.320 Australia also has some amazing resources, Canada as well. 00:39:16.320 --> 00:39:19.960 So systemically we need to do that. 00:39:19.960 --> 00:39:22.200 On the other side, as I said, 00:39:22.200 --> 00:39:23.760 when it comes to personal information 00:39:23.760 --> 00:39:26.680 there's a danger when kids don't feel like they can share it. 00:39:26.680 --> 00:39:28.720 And this is again about making sure 00:39:28.720 --> 00:39:32.880 that they feel like they can ask for help 00:39:32.880 --> 00:39:36.480 or seek out help or seek out community. 00:39:36.480 --> 00:39:40.600 We've seen sort of an outgrowth of monitoring software 00:39:40.600 --> 00:39:43.120 by schools, by parents. 00:39:43.120 --> 00:39:45.480 And that can be really valuable, 00:39:45.480 --> 00:39:48.360 especially when you're talking about younger kids 00:39:48.360 --> 00:39:51.960 who are more likely to run into some of these issues 00:39:51.960 --> 00:39:56.000 where they're accessing sites and may not go to their parents. 00:39:56.000 --> 00:39:59.000 In other cases that may mean 00:39:59.000 --> 00:40:01.560 that they're not looking at the help they need. 00:40:01.560 --> 00:40:04.160 One of the examples that I see is, 00:40:04.160 --> 00:40:10.000 for example, when kids look up mental health help online, 00:40:10.000 --> 00:40:12.400 when they look up 00:40:12.400 --> 00:40:14.840 say the National Trans Data Visibility website 00:40:14.840 --> 00:40:16.360 or other things. 00:40:16.360 --> 00:40:19.280 Sometimes that leads to them having a sit down 00:40:19.280 --> 00:40:21.640 with the principal the next morning saying 00:40:21.640 --> 00:40:24.560 I know what you did on the internet last night. 00:40:24.560 --> 00:40:28.920 And since we know that it is vital for kids' well-being 00:40:28.920 --> 00:40:31.960 to have trusted relationships with adults 00:40:31.960 --> 00:40:33.320 those sorts of conversations 00:40:33.320 --> 00:40:35.560 don't build trusted relationships with adults. 00:40:35.560 --> 00:40:38.000 So how do we make sure we are watching kids 00:40:38.000 --> 00:40:44.040 and supporting them without limiting them 00:40:44.040 --> 00:40:47.520 actually looking for solutions and for help? 00:40:47.520 --> 00:40:49.040 And it's a difficult balance. 00:40:53.960 --> 00:40:57.040 >>Thank you so much that was really interesting. 00:40:57.040 --> 00:40:58.680 Definitely food for thought. 00:40:58.680 --> 00:41:03.000 Does anybody else want to come in on privacy? 00:41:03.000 --> 00:41:04.560 A have a few more questions to get through, 00:41:04.560 --> 00:41:07.120 but we can stay here if people want to. 00:41:09.400 --> 00:41:10.760 >>Melanie, do you want me 00:41:10.760 --> 00:41:12.400 to talk a little bit about sextortion? 00:41:12.400 --> 00:41:17.200 Or do you feel like that's already been kind of handled? 00:41:17.200 --> 00:41:19.120 >>I think that would be helpful. 00:41:19.120 --> 00:41:23.280 >>Okay. So I work with the Internet Crimes 00:41:23.280 --> 00:41:25.400 against Children task forces. 00:41:25.400 --> 00:41:30.200 And OJJDP supports 61 coordinated task forces 00:41:30.200 --> 00:41:32.360 representing 5400 federal, state 00:41:32.360 --> 00:41:33.920 and local law enforcement 00:41:33.920 --> 00:41:36.560 and they prosecute, investigate and develop responses 00:41:36.560 --> 00:41:37.880 to internet crimes against children. 00:41:37.880 --> 00:41:41.640 So the thing we're focused on right now is sextortion, 00:41:41.640 --> 00:41:45.800 because sextortion is an online exploitation crime 00:41:45.800 --> 00:41:48.400 where it's directed specifically towards children 00:41:48.400 --> 00:41:53.560 to coerce them or blackmail them to acquire sexual content 00:41:53.560 --> 00:41:56.640 or engage in sex or to obtain money. 00:41:56.640 --> 00:41:59.640 And these perpetrators basically troll the internet 00:41:59.640 --> 00:42:00.960 and they look at kids, 00:42:00.960 --> 00:42:03.080 like they look at their social media 00:42:03.080 --> 00:42:05.320 and they see who the vulnerable kids are 00:42:05.320 --> 00:42:07.040 and then they go after them 00:42:07.040 --> 00:42:09.400 from a variety of different platforms. 00:42:09.400 --> 00:42:12.760 And they even can steal another child's, 00:42:12.760 --> 00:42:17.800 you know, avatar or personality to actually get a child 00:42:17.800 --> 00:42:21.280 to send them the information or the photograph that they want 00:42:21.280 --> 00:42:23.840 and then the blackmailing starts. 00:42:23.840 --> 00:42:26.840 So I just wanted to give you some statistics 00:42:26.840 --> 00:42:29.200 that like 60 percent of online victims 00:42:29.200 --> 00:42:32.200 are threatened within the first two weeks of contact 00:42:32.200 --> 00:42:33.800 and one in six youths surveyed 00:42:33.800 --> 00:42:35.840 that are between the ages of nine to seventeen 00:42:35.840 --> 00:42:39.600 have shared a naked image of themselves 00:42:39.600 --> 00:42:44.160 with someone who's trying to perform sextortion. 00:42:44.160 --> 00:42:45.960 One in four victims of sextortion 00:42:45.960 --> 00:42:47.640 are 13 years or younger. 00:42:47.640 --> 00:42:50.920 So I feel like what Emily is trying to say 00:42:50.920 --> 00:42:53.040 and Emilia just said 00:42:53.040 --> 00:42:55.120 is that we really have to have trusted adults 00:42:55.120 --> 00:42:56.720 and have open communication, 00:42:56.720 --> 00:42:58.680 so parents can talk to their kids 00:42:58.680 --> 00:43:02.160 and help them when these things happen. 00:43:02.160 --> 00:43:04.240 So I just wanted to throw that out there, 00:43:04.240 --> 00:43:07.200 because there's a lot of sexual predators 00:43:07.200 --> 00:43:11.720 that are unfortunately trolling for our kids. 00:43:11.720 --> 00:43:13.800 >>That leads on nicely 00:43:13.800 --> 00:43:16.440 to getting into some of the good stuff. 00:43:16.440 --> 00:43:21.000 So what do we do about all of these huge problems? 00:43:21.000 --> 00:43:22.600 And I want to open up the discussion 00:43:22.600 --> 00:43:25.520 a little bit with reporting. 00:43:25.520 --> 00:43:27.200 Firstly, identifying, 00:43:27.200 --> 00:43:30.000 how can we help various different communities 00:43:30.000 --> 00:43:33.760 and individuals to identify inappropriate 00:43:33.760 --> 00:43:35.520 and threatening behavior online? 00:43:35.520 --> 00:43:36.880 And then what do we do about it? 00:43:36.880 --> 00:43:39.120 So what do the reporting mechanisms look like? 00:43:39.120 --> 00:43:42.200 And if there are any best practices or training 00:43:42.200 --> 00:43:45.800 on using reporting mechanisms, I would love to hear them? 00:43:48.680 --> 00:43:53.120 >>I'm happy to start as the platform if that helps. 00:43:53.120 --> 00:43:57.800 I can't emphasize enough the criticality, 00:43:57.800 --> 00:44:01.200 the vital-ness of reporting. 00:44:01.200 --> 00:44:04.320 So first just to start out, our community guidelines 00:44:04.320 --> 00:44:07.400 set the foundation for engaging on Snap Chat. 00:44:07.400 --> 00:44:09.080 And all of our community members 00:44:09.080 --> 00:44:11.520 have to abide by these guidelines. 00:44:11.520 --> 00:44:14.280 They cover the risks that we've been talking about today: 00:44:14.280 --> 00:44:18.200 online bullying, hate speech, inciting violence, sextortion, 00:44:18.200 --> 00:44:21.320 other prohibitive content and behavior. 00:44:21.320 --> 00:44:24.960 And we offer these easy to use in app tools 00:44:24.960 --> 00:44:26.680 for our community members 00:44:26.680 --> 00:44:30.840 to make us aware of violating content or concerning behavior. 00:44:30.840 --> 00:44:33.840 Now I think there's another thing to first keep in mind. 00:44:33.840 --> 00:44:37.000 It's important to note when to report to authorities 00:44:37.000 --> 00:44:40.160 and when to report to Snap Chat or some other platform. 00:44:40.160 --> 00:44:42.320 So any imminent threat to life 00:44:42.320 --> 00:44:44.480 must be reported to local authorities. 00:44:44.480 --> 00:44:47.040 There can be a subsequent report to Snap Chat 00:44:47.040 --> 00:44:49.840 or to another platform where you might be seeing this behavior, 00:44:49.840 --> 00:44:52.760 but the report to authorities is critical. 00:44:52.760 --> 00:44:56.480 And Snap Chat or other platforms will of course respond 00:44:56.480 --> 00:44:58.760 to legal law enforcement requests 00:44:58.760 --> 00:45:02.080 and cooperate with investigations. 00:45:02.080 --> 00:45:06.600 Meanwhile there are violations of our community guidelines, 00:45:06.600 --> 00:45:09.440 they can and should be reported to Snap Chat directly. 00:45:09.440 --> 00:45:13.240 You can report content, so you can report photos or videos. 00:45:13.240 --> 00:45:15.240 You can report an account. 00:45:15.240 --> 00:45:19.000 And again I can't stress enough the value of reaching out 00:45:19.000 --> 00:45:22.960 to us in this way. All of the reports are reviewed. 00:45:22.960 --> 00:45:25.840 Every action is taken that needs to be taken 00:45:25.840 --> 00:45:28.920 that is appropriate to the given violation. 00:45:28.920 --> 00:45:30.840 Young people in particular think that, 00:45:30.840 --> 00:45:32.200 oh, it doesn't really matter, 00:45:32.200 --> 00:45:35.360 oh, on Snap Chat that content is ephemeral, 00:45:35.360 --> 00:45:38.080 it goes away, how are they going to know what really happened, 00:45:38.080 --> 00:45:40.400 my report's going to go into a black hole. 00:45:40.400 --> 00:45:42.240 None of that is the case. 00:45:42.240 --> 00:45:47.400 Every report is reviewed and appropriate action is taken. 00:45:47.400 --> 00:45:51.120 It's particularly important that community members reach out 00:45:51.120 --> 00:45:55.880 about content in Snaps and Chat, so photos and text messages, 00:45:55.880 --> 00:45:59.640 because Snap Chat is not a traditional social network. 00:45:59.640 --> 00:46:02.320 It's again designed for communications 00:46:02.320 --> 00:46:04.520 between and among real friends 00:46:04.520 --> 00:46:06.880 and these conversations are private. 00:46:06.880 --> 00:46:09.200 So if something is going on in that private setting 00:46:09.200 --> 00:46:11.760 and someone is uncomfortable, reach out to us, 00:46:11.760 --> 00:46:13.040 we need to know about it 00:46:13.040 --> 00:46:15.480 so that we can help and do something about it. 00:46:20.840 --> 00:46:23.320 >>It's also important to remember, 00:46:23.320 --> 00:46:25.560 and I think Jacqueline I think you mentioned this 00:46:25.560 --> 00:46:28.080 at the beginning that there are some that need 00:46:28.080 --> 00:46:30.400 to be reported to authorities, right? 00:46:30.400 --> 00:46:31.560 And when talking about bullying 00:46:31.560 --> 00:46:34.520 it can sometimes overlap with this discriminatory harassment 00:46:34.520 --> 00:46:36.240 when it's based on race, religion, 00:46:36.240 --> 00:46:39.440 national origin, caller sex, including sexual orientation 00:46:39.440 --> 00:46:41.960 and gender identity, age, disability. 00:46:41.960 --> 00:46:45.760 We need to be aware of this, especially schools. 00:46:45.760 --> 00:46:47.200 Federally-funded schools 00:46:47.200 --> 00:46:48.840 including colleges and universities 00:46:48.840 --> 00:46:52.680 have an obligation to resolve harassment on this basis. 00:46:52.680 --> 00:46:56.080 And the Department of Justice can also play a role in this 00:46:56.080 --> 00:46:58.400 at the federal level when a hate crime, 00:46:58.400 --> 00:47:00.600 a hate crime is any crime that is committed, 00:47:00.600 --> 00:47:02.760 again in terms of the basis of race, 00:47:02.760 --> 00:47:04.800 color, religion, national origin. 00:47:04.800 --> 00:47:08.200 So bullying, cyberbullying and the type of risk 00:47:08.200 --> 00:47:11.200 that we are talking about today on the online sphere 00:47:11.200 --> 00:47:13.280 could be associated with this. 00:47:13.280 --> 00:47:16.000 It's not always a crime, it can be. 00:47:16.000 --> 00:47:20.400 And to learn more about this and there are also jurisdictions 00:47:20.400 --> 00:47:22.840 that have specific laws and policies 00:47:22.840 --> 00:47:25.600 related to cyberbullying and bullying. 00:47:25.600 --> 00:47:27.560 That would need to be looked into specifically 00:47:27.560 --> 00:47:29.400 for the jurisdiction of the person. 00:47:29.400 --> 00:47:31.800 Stopbullying.gov, through the Department of Education 00:47:31.800 --> 00:47:34.280 has a great resource to be able to look into this 00:47:34.280 --> 00:47:35.600 at stopbullying.gov. 00:47:38.240 --> 00:47:40.240 >>I think another point that would be great for kids 00:47:40.240 --> 00:47:42.840 to learn as they're sort of gaining independence 00:47:42.840 --> 00:47:45.400 and are online is that there is nothing wrong 00:47:45.400 --> 00:47:47.520 with using safety tools to stay safe. 00:47:47.520 --> 00:47:48.800 That's what they're there for. 00:47:48.800 --> 00:47:50.160 And if they see something concerning 00:47:50.160 --> 00:47:53.200 it's starting drama to flag up a behavior 00:47:53.200 --> 00:47:55.280 that is truly harmful or inappropriate. 00:47:55.280 --> 00:47:57.840 And I'm basing that on our most recent research 00:47:57.840 --> 00:48:00.880 that came out last fall which was specifically focused 00:48:00.880 --> 00:48:03.680 on teens and young people's use of online safety tools. 00:48:03.680 --> 00:48:04.960 And one thing that we found 00:48:04.960 --> 00:48:07.360 is that somewhat understandably there's sort of for teens 00:48:07.360 --> 00:48:08.560 where social currency 00:48:08.560 --> 00:48:10.560 is everything there's a little bit of an element 00:48:10.560 --> 00:48:14.000 or a perceived element at the very least of risk 00:48:14.000 --> 00:48:16.560 that if someone finds out that they took an action 00:48:16.560 --> 00:48:18.880 they're going to be seen as like a tattle tale 00:48:18.880 --> 00:48:21.320 and there's this fear that somehow even in the case 00:48:21.320 --> 00:48:23.720 of completely anonymous reporting features 00:48:23.720 --> 00:48:25.440 that it will be connected back to them 00:48:25.440 --> 00:48:27.400 and they'll and they'll be cast out of the social group. 00:48:27.400 --> 00:48:30.440 So sort of removing that fear and teaching them 00:48:30.440 --> 00:48:32.680 that it can be very positive to protect each other 00:48:32.680 --> 00:48:37.440 would be a great thing to focus on in an educational effort. 00:48:37.440 --> 00:48:40.760 We also found that some teens had more of a willingness 00:48:40.760 --> 00:48:42.400 to take action based on 00:48:42.400 --> 00:48:45.440 whether or not they knew the person perpetrated behavior. 00:48:45.440 --> 00:48:47.840 So if there is a stranger or a troll 00:48:47.840 --> 00:48:49.360 who's out spreading content 00:48:49.360 --> 00:48:52.360 that they don't know they had much less of an issue muting, 00:48:52.360 --> 00:48:54.120 blocking, reporting that person. 00:48:54.120 --> 00:48:56.040 Whereas if it's somebody in their friend group 00:48:56.040 --> 00:48:58.880 who begins posting things that make them uncomfortable 00:48:58.880 --> 00:49:01.040 or moving in a harmful direction, 00:49:01.040 --> 00:49:02.320 there's much more hesitancy 00:49:02.320 --> 00:49:03.840 about taking action against that. 00:49:03.840 --> 00:49:06.000 So they are more likely to feel 00:49:06.000 --> 00:49:08.040 that they can control their own space, 00:49:08.040 --> 00:49:10.440 you know, if they want to not interact with that person, 00:49:10.440 --> 00:49:11.960 they can separate from that behavior, 00:49:11.960 --> 00:49:15.600 but they won't necessarily go the extra step of reporting 00:49:15.600 --> 00:49:19.120 because they don't feel that the offense is bad enough. 00:49:19.120 --> 00:49:21.200 And the only time that that sort of changed a bit 00:49:21.200 --> 00:49:24.080 was if there was a sense of responsibility, 00:49:24.080 --> 00:49:26.360 participants saying I have, you know, a little sister 00:49:26.360 --> 00:49:27.600 and I thought about her 00:49:27.600 --> 00:49:29.600 and I wouldn't want her to interact with that, 00:49:29.600 --> 00:49:32.920 so I reported it or in cases where I truly felt 00:49:32.920 --> 00:49:35.560 someone was going to get hurt, I reported it. 00:49:35.560 --> 00:49:39.120 But I think there is maybe a little bit of a gap 00:49:39.120 --> 00:49:40.600 from sometimes a little bit 00:49:40.600 --> 00:49:42.680 of a lack of understanding of the tools 00:49:42.680 --> 00:49:44.240 and a little bit of fear that comes from that, 00:49:44.240 --> 00:49:47.360 so I think just continuing to sort of raise that awareness. 00:49:47.360 --> 00:49:50.520 And with parents as well, kids are not going to always 00:49:50.520 --> 00:49:51.920 come across these things on their own 00:49:51.920 --> 00:49:55.080 so it really goes back to that idea of encouraging parents 00:49:55.080 --> 00:49:56.960 to always sit down with their child, 00:49:56.960 --> 00:49:58.480 especially starting from a young age 00:49:58.480 --> 00:49:59.840 if they download a new app, 00:49:59.840 --> 00:50:02.440 you know, going through with them all of the features 00:50:02.440 --> 00:50:04.040 and the tools that they have at their disposal 00:50:04.040 --> 00:50:07.000 to make sure that they're creating a space for themselves 00:50:07.000 --> 00:50:08.200 that they feel safe. 00:50:10.960 --> 00:50:12.520 >>Just to jump in here, Emily, 00:50:12.520 --> 00:50:14.920 something that you said kind of sparked a thought for me, 00:50:14.920 --> 00:50:16.680 I'm going to put this question to Jacqueline, 00:50:16.680 --> 00:50:18.720 but I obviously want to ask you 00:50:18.720 --> 00:50:21.240 to speak on behalf of all social media platforms, 00:50:21.240 --> 00:50:23.880 because I know they deal with reporting 00:50:23.880 --> 00:50:25.800 and with moderation very definitely. 00:50:25.800 --> 00:50:27.600 But in terms of thinking about how we think 00:50:27.600 --> 00:50:31.080 on destigmatizing the idea of reporting, 00:50:31.080 --> 00:50:33.960 there is I think a lot of confusion 00:50:33.960 --> 00:50:37.440 and a lot of debate about whether data on the basis 00:50:37.440 --> 00:50:40.520 of that reporting actually gets communicated 00:50:40.520 --> 00:50:43.760 to people's networks or is communicated to the person 00:50:43.760 --> 00:50:46.000 who's the subject of the reporting. 00:50:46.000 --> 00:50:47.680 It would be great if you could just explain 00:50:47.680 --> 00:50:51.520 from Snap's point of view how that actually happens. 00:50:51.520 --> 00:50:54.480 >>Sure. I was actually going back into some notes 00:50:54.480 --> 00:50:58.960 and looking at the research that Emily cited, 00:50:58.960 --> 00:51:01.480 which I think is really important here. 00:51:01.480 --> 00:51:04.240 Snap sponsored that research last year 00:51:04.240 --> 00:51:06.800 for the Family Online Safety Institute. 00:51:06.800 --> 00:51:09.560 And it was over a third of young people, 00:51:09.560 --> 00:51:11.200 so 34 percent said 00:51:11.200 --> 00:51:13.920 that they worried what their friends will think 00:51:13.920 --> 00:51:17.760 if they take action against bad behavior on social media. 00:51:17.760 --> 00:51:22.600 And then almost one in four, it was 39 percent said 00:51:22.600 --> 00:51:25.080 that they feel pressure not to act 00:51:25.080 --> 00:51:28.800 when someone they personally know behaves badly. 00:51:28.800 --> 00:51:31.800 So we're up against a little bit of a social dynamic here. 00:51:31.800 --> 00:51:34.880 And then we're also up against, as you know Melanie, 00:51:34.880 --> 00:51:39.960 actions from platforms to show that reporting is valuable 00:51:39.960 --> 00:51:43.000 that it's not tattle tailing on someone else, 00:51:43.000 --> 00:51:46.560 it's actually designed to make the community safer. 00:51:46.560 --> 00:51:50.920 So we want people to know that they can report, as I said, 00:51:50.920 --> 00:51:55.520 photos, videos, accounts in the more public portion of the app. 00:51:55.520 --> 00:51:58.120 On Snap Chat you can also report content 00:51:58.120 --> 00:52:00.400 from Discover and Spotlight. 00:52:00.400 --> 00:52:01.880 How to report. 00:52:01.880 --> 00:52:04.200 There is just a matter of pressing 00:52:04.200 --> 00:52:06.360 and holding on the piece of content 00:52:06.360 --> 00:52:10.560 or you can actually report via our support site as well. 00:52:10.560 --> 00:52:12.760 Reporting is confidential. 00:52:12.760 --> 00:52:17.320 Snap Chat is not going to alert any other Snap Chatter 00:52:17.320 --> 00:52:20.200 that they were reported on by the person who reported them. 00:52:20.200 --> 00:52:22.520 So you report somebody that person 00:52:22.520 --> 00:52:24.800 is not going to know that you reported them. 00:52:24.800 --> 00:52:28.200 And again reports are vital. All of our reports are reviewed. 00:52:28.200 --> 00:52:30.480 They're actioned by our safety teams. 00:52:30.480 --> 00:52:34.080 Those safety teams operate 24 by 7 around the clock, 00:52:34.080 --> 00:52:35.560 around the globe. 00:52:35.560 --> 00:52:38.160 And then the enforcement itself can vary. 00:52:38.160 --> 00:52:41.440 So depending on the nature of the violation of our community 00:52:41.440 --> 00:52:46.360 guidelines enforcement can range from warning the user up to 00:52:46.360 --> 00:52:49.240 and including deleting the entire account. 00:52:49.240 --> 00:52:52.600 So no action is going to be taken on a report 00:52:52.600 --> 00:52:55.640 for an account that was deemed not to have violated 00:52:55.640 --> 00:52:57.240 Snap Chat's community guidelines. 00:52:57.240 --> 00:53:01.160 But again this is so critical, so important. 00:53:01.160 --> 00:53:04.400 It helps keep everyone safer in the community. 00:53:04.400 --> 00:53:05.760 You're doing a service. 00:53:05.760 --> 00:53:06.960 You are not doing anyone 00:53:06.960 --> 00:53:09.600 or yourself a disservice by reporting. 00:53:09.600 --> 00:53:12.840 Now we don't want false reports. We don't want made up reports. 00:53:12.840 --> 00:53:19.880 We want legitimate bad conduct, bad behavior, bad content 00:53:19.880 --> 00:53:22.880 so that we can remove bad actors from the community. 00:53:24.800 --> 00:53:27.600 >>Thank you for that. That makes a lot of sense. 00:53:27.600 --> 00:53:29.200 We only have a few minutes left, 00:53:29.200 --> 00:53:32.040 so I'm going to zoom out a little bit 00:53:32.040 --> 00:53:34.680 and ask for some big picture thoughts 00:53:34.680 --> 00:53:37.160 to help wrap us up on the question 00:53:37.160 --> 00:53:40.960 of how do we help create safer spaces online 00:53:40.960 --> 00:53:44.000 and particularly with regard to building resilience 00:53:44.000 --> 00:53:45.800 among some of these constituencies 00:53:45.800 --> 00:53:47.040 that we've been talking about? 00:53:55.040 --> 00:53:57.200 >>I can start if you want. 00:53:57.200 --> 00:54:01.280 Basically first we have to have adults and parents and guardians 00:54:01.280 --> 00:54:04.800 start the conversation with our youth early. 00:54:04.800 --> 00:54:06.480 And make sure that it goes both ways 00:54:06.480 --> 00:54:08.760 that youth can start the conversation, 00:54:08.760 --> 00:54:12.240 teens can talk to their parents about their online activities 00:54:12.240 --> 00:54:16.200 and they can sit down together and talk about like usage, 00:54:16.200 --> 00:54:19.280 how much is okay, what's appropriate usage, 00:54:19.280 --> 00:54:20.640 what are the consequences 00:54:20.640 --> 00:54:22.960 for breaking technology rules in the house. 00:54:22.960 --> 00:54:24.600 And it would also help familiarize parents 00:54:24.600 --> 00:54:27.040 and guardians like what are the youth's interests 00:54:27.040 --> 00:54:29.240 and activities online, 00:54:29.240 --> 00:54:32.080 what's beneficial about them being online. 00:54:32.080 --> 00:54:34.760 Second, it's important to have privacy controls I think 00:54:34.760 --> 00:54:36.560 and regularly check in on those. 00:54:36.560 --> 00:54:39.040 Sometimes kids think, ah, nobody's going to be, 00:54:39.040 --> 00:54:42.240 like strangers aren't going to be coming to bother me. 00:54:42.240 --> 00:54:44.200 But it does matter who they're talking to 00:54:44.200 --> 00:54:47.840 and you need to know the level of risk from your youth. 00:54:47.840 --> 00:54:49.400 Also we want to make sure that kids know 00:54:49.400 --> 00:54:51.680 that there is never a mistake that's too big. 00:54:51.680 --> 00:54:53.560 We all make mistakes and if they need help 00:54:53.560 --> 00:54:55.560 they need to be able to come to an adult 00:54:55.560 --> 00:54:57.920 and we don't want to make them feel like, 00:54:57.920 --> 00:55:01.600 you know, it's not okay to come to us. 00:55:01.600 --> 00:55:03.240 And the other thing I wanted to say 00:55:03.240 --> 00:55:05.560 is there are some resources out there. 00:55:05.560 --> 00:55:07.640 We have like a vigilant parent initiative 00:55:07.640 --> 00:55:09.720 from the San Diego ICAC Task Force. 00:55:09.720 --> 00:55:11.960 I gave the website to you guys. 00:55:11.960 --> 00:55:14.240 We have the ICAC Task Forces 00:55:14.240 --> 00:55:16.960 that will come into your schools to build resiliency 00:55:16.960 --> 00:55:18.600 and help protect with online stuff. 00:55:18.600 --> 00:55:20.320 All anybody out there has to do 00:55:20.320 --> 00:55:23.320 is basically contact their states ICAC Task Force 00:55:23.320 --> 00:55:25.040 and they'll come to any school community, 00:55:25.040 --> 00:55:27.880 church, synagogue, anywhere and basically do 00:55:27.880 --> 00:55:32.640 an online safety digital citizenship presentation. 00:55:32.640 --> 00:55:35.600 >>Okay. We have time for one more comment before we wrap. 00:55:35.600 --> 00:55:38.280 Steffie I think you just lined up the next panel 00:55:38.280 --> 00:55:41.320 on education really well. 00:55:41.320 --> 00:55:43.160 Anyone have any final thoughts? 00:55:46.400 --> 00:55:48.680 >>I just want to reemphasize that the parenting 00:55:48.680 --> 00:55:51.920 and caregivers are key in building resilience. 00:55:51.920 --> 00:55:55.760 So recognizes the ability to overcome a serious hardship 00:55:55.760 --> 00:55:58.960 and adapt well when faced with adverse experiences, right? 00:55:58.960 --> 00:56:01.760 So if we are talking about cyberbullying, 00:56:01.760 --> 00:56:04.360 sexting, this online risk, 00:56:04.360 --> 00:56:06.920 if we can help our children build that resilience 00:56:06.920 --> 00:56:09.000 and that ability to overcome it 00:56:09.000 --> 00:56:11.160 as has been mentioned talking to our kids, 00:56:11.160 --> 00:56:12.640 spending time with them, 00:56:12.640 --> 00:56:14.960 consistent guidance and structure, 00:56:14.960 --> 00:56:17.200 being aware of their whereabouts and activities, 00:56:17.200 --> 00:56:20.960 giving warm, emotional support can be some very general ways 00:56:20.960 --> 00:56:23.360 in which we can build resilience 00:56:23.360 --> 00:56:27.520 and there can be more specific points that we can go into. 00:56:27.520 --> 00:56:29.440 Steffie mentioned some major resources. 00:56:29.440 --> 00:56:32.160 I see some are being posted on this chat. 00:56:32.160 --> 00:56:33.760 So thank you for that. 00:56:33.760 --> 00:56:35.480 [inaudible] 00:56:35.480 --> 00:56:38.760 From CDC Violence. 00:56:38.760 --> 00:56:41.080 We can go there and look at resources 00:56:41.080 --> 00:56:44.680 for different forms of violence and to build resilience as well. 00:56:46.520 --> 00:56:47.800 >>Thank you, Melissa. 00:56:47.800 --> 00:56:50.560 I'm going to pass back to Christie to wrap up the panel. 00:56:50.560 --> 00:56:52.840 But a big personal thank you from me. 00:56:52.840 --> 00:56:54.640 You've been amazing panelists. 00:56:54.640 --> 00:56:59.440 And I hope this has been useful for everybody on the line. 00:56:59.440 --> 00:57:00.720 >>Thank you, Melanie. 00:57:00.720 --> 00:57:02.600 And thank you to all the panelists. 00:57:02.600 --> 00:57:05.400 What amazing information that was shared from you 00:57:05.400 --> 00:57:07.560 as the experts and the leaders. 00:57:07.560 --> 00:57:08.880 And really what a great discussion 00:57:08.880 --> 00:57:10.560 on what we should be aware of online, 00:57:10.560 --> 00:57:12.600 ways to be proactive in the social development 00:57:12.600 --> 00:57:14.560 and use of online resources and platforms. 00:57:14.560 --> 00:57:16.880 And then really love this quote from the panel that 00:57:16.880 --> 00:57:19.200 "kids have the intelligence, but adults have the wisdom." 00:57:19.200 --> 00:57:21.960 So again the importance of including all levels 00:57:21.960 --> 00:57:23.520 of the community from platforms, 00:57:23.520 --> 00:57:26.000 parents, leaders, down to the students themselves 00:57:26.000 --> 00:57:28.440 to help build resilience and design solutions. 00:57:28.440 --> 00:57:30.240 So thank you again. 00:57:30.240 --> 00:57:32.120 We're going to take a quick break 00:57:32.120 --> 00:57:35.000 before we return to our second panel starting at 2:30. 00:57:35.000 --> 00:57:38.840 So thank you again panelists from this panel. 00:57:38.840 --> 00:57:41.400 You can turn your videos off and you can go grab some coffee, 00:57:41.400 --> 00:57:46.080 some lunch and we will break and reconvene at 2:30 00:57:46.080 --> 00:57:47.960 for the panel on visual literacy. 00:57:47.960 --> 00:57:48.760 Thank you all.