WEBVTT 00:00:12.513 --> 00:00:14.982 >>I'd like to kick it over to you. 00:00:14.982 --> 00:00:17.117 >>Thank you, Ryan. That's very kind of you to say. 00:00:17.117 --> 00:00:19.887 And I got to say, we're going to have two hours 00:00:19.887 --> 00:00:23.257 together of this moderated discussion. 00:00:23.257 --> 00:00:27.928 As we've seen, especially in the attack on the US Capitol 00:00:27.928 --> 00:00:29.830 on January 6th, 00:00:29.830 --> 00:00:31.865 technology affects all of us. 00:00:31.865 --> 00:00:34.601 It primarily affects our information ecosystem, 00:00:34.601 --> 00:00:35.802 one of the topics 00:00:35.802 --> 00:00:38.305 that surely will come up today. 00:00:38.305 --> 00:00:41.875 As we can say or as I would say, in my opinion, 00:00:41.875 --> 00:00:47.581 our democracy depends on a vibrant information ecosystem 00:00:47.581 --> 00:00:50.217 and a shared sense of truth. 00:00:50.217 --> 00:00:54.855 And technology obviously affects this in so many different ways. 00:00:54.855 --> 00:00:57.925 And I think what we're trying to do today 00:00:57.925 --> 00:01:00.894 with the esteemed panel that we have assembled 00:01:00.894 --> 00:01:03.297 is discuss how we can improve, 00:01:03.297 --> 00:01:07.301 how we can build a better tech future. 00:01:07.301 --> 00:01:10.904 What we're going to do with our two hours together is, 00:01:10.904 --> 00:01:14.875 first, we're going to hear from each individual panelist. 00:01:14.875 --> 00:01:18.979 And each of these panelists are going to be describing 00:01:18.979 --> 00:01:20.747 a little bit of their background. 00:01:20.747 --> 00:01:24.017 So, we're going to have these mini presentations 00:01:24.017 --> 00:01:25.252 that are going to happen. 00:01:25.252 --> 00:01:29.756 So, what I want you to do, as an attendee, 00:01:29.756 --> 00:01:32.826 one, thank you for being here today. 00:01:32.826 --> 00:01:35.762 But really, as you listen to each of these panelists, 00:01:35.762 --> 00:01:38.198 I want you to start thinking of questions, 00:01:38.198 --> 00:01:41.635 thinking of areas that can be asked 00:01:41.635 --> 00:01:45.839 as we kind of welcome everyone back together. 00:01:45.839 --> 00:01:48.008 After each individual presentation, 00:01:48.008 --> 00:01:50.377 we're going to have time together where we can do 00:01:50.377 --> 00:01:52.245 a little bit of Q&A. 00:01:52.245 --> 00:01:53.780 So without further ado, 00:01:53.780 --> 00:01:56.116 I'll start us off in the order 00:01:56.116 --> 00:01:58.418 that we're going to go through with our panel. 00:01:58.418 --> 00:02:02.389 So, I'm going to introduce each panelist separately, 00:02:02.389 --> 00:02:05.459 and then that panelist is going to share their screen 00:02:05.459 --> 00:02:08.028 and do a little bit of a mini presentation, 00:02:08.028 --> 00:02:11.565 probably around 10 to 15 minutes for each presentation. 00:02:11.565 --> 00:02:14.468 And then we come back and then have a little bit of Q&A 00:02:14.468 --> 00:02:16.269 that I'm going to moderate. 00:02:16.269 --> 00:02:17.738 And then we want to start hearing 00:02:17.738 --> 00:02:21.608 from all of you with our remaining time together. 00:02:21.608 --> 00:02:24.911 So, the first individual that we're going to bring on today 00:02:24.911 --> 00:02:27.848 is Dr. Greg Gerber. 00:02:27.848 --> 00:02:29.816 Greg is the senior advisor 00:02:29.816 --> 00:02:33.487 and case consultant with Safer Schools Together, 00:02:33.487 --> 00:02:34.621 and an assistant professor 00:02:34.621 --> 00:02:39.026 at the New York Institute of Technology in Vancouver. 00:02:39.026 --> 00:02:41.028 Greg also serves as the dean of Science 00:02:41.028 --> 00:02:43.230 and Instructional Technology, 00:02:43.230 --> 00:02:44.865 and is the director of the Center 00:02:44.865 --> 00:02:47.067 for Teaching and Learning. 00:02:47.067 --> 00:02:50.837 So, I'm going to ask Greg right now to put on his camera 00:02:50.837 --> 00:02:52.873 and then share his screen 00:02:52.873 --> 00:02:55.809 so we can learn a little bit more about his background. 00:02:57.544 --> 00:03:00.881 >>Perfect. Thank you, David, and hello to everyone. 00:03:00.881 --> 00:03:04.951 I'm actually going to integrate my screen into the feed, 00:03:04.951 --> 00:03:06.920 so hopefully that works well. 00:03:06.920 --> 00:03:08.822 If there are any issues, let me know, 00:03:08.822 --> 00:03:12.059 and we'll change tracks really quickly. 00:03:12.059 --> 00:03:14.828 Thank you for that introduction. 00:03:14.828 --> 00:03:17.431 My background, to go a little further into it, 00:03:17.431 --> 00:03:19.566 is I actually began this journey 00:03:19.566 --> 00:03:23.170 as a high school teacher and vice principal, 00:03:23.170 --> 00:03:27.507 and worked for about 14 years 00:03:27.507 --> 00:03:31.678 as a senior technical consultant for Fortune 500 companies, 00:03:31.678 --> 00:03:34.114 and ran an IT consulting form, 00:03:34.114 --> 00:03:36.683 which ultimately, with the combination 00:03:36.683 --> 00:03:40.921 of education and technology, 00:03:40.921 --> 00:03:43.723 landed me in the roles that I hold today. 00:03:43.723 --> 00:03:48.328 Most significantly, of course, is working with the university 00:03:48.328 --> 00:03:52.032 on training future educators, 00:03:52.032 --> 00:03:56.703 but also in the work that we're going to talk about today, 00:03:56.703 --> 00:03:59.706 which is with Safer Schools Together. 00:03:59.706 --> 00:04:01.775 And so, I'm very excited to be here. 00:04:01.775 --> 00:04:04.911 Now, of course, all of that work life 00:04:04.911 --> 00:04:07.447 is coupled with the real life experience 00:04:07.447 --> 00:04:11.985 of being a dad of four young men. 00:04:11.985 --> 00:04:17.958 And I have lived through this technological infusion, 00:04:17.958 --> 00:04:22.596 and trying to navigate the goods and the bads, etc., 00:04:22.596 --> 00:04:23.997 that have come with it, 00:04:23.997 --> 00:04:27.434 as now my oldest son is 21 years of age, 00:04:27.434 --> 00:04:30.237 and the youngest is 14. 00:04:30.237 --> 00:04:34.174 That grants an awful lot of perspective. 00:04:34.174 --> 00:04:35.442 Now, really quickly, 00:04:35.442 --> 00:04:36.643 I need to say that 00:04:36.643 --> 00:04:38.245 some of what I'm going to share with you 00:04:38.245 --> 00:04:39.846 in this short presentation 00:04:39.846 --> 00:04:45.051 may be a contextual trigger for people 00:04:45.051 --> 00:04:48.522 who have gone through difficult times in the past. 00:04:48.522 --> 00:04:54.594 And so, just a quick word to your own safety and well-being, 00:04:54.594 --> 00:04:57.998 if there is something that has triggered you in any way, 00:04:57.998 --> 00:05:01.401 please feel free to take a moment away from the screen. 00:05:01.401 --> 00:05:04.371 There will be some graphics 00:05:04.371 --> 00:05:07.174 that I am going to share as a result 00:05:07.174 --> 00:05:09.809 of the observational data 00:05:09.809 --> 00:05:14.214 that we've been collecting with Safer Schools Together. 00:05:14.214 --> 00:05:17.684 Now, in Safer Schools together, 00:05:17.684 --> 00:05:21.888 our vision, very simply and concisely, 00:05:21.888 --> 00:05:26.493 is to end all school violence. 00:05:26.493 --> 00:05:28.395 It's a lofty dream, there's no doubt. 00:05:28.395 --> 00:05:33.233 But the way that we do this is by investing today 00:05:33.233 --> 00:05:35.001 for a safer tomorrow 00:05:35.001 --> 00:05:41.808 and considering all the learnings that we have garnered 00:05:41.808 --> 00:05:44.878 from working closely with school districts, 00:05:44.878 --> 00:05:51.618 states, provinces, in responding to threats, 00:05:51.618 --> 00:05:56.790 violent situations, self-harm, etc. 00:05:56.790 --> 00:06:02.262 But not only in response to, but instead, very specifically, 00:06:02.262 --> 00:06:04.564 our work is geared towards 00:06:04.564 --> 00:06:07.634 moving into a lane of intervention, 00:06:07.634 --> 00:06:14.341 as opposed to response, as opposed to reaction, okay? 00:06:14.341 --> 00:06:17.210 How do we move towards prevention? 00:06:17.210 --> 00:06:18.445 That's the goal. 00:06:18.445 --> 00:06:20.847 How does tomorrow become safer? 00:06:20.847 --> 00:06:23.883 We apply the learnings 00:06:23.883 --> 00:06:27.621 and engage all community stakeholders 00:06:27.621 --> 00:06:29.089 that we have access to. 00:06:30.457 --> 00:06:33.827 We've learned from years of working with schools, 00:06:33.827 --> 00:06:36.096 law enforcement, and government 00:06:36.096 --> 00:06:41.968 in the aftermath of violent acts that there are commonalities 00:06:41.968 --> 00:06:46.239 and things 20 years ago, 22 years ago, 00:06:46.239 --> 00:06:50.310 when I was first started as a vice principal, 00:06:50.310 --> 00:06:53.647 that I never had access to. 00:06:53.647 --> 00:06:57.050 And it would've absolutely transformed the way 00:06:57.050 --> 00:07:00.420 that we would deal with individuals. 00:07:00.420 --> 00:07:02.989 As a vice principal responsible for discipline, 00:07:02.989 --> 00:07:05.325 you can imagine, a student comes in, 00:07:05.325 --> 00:07:07.460 they're in trouble, they've done something wrong, 00:07:07.460 --> 00:07:09.296 they've behaved poorly - 00:07:09.296 --> 00:07:11.965 what is the reaction? 00:07:11.965 --> 00:07:13.600 Discipline. 00:07:13.600 --> 00:07:18.938 We've learned that acting out of a mindset of discipline 00:07:18.938 --> 00:07:25.145 as a primary response often acts as a catalyst 00:07:25.145 --> 00:07:28.782 for that individual to go and cause harm, 00:07:28.782 --> 00:07:31.718 to engage in the violence they were considering. 00:07:31.718 --> 00:07:37.557 What we also learned in the aftermath of the violent acts, 00:07:37.557 --> 00:07:40.660 like Columbine, etc., 00:07:40.660 --> 00:07:47.467 is that nobody wakes up a healthy individual today, 00:07:47.467 --> 00:07:52.839 has a day, wake up tomorrow, and suddenly snaps. 00:07:52.839 --> 00:07:55.075 You've probably heard that in the media before. 00:07:55.075 --> 00:07:58.411 It just doesn't happen. 00:07:58.411 --> 00:08:00.647 What we see instead is 00:08:00.647 --> 00:08:05.452 that there is a very specific progression or pathway 00:08:05.452 --> 00:08:07.120 towards violence. 00:08:07.120 --> 00:08:10.724 There's some sort of grievance that takes place. I'm upset. 00:08:10.724 --> 00:08:15.261 I feel like I've lost something as an individual. 00:08:15.261 --> 00:08:18.231 And only if that is not repaired, 00:08:18.231 --> 00:08:19.733 if we don't settle the grievance, 00:08:19.733 --> 00:08:26.072 if we don't find a way to do better, 00:08:26.072 --> 00:08:28.708 to repair the relationships normally, 00:08:28.708 --> 00:08:32.379 then it may progress through ideation planning, preparation, 00:08:32.379 --> 00:08:36.483 and finally, the act of committing violence. 00:08:36.483 --> 00:08:39.586 But this takes time. 00:08:39.586 --> 00:08:40.754 It takes time. 00:08:40.754 --> 00:08:46.126 And the utility of recognizing this kind of progression 00:08:46.126 --> 00:08:48.928 allows us to see, 00:08:48.928 --> 00:08:51.398 it provides us a purview 00:08:51.398 --> 00:08:55.635 into the motivation of any individual, 00:08:55.635 --> 00:08:59.372 a motivation towards some sort of gain. 00:08:59.372 --> 00:09:00.473 A payoff. 00:09:00.473 --> 00:09:02.008 They feel like 00:09:02.008 --> 00:09:07.280 they're getting something out of moving in this direction. 00:09:07.280 --> 00:09:09.416 Not in all cases, of course. 00:09:09.416 --> 00:09:12.919 But in many cases where we see that violence, 00:09:12.919 --> 00:09:16.022 and having considered the information 00:09:16.022 --> 00:09:18.792 after the event has taken place, 00:09:18.792 --> 00:09:22.996 we also see that one of those motivations might be notoriety. 00:09:22.996 --> 00:09:29.202 So the question is, how do we, recognizing this pathway, 00:09:29.202 --> 00:09:31.337 intervene appropriately? 00:09:31.337 --> 00:09:34.274 How can we stop it at the ideation or the planning 00:09:34.274 --> 00:09:37.777 so that we don't see that violent action? 00:09:37.777 --> 00:09:41.314 Well, that's the goal, isn't it? 00:09:41.314 --> 00:09:47.754 First, we recognize that behavior provides a window 00:09:47.754 --> 00:09:49.722 into the psychology 00:09:49.722 --> 00:09:52.625 and the mindset of an individual. 00:09:52.625 --> 00:09:55.295 And while this won't be a philosophy 00:09:55.295 --> 00:09:58.164 or psychology session, 00:09:58.164 --> 00:10:01.701 we've learned from behavioral psychiatrists and philosophers 00:10:01.701 --> 00:10:08.508 alike that recognize how mindset influences behavior. 00:10:08.508 --> 00:10:14.514 And similarly, behavior influences mindset. 00:10:14.514 --> 00:10:18.184 Boy, that sounds simple when I say it, doesn't it? 00:10:18.184 --> 00:10:21.454 For example, self-help gurus 00:10:21.454 --> 00:10:26.159 and body language experts all agree that forcing a smile, 00:10:26.159 --> 00:10:28.928 something as simple as that, can change our feelings. 00:10:28.928 --> 00:10:31.164 Try it right now if you don't believe me. 00:10:31.164 --> 00:10:33.566 What happens inside? 00:10:33.566 --> 00:10:37.804 Or standing in the hero pose can make you feel more confident. 00:10:37.804 --> 00:10:39.906 Okay. We get it. 00:10:39.906 --> 00:10:43.376 How we behave tells us an awful lot 00:10:43.376 --> 00:10:46.246 about what's going on in our minds. 00:10:46.246 --> 00:10:51.217 And, as our technology-infused culture continues to illustrate, 00:10:51.217 --> 00:10:55.021 times have changed. 00:10:55.021 --> 00:10:56.823 Oh boy. [laughter] 00:10:56.823 --> 00:10:58.224 Have they ever. 00:10:58.224 --> 00:11:03.329 Technology is more pervasive and ubiquitous today 00:11:03.329 --> 00:11:06.332 than it has ever been in history. 00:11:06.332 --> 00:11:09.335 It's constantly at our fingertips. 00:11:09.335 --> 00:11:14.240 And with that technology, 00:11:14.240 --> 00:11:18.378 people are sharing more about their inmost thoughts 00:11:18.378 --> 00:11:19.979 than they ever have. 00:11:19.979 --> 00:11:22.348 But they don't necessarily recognize 00:11:22.348 --> 00:11:27.220 that they're sharing the depth of thinking that they are. 00:11:27.220 --> 00:11:30.523 And this is good in terms of being able to move towards 00:11:30.523 --> 00:11:36.095 intervention and prevention over that lane of reaction. 00:11:36.095 --> 00:11:39.332 At Safer Schools Together, we call this leakage. 00:11:39.332 --> 00:11:43.102 We call the ability to access it through open source information 00:11:43.102 --> 00:11:47.874 and social media something of a golden ticket. 00:11:47.874 --> 00:11:51.277 And yes, this is reference to Charlie's chocolate factory 00:11:51.277 --> 00:11:54.180 and Willy Wonka golden ticket. 00:11:54.180 --> 00:11:58.117 We are enabled through technology 00:11:58.117 --> 00:12:06.726 to see far more deeply into an individual's inner world. 00:12:06.726 --> 00:12:08.661 And if we take that information 00:12:08.661 --> 00:12:10.964 and construct something of a baseline, 00:12:10.964 --> 00:12:13.466 or, as some people call it, a profile, 00:12:13.466 --> 00:12:17.103 and recognize where there has been a significant change 00:12:17.103 --> 00:12:21.541 in behavior or ideation, things that influence them, 00:12:21.541 --> 00:12:27.046 that information is absolutely instrumental 00:12:27.046 --> 00:12:29.549 as a consideration 00:12:29.549 --> 00:12:35.221 in our multidimensional assessment teams. 00:12:35.221 --> 00:12:38.024 Coupled with behavioral threat assessment, 00:12:38.024 --> 00:12:45.698 the digital profile provides an increasingly deep purview. 00:12:45.698 --> 00:12:48.001 People post things that are consistent with 00:12:48.001 --> 00:12:50.403 what they're thinking about. 00:12:50.403 --> 00:12:53.973 When somebody posts a picture of their dinner plate, 00:12:53.973 --> 00:12:56.643 I know they're thinking [laughter] about that dinner, 00:12:56.643 --> 00:13:00.179 but also that it was worth posting for some reason. 00:13:02.649 --> 00:13:04.117 The leakage of one's mindset 00:13:04.117 --> 00:13:08.821 and what they're thinking about, usually, in that social media, 00:13:08.821 --> 00:13:12.725 provides an incredible purview. 00:13:12.725 --> 00:13:16.496 Let's take a look at a few examples. 00:13:16.496 --> 00:13:19.198 What we've been seeing over the course of the pandemic, 00:13:19.198 --> 00:13:26.272 and of course, prior as well, are increased numbers, 00:13:26.272 --> 00:13:34.580 increased incidents of people sharing their difficulties. 00:13:34.580 --> 00:13:39.118 Depression. Low self-esteem. 00:13:39.118 --> 00:13:43.156 Constantly wanting to commit self-die. 00:13:43.156 --> 00:13:47.193 Why would you say that? It's grammatically incorrect. 00:13:47.193 --> 00:13:52.498 Because TikTok's algorithms will take the post down 00:13:52.498 --> 00:13:54.667 if you say suicide. 00:13:54.667 --> 00:13:58.037 So our kids are finding increasingly, 00:13:58.037 --> 00:14:01.974 I would say, creative ways to ensure that their post, 00:14:01.974 --> 00:14:04.277 potentially a cry for help, 00:14:04.277 --> 00:14:09.015 their signal to a loss of power and freedom 00:14:09.015 --> 00:14:12.251 - they want to ensure that it has an audience. 00:14:12.251 --> 00:14:16.489 Why? What does that sort of behavior tell us? 00:14:16.489 --> 00:14:21.127 I can go through slide after slide of real cases 00:14:21.127 --> 00:14:22.729 where you see things like this. 00:14:22.729 --> 00:14:25.064 Reasons I'm going to hell. 00:14:25.064 --> 00:14:31.871 Hm. How many of us would post that sort of thing online? 00:14:31.871 --> 00:14:33.840 What's the end game? 00:14:33.840 --> 00:14:35.308 What are we thinking? 00:14:37.376 --> 00:14:43.216 Do we believe this individual feels accepted or important? 00:14:43.216 --> 00:14:45.451 And we have to ask the question, 00:14:45.451 --> 00:14:48.187 are they posting because they wonder whether or not 00:14:48.187 --> 00:14:51.924 anyone who sees this will care enough to notice? 00:14:51.924 --> 00:14:55.428 Am I seen? Am I accepted? 00:14:55.428 --> 00:14:57.330 And it goes on and on with different trends 00:14:57.330 --> 00:14:59.298 that we've been seeing develop through COVID. 00:14:59.298 --> 00:15:01.033 And you've probably seen this one. 00:15:01.033 --> 00:15:03.970 People noting my firsts. 00:15:03.970 --> 00:15:09.375 And you can see my first drugs, my first sexual encounters, 00:15:09.375 --> 00:15:12.545 my first - right here, 00:15:12.545 --> 00:15:16.082 my first pictures being sent to me on my phone 00:15:16.082 --> 00:15:20.820 of male parts, female parts, etc. 00:15:20.820 --> 00:15:25.625 And so, we're always asking the question, 00:15:25.625 --> 00:15:28.294 what is the payoff? 00:15:28.294 --> 00:15:32.231 Our primary hypothesis is this is a cry for help, 00:15:32.231 --> 00:15:38.237 which allows us to move squarely 00:15:38.237 --> 00:15:43.609 into that lane of intervention with community supports. 00:15:43.609 --> 00:15:47.280 They're simply hoping someone will see, 00:15:47.280 --> 00:15:50.716 someone will recognize that they have a need, 00:15:50.716 --> 00:15:54.587 will care for and love them enough to help. 00:15:59.158 --> 00:16:00.526 And it goes on and on. 00:16:03.863 --> 00:16:06.799 Unalive. Self-die. 00:16:06.799 --> 00:16:11.938 Depression spelt differently, simply to evade the censors, 00:16:11.938 --> 00:16:15.107 to ensure that their message is seen by an audience. 00:16:18.277 --> 00:16:21.314 As a parent, that's troubling. 00:16:21.314 --> 00:16:23.282 So we need to do all that we can. 00:16:23.282 --> 00:16:26.686 Now, some of the trends that we're seeing very specifically 00:16:26.686 --> 00:16:28.487 and this data doesn't exist anywhere else. 00:16:28.487 --> 00:16:31.157 Safer Schools works with the entire province 00:16:31.157 --> 00:16:32.658 of British Columbia, 00:16:32.658 --> 00:16:37.864 so works closely with each one of the school districts 00:16:37.864 --> 00:16:39.398 and the government. 00:16:39.398 --> 00:16:40.833 We also work in multiple states 00:16:40.833 --> 00:16:43.603 and across the US and parts of Europe, 00:16:43.603 --> 00:16:45.805 and have data similar to this. 00:16:45.805 --> 00:16:46.606 But what we're seeing, 00:16:46.606 --> 00:16:48.908 and I hope you can see this clearly enough 00:16:48.908 --> 00:16:52.545 I can zoom in a little bit - 00:16:52.545 --> 00:16:56.515 is the blue line is pre-pandemic. 00:16:56.515 --> 00:17:00.953 The orange-y line is during the pandemic. 00:17:00.953 --> 00:17:02.121 Now, what we're seeing, 00:17:02.121 --> 00:17:04.156 and this probably doesn't surprise anyone, 00:17:04.156 --> 00:17:08.027 is that through our engagements with schools, 00:17:08.027 --> 00:17:09.996 with districts, with law enforcement, 00:17:09.996 --> 00:17:14.533 there have been case consults increasing in proportion 00:17:14.533 --> 00:17:16.836 around threat-related behavior, 00:17:16.836 --> 00:17:20.039 negative digital climate and culture, 00:17:20.039 --> 00:17:24.210 sextortion, hate, racism, or radicalization, 00:17:24.210 --> 00:17:28.948 mental health concerns, cyber bullying, etc. 00:17:28.948 --> 00:17:31.918 We also see that there have been a decline 00:17:31.918 --> 00:17:34.820 in some of the other areas. 00:17:34.820 --> 00:17:36.989 Okay. So what's going on? 00:17:36.989 --> 00:17:39.558 Part of those declines, school community concerns, 00:17:39.558 --> 00:17:42.261 we saw schools were closed for a long time, 00:17:42.261 --> 00:17:44.297 so we didn't receive those reports. 00:17:45.765 --> 00:17:48.234 Needless to say, 00:17:48.234 --> 00:17:51.671 that with this increase of threat-related behavior, 00:17:51.671 --> 00:17:57.443 how do we intervene becomes the most important question. 00:17:57.443 --> 00:17:59.679 And so, we engage in a systematic review 00:17:59.679 --> 00:18:01.914 of social media from geocoding, 00:18:01.914 --> 00:18:07.019 so using known locations to find individuals, 00:18:07.019 --> 00:18:09.522 from reports from schools, 00:18:09.522 --> 00:18:12.391 and finally, from anonymous reporting. 00:18:12.391 --> 00:18:15.027 And I'll talk about that in a moment. 00:18:15.027 --> 00:18:21.300 But so, we've seen things like this in the past year. 00:18:21.300 --> 00:18:23.936 Concerning, no doubt. 00:18:23.936 --> 00:18:30.009 First question: is this a stock image, 00:18:30.009 --> 00:18:32.478 or do they really have the weapon? 00:18:32.478 --> 00:18:36.048 That changes our assessment level of risk. 00:18:36.048 --> 00:18:39.151 And similarly, in these images, what else can we find? 00:18:39.151 --> 00:18:40.519 How do we intervene? 00:18:40.519 --> 00:18:41.954 Who are the community stakeholders 00:18:41.954 --> 00:18:43.422 that we need to connect with? 00:18:45.224 --> 00:18:52.031 We're seeing as well, those evidences of, 00:18:52.031 --> 00:18:56.502 as I've already spoken about, the mental health concerns. 00:18:56.502 --> 00:18:58.404 This trend, handprints, 00:18:58.404 --> 00:19:00.473 where individuals paint their hands 00:19:00.473 --> 00:19:04.744 and place those paint marks all across their body 00:19:04.744 --> 00:19:06.212 to indicate, 00:19:06.212 --> 00:19:10.716 this is where I've been inappropriately touched or hurt. 00:19:10.716 --> 00:19:14.620 Mm. Ouch. 00:19:14.620 --> 00:19:17.023 Is it a cry for help? 00:19:17.023 --> 00:19:21.627 How do we, as responsible person, intervene? 00:19:21.627 --> 00:19:25.264 Well, thankfully, as I was just tipping my hat to, 00:19:25.264 --> 00:19:26.866 we are seeing, 00:19:26.866 --> 00:19:29.468 and we are educating, a growing culture 00:19:29.468 --> 00:19:33.939 where people report concerning behaviors or worrisome trends 00:19:33.939 --> 00:19:35.808 if they see them. 00:19:35.808 --> 00:19:37.376 And we advocate very strongly, 00:19:37.376 --> 00:19:40.579 and actually run an anonymous reporting tool 00:19:40.579 --> 00:19:43.082 so that people can do this. 00:19:43.082 --> 00:19:45.151 Now, in the anonymous reporting tool, 00:19:45.151 --> 00:19:48.387 we have the eyes of everyone in the community, 00:19:48.387 --> 00:19:50.923 provided they remember that the tool is there. 00:19:50.923 --> 00:19:55.628 So there's an education campaign that must take place. 00:19:55.628 --> 00:19:56.562 And with no surprise, 00:19:56.562 --> 00:19:59.532 because the majority of that reporting 00:19:59.532 --> 00:20:03.035 comes from the students themselves. 00:20:03.035 --> 00:20:08.908 Well, the highest category of tips is cyber bullying, 00:20:08.908 --> 00:20:13.646 school community concerns, relationship, substance abuse, 00:20:13.646 --> 00:20:18.217 and, go figure, mental health. 00:20:18.217 --> 00:20:21.687 And so, we recognize that there is an implicit duty 00:20:21.687 --> 00:20:23.489 of all of us, 00:20:23.489 --> 00:20:26.358 of everyone in the know that has access to resources 00:20:26.358 --> 00:20:29.095 and supports for our students, 00:20:29.095 --> 00:20:32.331 to help them through these mental health concerns, 00:20:32.331 --> 00:20:35.167 and again, move towards intervention. 00:20:35.167 --> 00:20:36.735 How do we do that? 00:20:36.735 --> 00:20:40.439 Digital threat assessment utilizing the technology 00:20:40.439 --> 00:20:45.077 allows us a deep purview into understanding the risks 00:20:45.077 --> 00:20:47.680 associated with somebody's behavior, 00:20:47.680 --> 00:20:51.016 and in particular, the shifts in those baselines. 00:20:51.016 --> 00:20:54.954 Once we extract and preserve the data 00:20:54.954 --> 00:20:58.524 that provides us a profile, 00:20:58.524 --> 00:21:04.597 we then consider that information not in isolation, 00:21:04.597 --> 00:21:06.799 but through the lens of a 00:21:06.799 --> 00:21:11.337 multidisciplinary assessment team, 00:21:11.337 --> 00:21:14.707 engaging all stakeholders, as you see here. 00:21:14.707 --> 00:21:19.912 But the online component has given us a golden ticket, 00:21:19.912 --> 00:21:22.448 which, back when I was in the schools, 00:21:22.448 --> 00:21:28.387 I could only wish we had, or wish I knew about. 00:21:28.387 --> 00:21:30.456 I mean, I could talk about some doozies there. 00:21:32.024 --> 00:21:34.226 On the positive side, 00:21:34.226 --> 00:21:37.129 even though we're seeing things like this and you can read, 00:21:37.129 --> 00:21:41.600 she says here, "I'm sorry if you know what this means." 00:21:41.600 --> 00:21:45.538 She's drawn a butterfly onto her wrist. 00:21:45.538 --> 00:21:49.842 What it means is, it's part of the Butterfly Project. 00:21:49.842 --> 00:21:52.811 And the Butterfly Project states, 00:21:52.811 --> 00:21:55.347 "If you've been engaging in self-harm, 00:21:55.347 --> 00:21:59.485 in cutting, draw a butterfly on yourself." 00:21:59.485 --> 00:22:02.021 And for every day that you don't cut, 00:22:02.021 --> 00:22:04.023 you leave the butterfly there. 00:22:04.023 --> 00:22:07.560 If the butterfly disappears 00:22:07.560 --> 00:22:08.961 and you haven't cut yourself again, 00:22:08.961 --> 00:22:10.896 you just saved a butterfly. 00:22:10.896 --> 00:22:15.034 You progress to the charms. 00:22:15.034 --> 00:22:17.903 For every week, that you don't cut yourself, 00:22:17.903 --> 00:22:20.639 add a butterfly charm. 00:22:20.639 --> 00:22:23.008 And so, there's social media movements as well 00:22:23.008 --> 00:22:28.714 that are trying to intervene and promote positive mental health, 00:22:28.714 --> 00:22:31.116 for which I'm thankful. 00:22:31.116 --> 00:22:33.686 So, I'll end with, it's important, 00:22:33.686 --> 00:22:35.321 once we have this information, 00:22:35.321 --> 00:22:38.190 and Safer Schools Together does this very specifically 00:22:38.190 --> 00:22:40.693 with our 13 analysts, 00:22:40.693 --> 00:22:43.729 who survey the internet and collect information, 00:22:43.729 --> 00:22:47.032 and construct something of a digital baseline report, 00:22:47.032 --> 00:22:50.769 preserve the data, provide insights, 00:22:50.769 --> 00:22:55.608 and make notes of what's going on when we see it. 00:22:55.608 --> 00:22:58.510 Because when we share it with a multidisciplinary team, 00:22:58.510 --> 00:23:01.547 they might not understand all the cues. 00:23:01.547 --> 00:23:04.016 In this slide, the individual is talking about 00:23:04.016 --> 00:23:06.619 how he got that bullet hole in his shirt, 00:23:06.619 --> 00:23:09.021 then continues, in his video. 00:23:09.021 --> 00:23:10.322 He says, "I'm not gonna tell you 00:23:10.322 --> 00:23:13.158 what's underneath my neck piece." 00:23:13.158 --> 00:23:17.229 A quick look into his online profile, 00:23:17.229 --> 00:23:19.198 and we see all sorts of evidence. 00:23:19.198 --> 00:23:21.500 "I go out because I stand remembering 00:23:21.500 --> 00:23:23.903 how my ex killed herself." 00:23:23.903 --> 00:23:27.973 He's not in a good place, hoping to get hit. 00:23:27.973 --> 00:23:34.213 Even the posting in his other social media profiles 00:23:34.213 --> 00:23:40.586 tips the hat to help us best ascertain the level of risk. 00:23:40.586 --> 00:23:42.254 And so, Safer Schools Together 00:23:42.254 --> 00:23:44.256 does this through multiple services. 00:23:44.256 --> 00:23:45.924 And, and if you're interested in more, 00:23:45.924 --> 00:23:49.061 I know this is a short time, and my time is now up, 00:23:49.061 --> 00:23:52.498 please do feel free to get in contact with us. 00:23:52.498 --> 00:23:54.733 And I look forward to the rest of the dialogue. 00:23:54.733 --> 00:23:56.068 Over to you, David. 00:23:56.068 --> 00:23:57.369 >>Thank you, Greg. 00:23:57.369 --> 00:24:00.973 And also, thank you for your inspiring work. 00:24:00.973 --> 00:24:04.276 As we've seen, especially all living through this pandemic, 00:24:04.276 --> 00:24:05.744 a trying time, 00:24:05.744 --> 00:24:08.113 both economically, for a lot of individuals, 00:24:08.113 --> 00:24:11.583 and being surrounded by so much turmoil 00:24:11.583 --> 00:24:15.454 with the tragic circumstances with COVID-19 00:24:15.454 --> 00:24:18.857 has definitely been a pressing time for mental health, 00:24:18.857 --> 00:24:20.192 as we've seen. 00:24:20.192 --> 00:24:22.728 I also want to put out with a lot of your work, 00:24:22.728 --> 00:24:24.797 with the cat and mouse game 00:24:24.797 --> 00:24:27.466 that oftentimes teens play with social media, 00:24:27.466 --> 00:24:30.469 and some of the content moderation algorithms 00:24:30.469 --> 00:24:32.671 that you're alluding to as well. 00:24:32.671 --> 00:24:36.141 I will certainly get back to that when we get into the Q&A. 00:24:36.141 --> 00:24:37.409 But without further ado, 00:24:37.409 --> 00:24:40.646 we want to bring on Dr. Vicki Harrison 00:24:40.646 --> 00:24:45.551 for her next mini presentation that Vicki is going to give. 00:24:45.551 --> 00:24:47.820 Dr. Vicki Harrison is the program director 00:24:47.820 --> 00:24:51.690 at the Stanford Center for Youth Mental Health and Wellbeing, 00:24:51.690 --> 00:24:54.226 and the manager of community partnerships 00:24:54.226 --> 00:24:57.029 for the Stanford Department of Psychiatry 00:24:57.029 --> 00:24:58.597 and Behavioral Sciences. 00:24:58.597 --> 00:24:59.531 And on the personal note, 00:24:59.531 --> 00:25:02.034 I've had the pleasure and privilege 00:25:02.034 --> 00:25:06.538 with serving on multiple boards with Vicki over the years, 00:25:06.538 --> 00:25:08.907 and always inspired by her incredible work. 00:25:08.907 --> 00:25:13.011 So, Vicki, I'd like to bring you on for your mini presentation. 00:25:14.313 --> 00:25:15.714 >>Thank you so much, David. 00:25:15.714 --> 00:25:18.417 Hopefully you can hear me. ->>Yes. 00:25:18.417 --> 00:25:21.887 >>And let me share my screen. 00:25:26.492 --> 00:25:27.659 There we go. 00:25:32.564 --> 00:25:33.999 All right. 00:25:37.336 --> 00:25:38.670 All right. 00:25:39.304 --> 00:25:43.175 Well, thank you again, everyone, for inviting me to be with you. 00:25:43.175 --> 00:25:47.112 And I'm just going to spend the next 10 to 15 minutes 00:25:47.112 --> 00:25:49.715 talking a little bit about the work I do 00:25:49.715 --> 00:25:51.950 and how it's relevant to this conversation. 00:25:51.950 --> 00:25:54.853 And so, the outline I have here is just talking a little bit 00:25:54.853 --> 00:25:58.757 about the benefits and risks of social media for young people, 00:25:58.757 --> 00:26:00.659 especially related to youth mental health, 00:26:00.659 --> 00:26:03.929 some of the vulnerabilities and the opportunities we have. 00:26:03.929 --> 00:26:05.497 To give you some context for 00:26:05.497 --> 00:26:09.568 what I bring to this conversation and where I work, 00:26:09.568 --> 00:26:11.703 at the Stanford Center for Youth Mental Health and Wellbeing, 00:26:11.703 --> 00:26:13.672 we're part of the School of Medicine, 00:26:13.672 --> 00:26:16.208 the Department of Psychiatry and Behavioral Science. 00:26:16.208 --> 00:26:17.810 And we have several initiatives. 00:26:17.810 --> 00:26:19.812 We mostly work out in communities, 00:26:19.812 --> 00:26:22.047 so we're trying to increase access 00:26:22.047 --> 00:26:25.551 to mental health services, decrease stigma, 00:26:25.551 --> 00:26:29.488 really focus on the public health end of the spectrum. 00:26:29.488 --> 00:26:31.390 So, prevention, early intervention. 00:26:31.390 --> 00:26:33.125 So, we do a lot of work with schools 00:26:33.125 --> 00:26:34.259 directly with young people, 00:26:34.259 --> 00:26:35.894 a lot of co-design. 00:26:35.894 --> 00:26:37.596 And media and mental health is one area 00:26:37.596 --> 00:26:40.365 that we care a lot about. 00:26:40.365 --> 00:26:43.335 This slide underscores a lot of the work that we do 00:26:43.335 --> 00:26:44.536 and why we do it. 00:26:44.536 --> 00:26:47.606 And this slide, we've been using for many years. 00:26:47.606 --> 00:26:51.410 So, these are pre-pandemic figures, 00:26:51.410 --> 00:26:54.813 that really half of all lifetime cases of mental illness 00:26:54.813 --> 00:26:56.548 begin by age 14. 00:26:56.548 --> 00:27:00.219 75% of those start by 24. 00:27:00.219 --> 00:27:01.687 So if you look at this graph 00:27:01.687 --> 00:27:04.189 with the yellow blip in the middle, 00:27:04.189 --> 00:27:07.292 that's really when mental health issues present. 00:27:07.292 --> 00:27:10.395 So, adolescence is the time where those emerge 00:27:10.395 --> 00:27:12.931 and where we really need to be supporting people 00:27:12.931 --> 00:27:14.967 and connecting them with services. 00:27:14.967 --> 00:27:17.369 And then I also point out here on the slide, 00:27:17.369 --> 00:27:20.539 the majority of young people do not access care, 00:27:20.539 --> 00:27:23.575 despite having mental health presentations. 00:27:23.575 --> 00:27:26.979 And suicide is currently the second leading cause of death 00:27:26.979 --> 00:27:29.481 for young people after unintentional injury. 00:27:31.216 --> 00:27:33.752 So, for the media work we do, 00:27:33.752 --> 00:27:36.021 I don't think it's a surprise how much media plays a role 00:27:36.021 --> 00:27:37.289 in all of our lives, 00:27:37.289 --> 00:27:39.691 especially the lives of young people. 00:27:39.691 --> 00:27:42.060 We look at it in these three main categories: 00:27:42.060 --> 00:27:45.531 the role of journalism and news media and the narratives there, 00:27:45.531 --> 00:27:49.434 some of which perpetuate some of the stereotypes and stigmas 00:27:49.434 --> 00:27:52.337 that we have around mental health; 00:27:52.337 --> 00:27:54.473 social media, we're going to talk about here; 00:27:54.473 --> 00:27:55.741 and entertainment media 00:27:55.741 --> 00:27:58.977 is obviously a big part of young people's lives. 00:27:58.977 --> 00:28:02.047 And we work with all of these sectors 00:28:02.047 --> 00:28:05.217 to try to improve the content 00:28:05.217 --> 00:28:08.153 of what young people are interacting with, 00:28:08.153 --> 00:28:09.821 and make sure that it's supporting mental health, 00:28:09.821 --> 00:28:12.491 and not contributing to contagion 00:28:12.491 --> 00:28:16.061 and some other risk factors that we'll talk about. 00:28:16.061 --> 00:28:17.729 I just wanted to include a slide here 00:28:17.729 --> 00:28:21.133 about the purposes of why young people are using social media. 00:28:21.133 --> 00:28:23.635 I don't think this is news to anyone, 00:28:23.635 --> 00:28:25.771 but I just want to talk about the benefits. 00:28:25.771 --> 00:28:29.241 Because we all use it and we all enjoy it and get a lot from it. 00:28:29.241 --> 00:28:31.376 And so, that sort of goes without saying. 00:28:31.376 --> 00:28:34.012 But I just wanted to point out some of the primary reasons 00:28:34.012 --> 00:28:35.847 that we hear from young people 00:28:35.847 --> 00:28:39.952 that they go on their phones and their devices so much. 00:28:39.952 --> 00:28:42.321 Connecting with friends is bolded for a reason. 00:28:42.321 --> 00:28:43.488 That's the primary reason. 00:28:43.488 --> 00:28:48.060 And it's hard to not be connected anymore 00:28:48.060 --> 00:28:50.963 when you're trying to interact with peers. 00:28:50.963 --> 00:28:53.465 Also, they go on there to lift their mood. 00:28:53.465 --> 00:28:56.435 So, it does provide some mood management. 00:28:56.435 --> 00:28:58.203 Creative expression. 00:28:58.203 --> 00:29:00.038 And schoolwork, I added at the bottom, 00:29:00.038 --> 00:29:02.207 that's something that everyone needed, 00:29:02.207 --> 00:29:04.509 especially this past year. 00:29:04.509 --> 00:29:07.346 Even if you were trying not to have your young person 00:29:07.346 --> 00:29:08.580 on a device, 00:29:08.580 --> 00:29:10.716 it's pretty hard now once they get in school. 00:29:10.716 --> 00:29:14.219 They are sent to YouTube to watch school-related videos 00:29:14.219 --> 00:29:16.555 and need to check their grades and their work 00:29:16.555 --> 00:29:17.556 and submit it online. 00:29:17.556 --> 00:29:21.960 So, it's embedded in our lives, whether we want it to be or not. 00:29:21.960 --> 00:29:25.197 These are by no means exclusive lists. 00:29:25.197 --> 00:29:26.898 But these are some of the challenges 00:29:26.898 --> 00:29:28.567 that we hear from young people 00:29:28.567 --> 00:29:31.403 that they face when they are just going on their phone 00:29:31.403 --> 00:29:34.406 for all of those positives reasons that they're seeking. 00:29:34.406 --> 00:29:39.011 And this is a pretty thorny list of big items 00:29:39.011 --> 00:29:42.481 that they have to face and deal with. 00:29:42.481 --> 00:29:44.316 Obviously, privacy is something 00:29:44.316 --> 00:29:46.918 that we're still trying to work out, 00:29:46.918 --> 00:29:48.253 for young people, especially. 00:29:48.253 --> 00:29:52.624 And the age verification process is very porous 00:29:52.624 --> 00:29:55.160 and not much of a protection. 00:29:55.160 --> 00:29:57.896 And so, young people, elementary school age, 00:29:57.896 --> 00:30:01.900 it's quite normal at this stage for them to all have phones 00:30:01.900 --> 00:30:03.168 and be on their phones. 00:30:03.168 --> 00:30:04.269 And so then, they're encountering 00:30:04.269 --> 00:30:06.838 this type of content 00:30:06.838 --> 00:30:10.442 really without a lot of guidance from parents, 00:30:10.442 --> 00:30:12.811 from school professionals. 00:30:12.811 --> 00:30:15.247 And the platforms are trying to help some, 00:30:15.247 --> 00:30:21.486 but it's pretty hard to prevent finding some of this content 00:30:21.486 --> 00:30:22.821 within just a few clicks. 00:30:24.956 --> 00:30:26.758 These are just some quotes from some of the work 00:30:26.758 --> 00:30:28.393 that we've done directly with young people, 00:30:28.393 --> 00:30:30.595 again, showing the good, the bad, and the ugly. 00:30:30.595 --> 00:30:35.300 So, there obviously is a great deal of benefit and connection, 00:30:35.300 --> 00:30:36.802 especially from some marginalized 00:30:36.802 --> 00:30:41.373 groups like LGBTQ young people, who are finding community 00:30:41.373 --> 00:30:44.042 where it may not exist in their home life 00:30:44.042 --> 00:30:46.611 or in their small communities. 00:30:46.611 --> 00:30:48.513 And so, it's really a lifeline for them 00:30:48.513 --> 00:30:50.716 to see some representation. 00:30:50.716 --> 00:30:54.453 The flip side of that is obviously hate speech 00:30:54.453 --> 00:30:57.155 and terrible content 00:30:57.155 --> 00:31:00.559 that they might see directed towards the characteristics 00:31:00.559 --> 00:31:02.761 that they identify with. 00:31:02.761 --> 00:31:05.831 But then you have the mood management, 00:31:05.831 --> 00:31:07.866 the creativity that they see. 00:31:07.866 --> 00:31:10.268 And then we hear a lot about the ads that they encounter, 00:31:10.268 --> 00:31:13.071 and how they really feel powerless with the ads 00:31:13.071 --> 00:31:15.173 that are being presented to them incessantly, 00:31:15.173 --> 00:31:19.478 and that they can tell are based on searches that they've done, 00:31:19.478 --> 00:31:22.180 whether it's related to searches they've done 00:31:22.180 --> 00:31:24.683 for physical activity or fitness, 00:31:24.683 --> 00:31:26.785 and then they'll start seeing more ads 00:31:26.785 --> 00:31:31.056 directed towards dieting or getting skinny. 00:31:31.056 --> 00:31:36.128 And that obviously is difficult to handle for many of them. 00:31:36.128 --> 00:31:38.697 And also, the violent content that they see, 00:31:38.697 --> 00:31:40.198 the streaming that people do, 00:31:40.198 --> 00:31:42.134 and that they will just be scrolling, 00:31:42.134 --> 00:31:45.604 and they'll see - in this case, they saw a murder online, 00:31:45.604 --> 00:31:47.606 or they'll see something that they can't unsee, 00:31:47.606 --> 00:31:51.109 and they didn't necessarily choose to see it. 00:31:51.109 --> 00:31:53.512 So overall, there's just a lot that young people 00:31:53.512 --> 00:31:55.580 have to deal with online. 00:31:55.580 --> 00:32:00.051 It's a ton of content that's hard to avoid. 00:32:00.051 --> 00:32:03.021 And I would say, as of today, 00:32:03.021 --> 00:32:07.192 that these platforms are pretty unsafe for young people. 00:32:07.192 --> 00:32:10.395 It's pretty hard to not encounter this type of content. 00:32:11.463 --> 00:32:14.232 So, I wanted to just talk a moment 00:32:14.232 --> 00:32:17.869 about the role of media in suicide, 00:32:17.869 --> 00:32:19.604 because it's something that I think is relevant 00:32:19.604 --> 00:32:20.806 to these conversations 00:32:20.806 --> 00:32:25.811 and that people don't really - aren't really aware of 00:32:25.811 --> 00:32:28.146 and don't give a lot of weight to, 00:32:28.146 --> 00:32:29.414 where I think 00:32:29.414 --> 00:32:31.683 that it's something we need to look at more deeply. 00:32:31.683 --> 00:32:34.586 And that really is that the media has 00:32:34.586 --> 00:32:39.691 a pretty significant impact potential on suicidal behavior. 00:32:39.691 --> 00:32:40.225 And this is something 00:32:40.225 --> 00:32:43.028 that has quite a bit of research behind it 00:32:43.028 --> 00:32:44.830 over many, many years. 00:32:46.298 --> 00:32:48.466 So much so that these effects, 00:32:48.466 --> 00:32:52.971 that they have named them, are based off of literature 00:32:52.971 --> 00:32:56.808 from many, many centuries ago, where the degree of publicity 00:32:56.808 --> 00:32:59.344 given to a suicide is now directly correlated 00:32:59.344 --> 00:33:01.980 with the number of suicides that will follow that. 00:33:01.980 --> 00:33:04.850 And here, I just included three headlines 00:33:04.850 --> 00:33:06.651 from the three buckets of media 00:33:06.651 --> 00:33:10.722 that we look at of how this contagion effect plays out. 00:33:10.722 --> 00:33:13.191 And so, as much as we like to think of suicide 00:33:13.191 --> 00:33:15.660 as a very personal, individual act, 00:33:15.660 --> 00:33:18.930 it really has a social component to it. 00:33:18.930 --> 00:33:21.132 It's got a social construct behind it. 00:33:21.132 --> 00:33:24.603 And that means that the more that you publicize something, 00:33:24.603 --> 00:33:30.976 the more that that individual suicide was sensationalized, 00:33:30.976 --> 00:33:35.714 it can increase suicides to follow, or copycat suicides. 00:33:35.714 --> 00:33:38.049 So we saw like when Robin Williams died, 00:33:38.049 --> 00:33:41.119 there was a 10% increase following his death 00:33:41.119 --> 00:33:45.257 in the same type of age range and male bracket 00:33:45.257 --> 00:33:48.293 that he was part of, middle-aged men. 00:33:48.293 --> 00:33:51.229 And then the same with some of - the 13 Reasons 00:33:51.229 --> 00:33:53.198 Why is a pretty controversial film 00:33:53.198 --> 00:33:57.202 because of the live - the very graphic suicide that it showed. 00:33:57.202 --> 00:33:58.770 And there was a spike after that. 00:33:58.770 --> 00:34:00.205 And then on social media, 00:34:00.205 --> 00:34:02.140 we're seeing more of that related to self-harm. 00:34:02.140 --> 00:34:03.441 So, the good news is that 00:34:03.441 --> 00:34:05.610 there's the so-called Papageno effect, 00:34:05.610 --> 00:34:09.481 which is the flip side - that you can prevent suicides, 00:34:09.481 --> 00:34:12.417 potentially, if you spread different messages 00:34:12.417 --> 00:34:14.753 that are not glorifying the act of suicide 00:34:14.753 --> 00:34:17.589 or giving attention to the behavior. 00:34:17.589 --> 00:34:20.558 And I put here the reporting on suicide.org 00:34:20.558 --> 00:34:22.394 is where you will see some of the guidelines 00:34:22.394 --> 00:34:25.630 that are put out there for mass media to follow. 00:34:25.630 --> 00:34:28.600 There's also a set for reporting on mass shootings. 00:34:28.600 --> 00:34:30.335 And ironically or not, 00:34:30.335 --> 00:34:32.771 I was working on this slide deck last week, 00:34:32.771 --> 00:34:34.105 when down the street from me, 00:34:34.105 --> 00:34:36.374 the San Jose shooting took place, 00:34:36.374 --> 00:34:40.011 and was seeing the streaming footage 00:34:40.011 --> 00:34:44.015 on the news of just the event and the aftermath. 00:34:44.015 --> 00:34:47.485 And so, this is the type of attention and publicity 00:34:47.485 --> 00:34:50.989 that can really create contagion. 00:34:50.989 --> 00:34:52.357 And this slide just shows you 00:34:52.357 --> 00:34:57.696 a really compelling set of evidence for this effect. 00:34:57.696 --> 00:35:01.099 And this was something that happened in Vienna in the '80s, 00:35:01.099 --> 00:35:04.102 where they had a lot of people taking their lives 00:35:04.102 --> 00:35:06.237 in the subway system. 00:35:06.237 --> 00:35:09.541 And the safe reporting guidelines 00:35:09.541 --> 00:35:11.443 went into effect in 1987. 00:35:11.443 --> 00:35:14.846 And directly following changes by the media 00:35:14.846 --> 00:35:16.281 in how they reported, 00:35:16.281 --> 00:35:19.317 the suicides and attempts decreased by 80%. 00:35:19.317 --> 00:35:23.021 So, pretty striking. 00:35:23.021 --> 00:35:26.458 And so, the reason I think this is relevant, 00:35:26.458 --> 00:35:28.727 just that obviously, there is some harm 00:35:28.727 --> 00:35:30.962 that is being done by the degree of publicity 00:35:30.962 --> 00:35:34.032 and media attention that we're giving. 00:35:34.032 --> 00:35:38.169 And we can also flip that by doing a better job. 00:35:38.169 --> 00:35:41.306 And really, there's still ways to report on these incidents, 00:35:41.306 --> 00:35:45.377 but doing it in ways that use different types of headlines, 00:35:45.377 --> 00:35:50.982 less graphic imagery, and can really provide some ways to help 00:35:50.982 --> 00:35:52.917 to prevent this type of thing, 00:35:52.917 --> 00:35:56.421 and also influence better behavior 00:35:56.421 --> 00:35:57.756 instead of harmful behavior. 00:35:57.756 --> 00:36:00.825 And I just want to point out that some mass shootings 00:36:00.825 --> 00:36:03.828 and terror attacks are often murder-suicides. 00:36:03.828 --> 00:36:06.598 And so, even though some people will consider 00:36:06.598 --> 00:36:10.869 that kind of a separate bucket from just an individual suicide, 00:36:10.869 --> 00:36:12.971 it's something worth looking at. 00:36:12.971 --> 00:36:16.541 And I think that there's also really a need for further study 00:36:16.541 --> 00:36:19.744 of contagion effects in social media in general. 00:36:19.744 --> 00:36:21.946 And this information is something that I think 00:36:21.946 --> 00:36:24.349 is being talked about and we'll hear more about, 00:36:24.349 --> 00:36:26.685 that is also contagious in many ways. 00:36:28.687 --> 00:36:31.456 So, I think I put this slide here 00:36:31.456 --> 00:36:34.692 because I think that we have a lot of concerns 00:36:34.692 --> 00:36:37.495 about whether the benefits are outweighing the risks 00:36:37.495 --> 00:36:40.198 or if it's starting to tip a little bit. 00:36:40.198 --> 00:36:45.236 This statistic is recent. So, having come out after COVID, 00:36:45.236 --> 00:36:47.739 I think we've seen that hate speech, 00:36:47.739 --> 00:36:49.374 body shaming, racist content, 00:36:49.374 --> 00:36:52.677 that's all increased in the past year. 00:36:52.677 --> 00:36:55.814 So, this is one in four young people are saying 00:36:55.814 --> 00:36:57.582 not just that they have encountered this content, 00:36:57.582 --> 00:36:59.717 but they encounter it often. 00:36:59.717 --> 00:37:02.487 So really, the ecosystem that we have for young people 00:37:02.487 --> 00:37:03.955 is currently just not - 00:37:03.955 --> 00:37:06.024 it's not setting them up for success. 00:37:06.024 --> 00:37:09.427 So, we need a combination of individual skill-building, 00:37:09.427 --> 00:37:14.265 but also, I think, substantial structural changes. 00:37:14.265 --> 00:37:16.968 This is just a couple examples of some of the work 00:37:16.968 --> 00:37:18.903 that we are doing in this area. 00:37:18.903 --> 00:37:22.107 This was a set of guidelines trying to help control 00:37:22.107 --> 00:37:26.344 some of the suicidal content that happens online. 00:37:26.344 --> 00:37:28.680 As you saw from the prior presentation, 00:37:28.680 --> 00:37:29.948 it's hard to do that. 00:37:29.948 --> 00:37:33.218 And the lexicon is always evolving. 00:37:33.218 --> 00:37:35.620 But this is a set of guidelines that we are working - 00:37:35.620 --> 00:37:37.122 we've worked with Origin in Australia 00:37:37.122 --> 00:37:38.923 around developing these, 00:37:38.923 --> 00:37:43.461 and a second set is coming out soon. 00:37:43.461 --> 00:37:46.898 This is a project that I launched with some young people 00:37:46.898 --> 00:37:49.067 we work with just earlier this year. 00:37:49.067 --> 00:37:52.470 So, this is kind of recognizing that while we work 00:37:52.470 --> 00:37:55.340 on these structural changes that are needed, 00:37:55.340 --> 00:37:57.075 young people are really left to navigate 00:37:57.075 --> 00:38:00.445 these pretty substantial themes and issues on their own. 00:38:00.445 --> 00:38:04.115 And so, it's how can we help them get through this, 00:38:04.115 --> 00:38:05.483 when it's a key part 00:38:05.483 --> 00:38:06.885 of adolescence to be on your phone 00:38:06.885 --> 00:38:08.686 and be interacting with your friends? 00:38:08.686 --> 00:38:10.955 So, this is a peer mentoring campaign 00:38:10.955 --> 00:38:13.792 where we're really trying to help older teens 00:38:13.792 --> 00:38:17.562 who have navigated these issues help younger teens 00:38:17.562 --> 00:38:20.165 that are just starting on their phones, 00:38:20.165 --> 00:38:22.200 to do a better job of protecting their mental health, 00:38:22.200 --> 00:38:25.069 or using strategies that might help them. 00:38:25.069 --> 00:38:26.905 So, these are some of the goals of what we're doing 00:38:26.905 --> 00:38:28.706 with this project, 00:38:28.706 --> 00:38:31.142 which are useful for this project, 00:38:31.142 --> 00:38:32.410 but also the types of themes 00:38:32.410 --> 00:38:34.546 that I would say we need to start 00:38:34.546 --> 00:38:36.247 incorporating into our design. 00:38:36.247 --> 00:38:40.018 When we are looking at how we work on improving platforms, 00:38:40.018 --> 00:38:43.788 we really, in addition to just creating more diversity, 00:38:43.788 --> 00:38:48.226 lived experience in the design process, 00:38:48.226 --> 00:38:50.662 really just need to empower young people 00:38:50.662 --> 00:38:52.897 to promote authenticity, 00:38:52.897 --> 00:38:56.534 get away from this curation and this sense of perfection, 00:38:56.534 --> 00:38:58.636 and this sort of inauthentic world 00:38:58.636 --> 00:39:00.972 that social media has created for them 00:39:00.972 --> 00:39:03.208 that creates a lot of comparison. 00:39:03.208 --> 00:39:05.343 Promoting empathy and belonging, and really just, 00:39:05.343 --> 00:39:08.546 it's the humanization aspect that we need to pull out. 00:39:11.082 --> 00:39:15.353 So, kind of in summary, I think this is just a graph 00:39:15.353 --> 00:39:19.924 that I created to show that as much as those examples 00:39:19.924 --> 00:39:23.294 I just gave are really focused on this individual level 00:39:23.294 --> 00:39:26.731 of what you can control as a personal line, 00:39:26.731 --> 00:39:28.700 I think there's a lot of work that's put in that area, 00:39:28.700 --> 00:39:30.134 and it's clearly very important. 00:39:30.134 --> 00:39:33.171 Media literacy and getting more control. 00:39:33.171 --> 00:39:35.273 I know that Instagram and Facebook 00:39:35.273 --> 00:39:39.410 rolled out a feature to hide likes last week. 00:39:39.410 --> 00:39:40.945 And so, that is the kind of thing 00:39:40.945 --> 00:39:43.848 that you could create more control individually. 00:39:43.848 --> 00:39:45.984 But really, the majority of the recommendations 00:39:45.984 --> 00:39:49.020 that I think need to happen are more at the structural level, 00:39:49.020 --> 00:39:52.824 where we really need to invest more in age-appropriate design, 00:39:52.824 --> 00:39:56.094 safety by design, research these issues more, 00:39:56.094 --> 00:39:57.662 really have zero tolerance 00:39:57.662 --> 00:40:00.965 for some of the hate that we're seeing online, 00:40:00.965 --> 00:40:04.068 and cracking down on this mis- and disinformation 00:40:04.068 --> 00:40:06.938 in order to make these places safer, 00:40:06.938 --> 00:40:10.241 because we're really creating a vulnerable population 00:40:10.241 --> 00:40:12.744 of young people. 00:40:12.744 --> 00:40:17.081 And I will cede my time and hand it back over to David. 00:40:17.081 --> 00:40:18.583 Thank you very much. 00:40:18.583 --> 00:40:20.218 >>Well, thank you, Vicki, and thank you again 00:40:20.218 --> 00:40:23.254 for your incredibly valuable work 00:40:23.254 --> 00:40:25.757 in trying to build a better tech future. 00:40:25.757 --> 00:40:29.260 As we both know, the social media experience, 00:40:29.260 --> 00:40:30.628 especially for teens, 00:40:30.628 --> 00:40:33.931 is not something that's always a choice anymore. 00:40:33.931 --> 00:40:37.302 It's hard almost not to be on social media platforms. 00:40:37.302 --> 00:40:42.140 So, to your point, I would certainly want to second 00:40:42.140 --> 00:40:44.942 that social media obviously needs to do a better job 00:40:44.942 --> 00:40:47.111 of making sure that it aligns 00:40:47.111 --> 00:40:49.714 with some of the values for youth. 00:40:49.714 --> 00:40:54.986 In particular, as you mentioned, with the kind of metrics, 00:40:54.986 --> 00:40:59.424 anxiety that oftentimes teens post what will be liked, 00:40:59.424 --> 00:41:02.593 as opposed to what authentically they like. 00:41:02.593 --> 00:41:05.830 Certainly, we'll come back to that into the Q&A. 00:41:05.830 --> 00:41:07.598 But without further ado, 00:41:07.598 --> 00:41:11.669 we want to bring on Dr. Kelli Dunlap. 00:41:11.669 --> 00:41:15.440 Kelli is a clinical psychologist, game designer, 00:41:15.440 --> 00:41:19.444 and Adjunct Professor of Game Design at American University. 00:41:19.444 --> 00:41:23.481 Also, Kelli is the community manager for Take This, 00:41:23.481 --> 00:41:26.551 a games-focused mental health nonprofit; 00:41:26.551 --> 00:41:28.786 and the chair of the International Game 00:41:28.786 --> 00:41:32.256 Developers Association of Mental Health. 00:41:32.256 --> 00:41:35.460 So, with that, we'd like to bring on Kelly 00:41:35.460 --> 00:41:39.163 for the mini presentation. 00:41:39.163 --> 00:41:40.798 >>Hello. 00:41:40.798 --> 00:41:44.102 I am just trying to get it to share my screen, 00:41:44.102 --> 00:41:45.903 so give me just a moment. 00:41:47.338 --> 00:41:48.973 I am not familiar with Teams. 00:41:48.973 --> 00:41:50.241 I've gotten every other one down. 00:41:50.241 --> 00:41:51.776 But Teams is a whole [crosstalk]. 00:41:51.776 --> 00:41:53.511 >>[Laughter] There's a lot of platform - 00:41:53.511 --> 00:41:54.779 it should be, yes. 00:41:54.779 --> 00:41:58.516 There's a lot of platform choices we have. 00:41:58.516 --> 00:42:00.618 So, we'll bear with you. 00:42:00.618 --> 00:42:07.525 And if not, I'll do a routine for an in-between moderator. 00:42:10.027 --> 00:42:11.262 -Do you see it? ->>Up at the top. 00:42:11.262 --> 00:42:14.732 I'm getting mail that says it's up at the top. 00:42:14.732 --> 00:42:19.871 >>Do you see a share tray to the right of your microphone? 00:42:19.871 --> 00:42:22.407 >>Yeah. I see open and close share tray. 00:42:22.407 --> 00:42:23.875 But when I open it, nothing happens. 00:42:23.875 --> 00:42:25.076 [Laughter] 00:42:25.076 --> 00:42:27.478 >>Oh, you don't have below that a share screen? 00:42:27.478 --> 00:42:31.182 It should pop up for a share screen desktop window. 00:42:31.182 --> 00:42:32.383 >>Nope. Not there. 00:42:32.383 --> 00:42:34.652 >>That's not popping up? 00:42:34.652 --> 00:42:36.254 >>Okay. 00:42:36.254 --> 00:42:37.789 >>Everything's been going too smooth so far, 00:42:37.789 --> 00:42:39.090 so I need to make sure 00:42:39.090 --> 00:42:41.225 that I bring in some technical challenge. 00:42:41.225 --> 00:42:44.562 >>Oh, not at all. [Laughter] 00:42:44.562 --> 00:42:46.431 If you'd like, I mean, we can play around 00:42:46.431 --> 00:42:47.698 with that a little more, 00:42:47.698 --> 00:42:52.670 or if it's also preferred, if you have your slides nearby, 00:42:52.670 --> 00:42:57.275 we could always do that without the visuals. 00:42:57.275 --> 00:42:58.543 >>Yes, I do have my slides, 00:42:58.543 --> 00:43:02.180 and I can actually send you the link, if that works for you. 00:43:02.180 --> 00:43:03.748 >>And then as a reminder for everyone, 00:43:03.748 --> 00:43:07.251 I did see a few questions in the chat. 00:43:07.251 --> 00:43:10.755 Yes, all of this information will be made available 00:43:10.755 --> 00:43:15.393 to each of you who registered for today. 00:43:15.393 --> 00:43:17.462 And there will be additional information 00:43:17.462 --> 00:43:20.565 that will also be shared from our panelists. 00:43:25.236 --> 00:43:26.537 >>All right. 00:43:26.537 --> 00:43:30.107 So, you should be getting an email in just a second. 00:43:38.015 --> 00:43:39.584 In theory. 00:43:41.252 --> 00:43:43.221 >>Don't worry, Kelli. We're all on the same Teams. 00:43:43.221 --> 00:43:44.489 [Laughter] 00:43:44.489 --> 00:43:45.690 >>There we go. 00:43:45.690 --> 00:43:47.592 A little Microsoft joke, I guess. 00:43:47.592 --> 00:43:48.860 >>Yeah. 00:43:48.860 --> 00:43:50.127 >>Yeah. 00:43:50.127 --> 00:43:51.329 >>You know what? 00:43:51.329 --> 00:43:53.564 I'm just going to drop the link in chat, 00:43:53.564 --> 00:43:56.467 because that's just going to be what it is. 00:43:59.804 --> 00:44:01.405 So, if other people want to follow along, 00:44:01.405 --> 00:44:04.008 they're happy, they're welcome to as well. 00:44:06.043 --> 00:44:08.212 >>Yeah, you can open up the slides 00:44:08.212 --> 00:44:09.547 that were just sent over, 00:44:09.547 --> 00:44:11.983 and then this will be like a museum tour, 00:44:11.983 --> 00:44:14.685 where you can follow along at your own pace. 00:44:14.685 --> 00:44:16.787 >>[Laughter] All right. 00:44:16.787 --> 00:44:18.990 Well, thank you all so much, and thank you, Greg, for that... 00:44:18.990 --> 00:44:21.058 or, I'm sorry, thank you, David, for that lovely introduction, 00:44:21.058 --> 00:44:23.995 and thank you, everyone, for hanging in there with me 00:44:23.995 --> 00:44:27.265 while we figure this out. 00:44:27.265 --> 00:44:29.167 So, yes, my name is Dr. Kelli Dunlap. 00:44:29.167 --> 00:44:31.068 This is slide one. 00:44:31.068 --> 00:44:33.404 And I am the community manager for Take This, 00:44:33.404 --> 00:44:36.440 and that is who I'm here representing today. 00:44:36.440 --> 00:44:38.409 But as mentioned previously, I am also a licensed 00:44:38.409 --> 00:44:39.911 and practicing clinical psychologist. 00:44:39.911 --> 00:44:43.214 So, my day job is actually seeing clients. 00:44:43.214 --> 00:44:44.382 And I do teach game design. 00:44:44.382 --> 00:44:45.583 I have a masters in game design, 00:44:45.583 --> 00:44:47.818 and I'm very much involved in the game 00:44:47.818 --> 00:44:50.421 and game development spaces. 00:44:50.421 --> 00:44:53.524 So, for slide two, if you're following along at home, 00:44:53.524 --> 00:44:55.693 I want to tell you a little bit about Take This. 00:44:55.693 --> 00:44:58.196 So, Take This was founded in 2012, 00:44:58.196 --> 00:45:01.832 and it is a nonprofit dedicated to decreasing the stigma 00:45:01.832 --> 00:45:04.769 and increasing support around the mental health 00:45:04.769 --> 00:45:08.239 of game players and game developers. 00:45:08.239 --> 00:45:11.008 At Take This, we have three core pillars 00:45:11.008 --> 00:45:13.778 that support our mission and guide our work. 00:45:13.778 --> 00:45:16.614 And those are support, community, and mental wellness. 00:45:16.614 --> 00:45:19.250 So, support is our focus 00:45:19.250 --> 00:45:21.419 on providing mental health education and training 00:45:21.419 --> 00:45:24.221 to game players and game developers. 00:45:24.221 --> 00:45:26.257 Take This works closely with studios and teams 00:45:26.257 --> 00:45:28.859 across the industry to support their mental health needs. 00:45:28.859 --> 00:45:30.161 In the past year, 00:45:30.161 --> 00:45:33.164 we've conducted workshops for game studios of all sizes, 00:45:33.164 --> 00:45:36.334 and we've spoken at and organized numerous conferences 00:45:36.334 --> 00:45:38.903 and events about self-care, community management, 00:45:38.903 --> 00:45:40.838 leadership, mental health accommodations, 00:45:40.838 --> 00:45:43.374 and how to be an advocate. 00:45:43.374 --> 00:45:46.210 Slide five. Our community programs 00:45:46.210 --> 00:45:48.212 focus on connections and resilience 00:45:48.212 --> 00:45:49.947 through support and engagement. 00:45:49.947 --> 00:45:51.682 We know that from mental health research, 00:45:51.682 --> 00:45:52.917 feeling connected and supported 00:45:52.917 --> 00:45:55.987 is an important piece of preventative mental healthcare. 00:45:55.987 --> 00:45:57.588 And we use our community programs 00:45:57.588 --> 00:46:01.058 to provide spaces for community to develop. 00:46:01.058 --> 00:46:02.727 So, for example, we have a Twitch channel, 00:46:02.727 --> 00:46:04.395 and we also have a Discord. 00:46:04.395 --> 00:46:06.263 We also highlight and support others 00:46:06.263 --> 00:46:08.332 in the gaming spaces and streaming community 00:46:08.332 --> 00:46:09.900 who share our goals. 00:46:09.900 --> 00:46:10.801 For example, 00:46:10.801 --> 00:46:12.903 our Streaming Ambassador program features streamers 00:46:12.903 --> 00:46:16.140 who use their platforms to advocate for mental health 00:46:16.140 --> 00:46:19.110 and destigmatize mental illness. 00:46:19.110 --> 00:46:21.746 Offline, Take This has a program for creating safe, 00:46:21.746 --> 00:46:25.883 quiet spaces at gaming conventions, such as PAX and E3. 00:46:25.883 --> 00:46:28.052 And these are known as the AFK rooms. 00:46:30.655 --> 00:46:32.657 These rooms are staffed by trained volunteers 00:46:32.657 --> 00:46:34.592 as well as mental health professionals 00:46:34.592 --> 00:46:37.595 so that we can continue to provide education and support 00:46:37.595 --> 00:46:41.799 at the places where gamers gather. 00:46:41.799 --> 00:46:43.567 And then, of course, at our core, 00:46:43.567 --> 00:46:46.404 we are dedicated to mental wellness 00:46:46.404 --> 00:46:48.539 through destigmatization of mental illness 00:46:48.539 --> 00:46:49.774 and increasing access 00:46:49.774 --> 00:46:52.310 to mental health resources and information. 00:46:52.310 --> 00:46:54.845 So, an important distinction here is that 00:46:54.845 --> 00:46:56.781 just because you do not have good mental health 00:46:56.781 --> 00:46:59.216 does that mean that you have a mental illness. 00:46:59.216 --> 00:47:01.218 And so, highlighting those differences 00:47:01.218 --> 00:47:03.054 is really, really important. 00:47:03.054 --> 00:47:06.257 But awareness and education are just the beginning for us. 00:47:06.257 --> 00:47:08.926 Through resource sharing and partnering with organizations 00:47:08.926 --> 00:47:12.496 like Twitch and the Games and Online Harassment Hotline, 00:47:12.496 --> 00:47:15.066 we strive to bring these kinds of resources and wellness 00:47:15.066 --> 00:47:17.802 to game players and developers at a larger scale, 00:47:17.802 --> 00:47:19.837 more than we could ever do on our own. 00:47:19.837 --> 00:47:21.205 And that was slide six. 00:47:21.205 --> 00:47:23.507 So, thank you for putting them up on the screen. 00:47:23.507 --> 00:47:25.042 [Laughter] All right. 00:47:25.042 --> 00:47:29.346 So, that's kind of a quick recap of who we are at Take this. 00:47:29.346 --> 00:47:32.049 Next slide. 00:47:32.049 --> 00:47:35.319 So, bringing resources to the gaming community 00:47:35.319 --> 00:47:38.556 is very much aligned with the old clinical saying of, 00:47:38.556 --> 00:47:40.291 "Meet the client where they're at." 00:47:40.291 --> 00:47:42.393 And at Take This, we bring our expertise 00:47:42.393 --> 00:47:44.195 as mental health professionals 00:47:44.195 --> 00:47:46.097 into gaming spaces and communities, 00:47:46.097 --> 00:47:48.766 because those are the exact same spaces and communities 00:47:48.766 --> 00:47:51.202 that we ourselves are already a part of. 00:47:53.270 --> 00:47:54.538 As I mentioned earlier, 00:47:54.538 --> 00:47:56.674 feeling a sense of connection and community 00:47:56.674 --> 00:47:59.410 is an important part of preventative mental healthcare. 00:47:59.410 --> 00:48:00.878 Given the state of mental healthcare, 00:48:00.878 --> 00:48:02.079 [laughter] 00:48:02.079 --> 00:48:06.083 even before the pandemic, we need all the help we can get. 00:48:06.083 --> 00:48:08.152 This map here on your screen 00:48:08.152 --> 00:48:12.390 represents the unmet need of mental health 00:48:12.390 --> 00:48:13.724 across the United States 00:48:13.724 --> 00:48:17.261 due specifically to healthcare provider shortages. 00:48:17.261 --> 00:48:22.366 And the darker red, the states, indicates a greater unmet need. 00:48:22.366 --> 00:48:24.034 So, it's not good. [Laughter] 00:48:24.034 --> 00:48:27.371 And again, that was 2019, before COVID. 00:48:27.371 --> 00:48:30.274 All right, next slide. 00:48:30.274 --> 00:48:34.545 Prior to COVID, there were 47 million American adults 00:48:34.545 --> 00:48:37.081 and seven million American children 00:48:37.081 --> 00:48:40.384 who met criteria for at least one mental illness. 00:48:40.384 --> 00:48:42.520 And unfortunately, despite the prevalence 00:48:42.520 --> 00:48:44.688 of mental illness throughout the US, 00:48:44.688 --> 00:48:46.857 the majority of those who experience it 00:48:46.857 --> 00:48:50.094 do not receive any kind of therapeutic intervention. 00:48:50.094 --> 00:48:52.963 In fact, next slide, on average, 00:48:52.963 --> 00:48:55.065 less than half of those who meet criteria 00:48:55.065 --> 00:48:57.802 will actually obtain any kind of treatment. 00:48:57.802 --> 00:48:59.403 This is for a variety of reasons, 00:48:59.403 --> 00:49:01.906 from the practitioner shortages that I just mentioned 00:49:01.906 --> 00:49:05.342 to things like cost or lack of access, 00:49:05.342 --> 00:49:08.946 and just general stigma against mental illness and help-seeking. 00:49:11.816 --> 00:49:15.853 So, on our next slide here, in August of 2020, 00:49:15.853 --> 00:49:18.322 after five months of social distancing and lockdown, 00:49:18.322 --> 00:49:20.024 the Center for Disease Control reported 00:49:20.024 --> 00:49:22.359 that the prevalence of anxiety and depression, 00:49:22.359 --> 00:49:24.795 the two most common types of mental illness, 00:49:24.795 --> 00:49:27.765 were three and four times higher, respectively, 00:49:27.765 --> 00:49:30.835 than they were at the same point during the previous year. 00:49:30.835 --> 00:49:33.537 Between January 2020 and September 2020, 00:49:33.537 --> 00:49:38.609 the number of people seeking help for anxiety rose 634%, 00:49:38.609 --> 00:49:42.947 and the number seeking support for depression rose 873%. 00:49:44.481 --> 00:49:47.251 Next slide. 00:49:47.251 --> 00:49:51.388 Suicidality, in terms of suicidal thoughts, 00:49:51.388 --> 00:49:53.023 is also at an all-time high. 00:49:53.023 --> 00:49:56.060 So, 37% of distressed individuals 00:49:56.060 --> 00:49:59.563 reported having suicidal thoughts almost every day. 00:49:59.563 --> 00:50:01.365 And this was the most common 00:50:01.365 --> 00:50:04.969 and the highest rates were reported by LGBTQIA youth. 00:50:07.504 --> 00:50:08.672 Next slide. 00:50:10.774 --> 00:50:13.344 Given the shortage of mental health professionals 00:50:13.344 --> 00:50:16.146 and the significant systemic barriers to treatment - 00:50:16.146 --> 00:50:19.149 again, cost, insurance, location, and stigma - 00:50:19.149 --> 00:50:21.685 and the predictable, yet still alarming spike 00:50:21.685 --> 00:50:23.687 in request for mental health services, 00:50:23.687 --> 00:50:26.123 it is not a surprise at all that people seek 00:50:26.123 --> 00:50:28.893 and find nontraditional resources, 00:50:28.893 --> 00:50:31.128 like streamers and streaming communities, 00:50:31.128 --> 00:50:32.963 to get their mental health needs met 00:50:32.963 --> 00:50:35.799 and find mental health support. 00:50:35.799 --> 00:50:37.167 Next. 00:50:38.302 --> 00:50:41.272 Speaking of streaming, in April of 2019, 00:50:41.272 --> 00:50:46.010 Twitch, which is the largest broadcaster of game-play streams 00:50:46.010 --> 00:50:48.312 in the word, I think, currently, 00:50:48.312 --> 00:50:52.216 reported 4.2 million active streamers on their platform. 00:50:52.216 --> 00:50:54.285 But one year later, in April of 2020, 00:50:54.285 --> 00:50:56.854 after COVID hit, that number skyrocketed 00:50:56.854 --> 00:50:59.757 to 7.2 million active streamers. 00:50:59.757 --> 00:51:00.624 And the amount of people 00:51:00.624 --> 00:51:02.960 watching also dramatically increased, 00:51:02.960 --> 00:51:07.498 from 889 million hours watched in April of 2019 00:51:07.498 --> 00:51:11.802 to 1.8 billion hours watched in April of 2020. 00:51:11.802 --> 00:51:15.239 So, the massive increase in engagement on Twitch 00:51:15.239 --> 00:51:18.275 is largely attributed to the majority of the US population 00:51:18.275 --> 00:51:21.845 being under stay at home orders or other lockdown measures, 00:51:21.845 --> 00:51:25.816 such as the cancellation of professional sporting events. 00:51:25.816 --> 00:51:26.850 Next slide. 00:51:28.585 --> 00:51:31.522 Since the COVID-19 pandemic began, 00:51:31.522 --> 00:51:34.892 Twitch communities have provided an essential social service 00:51:34.892 --> 00:51:36.760 by keeping people connected. 00:51:36.760 --> 00:51:39.697 Increases in depression and anxiety and loneliness, 00:51:39.697 --> 00:51:41.598 and reduced social support 00:51:41.598 --> 00:51:43.600 are very common adverse mental health 00:51:43.600 --> 00:51:46.236 impacts experienced during COVID. 00:51:46.236 --> 00:51:49.006 However, social bonding through collective gatherings, 00:51:49.006 --> 00:51:51.408 like Twitch streams, have been shown to have 00:51:51.408 --> 00:51:55.312 protective psychological effects in times of tragedy and crisis, 00:51:55.312 --> 00:51:58.315 especially in times of tragedy and crisis. 00:51:58.315 --> 00:52:00.651 Next slide. 00:52:00.651 --> 00:52:02.252 Online platforms like Twitch 00:52:02.252 --> 00:52:04.288 and community-building tools like Discord 00:52:04.288 --> 00:52:06.023 have been psychologically protective 00:52:06.023 --> 00:52:07.992 and supportive environments. 00:52:07.992 --> 00:52:10.327 Research has shown that social gatherings, 00:52:10.327 --> 00:52:13.630 so in person or online, facilitate bonding 00:52:13.630 --> 00:52:17.101 and enhance a sense of cohesion and social identity. 00:52:17.101 --> 00:52:19.770 It's very much a coming together effect 00:52:19.770 --> 00:52:22.206 that provides mutual support. 00:52:22.206 --> 00:52:24.341 Um, collective gatherings provide a venue 00:52:24.341 --> 00:52:27.544 for socially acceptable outpouring of emotion. 00:52:27.544 --> 00:52:30.881 It allows us to acknowledge that tragedy is a collective grief, 00:52:30.881 --> 00:52:33.150 rather than maybe an individual loss. 00:52:33.150 --> 00:52:34.651 And it reminds community members 00:52:34.651 --> 00:52:38.689 that they are not alone in their struggles. 00:52:38.689 --> 00:52:40.858 Next slide. 00:52:40.858 --> 00:52:44.094 Each Twitch stream is a public-facing collective 00:52:44.094 --> 00:52:46.764 gathering for members of that community, 00:52:46.764 --> 00:52:49.933 and it contains the potential for meeting core emotional needs 00:52:49.933 --> 00:52:52.202 like belongingness, acceptance, 00:52:52.202 --> 00:52:55.239 and feeling supported by fear [laughter] by peers. 00:52:55.239 --> 00:52:57.307 If these needs go unmet, 00:52:57.307 --> 00:53:00.544 we experience feelings like loneliness and isolation, 00:53:00.544 --> 00:53:03.247 lacking a sense of belonging or identity. 00:53:03.247 --> 00:53:06.150 And these experiences produce feelings like anger 00:53:06.150 --> 00:53:09.653 and shame and depression - feelings that predispose us 00:53:09.653 --> 00:53:12.990 to become psychologically vulnerable. 00:53:12.990 --> 00:53:15.259 Next slide. 00:53:15.259 --> 00:53:18.529 For example, shame is a deeply painful emotion, 00:53:18.529 --> 00:53:21.198 where we believe ourselves to be flawed 00:53:21.198 --> 00:53:24.401 and unworthy of love or connection. 00:53:24.401 --> 00:53:28.372 Guilt is "I did something bad." Shame is "I am bad." 00:53:28.372 --> 00:53:31.508 and as a result, we become more likely to seek out 00:53:31.508 --> 00:53:35.312 any social connection. We are driven to fit in, 00:53:35.312 --> 00:53:38.348 no matter how much we ourselves have to change 00:53:38.348 --> 00:53:39.750 in order to become accepted. 00:53:39.750 --> 00:53:40.551 And that, of course, 00:53:40.551 --> 00:53:42.820 is a very vulnerable place to find ourselves, 00:53:42.820 --> 00:53:46.356 especially for a teen or young adult. 00:53:46.356 --> 00:53:47.958 We become more isolated, 00:53:47.958 --> 00:53:50.861 or at least perceive ourselves to be more isolated. 00:53:50.861 --> 00:53:52.429 And we turn inward. 00:53:52.429 --> 00:53:55.099 We become more protective, we become more defensive, 00:53:55.099 --> 00:53:59.036 and we become more vigilant against any perceived threat. 00:53:59.036 --> 00:54:01.572 And then, last, because the pain of shame 00:54:01.572 --> 00:54:03.707 and feeling more defensive, 00:54:03.707 --> 00:54:06.009 we are more likely to externalize those, 00:54:06.009 --> 00:54:09.713 to lash out as a means of offloading that pain 00:54:09.713 --> 00:54:13.050 from being disconnected from others. 00:54:13.050 --> 00:54:15.619 Next one. 00:54:15.619 --> 00:54:18.489 Online communities, like streaming communities on Twitch 00:54:18.489 --> 00:54:20.724 or surrounding ones on Discord, 00:54:20.724 --> 00:54:23.527 can help people feel seen, heard, valued, 00:54:23.527 --> 00:54:25.462 and socially connected. 00:54:25.462 --> 00:54:28.699 During the pandemic, health services were overrun, 00:54:28.699 --> 00:54:31.368 and many streamers found themselves in the position 00:54:31.368 --> 00:54:35.205 of holding space for people experiencing grief, loss, 00:54:35.205 --> 00:54:37.374 and overwhelming emotions. 00:54:37.374 --> 00:54:41.011 But those people weren't able to obtain mental health services. 00:54:41.011 --> 00:54:43.814 Streamers and the streaming community rose to that challenge 00:54:43.814 --> 00:54:48.585 and have proven time and again that if done well and with care 00:54:48.585 --> 00:54:50.020 and with intention, 00:54:50.020 --> 00:54:51.755 online communities can provide 00:54:51.755 --> 00:54:54.124 psychologically protective spaces 00:54:54.124 --> 00:54:56.693 where vulnerable populations and individuals 00:54:56.693 --> 00:54:59.496 can get at least some of their core 00:54:59.496 --> 00:55:01.798 social and emotional needs met, 00:55:01.798 --> 00:55:04.101 therefore preventing the shame spiral, 00:55:04.101 --> 00:55:05.536 the anger, the isolation, 00:55:05.536 --> 00:55:09.306 and the despair of feeling separated 00:55:09.306 --> 00:55:12.876 and socially disconnected from others. 00:55:12.876 --> 00:55:14.545 And that's it. So, thank you all so much. 00:55:14.545 --> 00:55:16.180 And again, thank you for bearing with me 00:55:16.180 --> 00:55:18.215 through the technical difficulties. 00:55:18.215 --> 00:55:19.716 And it's back to you. 00:55:19.716 --> 00:55:22.486 >>All right, thank you. Thank you, Kelli. 00:55:22.486 --> 00:55:25.756 And also, I will mention, as we were talking about, 00:55:25.756 --> 00:55:28.325 see, we're all in the same Teams together. 00:55:28.325 --> 00:55:32.329 And we'll send over 20 karma points to Greg, 00:55:32.329 --> 00:55:35.265 who was able to quickly share the screen 00:55:35.265 --> 00:55:36.767 that you had sent over from your slides. 00:55:36.767 --> 00:55:38.001 >>Thank you. 00:55:38.001 --> 00:55:39.870 >>So, again, thank you for that, 00:55:39.870 --> 00:55:42.439 and thank you for your incredibly valuable work 00:55:42.439 --> 00:55:45.976 that you're doing to increase some of the value 00:55:45.976 --> 00:55:47.177 that we have on social media 00:55:47.177 --> 00:55:49.279 and make it a better environment. 00:55:49.279 --> 00:55:51.481 We'd also, later on, 00:55:51.481 --> 00:55:53.684 after we finish up with the mini presentations, 00:55:53.684 --> 00:55:55.986 love to come back to you around some of the work 00:55:55.986 --> 00:55:57.955 that you're doing with resilience. 00:55:57.955 --> 00:55:59.489 Resilience, certainly a term 00:55:59.489 --> 00:56:02.159 that we're hearing a lot of lately. 00:56:02.159 --> 00:56:08.165 And then also, some of your work around shame being an area 00:56:08.165 --> 00:56:12.035 that leads to so many issues around the vulnerability 00:56:12.035 --> 00:56:16.640 and its potential connectedness to other issues 00:56:16.640 --> 00:56:19.543 that might stem and be related to extremism. 00:56:19.543 --> 00:56:21.945 So, with that, we're going to send it on 00:56:21.945 --> 00:56:26.016 to our fourth mini presentation of our two hours together. 00:56:26.016 --> 00:56:28.252 I will also mention two things. 00:56:28.252 --> 00:56:29.953 One, start thinking of questions 00:56:29.953 --> 00:56:31.822 that you might have for the panelists 00:56:31.822 --> 00:56:34.992 that we'll get to near the end of our time together. 00:56:34.992 --> 00:56:38.228 And then, two, as Greg also mentioned 00:56:38.228 --> 00:56:41.231 in the first presentation, 00:56:41.231 --> 00:56:44.067 we are talking about extremely sensitive topics, 00:56:44.067 --> 00:56:48.038 and we're also very aware of the strenuous circumstances 00:56:48.038 --> 00:56:52.042 that we're all living through right now during the pandemic. 00:56:52.042 --> 00:56:53.410 So, please take care of yourself. 00:56:53.410 --> 00:56:58.982 And if there's any issues that are triggering to you, 00:56:58.982 --> 00:57:01.618 please step away from your computer, 00:57:01.618 --> 00:57:03.920 join us at a later time, 00:57:03.920 --> 00:57:09.293 and just always be mindful of your personal state of health. 00:57:09.293 --> 00:57:12.729 So, with that, we're going to bring on our next panelist. 00:57:12.729 --> 00:57:15.098 Dr. Kurt Braddock. 00:57:15.098 --> 00:57:18.702 Kurt is an assistant professor of Public Communication 00:57:18.702 --> 00:57:22.172 in the School of Communication at American University, 00:57:22.172 --> 00:57:24.941 and a faculty fellow at the Center for Media 00:57:24.941 --> 00:57:27.711 and Social Impact, and at the Polarization 00:57:27.711 --> 00:57:31.048 and Extremism Research and Innovation Lab, 00:57:31.048 --> 00:57:33.483 also known by the acronym of PERIL. 00:57:33.483 --> 00:57:35.852 So, Kurt, love to bring you on right now 00:57:35.852 --> 00:57:38.922 to hear a little bit more about your work. 00:57:38.922 --> 00:57:40.257 >>Sure. Thanks, David. 00:57:40.257 --> 00:57:46.763 And I am going to hopefully avoid what we had last time. 00:57:46.763 --> 00:57:49.633 How we looking? Do we see slides in a second? 00:57:49.633 --> 00:57:51.535 >>Yes. Mm-hmm. 00:57:51.535 --> 00:57:54.037 >>All right. There we go. 00:57:54.037 --> 00:57:56.473 Success. Biggest hurdle overcome. 00:57:56.473 --> 00:57:57.808 Okay. 00:57:57.808 --> 00:58:00.410 So, I thought, as the other panelists have done, 00:58:00.410 --> 00:58:02.813 I would talk a little bit about myself, 00:58:02.813 --> 00:58:04.114 where I'm coming from, 00:58:04.114 --> 00:58:09.252 and kind of my role in this space. 00:58:09.252 --> 00:58:10.987 Unlike some of the other panelists, 00:58:10.987 --> 00:58:13.523 who are more embedded in the world of education, 00:58:13.523 --> 00:58:16.893 I'm more embedded in the world of disinformation, 00:58:16.893 --> 00:58:19.629 violent extremism, and these days, 00:58:19.629 --> 00:58:22.599 how the former really perpetuates the latter. 00:58:22.599 --> 00:58:24.801 So, a lot of what I research 00:58:24.801 --> 00:58:26.703 and what I'll be talking about today 00:58:26.703 --> 00:58:28.905 relates to a particular strategy 00:58:28.905 --> 00:58:32.542 that I've been testing in online environments 00:58:32.542 --> 00:58:36.012 for preventing the assimilation of beliefs and attitudes 00:58:36.012 --> 00:58:39.316 that bring about the sorts of problematic outcomes 00:58:39.316 --> 00:58:41.685 that we've heard talked about already. 00:58:41.685 --> 00:58:43.520 I focus on violent extremism. 00:58:43.520 --> 00:58:46.390 And my focus these days is on the far right, 00:58:46.390 --> 00:58:50.460 though I've tested this strategy in other contexts as well, 00:58:50.460 --> 00:58:54.464 and it seems to be pretty robust across contexts - 00:58:54.464 --> 00:58:56.867 this strategy called attitudinal inoculation. 00:58:56.867 --> 00:58:59.202 And when people talk about inoculation, 00:58:59.202 --> 00:59:01.705 the word gets thrown around a bit. 00:59:01.705 --> 00:59:03.740 But as a communication researcher, 00:59:03.740 --> 00:59:04.975 when I talk about it, 00:59:04.975 --> 00:59:07.511 it has a very, very specific meaning. 00:59:07.511 --> 00:59:09.179 But I'll get into that. 00:59:09.179 --> 00:59:11.948 So, what I'm going to do is talk about inoculation theory 00:59:11.948 --> 00:59:14.684 kind of at large, describe what it is. 00:59:14.684 --> 00:59:16.586 I'll describe a study that I did, 00:59:16.586 --> 00:59:19.990 I guess, a year-and-a-half ago now testing the theory. 00:59:19.990 --> 00:59:23.427 I'll talk a bit about work that I'm doing with DHS 00:59:23.427 --> 00:59:25.495 now in the CP3 program. 00:59:25.495 --> 00:59:28.632 And I'll close with a new area I'm going to be going into, 00:59:28.632 --> 00:59:32.602 looking into whether inoculation works online 00:59:32.602 --> 00:59:36.740 for a particular kind of source of extremist messaging. 00:59:36.740 --> 00:59:39.676 But let me start with attitudinal inoculation 00:59:39.676 --> 00:59:41.545 and what it is, 00:59:41.545 --> 00:59:45.615 and how it's been taught for basically decades, 00:59:45.615 --> 00:59:48.151 and kind of where I'm coming from with it. 00:59:48.151 --> 00:59:49.686 This is all based - all this work 00:59:49.686 --> 00:59:52.789 is based on the idea that we seem to - and by we, 00:59:52.789 --> 00:59:56.960 I mean those of us in the kind of terrorism studies community - 00:59:56.960 --> 01:00:00.430 have gotten better at designing interventions. 01:00:00.430 --> 01:00:02.265 But as good as our ideas are, 01:00:02.265 --> 01:00:06.636 we're not terribly good at gathering data to demonstrate 01:00:06.636 --> 01:00:08.605 whether or not they're actually proven, 01:00:08.605 --> 01:00:10.106 or whether there is no evidence to show 01:00:10.106 --> 01:00:11.775 whether they're effective or not. 01:00:11.775 --> 01:00:14.478 Most of our arguments as to whether certain interventions 01:00:14.478 --> 01:00:16.913 are effective are based on stories and anecdotes 01:00:16.913 --> 01:00:18.782 and things of that nature. 01:00:18.782 --> 01:00:22.185 So, a couple of years ago, 01:00:22.185 --> 01:00:24.321 being kind of the data nerd that I am, 01:00:24.321 --> 01:00:27.457 I thought I would start doing some quantitative research 01:00:27.457 --> 01:00:29.559 to see whether or not some of these interventions 01:00:29.559 --> 01:00:32.128 that have been tested in communication studies 01:00:32.128 --> 01:00:34.664 can be brought to bear in the development of interventions 01:00:34.664 --> 01:00:38.034 in the violent extremism space, particularly online. 01:00:38.034 --> 01:00:40.437 So, the research driving most of my question 01:00:40.437 --> 01:00:43.306 relates to how we can use communication science 01:00:43.306 --> 01:00:45.375 and the theories that underpin it 01:00:45.375 --> 01:00:48.745 to provide evidence for what works and what doesn't 01:00:48.745 --> 01:00:52.282 in preventing people from adopting 01:00:52.282 --> 01:00:54.818 kind of extremist mindsets, beliefs, attitudes, 01:00:54.818 --> 01:00:56.086 and intentions 01:00:56.086 --> 01:00:58.421 that are consistent with ideologies 01:00:58.421 --> 01:01:01.224 that promote the use of violence for the sake of terrorism. 01:01:01.224 --> 01:01:04.060 And the first stop here, because of the theory itself, 01:01:04.060 --> 01:01:06.363 is something called inoculation theory. 01:01:06.363 --> 01:01:08.164 And to understand inoculation theory, 01:01:08.164 --> 01:01:10.200 it's important to understand inoculation 01:01:10.200 --> 01:01:12.569 and how it works more generally. 01:01:12.569 --> 01:01:15.639 And I would very much appreciate if you all are 01:01:15.639 --> 01:01:18.875 very impressed by this slide, because as little as it is, 01:01:18.875 --> 01:01:21.211 it took probably about an hour to make a couple years ago. 01:01:21.211 --> 01:01:24.481 My students are still very impressed by it, so. 01:01:24.481 --> 01:01:27.217 So, the way inoculation works more generally is 01:01:27.217 --> 01:01:31.021 that somebody is exposed to a small bit of a contagion, 01:01:31.021 --> 01:01:32.389 typically a virus. 01:01:32.389 --> 01:01:35.392 And as a function of being exposed to that contagion, 01:01:35.392 --> 01:01:38.161 it replicates in their body to a point 01:01:38.161 --> 01:01:41.331 that it triggers an immune response. 01:01:41.331 --> 01:01:44.768 And that immune response is enough to develop antibodies, 01:01:44.768 --> 01:01:47.370 but not so strong as to make the person sick. 01:01:47.370 --> 01:01:50.106 So, when the person kind of encounters the contagion again 01:01:50.106 --> 01:01:51.308 in the wild, 01:01:51.308 --> 01:01:53.810 they have those antibodies to protect against it 01:01:53.810 --> 01:01:55.946 making them sick in the real world. 01:01:55.946 --> 01:01:57.781 Inoculation theory, which was developed, 01:01:57.781 --> 01:02:02.018 I guess, almost 60 years ago now, 60 years, 01:02:02.018 --> 01:02:06.122 argues that ideas can operate in the same manner. 01:02:06.122 --> 01:02:07.591 That being, 01:02:07.591 --> 01:02:11.027 that we can become resistant to ideas in the same way 01:02:11.027 --> 01:02:13.563 that we're made resistant to viruses 01:02:13.563 --> 01:02:15.098 that we're vaccinated against. 01:02:15.098 --> 01:02:18.068 Now, it's a bit more complicated than what I'm explaining here. 01:02:18.068 --> 01:02:21.438 But generally speaking, and I say this very generally, 01:02:21.438 --> 01:02:24.541 inoculation messages tend to have two components. 01:02:24.541 --> 01:02:28.178 Number one, in your audience, you raise the specter of threat. 01:02:28.178 --> 01:02:30.714 I don't mean threat they're going to be physically hurt, 01:02:30.714 --> 01:02:34.317 but threat, the beliefs and attitudes they currently hold 01:02:34.317 --> 01:02:36.486 are at risk of changing. 01:02:36.486 --> 01:02:39.322 Generally speaking, pretty much everywhere, 01:02:39.322 --> 01:02:42.659 but even stronger so in the United States, 01:02:42.659 --> 01:02:44.527 we like making our own decisions. 01:02:44.527 --> 01:02:46.463 And when we think somebody's influencing those decisions 01:02:46.463 --> 01:02:49.065 or that our decisions might be made for us, 01:02:49.065 --> 01:02:50.800 we push back against it. 01:02:50.800 --> 01:02:52.502 So the first step is to make people feel 01:02:52.502 --> 01:02:55.739 as though the beliefs they have will be under threat. 01:02:55.739 --> 01:02:58.308 The second step is to provide counterarguments 01:02:58.308 --> 01:03:01.444 against the arguments they will encounter in the real world. 01:03:01.444 --> 01:03:03.013 So, in the same way 01:03:03.013 --> 01:03:06.182 that you give somebody a little dose of a virus 01:03:06.182 --> 01:03:09.519 to trigger immune response in the body with vaccination, 01:03:09.519 --> 01:03:11.287 you do the same thing with inoculation. 01:03:11.287 --> 01:03:13.523 You expose the person to a bit of the argument 01:03:13.523 --> 01:03:16.326 they'll run into to raise the specter of threat, 01:03:16.326 --> 01:03:17.727 and then give them the tools they need 01:03:17.727 --> 01:03:19.329 to defend against that threat 01:03:19.329 --> 01:03:22.132 when they encounter it in the real world. 01:03:22.132 --> 01:03:23.099 Now, one of the mechanisms 01:03:23.099 --> 01:03:25.902 by which this works is called reactance. 01:03:25.902 --> 01:03:28.538 And it's a bit difficult to explain reactance. 01:03:28.538 --> 01:03:30.306 But the best way to describe it 01:03:30.306 --> 01:03:33.910 is if you've ever gone into a shop or went shopping for a car, 01:03:33.910 --> 01:03:35.745 and a salesman comes up to you 01:03:35.745 --> 01:03:37.280 and tries to convince you to buy something, 01:03:37.280 --> 01:03:39.683 that negative pushback that you have, 01:03:39.683 --> 01:03:42.852 that you get a little annoyed and angry - that's reactance. 01:03:42.852 --> 01:03:45.622 It's a negative motivational tendency 01:03:45.622 --> 01:03:47.190 to want to make our own decisions. 01:03:47.190 --> 01:03:49.025 And it's typically been found to be the combination 01:03:49.025 --> 01:03:51.327 of being angry and counter-arguing 01:03:51.327 --> 01:03:54.764 to what the person's going to say. 01:03:54.764 --> 01:03:57.500 The most exciting thing about inoculation for me, 01:03:57.500 --> 01:03:58.702 even prior to doing the study 01:03:58.702 --> 01:04:00.303 I'm going to explain in a second, 01:04:00.303 --> 01:04:02.972 was that it's been shown to work across contexts. 01:04:02.972 --> 01:04:05.108 Inoculation messages have protected people 01:04:05.108 --> 01:04:09.312 in the domains of health communication, politics, 01:04:09.312 --> 01:04:13.083 interpersonal relationships like friendships, marketing. 01:04:13.083 --> 01:04:14.517 All these different contexts have shown 01:04:14.517 --> 01:04:16.352 that if you use these inoculation messages, 01:04:16.352 --> 01:04:19.155 you can prevent people from being persuaded. 01:04:19.155 --> 01:04:22.125 So, I conducted a study to see whether or not we can do that 01:04:22.125 --> 01:04:24.761 when we expose people to an inoculation message, 01:04:24.761 --> 01:04:26.563 and then whether it helps them defend 01:04:26.563 --> 01:04:28.832 against extremist propaganda. 01:04:28.832 --> 01:04:31.634 So, I won't go into the details of the study itself. 01:04:31.634 --> 01:04:34.237 I'll give you a general outline of what it looks like. 01:04:34.237 --> 01:04:35.505 But I know my audience, 01:04:35.505 --> 01:04:38.341 and I know that it's not a particular... 01:04:38.341 --> 01:04:40.477 people get bored by academic stuff. 01:04:40.477 --> 01:04:43.113 So, the basic thing to note here 01:04:43.113 --> 01:04:46.082 is that I tested four different groups. 01:04:46.082 --> 01:04:49.719 I inoculated all four groups. 01:04:49.719 --> 01:04:52.088 I had one control group that wasn't inoculated. 01:04:52.088 --> 01:04:54.457 And I checked to see whether or not inoculation worked, 01:04:54.457 --> 01:04:57.160 depending on whether it was delivered by myself 01:04:57.160 --> 01:04:58.828 or it was delivered by somebody... 01:04:58.828 --> 01:05:03.500 a message that wasn't me, posing as a former extremist. 01:05:03.500 --> 01:05:06.035 And the other was left and right wing propaganda, 01:05:06.035 --> 01:05:09.172 to just test these little source outcomes. 01:05:09.172 --> 01:05:10.840 But the main findings 01:05:10.840 --> 01:05:14.144 were linked to the effectiveness of inoculation itself. 01:05:14.144 --> 01:05:17.680 So, the main predictions in the study 01:05:17.680 --> 01:05:19.949 were that, number one, 01:05:19.949 --> 01:05:23.253 if you inoculated people with those messages I just described, 01:05:23.253 --> 01:05:25.955 they would have more of this pushback, this reactance, 01:05:25.955 --> 01:05:27.924 which is the combination of anger and counter-arguing, 01:05:27.924 --> 01:05:29.125 as you said, 01:05:29.125 --> 01:05:32.028 when they were exposed to disinformation propaganda. 01:05:32.028 --> 01:05:34.264 Second, that if you inoculate people, 01:05:34.264 --> 01:05:36.332 they would think the source of the propaganda, 01:05:36.332 --> 01:05:39.836 so the people who authored the extremist propaganda, 01:05:39.836 --> 01:05:42.839 would be less credible than if they weren't inoculated. 01:05:42.839 --> 01:05:44.841 And finally, they would report less intention 01:05:44.841 --> 01:05:48.077 to support the group that produced the propaganda. 01:05:48.077 --> 01:05:49.913 And I believe, if I'm remembering right, 01:05:49.913 --> 01:05:54.083 it would be logistically, financially, ideologically, 01:05:54.083 --> 01:05:55.251 and armed support. 01:05:55.251 --> 01:06:00.123 And I can get into that in the Q&A if anybody wants. 01:06:00.123 --> 01:06:01.925 So, the two groups that I used, 01:06:01.925 --> 01:06:03.860 because I wanted to check kind of polar opposites, 01:06:03.860 --> 01:06:05.094 were left and right wing groups. 01:06:05.094 --> 01:06:06.763 And these are the groups that I used. 01:06:06.763 --> 01:06:09.399 I used the old Weather Underground talking points 01:06:09.399 --> 01:06:11.267 from their actual manifesto. 01:06:11.267 --> 01:06:13.436 And I actually used excerpts 01:06:13.436 --> 01:06:14.904 from the webpage of the National Alliance, 01:06:14.904 --> 01:06:17.440 which is a right wing neo-Nazi group. 01:06:17.440 --> 01:06:19.409 So, the groups that I inoculated 01:06:19.409 --> 01:06:20.844 were warned about this propaganda 01:06:20.844 --> 01:06:22.779 before they were exposed to it. 01:06:22.779 --> 01:06:24.714 Some were exposed to the inoculation methods; 01:06:24.714 --> 01:06:25.982 some were not. 01:06:25.982 --> 01:06:28.918 And then we saw whether or not inoculation prevented people 01:06:28.918 --> 01:06:32.355 from being persuaded by the propaganda when they saw it. 01:06:32.355 --> 01:06:36.659 And I'm going to spare you the technical jargon here. 01:06:36.659 --> 01:06:39.362 But what you're looking at is called a structural model. 01:06:39.362 --> 01:06:41.264 And when you see a positive number, 01:06:41.264 --> 01:06:44.868 that means that there was a positive relationship - 01:06:44.868 --> 01:06:46.903 as one went up, the other went up. 01:06:46.903 --> 01:06:49.372 And a negative number means as one went up, 01:06:49.372 --> 01:06:50.940 the other one went down. 01:06:50.940 --> 01:06:53.309 So, if you look at the era between inoculation 01:06:53.309 --> 01:06:55.712 and reactance, for example, that's positive. 01:06:55.712 --> 01:06:58.081 That means that people who I inoculated, 01:06:58.081 --> 01:07:00.917 they had that pushback against the extremist propaganda 01:07:00.917 --> 01:07:03.253 more than people who weren't inoculated. 01:07:03.253 --> 01:07:06.256 You see the negative number on the arrow between inoculation 01:07:06.256 --> 01:07:08.391 and credibility of the extremist group? 01:07:08.391 --> 01:07:10.293 That means those I inoculated perceived the group 01:07:10.293 --> 01:07:12.161 to be less credible. 01:07:12.161 --> 01:07:13.329 Now, the important things to note 01:07:13.329 --> 01:07:15.198 are the negative number on the arrow 01:07:15.198 --> 01:07:17.066 from reactance to intention, 01:07:17.066 --> 01:07:18.735 and the positive arrow 01:07:18.735 --> 01:07:23.339 on the number from credibility to intention. 01:07:23.339 --> 01:07:25.508 So, as reactance went up - 01:07:25.508 --> 01:07:26.876 now remember, as we inoculated people, 01:07:26.876 --> 01:07:28.311 reactance went up - 01:07:28.311 --> 01:07:31.047 their intention to support the group went down. 01:07:31.047 --> 01:07:33.149 So, the more we instilled reactants in people, 01:07:33.149 --> 01:07:35.718 the less they intended to support the group. 01:07:35.718 --> 01:07:37.253 And on the other hand, 01:07:37.253 --> 01:07:38.488 the less credibility 01:07:38.488 --> 01:07:41.391 or the less credible we made the group seem, 01:07:41.391 --> 01:07:43.760 the less they were likely to support the group. 01:07:43.760 --> 01:07:44.994 The more credible somebody seems, 01:07:44.994 --> 01:07:46.763 the more likely they're going to support the group. 01:07:46.763 --> 01:07:48.197 So, the less we were able - 01:07:48.197 --> 01:07:49.565 or the more we were able to make them seem 01:07:49.565 --> 01:07:51.034 as though they weren't credible, 01:07:51.034 --> 01:07:53.836 the less likely they were to support the group overall. 01:07:53.836 --> 01:07:56.572 So, we found that inoculation generally did really well 01:07:56.572 --> 01:07:58.574 at preventing on both sides 01:07:58.574 --> 01:08:04.047 in terms of getting people to not support the group, 01:08:04.047 --> 01:08:06.182 either with armed support, logistically, 01:08:06.182 --> 01:08:08.584 or all the other things that I have mentioned. 01:08:10.119 --> 01:08:13.089 Now, with that said, what are the major takeaways 01:08:13.089 --> 01:08:16.893 from these finds that we can use in online spaces? 01:08:16.893 --> 01:08:19.362 Number one, the main thing is clearly that inoculation 01:08:19.362 --> 01:08:21.297 seems to have a negative effect on persuasion 01:08:21.297 --> 01:08:22.999 in online contexts. 01:08:22.999 --> 01:08:25.201 So, inoculating against these messages, 01:08:25.201 --> 01:08:27.270 the kinds of messages that many young people 01:08:27.270 --> 01:08:31.708 are at risk of being exposed to, disinformation and otherwise, 01:08:31.708 --> 01:08:34.844 inoculating against them does seem to have some benefit 01:08:34.844 --> 01:08:37.046 in preventing it from being persuasive. 01:08:37.046 --> 01:08:39.515 There seems to be two mechanisms by which it works, 01:08:39.515 --> 01:08:41.484 through reactance and credibility. 01:08:41.484 --> 01:08:44.887 And finally, we can reduce support 01:08:44.887 --> 01:08:48.558 for the source of disinformation and extremist messaging 01:08:48.558 --> 01:08:50.593 when we inoculate against people. 01:08:50.593 --> 01:08:52.795 So, that was the initial study that I did, 01:08:52.795 --> 01:08:55.832 which is now, I think, two years ago. 01:08:55.832 --> 01:08:59.235 And since then, the work has expanded quite a bit, 01:08:59.235 --> 01:09:03.940 including to work with the new CP3 program. 01:09:03.940 --> 01:09:08.244 When I received funding from DHS, it was TVTP. 01:09:08.244 --> 01:09:12.281 But I received funding to test this theory and this practice 01:09:12.281 --> 01:09:13.516 on a grander scale 01:09:13.516 --> 01:09:16.219 against right wing extremism in the United States. 01:09:16.219 --> 01:09:18.354 So, in a two-year project, 01:09:18.354 --> 01:09:20.823 I am bringing together some of the best minds in the country 01:09:20.823 --> 01:09:24.360 and the world on disinformation and right wing extremism 01:09:24.360 --> 01:09:26.496 to better understand the ways 01:09:26.496 --> 01:09:29.632 that disinformation, as it's perpetuated, 01:09:29.632 --> 01:09:31.701 can promote right wing extremism. 01:09:31.701 --> 01:09:34.904 And just as we want to understand it, 01:09:34.904 --> 01:09:36.372 we also want to prevent it. 01:09:36.372 --> 01:09:39.809 So, taking what we learned from the first phase 01:09:39.809 --> 01:09:42.111 and understanding how disinformation 01:09:42.111 --> 01:09:44.247 affects right wing extremism, 01:09:44.247 --> 01:09:46.249 we are also going to be testing inoculation 01:09:46.249 --> 01:09:51.254 on a much larger scale at scale in communities and online, 01:09:51.254 --> 01:09:53.923 where right wing extremism can take hold 01:09:53.923 --> 01:09:57.627 and where disinformation is perpetuated more readily. 01:09:57.627 --> 01:09:59.495 So, that'll be over the next two years or so. 01:09:59.495 --> 01:10:02.298 This work is underway right now. 01:10:02.298 --> 01:10:06.369 But even beyond this, there is another bit of work 01:10:06.369 --> 01:10:08.638 that I'm starting now related to a topic 01:10:08.638 --> 01:10:11.674 on which there is very little empirical work. 01:10:11.674 --> 01:10:14.844 And this is what my next book is going to be on. 01:10:14.844 --> 01:10:16.779 And it's a topic that's a bit controversial, 01:10:16.779 --> 01:10:20.183 so I'm sure I'll get pushback on this in the Q&A a little bit. 01:10:20.183 --> 01:10:23.219 But I think it's a topic that needs to be addressed. 01:10:23.219 --> 01:10:27.090 And I think inoculation may have some benefit with respect to it. 01:10:27.090 --> 01:10:29.992 Now, before I get to it, I want to play a little game here. 01:10:29.992 --> 01:10:33.696 And, David, if you could keep an eye on the chat, 01:10:33.696 --> 01:10:36.065 because I'm curious to what the answer's going to be, 01:10:36.065 --> 01:10:39.936 I'm going to put up a couple of quotes here on the next slide. 01:10:39.936 --> 01:10:44.607 Some of them come from actual convicted terrorists, 01:10:44.607 --> 01:10:47.977 and some of them come from people who operate 01:10:47.977 --> 01:10:50.913 in the mainstream and have large online platforms. 01:10:50.913 --> 01:10:53.783 So, what I'd like you to do, after I do all four, 01:10:53.783 --> 01:10:55.051 I'd like you to write in the chat 01:10:55.051 --> 01:10:56.919 which numbers you think come from terrorists 01:10:56.919 --> 01:10:59.689 and which ones you think come from people with platforms 01:10:59.689 --> 01:11:02.925 that have not been convicted of terrorism. 01:11:02.925 --> 01:11:04.393 Number one: "We'll also have no other choice 01:11:04.393 --> 01:11:07.763 but to take matters into our own hands 01:11:07.763 --> 01:11:09.465 and defend our rights on our own 01:11:09.465 --> 01:11:12.335 if you don't act within your powers to defend us." 01:11:12.335 --> 01:11:15.772 This was written as a letter to President Trump 01:11:15.772 --> 01:11:18.141 threatening the use of violence 01:11:18.141 --> 01:11:21.711 if certain rights were not defended by Trump. 01:11:21.711 --> 01:11:23.880 Second. "Rules of war. 01:11:23.880 --> 01:11:26.449 Stop all abortions, no same-sex marriage, 01:11:26.449 --> 01:11:29.519 no idolatry or occultism, no communism. 01:11:29.519 --> 01:11:31.454 And they must obey all biblical law. 01:11:31.454 --> 01:11:34.657 If they yield, they must pay their share of worker taxes. 01:11:34.657 --> 01:11:38.394 If they don't yield, all males must be killed." 01:11:38.394 --> 01:11:40.863 Third. "I'd rather say that I'm driven by my love 01:11:40.863 --> 01:11:44.033 for Europe, European culture, and all Europeans. 01:11:44.033 --> 01:11:45.668 This doesn't mean that I oppose diversity. 01:11:45.668 --> 01:11:47.403 But appreciating diversity doesn't mean 01:11:47.403 --> 01:11:50.273 that you support genocide of your own people." 01:11:50.273 --> 01:11:54.911 And fourth, this was a meme that was posted by somebody: 01:11:54.911 --> 01:11:57.380 "Folks keep talking about another civil war. 01:11:57.380 --> 01:11:59.315 One side has three trillion bullets, 01:11:59.315 --> 01:12:01.918 and the other doesn't know which bathroom to use." 01:12:01.918 --> 01:12:03.319 So, this is not a trick question. 01:12:03.319 --> 01:12:06.656 Some of these are attributed to actual convicted terrorists. 01:12:06.656 --> 01:12:10.359 Some of them are attributed to actual mainstream people. 01:12:10.359 --> 01:12:13.963 So, if you have a guess as to which numbers 01:12:13.963 --> 01:12:16.599 were from convicted terrorists 01:12:16.599 --> 01:12:18.601 and which ones were from mainstream individuals, 01:12:18.601 --> 01:12:20.570 please feel free to do so in the chat. 01:12:20.570 --> 01:12:22.004 I'll give it maybe 15 seconds. 01:12:22.004 --> 01:12:24.106 And then, David, if you could, because I can't see the chat, 01:12:24.106 --> 01:12:25.741 let me know what the consensus is. 01:12:39.889 --> 01:12:41.357 >>All right. It looks like, Kurt, 01:12:41.357 --> 01:12:44.994 a lot of people are saying, let's see. 01:12:44.994 --> 01:12:48.664 Mainstream three and four, convicted one and two. 01:12:48.664 --> 01:12:49.866 Another person writes - 01:12:49.866 --> 01:12:52.668 >>Mainstream three and four, convicted one and two. 01:12:52.668 --> 01:12:55.638 That's the consensus there? 01:12:55.638 --> 01:12:59.842 >>Well, it's hard to build a consensus, because [crosstalk]. 01:12:59.842 --> 01:13:01.177 It's a little bit of disagreement. 01:13:01.177 --> 01:13:03.646 I see another one, one not terrorist, 01:13:03.646 --> 01:13:07.516 two not terrorist, three terrorist, four not terrorist. 01:13:07.516 --> 01:13:09.452 So, it does seem to be hard to decipher. 01:13:09.452 --> 01:13:10.686 >>All over the place. 01:13:10.686 --> 01:13:11.954 >>Yes. 01:13:11.954 --> 01:13:15.291 >>Yeah. Well, that's part of the point. 01:13:15.291 --> 01:13:17.159 To break the mystery here, this first person 01:13:17.159 --> 01:13:21.530 was a prominent politician in Ohio politics, 01:13:21.530 --> 01:13:24.066 the head of the Tea Party, the Portage Tea Party, 01:13:24.066 --> 01:13:27.436 who said this, who took out an ad in the Washington Times, 01:13:27.436 --> 01:13:30.473 threatening that if the election was not protected, 01:13:30.473 --> 01:13:32.942 that they would take matters into their own hands. 01:13:32.942 --> 01:13:37.346 This is from a state rep in Washington, Matt Schey, 01:13:37.346 --> 01:13:40.016 who wrote a manifesto talking about the need 01:13:40.016 --> 01:13:42.518 to instill biblical law. 01:13:42.518 --> 01:13:46.322 He has been linked to right wing terrorist offenses, 01:13:46.322 --> 01:13:48.424 though I don't think he's been convicted. 01:13:48.424 --> 01:13:50.826 Third, whoever said terrorist, Anders Breivik. 01:13:50.826 --> 01:13:52.695 Good call. 01:13:52.695 --> 01:13:55.364 This was in his manifesto about protecting Europe. 01:13:55.364 --> 01:14:00.569 And fourth is Steve King, a US representative from Iowa. 01:14:00.569 --> 01:14:03.005 So, three out of four of these were prominent politicians, 01:14:03.005 --> 01:14:05.007 and the fourth was a convicted terrorist. 01:14:05.007 --> 01:14:08.077 But the fact that there was so much disagreement, 01:14:08.077 --> 01:14:09.545 and there doesn't seem to be a consensus 01:14:09.545 --> 01:14:11.213 that it was the first two, 01:14:11.213 --> 01:14:14.884 and the final were politicians, shows that this kind of language 01:14:14.884 --> 01:14:17.486 has effects that can be discerned in different ways. 01:14:17.486 --> 01:14:19.855 And this is the area that I'm going with my research now. 01:14:19.855 --> 01:14:23.559 And the topic is called stochastic terrorism. 01:14:23.559 --> 01:14:26.095 Stochastic terrorism, basically speaking, 01:14:26.095 --> 01:14:29.298 is the idea that an individual - 01:14:29.298 --> 01:14:31.334 and this has gotten much more prominent 01:14:31.334 --> 01:14:33.035 in the age of disinformation - 01:14:33.035 --> 01:14:36.005 individuals can spread messages online that, 01:14:36.005 --> 01:14:39.075 on the face of it, do not advocate for violence, 01:14:39.075 --> 01:14:41.744 or they don't overtly call for violence, 01:14:41.744 --> 01:14:44.313 but can be interpreted as calls for violence. 01:14:44.313 --> 01:14:46.449 And the reason it's called stochastic 01:14:46.449 --> 01:14:48.184 is because it comes out of statistics, 01:14:48.184 --> 01:14:49.452 where a stochastic process 01:14:49.452 --> 01:14:51.787 is one that is impossible to predict 01:14:51.787 --> 01:14:54.890 when and where it'll happen, but it will happen. 01:14:54.890 --> 01:14:58.260 So, think about when a storm is rolling into your town, 01:14:58.260 --> 01:14:59.562 a lightning strike. 01:14:59.562 --> 01:15:03.032 Almost impossible to predict where, but most likely to occur. 01:15:03.032 --> 01:15:04.934 The prominent example in the statistics literature 01:15:04.934 --> 01:15:06.369 is often a petri dish. 01:15:06.369 --> 01:15:09.672 If you sneeze in a petri dish, bacteria are going to grow, 01:15:09.672 --> 01:15:11.440 but you can't predict when and where. 01:15:11.440 --> 01:15:14.510 So, stochastic terrorism is the idea 01:15:14.510 --> 01:15:17.980 that when individuals have a large platform, 01:15:17.980 --> 01:15:20.116 especially an online platform, 01:15:20.116 --> 01:15:23.319 and they say statements that seem to implicitly call 01:15:23.319 --> 01:15:25.454 for violence in the way that they do, 01:15:25.454 --> 01:15:27.356 that you can't predict when and where. 01:15:27.356 --> 01:15:30.326 But when they are talking to, say, 88 million people, 01:15:30.326 --> 01:15:33.362 there may be at least one, two, or a dozen 01:15:33.362 --> 01:15:35.364 who take it as a literal call to action. 01:15:35.364 --> 01:15:36.932 And we've seen this happen in the past 01:15:36.932 --> 01:15:39.035 with statements from some individuals 01:15:39.035 --> 01:15:41.837 who have made implicit calls for violence. 01:15:41.837 --> 01:15:44.040 So, I argue, or I think, 01:15:44.040 --> 01:15:46.075 that understanding stochastic terrorism, 01:15:46.075 --> 01:15:49.412 understanding how to prevent it can involve inoculation 01:15:49.412 --> 01:15:53.482 to some degree, and online kind of interventions 01:15:53.482 --> 01:15:56.385 if we know what the talking points might be. 01:15:56.385 --> 01:16:00.489 The challenges here relate to, number one, first and foremost, 01:16:00.489 --> 01:16:02.258 I am a staunch First Amendment advocate, 01:16:02.258 --> 01:16:05.728 and I in no way advocate restricting anybody's speech 01:16:05.728 --> 01:16:09.765 in a way that is incongruent with the First Amendment. 01:16:09.765 --> 01:16:13.235 That said, there needs to be a way to understand 01:16:13.235 --> 01:16:14.036 where the line is. 01:16:14.036 --> 01:16:15.604 And to this point, 01:16:15.604 --> 01:16:17.973 we don't really have an idea where that line is. 01:16:17.973 --> 01:16:20.643 Number two, one of the big issues 01:16:20.643 --> 01:16:24.146 in trying to develop large-scale interventions, 01:16:24.146 --> 01:16:26.715 especially those that are implemented by government, 01:16:26.715 --> 01:16:27.950 is getting them to scale. 01:16:27.950 --> 01:16:30.519 It's very difficult to get something to scale that works. 01:16:30.519 --> 01:16:32.455 Stochastic terrorism, on the other hand, 01:16:32.455 --> 01:16:34.123 that can be done to scale very quickly 01:16:34.123 --> 01:16:35.391 and can be shown to be effective, 01:16:35.391 --> 01:16:38.928 because it only needs to affect one or two people. 01:16:38.928 --> 01:16:40.663 And finally, as I said, 01:16:40.663 --> 01:16:43.999 the plausible deniability of the speaker is kind of the meat 01:16:43.999 --> 01:16:46.769 and the potatoes of stochastic terrorism. 01:16:46.769 --> 01:16:48.737 So, these are all things I'm going to be addressing 01:16:48.737 --> 01:16:50.873 in my research upcoming, 01:16:50.873 --> 01:16:52.441 and how we might be able to intervene 01:16:52.441 --> 01:16:53.943 in these sorts of processes. 01:16:53.943 --> 01:16:56.612 I don't think that there is, 01:16:56.612 --> 01:16:58.314 and I don't know if there should be, 01:16:58.314 --> 01:16:59.782 legal ramifications for these things 01:16:59.782 --> 01:17:01.150 because of the First Amendment. 01:17:01.150 --> 01:17:03.285 But there needs to be some way to address this sort of thing. 01:17:03.285 --> 01:17:04.487 And that's what I'll be addressing 01:17:04.487 --> 01:17:06.088 in my upcoming research. 01:17:06.088 --> 01:17:11.126 This is all complicated by the death of truth, 01:17:11.126 --> 01:17:14.763 to the benefit of disinformation to motivate one's base, 01:17:14.763 --> 01:17:17.967 something that really makes it difficult 01:17:17.967 --> 01:17:19.802 to develop effective interventions, 01:17:19.802 --> 01:17:23.305 when the idea of reality itself comes under debate. 01:17:23.305 --> 01:17:25.407 So, these are all challenges I'll be contending with. 01:17:25.407 --> 01:17:28.577 I'll be testing inoculation in online contexts. 01:17:28.577 --> 01:17:32.781 And it does apply to places like schools, 01:17:32.781 --> 01:17:35.117 because we find that young people, 01:17:35.117 --> 01:17:38.521 especially in high school, maybe even in college as well, 01:17:38.521 --> 01:17:40.422 are very politically motivated these days, 01:17:40.422 --> 01:17:41.924 and they encounter these messages. 01:17:41.924 --> 01:17:43.459 So, there can be negative effects 01:17:43.459 --> 01:17:46.562 in the online space for these individuals. 01:17:46.562 --> 01:17:47.830 So, as I move forward, 01:17:47.830 --> 01:17:50.933 I'll be looking into this particular topic, 01:17:50.933 --> 01:17:52.601 seeing whether or not we can prevent it 01:17:52.601 --> 01:17:54.870 from going into violence, 01:17:54.870 --> 01:17:57.406 and hopefully gathering data on that front. 01:17:57.406 --> 01:18:00.175 So, with that, I'll stop. If anybody has any questions - 01:18:00.175 --> 01:18:01.744 that email is actually old there. 01:18:01.744 --> 01:18:04.113 It should be braddock@american.edu. 01:18:06.582 --> 01:18:09.418 But I'll be happy to take any questions during Q&A as well. 01:18:09.418 --> 01:18:10.819 So, thanks, David. 01:18:10.819 --> 01:18:11.520 >>Thank you, Kurt.