WEBVTT 00:00:18.685 --> 00:00:20.437 >>Hello, and welcome back. 00:00:20.437 --> 00:00:24.566 Just making sure we have all of our panelists for our Panel 2. 00:00:24.566 --> 00:00:26.235 Welcome back to our second panel today 00:00:26.235 --> 00:00:28.779 for the Digital Forum on Prevention. 00:00:28.779 --> 00:00:30.572 One, I am very excited 00:00:30.572 --> 00:00:33.200 that we'll discuss digital literacy education, 00:00:33.200 --> 00:00:36.119 youth and peer led programs that empower digital literacy, 00:00:36.119 --> 00:00:38.247 and digital civility initiatives 00:00:38.247 --> 00:00:41.792 and ways to prevent the spread of harmful misinformation. 00:00:41.792 --> 00:00:44.211 So we have a really great panel today 00:00:44.211 --> 00:00:45.545 that I'm honored to introduce. 00:00:45.545 --> 00:00:48.257 For this panel we welcome Courtney Gregoire 00:00:48.257 --> 00:00:50.425 who is the Chief Digital Safety Officer 00:00:50.425 --> 00:00:51.927 of Microsoft Corporation, 00:00:51.927 --> 00:00:54.346 and she's responsible for Microsoft's companywide 00:00:54.346 --> 00:00:57.015 digital safety strategy to reduce harm from illegal 00:00:57.015 --> 00:00:58.475 and harmful content online 00:00:58.475 --> 00:01:01.353 through technology, policy, and partnerships. 00:01:01.353 --> 00:01:03.355 We also welcome Michelle Ciulla Lipkin, 00:01:03.355 --> 00:01:05.983 the Executive Director of the National Association 00:01:05.983 --> 00:01:07.985 for Media Literacy Education 00:01:07.985 --> 00:01:09.569 which is a professional association 00:01:09.569 --> 00:01:11.780 for educators, academics, activists, 00:01:11.780 --> 00:01:16.368 and students as a leading voice convenor and resource 00:01:16.368 --> 00:01:19.329 to foster critical thinking and effective communication 00:01:19.329 --> 00:01:21.540 for empowered media participation. 00:01:21.540 --> 00:01:24.334 We also welcome Ebonee Rice who is the Senior Vice President 00:01:24.334 --> 00:01:27.296 of the Educator Network at The News Literacy Project 00:01:27.296 --> 00:01:28.255 which is having their third 00:01:28.255 --> 00:01:31.258 annual National News Literacy Week this week, 00:01:31.258 --> 00:01:32.384 so thank you, Ebonee, 00:01:32.384 --> 00:01:33.802 for being here during what I'm sure 00:01:33.802 --> 00:01:35.971 is a very busy week for you as well. 00:01:35.971 --> 00:01:37.973 We also welcome Tami Bhaumik, 00:01:37.973 --> 00:01:40.851 the Vice President of Civility and Partnerships at Roblox 00:01:40.851 --> 00:01:42.102 where she spearheads 00:01:42.102 --> 00:01:45.188 the platform's digital civility initiative 00:01:45.188 --> 00:01:48.233 working to create positive online experiences. 00:01:48.233 --> 00:01:50.736 Also on the panel is Dr. Kristen Mattson 00:01:50.736 --> 00:01:53.488 who is an educational consultant and adjunct professor 00:01:53.488 --> 00:01:56.408 at the University of Illinois at Urbana-Champaign 00:01:56.408 --> 00:01:59.244 and a leader in digital citizenship curriculum 00:01:59.244 --> 00:02:02.539 and digital citizenship in action. 00:02:02.539 --> 00:02:05.334 Also on the panel we welcome Jennie King, 00:02:05.334 --> 00:02:07.377 the head of Civic Action and Education 00:02:07.377 --> 00:02:10.547 at the Institute for Strategic Dialogue who has co-authored 00:02:10.547 --> 00:02:12.674 the Be Internet Citizens curriculum 00:02:12.674 --> 00:02:16.136 that was formally accredited for schools in the U.K. 00:02:16.136 --> 00:02:18.096 And as our moderator for the panel today, 00:02:18.096 --> 00:02:20.849 we are lucky to welcome Jimmeka Anderson. 00:02:20.849 --> 00:02:22.642 She is ProgramFellow at New America 00:02:22.642 --> 00:02:25.312 and Project Manager for the Cyber Citizenship Initiative 00:02:25.312 --> 00:02:28.523 at the National Association for Media Literacy Education 00:02:28.523 --> 00:02:29.941 who is also busy with her 00:02:29.941 --> 00:02:32.569 PhD in the Curriculum and Instruction Urban Education 00:02:32.569 --> 00:02:35.447 program at the University of North Carolina at Charlotte. 00:02:35.447 --> 00:02:39.326 So thank you to our esteemed and expert panel here today, 00:02:39.326 --> 00:02:40.869 and without further ado, Jimmeka, 00:02:40.869 --> 00:02:42.454 I will throw it over to you. 00:02:44.790 --> 00:02:46.458 >>Good afternoon, everyone. 00:02:46.458 --> 00:02:50.462 I am super excited to be here. I don't know if you all know, 00:02:50.462 --> 00:02:52.672 but this is the highlight of my week, 00:02:52.672 --> 00:02:55.592 so I'm really excited about this conversation. 00:02:55.592 --> 00:02:59.012 So welcome. Welcome to today's panel titled 00:02:59.012 --> 00:03:01.973 "Navigating Student Safety Online." 00:03:01.973 --> 00:03:03.183 I am Jimmeka Anderson, 00:03:03.183 --> 00:03:05.602 a media literacy education author and media literacy 00:03:05.602 --> 00:03:08.188 consultant for several national organizations 00:03:08.188 --> 00:03:10.732 in the U.S. that focus on media literacy. 00:03:10.732 --> 00:03:12.484 I'm also the current project manager 00:03:12.484 --> 00:03:14.152 for the Cyber Citizenship Initiative 00:03:14.152 --> 00:03:16.655 which focuses on equipping educators 00:03:16.655 --> 00:03:19.616 with easily accessing tools and resources 00:03:19.616 --> 00:03:21.868 to teach media literacy and cybersecurity skills 00:03:21.868 --> 00:03:24.371 to combat misinformation. 00:03:24.371 --> 00:03:26.832 Lastly, I'm a doctoral candidate, okay, 00:03:26.832 --> 00:03:30.419 and I must tell you that my research agenda delves deeply 00:03:30.419 --> 00:03:32.170 and heavily into today's topics 00:03:32.170 --> 00:03:35.173 which is why I'm super excited to serve as the moderator. 00:03:35.173 --> 00:03:36.800 I said this was the highlight of my week 00:03:36.800 --> 00:03:38.093 because I'm going to be taking notes, 00:03:38.093 --> 00:03:41.012 so some of this conversation might end up in a dissertation. 00:03:41.012 --> 00:03:43.765 So why this topic? 00:03:43.765 --> 00:03:47.436 Well, the online world currently provides information 00:03:47.436 --> 00:03:49.771 and connection in every realm of our society 00:03:49.771 --> 00:03:53.483 from our business and politics to the education of our children 00:03:53.483 --> 00:03:55.026 and the ways that they communicate 00:03:55.026 --> 00:03:56.570 and share with each other. 00:03:56.570 --> 00:03:59.448 Yet it is also a realm of manipulation and threat 00:03:59.448 --> 00:04:02.492 that has exponentially grown in recent years. 00:04:02.492 --> 00:04:04.953 False and misleading claims and conspiracy theories 00:04:04.953 --> 00:04:07.581 have gone viral threatening not just our democracy, 00:04:07.581 --> 00:04:09.332 but even public health. 00:04:09.332 --> 00:04:10.917 At the same time, it's the very ways 00:04:10.917 --> 00:04:13.336 that the online world works to present new challenges 00:04:13.336 --> 00:04:16.423 for students just trying to find reliable information 00:04:16.423 --> 00:04:18.091 for a school research project 00:04:18.091 --> 00:04:20.177 or just make plans for the weekend. 00:04:20.177 --> 00:04:22.596 In facing this new world new skills are needed, 00:04:22.596 --> 00:04:25.098 digital media literacy skills. 00:04:25.098 --> 00:04:28.685 For context for today's conversation, I love context, 00:04:28.685 --> 00:04:30.854 we define these skills as being the ability 00:04:30.854 --> 00:04:33.440 to access, analyze, evaluate, create, 00:04:33.440 --> 00:04:36.651 and act with various modes of text and medium. 00:04:36.651 --> 00:04:38.487 Our focus for today's conversation 00:04:38.487 --> 00:04:40.322 is on skill building 00:04:40.322 --> 00:04:42.157 and what is needed to better navigate 00:04:42.157 --> 00:04:45.994 an increasingly online world safely and effectively, 00:04:45.994 --> 00:04:48.205 including against the potential for manipulation 00:04:48.205 --> 00:04:50.582 at the individual and social level. 00:04:50.582 --> 00:04:53.919 I know you just got to hear about this amazing panel 00:04:53.919 --> 00:04:55.337 and their introductions from Kristi, 00:04:55.337 --> 00:04:57.839 but I would be remiss if I did not highlight 00:04:57.839 --> 00:05:02.177 this esteemed panel of dynamic leaders 00:05:02.177 --> 00:05:04.513 that's doing this work one more time. 00:05:04.513 --> 00:05:06.473 We have Courtney. 00:05:06.473 --> 00:05:09.809 She is the Chief Digital Safety Officer at Microsoft. 00:05:09.809 --> 00:05:11.561 Microsoft is here, ya'll. 00:05:11.561 --> 00:05:13.146 We have Michelle Lipkin 00:05:13.146 --> 00:05:14.898 who is the Executive Director of the National Association 00:05:14.898 --> 00:05:17.776 of the National Association for Media Literacy Education. 00:05:17.776 --> 00:05:20.403 Come on, when we talk about leadership on this topic, 00:05:20.403 --> 00:05:21.821 you all are hearing from experts. 00:05:21.821 --> 00:05:24.824 We have Ebonee Rice with News Literacy Project. 00:05:24.824 --> 00:05:26.993 We have Tami who is the Vice President 00:05:26.993 --> 00:05:29.079 of Civility and Partnerships with Roblox. 00:05:29.079 --> 00:05:32.165 If you have kids or tweens, I have a tween, Roblox, 00:05:32.165 --> 00:05:33.583 you should know what Roblox is. 00:05:33.583 --> 00:05:36.211 Everybody knows what Roblox is if you have a kid. 00:05:36.211 --> 00:05:39.464 We have Dr. Kristen Mattson who is an educational consultant 00:05:39.464 --> 00:05:42.342 and adjunct professor, and we also have Jennie King 00:05:42.342 --> 00:05:45.887 who is head of Civic Action and Education. 00:05:45.887 --> 00:05:47.180 We have an amazing panel. 00:05:47.180 --> 00:05:50.350 This panel today will focus on ways in which schools 00:05:50.350 --> 00:05:52.310 and students are using digital citizenship 00:05:52.310 --> 00:05:53.979 and digital civility skills 00:05:53.979 --> 00:05:56.314 to mitigate hate and harassment online 00:05:56.314 --> 00:05:58.316 including how these programs can also help 00:05:58.316 --> 00:06:00.610 prevent the spread of malicious, violent, mis, 00:06:00.610 --> 00:06:01.903 dis and malformation. 00:06:01.903 --> 00:06:02.779 If you don't know what that is, 00:06:02.779 --> 00:06:04.531 we're going to discuss that in a second. 00:06:04.531 --> 00:06:06.449 The panel will be a roundtable discussion 00:06:06.449 --> 00:06:08.493 with other panelists on ways schools, 00:06:08.493 --> 00:06:12.247 students, parents, and guardians can help build resilience 00:06:12.247 --> 00:06:13.665 against the consumption 00:06:13.665 --> 00:06:16.126 and spread of false and harmful narratives, 00:06:16.126 --> 00:06:18.670 hate, and harassment through digital media literacy, 00:06:18.670 --> 00:06:23.341 critical thinking, and promoting positive online civility. 00:06:23.341 --> 00:06:25.010 As for the outline of today's session, 00:06:25.010 --> 00:06:27.596 the flow of the panel will be broken into three parts. 00:06:27.596 --> 00:06:30.807 Part One will focus on the issue of online polarization 00:06:30.807 --> 00:06:32.017 and misinformation. 00:06:32.017 --> 00:06:34.853 What is it? What does that mean? What's going on, right? 00:06:34.853 --> 00:06:36.563 Part Two will delve into exploring 00:06:36.563 --> 00:06:38.648 the state of media literacy education 00:06:38.648 --> 00:06:41.651 and how it may serve as one potential solution. 00:06:41.651 --> 00:06:44.779 Not the overall solution, 00:06:44.779 --> 00:06:46.990 let me make that clear, but one potential solution 00:06:46.990 --> 00:06:48.658 for combatting this issue. 00:06:48.658 --> 00:06:51.244 And lasty, in Part Three of today's panel 00:06:51.244 --> 00:06:53.288 these panelists will share some tips and tools 00:06:53.288 --> 00:06:55.999 to build resilience for educators and parents. 00:06:55.999 --> 00:06:57.292 Each part of today's panel 00:06:57.292 --> 00:06:59.461 is allotted ten minutes for discussion, 00:06:59.461 --> 00:07:02.005 and we will save the last ten minutes at the end 00:07:02.005 --> 00:07:05.050 for questions from the audience. Feel free to drop your questions 00:07:05.050 --> 00:07:06.676 in the chat throughout the panel, 00:07:06.676 --> 00:07:08.470 and we will answer them at the end. 00:07:08.470 --> 00:07:13.099 Now, let's get ready for some great convo. 00:07:13.099 --> 00:07:18.229 All right, so my first question is going to go to Ebonee. 00:07:18.229 --> 00:07:19.981 Let's start off this conversation 00:07:19.981 --> 00:07:22.984 by defining what is online polarization, 00:07:22.984 --> 00:07:26.446 and what ways false or misleading information 00:07:26.446 --> 00:07:30.033 is spread on the internet to cause this issue. 00:07:30.033 --> 00:07:33.203 How can it lead to offline harm? But let me go back, Ebonee. 00:07:33.203 --> 00:07:35.121 I'm sorry. Let's define misinformation, 00:07:35.121 --> 00:07:38.166 not online polarization. Misinformation. 00:07:38.166 --> 00:07:40.085 >>Great. Let's do that. 00:07:40.085 --> 00:07:43.505 Firstly, I just want to thank you all for coming. 00:07:43.505 --> 00:07:45.382 I thank you for that wonderful introduction. 00:07:45.382 --> 00:07:49.177 It's such a pleasure to be here. Thank you so much, Jimmeka. 00:07:49.177 --> 00:07:53.139 So we at The News Literacy Project 00:07:53.139 --> 00:07:54.683 define misinformation 00:07:54.683 --> 00:07:56.810 really just as information 00:07:56.810 --> 00:08:00.522 that is created or shared that is misleading, 00:08:00.522 --> 00:08:04.234 disinformation as a subset of misinformation, 00:08:04.234 --> 00:08:06.361 and we break that down into five categories. 00:08:09.614 --> 00:08:12.033 So the first is satire, things that you see online 00:08:12.033 --> 00:08:15.495 that are meant to be jokes or meant to be funny 00:08:15.495 --> 00:08:18.289 but are used as something serious 00:08:18.289 --> 00:08:20.458 and are manipulated into making people think 00:08:20.458 --> 00:08:23.628 that this is something that's actually true or very serious. 00:08:23.628 --> 00:08:27.424 The other is false context which is literally just something 00:08:27.424 --> 00:08:30.719 taken out of context like a quote, an event, 00:08:30.719 --> 00:08:33.388 or something that is just literally taken out of context 00:08:33.388 --> 00:08:37.350 and put into a context that makes it appear true 00:08:37.350 --> 00:08:39.561 or appear in another context that was either not 00:08:39.561 --> 00:08:43.857 originally said or that didn't happen. 00:08:43.857 --> 00:08:46.568 The third is fabricated content 00:08:46.568 --> 00:08:47.986 which is exactly what it sounds like. 00:08:47.986 --> 00:08:50.989 It's like misleading images or things that you see like 00:08:50.989 --> 00:08:55.285 charts or graphs and things that are fabricated to also, 00:08:55.285 --> 00:08:59.289 again, give the intention that it is something that it is not. 00:08:59.289 --> 00:09:02.125 Then of course there's imposter content 00:09:02.125 --> 00:09:05.003 which is when you see like famous people or celebrities 00:09:05.003 --> 00:09:07.547 and they're holding a sign or they're wearing a t-shirt 00:09:07.547 --> 00:09:11.593 that says something or is doing something 00:09:11.593 --> 00:09:14.846 that they probably weren't doing. 00:09:14.846 --> 00:09:16.306 It's an imposter. 00:09:16.306 --> 00:09:19.726 And then the final thing that we categorize as misinformation 00:09:19.726 --> 00:09:22.729 is manipulated content, and that's when you see slogans 00:09:22.729 --> 00:09:25.273 and you see hats and see so much of this 00:09:25.273 --> 00:09:30.111 that is literally just a manipulated program online 00:09:30.111 --> 00:09:31.571 or on our phones that we see 00:09:31.571 --> 00:09:33.656 that's just been manipulated or changed like 00:09:33.656 --> 00:09:35.450 when you see a person who is in a place 00:09:35.450 --> 00:09:37.994 that they weren't actually at or you see a background 00:09:37.994 --> 00:09:40.705 that doesn't represent the real-time event 00:09:40.705 --> 00:09:42.165 that was actually happening at that time 00:09:42.165 --> 00:09:45.084 or the content has just been manipulated to make it seem 00:09:45.084 --> 00:09:46.503 as if it's something else. 00:09:46.503 --> 00:09:49.547 Now, I think the tricky part about misinformation 00:09:49.547 --> 00:09:50.799 and disinformation 00:09:50.799 --> 00:09:54.886 is that the difference there can sometimes come down to intent. 00:09:54.886 --> 00:09:59.349 With misinformation, often times people share that by accident. 00:09:59.349 --> 00:10:00.642 They have good intentions, 00:10:00.642 --> 00:10:02.644 and they're just trying to let you know something 00:10:02.644 --> 00:10:04.854 that they see online that caused a reaction out of them 00:10:04.854 --> 00:10:06.064 and want the world to know it. 00:10:06.064 --> 00:10:07.482 They want everyone to see it. 00:10:07.482 --> 00:10:08.733 And then on 00:10:08.733 --> 00:10:11.861 the disinformation side of things you have bad actors 00:10:11.861 --> 00:10:16.157 or people that are capitalizing on our emotions, our values, 00:10:16.157 --> 00:10:18.076 and our morals as a country 00:10:18.076 --> 00:10:20.286 or as a person or racial identity. 00:10:20.286 --> 00:10:25.458 Whatever it is. They're using that to meet a desired end. 00:10:25.458 --> 00:10:29.629 The challenge is that we're not able to judge that intention. 00:10:29.629 --> 00:10:30.964 We don't know these people. 00:10:30.964 --> 00:10:32.257 We don't know where something originated. 00:10:32.257 --> 00:10:34.968 It's really difficult to find those things. 00:10:34.968 --> 00:10:36.678 So when we're talking about misinformation 00:10:36.678 --> 00:10:38.346 at The News Literacy Project 00:10:38.346 --> 00:10:40.598 we are talking about all five of those things 00:10:40.598 --> 00:10:43.768 within that category that I listed earlier. 00:10:45.061 --> 00:10:49.524 >>Wow, thank you, Ebonee, so much for a very cool 00:10:49.524 --> 00:10:52.193 explanation of defining misinformation. 00:10:52.193 --> 00:10:55.196 So I know that a lot of the challenges 00:10:55.196 --> 00:10:57.699 that we're seeing right now in the online environment 00:10:57.699 --> 00:11:00.869 are such things as online polarization 00:11:00.869 --> 00:11:04.080 and the challenge to create online safe spaces 00:11:04.080 --> 00:11:05.748 or really even knowing how to navigate online 00:11:05.748 --> 00:11:09.836 which has been a huge issue 00:11:09.836 --> 00:11:12.130 that has been brought to the forefront. 00:11:12.130 --> 00:11:14.591 So it's very clear that this is a global issue. 00:11:14.591 --> 00:11:17.802 It impacts many sectors and fields and facets of our lives 00:11:17.802 --> 00:11:20.638 from politics to health, education, safety, and security. 00:11:20.638 --> 00:11:22.682 This next question is for you, Kristen. 00:11:22.682 --> 00:11:24.183 How important do you think it is 00:11:24.183 --> 00:11:26.561 to take a multi-stakeholder approach 00:11:26.561 --> 00:11:28.813 when it comes to tackling challenges 00:11:28.813 --> 00:11:31.524 in the online space that we're facing? 00:11:31.524 --> 00:11:33.651 What do you think it will take to do so? 00:11:33.651 --> 00:11:36.362 How can we create this kind of space 00:11:36.362 --> 00:11:39.324 to address this collectively? 00:11:39.324 --> 00:11:42.619 >>Yeah, my background is as a former educator 00:11:42.619 --> 00:11:43.828 and school librarian, 00:11:43.828 --> 00:11:46.581 and I now teach preservice school librarians 00:11:46.581 --> 00:11:49.167 and work with practicing educators. 00:11:49.167 --> 00:11:51.961 Education is definitely part of the solution, 00:11:51.961 --> 00:11:54.464 but when I go out and work with schools 00:11:54.464 --> 00:11:56.674 I hear about so many barriers 00:11:56.674 --> 00:12:00.136 to media literacy education initiatives. 00:12:00.136 --> 00:12:01.387 Some of those barriers 00:12:01.387 --> 00:12:04.641 are really outside of the control of the classroom, 00:12:04.641 --> 00:12:07.060 so when think about policy change 00:12:07.060 --> 00:12:10.647 and we think about availability to high-quality resources 00:12:10.647 --> 00:12:15.318 and think about teacher training and just the skills 00:12:15.318 --> 00:12:18.488 that our educators are going to need to carry out this work, 00:12:18.488 --> 00:12:21.324 it really has to be a comprehensive approach 00:12:21.324 --> 00:12:23.451 where all stakeholders are coming together 00:12:23.451 --> 00:12:28.081 to really pave the way to make this stick in classrooms. 00:12:28.081 --> 00:12:32.669 So I want to give a shoutout to folks like you, Jimmeka, 00:12:32.669 --> 00:12:37.507 and you, Ebonee, and Michelle who work with ISTE, New America, 00:12:37.507 --> 00:12:41.344 All Tech is Human and David Ryan Polgar-I'm sorry 00:12:41.344 --> 00:12:43.346 and non-profit organizations 00:12:43.346 --> 00:12:45.181 who are doing just a really great job 00:12:45.181 --> 00:12:47.809 of putting people at the table together. 00:12:47.809 --> 00:12:51.104 They're breaking down those silos between work 00:12:51.104 --> 00:12:55.400 that is happening in tech companies like Roblox 00:12:55.400 --> 00:12:59.696 and non-profit spaces and educational institutions. 00:12:59.696 --> 00:13:02.365 It really is just going to take all of us 00:13:02.365 --> 00:13:04.993 coming together to make a change. 00:13:04.993 --> 00:13:06.536 So I appreciate this panel 00:13:06.536 --> 00:13:08.496 because I think it's a great opportunity 00:13:08.496 --> 00:13:09.998 to start the conversation. 00:13:11.332 --> 00:13:15.169 >>Yes, it is, and you're right, Dr. Kristen. 00:13:15.169 --> 00:13:17.046 It's okay for me to call you Dr. Kristen, right? 00:13:17.046 --> 00:13:18.464 Are we informal here? 00:13:18.464 --> 00:13:19.757 >>Yes. 00:13:19.757 --> 00:13:22.677 >>Well, you know, I do want to highlight so much work 00:13:22.677 --> 00:13:23.970 which is now being done, 00:13:23.970 --> 00:13:27.473 and we're seeing cybersecurity sectors coming together 00:13:27.473 --> 00:13:31.436 and seeing national associations 00:13:31.436 --> 00:13:35.106 and policy thinktanks and researchers 00:13:35.106 --> 00:13:37.233 and everyone kind of now saying, 00:13:37.233 --> 00:13:38.776 "You know what? This is hitting us. 00:13:38.776 --> 00:13:42.530 Every facet is impacting us just like it's impacting you." 00:13:42.530 --> 00:13:44.365 At first I think it was impacting the kids. 00:13:44.365 --> 00:13:46.284 I think that's something the media literacy educators 00:13:46.284 --> 00:13:47.660 were saying with "Let's help the kids. 00:13:47.660 --> 00:13:48.494 Let's teach the kids." 00:13:48.494 --> 00:13:49.912 I think everyone is now saying 00:13:49.912 --> 00:13:53.374 that it's beyond just education with this issue. 00:13:53.374 --> 00:13:55.835 We all have to come together, right? 00:13:55.835 --> 00:13:58.504 But I think for the most part 00:13:58.504 --> 00:14:02.842 there still is not one consensus 00:14:02.842 --> 00:14:05.887 of really what is happening, right? 00:14:05.887 --> 00:14:09.140 There's so many different conversations 00:14:09.140 --> 00:14:11.517 that are taking place to the general public 00:14:11.517 --> 00:14:12.810 for understanding 00:14:12.810 --> 00:14:15.396 what misinformation or disinformation is, 00:14:15.396 --> 00:14:19.567 so I'm going to throw this question out to the panel. 00:14:19.567 --> 00:14:22.070 Michelle, I think you'll be great for this one. 00:14:22.070 --> 00:14:23.696 What do you believe is the general public's 00:14:23.696 --> 00:14:26.532 understanding of mis and disinformation and what it does? 00:14:26.532 --> 00:14:30.036 What do you wish they knew or understood better? 00:14:30.036 --> 00:14:31.621 >>I will jump in if that's okay, Jimmeka. 00:14:31.621 --> 00:14:32.830 Thank you so much. 00:14:32.830 --> 00:14:35.291 It's so nice to be here, and it's so great. 00:14:35.291 --> 00:14:38.127 I'm already taking notes and learning. 00:14:38.127 --> 00:14:41.923 I have to say, I think that we can't underestimate 00:14:41.923 --> 00:14:45.259 what an impact the change of our communication systems 00:14:45.259 --> 00:14:46.469 over the last decade 00:14:46.469 --> 00:14:50.389 has had on everything about being human, right? 00:14:50.389 --> 00:14:52.141 We have completely changed the way 00:14:52.141 --> 00:14:54.102 that we communicate as a species, 00:14:54.102 --> 00:14:56.854 and that's hard to keep up with and hard to understand. 00:14:56.854 --> 00:15:00.066 I think that thankfully for efforts like 00:15:00.066 --> 00:15:01.317 The News Literacy Project 00:15:01.317 --> 00:15:06.280 and other organizations we are getting better at teaching 00:15:06.280 --> 00:15:08.825 and making sure people understand 00:15:08.825 --> 00:15:11.994 mis and disinformation. 00:15:11.994 --> 00:15:14.163 There's a lot of mis and disinformation out there, 00:15:14.163 --> 00:15:16.332 and there's a lot of work to do, 00:15:16.332 --> 00:15:21.462 and I think for me with the media literacy organization 00:15:21.462 --> 00:15:22.588 that I run 00:15:22.588 --> 00:15:24.132 that I want to focus 00:15:24.132 --> 00:15:27.802 on the what do you wish the public understood better. 00:15:27.802 --> 00:15:32.640 I think we have had such a focus on battling 00:15:32.640 --> 00:15:35.101 and combating mis and disinformation 00:15:35.101 --> 00:15:36.811 that we do have to remember 00:15:36.811 --> 00:15:41.274 that if we magically were able to solve this problem, right, 00:15:41.274 --> 00:15:43.151 and we could disappear 00:15:43.151 --> 00:15:45.987 all the mis and disinformation on the internet, 00:15:45.987 --> 00:15:49.615 there is still a lot of plain ol' information 00:15:49.615 --> 00:15:51.117 that we don't understand. 00:15:51.117 --> 00:15:53.244 So it's recognizing that the skills 00:15:53.244 --> 00:15:56.831 that we need to understand mis and disinformation 00:15:56.831 --> 00:16:00.042 really apply to all information. 00:16:00.042 --> 00:16:03.087 Information is super complicated, 00:16:03.087 --> 00:16:05.006 and most of the information out there 00:16:05.006 --> 00:16:08.009 isn't completely true and isn't completely false. 00:16:08.009 --> 00:16:09.677 We need to understand that nuance, 00:16:09.677 --> 00:16:12.138 and we need to be willing to dive deep 00:16:12.138 --> 00:16:14.807 into understanding our communication systems. 00:16:14.807 --> 00:16:18.519 So I kind of want everyone to understand 00:16:18.519 --> 00:16:21.814 that this is hard work that we need to be doing, 00:16:21.814 --> 00:16:23.232 and I just want to stress 00:16:23.232 --> 00:16:27.069 that while education might not be the sole answer, 00:16:27.069 --> 00:16:30.823 we cannot do this without education. 00:16:30.823 --> 00:16:35.119 When I see multi-stakeholder groups my biggest frustration 00:16:35.119 --> 00:16:38.539 is if the education conversation isn't part of that. 00:16:38.539 --> 00:16:40.750 We have to really, really continue that. 00:16:40.750 --> 00:16:44.503 Like Dr. Kristen said, it's a multi-stakeholder problem 00:16:44.503 --> 00:16:49.550 that deserves thoughtful multi-stakeholder solutions. 00:16:49.550 --> 00:16:51.135 >>Thank you. I'm not sure if someone else 00:16:51.135 --> 00:16:53.304 wants to jump in on that question. 00:16:53.304 --> 00:16:58.142 >>I'd love to jump in, soon-to-be Dr. Anderson, 00:16:58.142 --> 00:17:01.437 just because I think there's a critical bridge 00:17:01.437 --> 00:17:03.773 between the last two discussions. 00:17:03.773 --> 00:17:07.360 We at Microsoft really try to advance the conversation 00:17:07.360 --> 00:17:10.988 about the broad concept of digital safety online 00:17:10.988 --> 00:17:12.990 whether that is today's important discussion 00:17:12.990 --> 00:17:14.617 about misinformation 00:17:14.617 --> 00:17:17.995 to child sexual exploitation or terrorism content. 00:17:17.995 --> 00:17:20.456 These are truly whole-of-society problems 00:17:20.456 --> 00:17:23.417 that need whole-of-society solutions. 00:17:23.417 --> 00:17:25.461 In other context we've brought together 00:17:25.461 --> 00:17:27.964 critical multi-stakeholder organizations 00:17:27.964 --> 00:17:30.299 to tackle those important issues. 00:17:30.299 --> 00:17:33.552 One of the spaces that I love helping advance 00:17:33.552 --> 00:17:35.846 that multi-stakeholder is to come from the tech industry 00:17:35.846 --> 00:17:38.766 and say, "Listen, we are a partner here." 00:17:38.766 --> 00:17:40.810 We approach this through prevention, 00:17:40.810 --> 00:17:44.522 and we approach this through internal policies, 00:17:44.522 --> 00:17:46.315 partnerships, and innovation. 00:17:46.315 --> 00:17:50.152 But if you think nerding harder is going to solve the problem of 00:17:50.152 --> 00:17:53.197 global misinformation, it's not. 00:17:53.197 --> 00:17:57.118 It has to be a much more holistic approach. 00:17:57.118 --> 00:17:59.745 So what I loved about what Michelle just said 00:17:59.745 --> 00:18:02.748 is that we've thought about other topical areas. 00:18:02.748 --> 00:18:05.126 We've walked into a really critical framework 00:18:05.126 --> 00:18:07.461 in which we know we have to invest in research 00:18:07.461 --> 00:18:10.339 to understand the threats that they expose themselves online. 00:18:10.339 --> 00:18:11.966 We need to invest in research to understand 00:18:11.966 --> 00:18:15.761 the connectivity between online harms and offline harms, 00:18:15.761 --> 00:18:18.264 and then we really need to see that research applied 00:18:18.264 --> 00:18:21.350 in practical mechanisms that, to be perfectly frank, 00:18:21.350 --> 00:18:24.812 are not just accessible to the big tech companies 00:18:24.812 --> 00:18:27.648 that Microsoft is, but enable any new, 00:18:27.648 --> 00:18:32.486 small app developer to access a critical playbook of tools 00:18:32.486 --> 00:18:36.699 to make sure that their platform is safer for the consumers. 00:18:36.699 --> 00:18:39.368 I think I have to underscore it one more time. 00:18:39.368 --> 00:18:43.497 If the public thinks we can delete misinformation online, 00:18:43.497 --> 00:18:45.833 that is a fundamental non-understanding of the topic 00:18:45.833 --> 00:18:47.418 that we're here to talk about today, 00:18:47.418 --> 00:18:51.672 and it's why it comes right back to that whole society approach. 00:18:51.672 --> 00:18:53.090 >>Jimmeka, if I could jump in as well. 00:18:53.090 --> 00:18:53.799 >>Go ahead. 00:18:53.799 --> 00:18:55.843 >>I think that there's so much important research 00:18:55.843 --> 00:18:58.929 that also needs to be in the educational space. 00:18:58.929 --> 00:19:01.932 We've tried very hard to empower young people 00:19:01.932 --> 00:19:04.268 with that filter between their ears 00:19:04.268 --> 00:19:05.561 which I'm a huge fan of, right, 00:19:05.561 --> 00:19:08.189 and if we can empower young people with skills 00:19:08.189 --> 00:19:09.648 they're going to be more equipped 00:19:09.648 --> 00:19:12.151 than if we just tell them a bunch of rules to follow. 00:19:12.151 --> 00:19:14.195 But we don't have a lot of research right now 00:19:14.195 --> 00:19:17.406 to prove the effectiveness of these different curriculum 00:19:17.406 --> 00:19:19.283 that we've tried to implement. 00:19:19.283 --> 00:19:20.826 I'm teaching a graduate class 00:19:20.826 --> 00:19:23.996 right now the discussion that we all had last night 00:19:23.996 --> 00:19:26.540 which was if media literacy education is working, 00:19:26.540 --> 00:19:29.585 and we sort of looked at the points and counterpoints 00:19:29.585 --> 00:19:32.004 between have we created a society 00:19:32.004 --> 00:19:34.673 that maybe questions too critically 00:19:34.673 --> 00:19:37.343 and if we've pushed young people into believing 00:19:37.343 --> 00:19:39.553 that nothing can be trusted. 00:19:39.553 --> 00:19:44.642 What is the danger there when we get young people 00:19:44.642 --> 00:19:46.394 almost overly 00:19:46.394 --> 00:19:48.646 criticizing everything that they come across? 00:19:48.646 --> 00:19:50.773 So I think it's going to be really important to invest 00:19:50.773 --> 00:19:52.566 in research to determine 00:19:52.566 --> 00:19:54.819 if the curriculum that we are implementing in schools 00:19:54.819 --> 00:20:00.157 really is giving us the intended outcome that we desire, 00:20:00.157 --> 00:20:02.326 not just from a space of whether our kids 00:20:02.326 --> 00:20:05.663 can answer a quiz question or produce a paper, 00:20:05.663 --> 00:20:07.081 but are they taking the skills 00:20:07.081 --> 00:20:08.541 that they're learning in the classroom 00:20:08.541 --> 00:20:12.920 and actually applying them out in the wild of the internet? 00:20:12.920 --> 00:20:14.130 >>Thank you, Dr. Kristen. 00:20:14.130 --> 00:20:15.339 >>Jimmeka, if I might... 00:20:15.339 --> 00:20:17.091 ->>Oh, go ahead, Jennie. ->>[Laughter] 00:20:17.091 --> 00:20:19.385 We're all very, very excited about this question clearly, 00:20:19.385 --> 00:20:22.096 but to build on what's being said, 00:20:22.096 --> 00:20:25.307 ISD as an organization we have an enormous evidence base 00:20:25.307 --> 00:20:27.476 in terms of the proliferation of harms online. 00:20:27.476 --> 00:20:29.353 That was where we started as an organization 00:20:29.353 --> 00:20:32.523 with researching the extremist movements, 00:20:32.523 --> 00:20:35.526 electrical integrity, disinformation, and hate, 00:20:35.526 --> 00:20:40.322 and there's always an instinct to rewrite your resources 00:20:40.322 --> 00:20:42.992 every time a new phenomenon occurs in the online space. 00:20:42.992 --> 00:20:44.577 So, you know, QAnon happens, 00:20:44.577 --> 00:20:45.995 so do we need to be teaching young people 00:20:45.995 --> 00:20:47.246 directly about QAnon 00:20:47.246 --> 00:20:49.874 without providing more oxygen for these things? 00:20:49.874 --> 00:20:52.334 What we have always said to educators is 00:20:52.334 --> 00:20:54.879 if you do that you are automatically reducing 00:20:54.879 --> 00:20:57.298 the shelf life of your materials 00:20:57.298 --> 00:20:59.550 because the reality is that the fad 00:20:59.550 --> 00:21:02.011 or the thing that everyone is the most worried about today 00:21:02.011 --> 00:21:04.346 will not be the case potentially in a day, 00:21:04.346 --> 00:21:07.141 let alone a week, let alone six months. 00:21:07.141 --> 00:21:09.393 What that does is you have an over fixation 00:21:09.393 --> 00:21:11.854 on individual types of content 00:21:11.854 --> 00:21:14.815 or individual trends in disinformation 00:21:14.815 --> 00:21:16.859 rather than the relationship 00:21:16.859 --> 00:21:19.069 that we as individuals have to content 00:21:19.069 --> 00:21:25.451 and how it plays upon our biases and our unconscious biases 00:21:25.451 --> 00:21:28.787 as well as our own echo chambers, etc. 00:21:28.787 --> 00:21:30.998 So when we come to developing resources 00:21:30.998 --> 00:21:33.000 we're always very keen to put it back 00:21:33.000 --> 00:21:35.169 to those constituent building blocks, right, 00:21:35.169 --> 00:21:38.422 of how do you navigate 00:21:38.422 --> 00:21:40.508 this terrain of online information? 00:21:40.508 --> 00:21:43.385 Also as Michelle was saying, it's the fact that 00:21:43.385 --> 00:21:45.679 this is not a completely different experience 00:21:45.679 --> 00:21:47.598 or set of learning outcomes than you would want 00:21:47.598 --> 00:21:49.767 from general citizenship education. 00:21:49.767 --> 00:21:52.853 The fact is we just have a rubric for what it means 00:21:52.853 --> 00:21:55.314 to be a good citizen in the offline space 00:21:55.314 --> 00:21:57.650 which might mean that you pay your taxes 00:21:57.650 --> 00:21:58.984 or you vote in elections, 00:21:58.984 --> 00:22:02.488 and we do not have a comparable framework 00:22:02.488 --> 00:22:05.157 of what it means to be a positive contributor 00:22:05.157 --> 00:22:07.493 or participant in online spaces. 00:22:07.493 --> 00:22:11.413 So the two things are very much interconnected, for us anyway, 00:22:11.413 --> 00:22:12.998 and that's always been the full principle 00:22:12.998 --> 00:22:15.793 of our material development rather than, 00:22:15.793 --> 00:22:18.337 "Oh, god, we need to teach them explicitly about deepfakes 00:22:18.337 --> 00:22:19.547 because who knows 00:22:19.547 --> 00:22:21.507 whether deepfakes are going to be the thing 00:22:21.507 --> 00:22:25.678 or whether TikTok is going to continue to exist," etc. 00:22:28.347 --> 00:22:29.848 >>So I'm with Roblox, 00:22:29.848 --> 00:22:32.059 and for those of you who are unfamiliar with Roblox, 00:22:32.059 --> 00:22:39.066 we are a platform that allows and educates young developers 00:22:39.066 --> 00:22:42.778 to create experiences to bring people together 00:22:42.778 --> 00:22:46.365 to learn about one another in diverse and inclusive worlds. 00:22:46.365 --> 00:22:49.159 We have all of the parental settings 00:22:49.159 --> 00:22:51.495 and the tools available just like Microsoft 00:22:51.495 --> 00:22:54.665 and just like so many of these other platforms. 00:22:54.665 --> 00:22:58.961 We can do everything that we can do to educate parents, kids, 00:22:58.961 --> 00:23:01.463 and teens to be able to use those tools, 00:23:01.463 --> 00:23:05.509 but it's not just about Roblox or Microsoft. 00:23:05.509 --> 00:23:08.470 I'm incredibly encouraged by this panel 00:23:08.470 --> 00:23:10.681 and some of the incredible work 00:23:10.681 --> 00:23:12.474 that is going on out there 00:23:12.474 --> 00:23:15.436 to really, really put a spotlight 00:23:15.436 --> 00:23:20.524 on the necessity for education in order to make sure 00:23:20.524 --> 00:23:25.195 that this next generation thrives for all of us. 00:23:25.195 --> 00:23:30.284 Quite frankly, the whole digital world is still in its infancy, 00:23:30.284 --> 00:23:32.202 and for those of us that have kids 00:23:32.202 --> 00:23:36.749 who are growing up in this world there's no hard line 00:23:36.749 --> 00:23:39.001 between the digital world and the physical world, 00:23:39.001 --> 00:23:41.670 so it's really, really important to understand 00:23:41.670 --> 00:23:44.340 and to listen to them and understand 00:23:44.340 --> 00:23:45.674 what they're going through 00:23:45.674 --> 00:23:50.220 so that we can adapt new features, new policies, etc. 00:23:50.220 --> 00:23:52.598 that's not fear based. 00:23:52.598 --> 00:23:55.142 So many of our parents come to us and say, 00:23:55.142 --> 00:23:56.977 "Well, I don't know really what to do. 00:23:56.977 --> 00:24:01.440 I don't really know what Roblox is." 00:24:01.440 --> 00:24:04.568 Honestly, that's not okay anymore. 00:24:04.568 --> 00:24:11.659 I think it's a necessity for this older generation 00:24:11.659 --> 00:24:14.495 to be able to empower this younger generation. 00:24:18.707 --> 00:24:22.836 Dr. Mattson, you mentioned All Tech is Human. 00:24:22.836 --> 00:24:25.547 We're deeply, deeply involved with that. 00:24:25.547 --> 00:24:29.301 There's a lot of research with incredible academics 00:24:29.301 --> 00:24:31.887 through Boston Children's Hospital 00:24:31.887 --> 00:24:35.766 with the Digital Wellness Lab and Harvard Medical School. 00:24:35.766 --> 00:24:37.976 Then there's Common Sense Media 00:24:37.976 --> 00:24:40.562 and Family Online Safety Institute. 00:24:40.562 --> 00:24:43.357 All of these incredible organizations 00:24:43.357 --> 00:24:46.068 have digital citizenship curriculum 00:24:46.068 --> 00:24:48.487 and hopefully will help level the playing field 00:24:48.487 --> 00:24:50.447 so that we can really, 00:24:50.447 --> 00:24:54.410 really make sure this next generation thrives. 00:24:54.410 --> 00:24:59.039 >>Thank you all for jumping in and tackling that question. 00:24:59.039 --> 00:25:03.377 So many amazing thoughts came to mind, 00:25:03.377 --> 00:25:07.381 and I think that one of the main premises of this dialogue 00:25:07.381 --> 00:25:08.924 was what's working? 00:25:08.924 --> 00:25:13.679 What's not? What's happening? What needs to still happen? 00:25:13.679 --> 00:25:16.265 I know, Dr. Kristen, you brought up a lot 00:25:16.265 --> 00:25:19.768 about even assessing media literacy education 00:25:19.768 --> 00:25:22.563 and is it really being effective. 00:25:22.563 --> 00:25:24.189 If we were in person I would shift 00:25:24.189 --> 00:25:26.650 and turn to Michelle like this and say, 00:25:26.650 --> 00:25:29.236 "Hey, Michelle." [Laughter] 00:25:29.236 --> 00:25:32.656 You all are doing a lot of work at NAMLE 00:25:32.656 --> 00:25:34.700 that has really been assessing the landscape 00:25:34.700 --> 00:25:36.577 of media literacy education, 00:25:36.577 --> 00:25:39.788 so what can you share with us about what's happening? 00:25:39.788 --> 00:25:40.998 What does this landscape 00:25:40.998 --> 00:25:43.333 look like with media literacy education? 00:25:43.333 --> 00:25:45.836 What has the organization seen as successes 00:25:45.836 --> 00:25:47.921 or still challenges that are out there? 00:25:47.921 --> 00:25:49.757 >>Great. So thank you for that question, 00:25:49.757 --> 00:25:53.135 and I do want to say that I usually answer that question 00:25:53.135 --> 00:25:54.678 in an hour presentation, 00:25:54.678 --> 00:25:58.849 so if anyone wants to talk more about it just contact me. 00:25:58.849 --> 00:26:00.142 Let me just summarize. 00:26:00.142 --> 00:26:03.645 So we are an organization of 7,000 members, 00:26:03.645 --> 00:26:08.692 and within our organization we have 90 organizational partners. 00:26:08.692 --> 00:26:11.153 The reason I tell you that is because what I know 00:26:11.153 --> 00:26:12.863 and what I see on a daily basis 00:26:12.863 --> 00:26:17.159 is incredible work being done in this space. 00:26:17.159 --> 00:26:19.828 Every organization, every practitioner, 00:26:19.828 --> 00:26:22.748 and every researcher, what they're doing in their classroom 00:26:22.748 --> 00:26:26.376 and what they're bringing to their community is incredible. 00:26:26.376 --> 00:26:27.711 That's the good news. 00:26:27.711 --> 00:26:30.589 The bad news is that media literacy education 00:26:30.589 --> 00:26:33.717 is not a national priority in our country. 00:26:33.717 --> 00:26:35.511 It is not across the board. 00:26:35.511 --> 00:26:39.473 We are still working from bottom up. 00:26:39.473 --> 00:26:41.141 We are going to have a really, really 00:26:41.141 --> 00:26:45.604 difficult time scaling and really making systemic change 00:26:45.604 --> 00:26:48.232 unless we change some of that, right? 00:26:48.232 --> 00:26:49.733 So let me just tell you this quickly. 00:26:49.733 --> 00:26:51.985 Where is it being practiced? What have we noticed? 00:26:51.985 --> 00:26:55.364 We kind of separate practice into three buckets. 00:26:55.364 --> 00:26:59.159 One is PreK-12 one is Higher Ed, and one is Communities. 00:26:59.159 --> 00:27:00.994 These are enormous buckets. 00:27:00.994 --> 00:27:04.373 Just within PreK-12 you might see media literacy 00:27:04.373 --> 00:27:05.958 being taught in a subject area 00:27:05.958 --> 00:27:08.168 or being taught by the library media specialist 00:27:08.168 --> 00:27:10.504 or being taught by a tech integration specialists. 00:27:10.504 --> 00:27:13.173 It might be taught from an outside organization 00:27:13.173 --> 00:27:15.926 like News Literacy Project coming into a school. 00:27:15.926 --> 00:27:17.511 In Higher Education you can see it 00:27:17.511 --> 00:27:18.929 as a stand-alone course 00:27:18.929 --> 00:27:21.223 or integrated into existing courses. 00:27:21.223 --> 00:27:23.225 You might see it in an education school, 00:27:23.225 --> 00:27:25.477 or you might see it in a communications school. 00:27:25.477 --> 00:27:26.687 There are strands of studies. 00:27:26.687 --> 00:27:30.399 There are degrees and programs and growing scholarships. 00:27:30.399 --> 00:27:32.067 With communities there are tons 00:27:32.067 --> 00:27:34.319 of non-profit community-based organizations 00:27:34.319 --> 00:27:36.613 or public libraries taking on this work. 00:27:36.613 --> 00:27:39.491 There's after-school programs and community centers. 00:27:39.491 --> 00:27:40.826 Our public media system 00:27:40.826 --> 00:27:43.871 is taking media literacy incredibly serious, 00:27:43.871 --> 00:27:45.330 and then we have our partnerships 00:27:45.330 --> 00:27:47.165 with the media and tech companies, 00:27:47.165 --> 00:27:49.877 all of whom are taking some responsibility 00:27:49.877 --> 00:27:51.920 in moving media literacy forward. 00:27:51.920 --> 00:27:54.047 So the opportunities are vast, right? 00:27:54.047 --> 00:27:56.091 We see urgency is evident. 00:27:56.091 --> 00:27:57.384 I have to tell you, 00:27:57.384 --> 00:28:00.637 my organization has existed since 1997, 00:28:00.637 --> 00:28:04.558 and the fact that urgency is evident now is a huge victory. 00:28:04.558 --> 00:28:08.353 For the first five years of my job I had to show up 00:28:08.353 --> 00:28:11.440 and prove that this was important to talk about, 00:28:11.440 --> 00:28:15.444 and now I can't stop talking about it because everyone cares. 00:28:15.444 --> 00:28:17.696 That is a huge opportunity for us. 00:28:17.696 --> 00:28:19.573 Growing attention is attached to that. 00:28:19.573 --> 00:28:21.992 There's increased support, increased practice, 00:28:21.992 --> 00:28:23.243 and increased research. 00:28:23.243 --> 00:28:26.079 This is amazing. 00:28:26.079 --> 00:28:29.249 I can't even tell you, from the time I started in 2012 00:28:29.249 --> 00:28:33.295 and the time we're at now, it's a world of difference. 00:28:33.295 --> 00:28:37.174 So where are the big challenges? 00:28:37.174 --> 00:28:39.259 Just listen to what I said about the buckets. 00:28:39.259 --> 00:28:41.345 There's no standard model, right? 00:28:41.345 --> 00:28:44.348 There's no way I can go into a school or school district 00:28:44.348 --> 00:28:46.975 and say, "Oh, this is what you have to do." 00:28:46.975 --> 00:28:49.561 It depends on community, it depends on population, 00:28:49.561 --> 00:28:52.606 and it depends on a lot of individual, 00:28:52.606 --> 00:28:55.108 kind of unique, circumstances. 00:28:55.108 --> 00:28:57.402 It still relies on individuals, like I said. 00:28:57.402 --> 00:29:00.614 It relies on the teacher, the organization, or the school. 00:29:00.614 --> 00:29:02.783 Who is going to say this is important? 00:29:02.783 --> 00:29:05.953 It's a very, very difficult place to scale from. 00:29:05.953 --> 00:29:08.330 There's still limited funding in this space, 00:29:08.330 --> 00:29:10.290 and there's a lot of us that need it, 00:29:10.290 --> 00:29:11.583 so that competition 00:29:11.583 --> 00:29:14.670 can really kind of muddy the waters a little bit. 00:29:14.670 --> 00:29:17.339 Then there is that lack of public understanding, 00:29:17.339 --> 00:29:20.592 so while the conversation about fake news and misinformation 00:29:20.592 --> 00:29:22.719 and disinformation is vital 00:29:22.719 --> 00:29:24.471 and has really brought all of this 00:29:24.471 --> 00:29:27.349 into the cultural conversation, thank goodness, 00:29:27.349 --> 00:29:29.142 there's still a lack of public understanding 00:29:29.142 --> 00:29:31.979 about what media literacy skills are 00:29:31.979 --> 00:29:35.148 and what we are really trying to do 00:29:35.148 --> 00:29:37.359 when it comes to people understanding 00:29:37.359 --> 00:29:39.152 the media ecosystem. 00:29:39.152 --> 00:29:41.279 So bottom line, it's a great practice 00:29:41.279 --> 00:29:45.534 with amazing organizations and amazing educators out there, 00:29:45.534 --> 00:29:48.036 but we need to be doing more in the U.S. 00:29:48.036 --> 00:29:51.665 to make systemic change. 00:29:51.665 --> 00:29:53.834 ->>Thank you. ->>Can I cut in, Jimmeka? 00:29:53.834 --> 00:29:55.127 >>Oh, okay. 00:29:55.127 --> 00:29:58.630 >>Oh, I'm sorry. Were you going to ask another question? 00:29:58.630 --> 00:30:00.298 >>Yeah, but it's okay. Go ahead. [Laughter] 00:30:00.298 --> 00:30:03.093 >>Okay, I'll be brief because this is like the thing 00:30:03.093 --> 00:30:04.970 that we do at The News Literacy Project. 00:30:04.970 --> 00:30:08.640 We work with NAMLE and a bunch of other organizations, 00:30:08.640 --> 00:30:11.059 so super-quick context. 00:30:11.059 --> 00:30:15.939 So with NLP our whole mission is to help people discern fact 00:30:15.939 --> 00:30:19.484 from fiction in news and other forms of content, 00:30:19.484 --> 00:30:22.946 and we've been doing that since 2008 which I think is important. 00:30:22.946 --> 00:30:25.824 And to Michelle's point, 00:30:25.824 --> 00:30:27.993 we specifically focus on news literacy, 00:30:27.993 --> 00:30:30.829 but media literacy and fake news 00:30:30.829 --> 00:30:32.164 and all these kinds of buzzwords 00:30:32.164 --> 00:30:35.250 have been popularized in recent years, 00:30:35.250 --> 00:30:37.753 but we've been doing this work for a very long time 00:30:37.753 --> 00:30:40.213 and really believe that we have a framework 00:30:40.213 --> 00:30:44.176 that has proved the efficacy of news literacy education. 00:30:44.176 --> 00:30:47.721 I mean, just today the state of Delaware 00:30:47.721 --> 00:30:49.639 passed a media literacy bill 00:30:49.639 --> 00:30:51.641 that mandates media literacy education, 00:30:51.641 --> 00:30:56.021 and those are the kinds of wins that we want to see. 00:30:56.021 --> 00:30:58.315 The same is true of Texas, 00:30:58.315 --> 00:30:59.941 and the same is true for Illinois. 00:30:59.941 --> 00:31:01.193 There are a couple of states 00:31:01.193 --> 00:31:03.820 that are really understanding this issue, 00:31:03.820 --> 00:31:06.490 and right now what my team focuses on 00:31:06.490 --> 00:31:10.243 is really creating what we call news literacy champions 00:31:10.243 --> 00:31:15.373 who are in districts, in cities, and in states across the country 00:31:15.373 --> 00:31:17.417 where maybe you are the only school librarian 00:31:17.417 --> 00:31:18.668 in your entire district, 00:31:18.668 --> 00:31:21.505 but you understand how important these issues are. 00:31:21.505 --> 00:31:23.882 So we want our resources to come alongside you, 00:31:23.882 --> 00:31:25.175 we want to be in news deserts, 00:31:25.175 --> 00:31:27.302 and we want to be in places where we know 00:31:27.302 --> 00:31:30.555 that the communities need this information the most. 00:31:30.555 --> 00:31:33.225 In 2020 in fact we expanded our offering, 00:31:33.225 --> 00:31:35.644 so it's not just for educators, 00:31:35.644 --> 00:31:38.480 but for the general public as well because we recognized 00:31:38.480 --> 00:31:42.484 that there was a critical need for news literacy education 00:31:42.484 --> 00:31:44.903 because in order to be civically engaged 00:31:44.903 --> 00:31:48.448 and in order to be an equal participant in democracy 00:31:48.448 --> 00:31:50.242 we know that news literacy 00:31:50.242 --> 00:31:52.369 is a cornerstone of that in our society. 00:31:52.369 --> 00:31:56.039 There's literally no other way to be able to participate 00:31:56.039 --> 00:31:57.791 and to be engaged equally 00:31:57.791 --> 00:32:00.168 if you don't understand news literacy 00:32:00.168 --> 00:32:02.462 and if you don't understand how the system works. 00:32:02.462 --> 00:32:05.173 It's just not going to happen. 00:32:05.173 --> 00:32:08.343 We believe, as a lot of folks have alluded to, 00:32:08.343 --> 00:32:11.596 that education is one of the solutions to that 00:32:11.596 --> 00:32:13.515 because educators are on the frontlines 00:32:13.515 --> 00:32:15.517 in the fight against misinformation, 00:32:15.517 --> 00:32:18.019 and it's also a great way to empower students 00:32:18.019 --> 00:32:20.272 to really take responsibility 00:32:20.272 --> 00:32:22.482 and to understand the digital ecosystem 00:32:22.482 --> 00:32:23.275 that they've inherited. 00:32:23.275 --> 00:32:25.610 It's incredibly complicated 00:32:25.610 --> 00:32:28.572 to just know what it means to cause harm online 00:32:28.572 --> 00:32:32.325 so they can be good stewards of their social media, 00:32:32.325 --> 00:32:36.204 their phones, or their computers and what have you, 00:32:36.204 --> 00:32:37.914 and we have resources and tools 00:32:37.914 --> 00:32:41.501 that have been developed over time on our website for that. 00:32:41.501 --> 00:32:44.004 We're promoting them this entire week, 00:32:44.004 --> 00:32:45.630 and I'm really proud of that. 00:32:45.630 --> 00:32:48.216 I'm proud of how that's evolved and changed 00:32:48.216 --> 00:32:49.968 and gotten better over time. 00:32:49.968 --> 00:32:51.178 And to Jennie's point, 00:32:51.178 --> 00:32:54.222 it really focuses on some key tenets of news literacy 00:32:54.222 --> 00:32:56.683 and not just like very specific things 00:32:56.683 --> 00:32:58.727 that are happening right now. 00:32:58.727 --> 00:33:00.645 Conspiracy theories that we're seeing in the news media 00:33:00.645 --> 00:33:03.190 right now, those things won't be as relevant later, 00:33:03.190 --> 00:33:06.318 so it's our goal to not just be responsive to the moment, 00:33:06.318 --> 00:33:09.112 but to be very forward thinking and intentional 00:33:09.112 --> 00:33:11.740 about making sure that educators have the support that they need 00:33:11.740 --> 00:33:14.409 and that the general public has the support and resources 00:33:14.409 --> 00:33:18.663 that they need to understand this incredibly critical moment 00:33:18.663 --> 00:33:20.999 that we're living in right now 00:33:20.999 --> 00:33:23.627 and have the tools to be news literate 00:33:23.627 --> 00:33:25.837 and have the tools to participate in 00:33:25.837 --> 00:33:29.966 our democracy and to be engaged in the things 00:33:29.966 --> 00:33:33.470 that are happening around their communities every day. 00:33:33.470 --> 00:33:35.597 And again, I just wanted to share that 00:33:35.597 --> 00:33:37.265 and to both highlight those resources, 00:33:37.265 --> 00:33:40.101 but also to talk about how we have seen 00:33:40.101 --> 00:33:41.478 how news literacy education 00:33:41.478 --> 00:33:43.939 is not just becoming an important thing right now, 00:33:43.939 --> 00:33:46.233 but how states and districts are recognizing 00:33:46.233 --> 00:33:47.984 that and putting funding behind that. 00:33:47.984 --> 00:33:49.653 We could use a lot more of that happening, 00:33:49.653 --> 00:33:52.322 and that's something that we're pushing for and working for. 00:33:52.322 --> 00:33:55.825 Also we've seen the efficacy of news literacy education 00:33:55.825 --> 00:33:57.869 as more students and more parents 00:33:57.869 --> 00:34:01.331 and more people are really rallying behind this issue, 00:34:01.331 --> 00:34:02.749 and I think we're going to continue 00:34:02.749 --> 00:34:04.793 to see more of that in the future. 00:34:06.670 --> 00:34:08.672 >>Thank you so much, Ebonee, for sharing. 00:34:08.672 --> 00:34:12.425 Shoutout to News Literacy Project and NAMLE, 00:34:12.425 --> 00:34:14.719 and shoutout to Media Literacy Now. 00:34:14.719 --> 00:34:19.266 I see that Michelle dropped Media Literacy Now in the chat. 00:34:19.266 --> 00:34:20.976 They just came out with a new report 00:34:20.976 --> 00:34:22.560 when we were talking about what's happening 00:34:22.560 --> 00:34:24.646 in the states with policy. 00:34:24.646 --> 00:34:27.941 They're focused on trying to move policy 00:34:27.941 --> 00:34:29.526 in these states to get media literacy education 00:34:29.526 --> 00:34:34.155 in schools such as what's happening in Washington State. 00:34:34.155 --> 00:34:36.449 They're investing in media literacy 00:34:36.449 --> 00:34:38.952 and putting funding in media literacy. 00:34:38.952 --> 00:34:41.329 There's so much that has been happening, 00:34:41.329 --> 00:34:46.251 so just kudos to all of those on the ground with bootstraps 00:34:46.251 --> 00:34:49.004 tied who are all in the game and all in the fight to [Laughter] 00:34:49.004 --> 00:34:52.465 media literacy value in the states. 00:34:52.465 --> 00:34:54.426 I also want to shout out this. 00:34:54.426 --> 00:34:55.719 I see Jenn in the chat who said, 00:34:55.719 --> 00:34:57.304 "Look at this amazing panel of women," 00:34:57.304 --> 00:34:59.097 so I'm just shouting you all out too, 00:34:59.097 --> 00:35:00.849 the amazing women on this panel. 00:35:00.849 --> 00:35:03.310 Alright, so we learned about what's going on 00:35:03.310 --> 00:35:05.228 with media literacy in the U.S., 00:35:05.228 --> 00:35:08.815 but I know that that doesn't really paint a picture 00:35:08.815 --> 00:35:10.984 on what's happening on a global level. 00:35:10.984 --> 00:35:12.819 There's a lot of things that are happening 00:35:12.819 --> 00:35:16.072 beyond the U.S. in this field, 00:35:16.072 --> 00:35:18.575 and, Jennie, I know that you've done a lot of work 00:35:18.575 --> 00:35:22.329 on an international level and looking at where the strides 00:35:22.329 --> 00:35:24.205 are that have been happening in media literacy. 00:35:24.205 --> 00:35:25.790 I know Michelle provided context 00:35:25.790 --> 00:35:27.667 to kind of what's happening in the U.S., 00:35:27.667 --> 00:35:29.878 but I'm hoping you can share with us 00:35:29.878 --> 00:35:32.672 kind of what does media literacy look like on a global level, 00:35:32.672 --> 00:35:34.090 and how does it compare to some of the things 00:35:34.090 --> 00:35:35.592 that we just discussed? 00:35:35.592 --> 00:35:38.678 >>Sure. I mean, I think that provision is extremely scarce 00:35:38.678 --> 00:35:40.555 in large regions of the world, 00:35:40.555 --> 00:35:42.807 so if you look at perhaps in Africa 00:35:42.807 --> 00:35:45.477 or across MENA and South Asia 00:35:45.477 --> 00:35:48.146 this is still an extremely nascent conversation, 00:35:48.146 --> 00:35:51.191 and certainly the idea of having a formalized mechanism 00:35:51.191 --> 00:35:56.112 through the education system is miles away in most contexts. 00:35:56.112 --> 00:35:58.907 The reality is also that we have an extremely poor understanding 00:35:58.907 --> 00:36:00.617 of how any of these phenomena play out 00:36:00.617 --> 00:36:04.037 in a non-English language context across social media, 00:36:04.037 --> 00:36:06.331 and most of the tech platforms themselves 00:36:06.331 --> 00:36:10.001 do not have internal teams or capacity to be understanding. 00:36:10.001 --> 00:36:14.339 For example, the real nature and proliferation of disinformation 00:36:14.339 --> 00:36:19.010 and hate speech in Arabic or Urdu or Swahili 00:36:19.010 --> 00:36:23.139 there's both a supply side issue 00:36:23.139 --> 00:36:25.183 and then a lack of infrastructure 00:36:25.183 --> 00:36:27.811 to deliver some of this education on the ground. 00:36:27.811 --> 00:36:31.147 In Europe I would say it's now a fairly well established 00:36:31.147 --> 00:36:33.650 and well-developed conversation. 00:36:33.650 --> 00:36:35.652 The approach will vary slightly depending 00:36:35.652 --> 00:36:38.571 on how centralized an education system is. 00:36:38.571 --> 00:36:41.032 You know, I do not envy America in that respect 00:36:41.032 --> 00:36:44.244 because you do not have that sense of 00:36:44.244 --> 00:36:46.496 like a national curriculum. 00:36:46.496 --> 00:36:49.582 Even in the U.K. where there is a huge amount of autonomy 00:36:49.582 --> 00:36:51.042 provided to schools in the way 00:36:51.042 --> 00:36:53.002 that they structure teaching and learning, 00:36:53.002 --> 00:36:54.421 and in comparison to other systems 00:36:54.421 --> 00:36:55.964 it's quite decentralized, 00:36:55.964 --> 00:37:00.385 there are still statutory mandates and a lot of guidance. 00:37:00.385 --> 00:37:02.345 In theory there's that sort of uniformity 00:37:02.345 --> 00:37:06.766 between the core principles of what schools are teaching. 00:37:06.766 --> 00:37:10.603 Last year the Department of Culture Media and Sports 00:37:10.603 --> 00:37:12.856 who have sort of claimed responsibility 00:37:12.856 --> 00:37:16.234 for the national media literacy strategy in the U.K. published 00:37:16.234 --> 00:37:19.320 that which had been a long time in the making, 00:37:19.320 --> 00:37:22.824 and that does have obligations on the education system 00:37:22.824 --> 00:37:24.200 to be delivering this. 00:37:24.200 --> 00:37:26.703 The question which I know we might come to later 00:37:26.703 --> 00:37:28.037 in the session is, 00:37:28.037 --> 00:37:30.665 how do you fit that into the working day? 00:37:30.665 --> 00:37:33.710 There have been different approaches taken across Europe 00:37:33.710 --> 00:37:36.546 based on really the history of the education system. 00:37:36.546 --> 00:37:39.841 You know, some countries have a very well-developed precedent 00:37:39.841 --> 00:37:44.137 of civics education or some form of general studies education, 00:37:44.137 --> 00:37:46.723 and there is allocated time in the school day 00:37:46.723 --> 00:37:49.225 where I guess this is kind of slotted in. 00:37:49.225 --> 00:37:51.519 In the U.K. it's primarily been delivered 00:37:51.519 --> 00:37:54.355 by what we call personal, social, and health education, 00:37:54.355 --> 00:37:56.024 or PSHE, 00:37:56.024 --> 00:38:01.404 and I think there is a growing sort of clammer from people 00:38:01.404 --> 00:38:04.824 working in this space to not view it as a new subject 00:38:04.824 --> 00:38:06.868 that needs to be taught independently. 00:38:06.868 --> 00:38:10.330 So it's not like you go from a history class 00:38:10.330 --> 00:38:14.125 into a digital citizenship class. 00:38:14.125 --> 00:38:17.045 It means that you're probably only going to engage 00:38:17.045 --> 00:38:19.047 a very small proportion of your staff 00:38:19.047 --> 00:38:21.174 in understanding any of these issues 00:38:21.174 --> 00:38:24.427 or it becomes the sole responsibility of one educator 00:38:24.427 --> 00:38:29.182 or it silos it or sort of compartmentalizes it 00:38:29.182 --> 00:38:32.810 from the broader learning that children are doing at school. 00:38:32.810 --> 00:38:34.729 In some countries there's more discussion now 00:38:34.729 --> 00:38:37.023 about how to build some of the competencies 00:38:37.023 --> 00:38:39.317 that the other panelists have been talking about today 00:38:39.317 --> 00:38:44.239 into science classes or English classes or history classes. 00:38:44.239 --> 00:38:46.449 You can't teach the history of World War II 00:38:46.449 --> 00:38:49.494 without teaching about disinformation and propaganda. 00:38:49.494 --> 00:38:51.996 You might have not used that vocabulary previously, 00:38:51.996 --> 00:38:54.624 but equally you can sort of use that as a lens to talk 00:38:54.624 --> 00:38:58.253 about echo chambers or "us versus them" dynamics. 00:38:58.253 --> 00:39:01.965 There's plenty of opportunities to talk about conspiracy 00:39:01.965 --> 00:39:05.635 theories or disinformation in terms of vaccines 00:39:05.635 --> 00:39:08.763 and looking at the history of vaccines or MMR. 00:39:08.763 --> 00:39:10.139 You know, it's about 00:39:10.139 --> 00:39:12.809 trying to find points of entry, 00:39:12.809 --> 00:39:16.563 I think, at the moment to have as many educators as possible 00:39:16.563 --> 00:39:19.566 to have the confidence to at least broach 00:39:19.566 --> 00:39:22.110 on some of these topics. 00:39:22.110 --> 00:39:24.737 The final thing I'd say is that it's very patchy 00:39:24.737 --> 00:39:25.989 across countries 00:39:25.989 --> 00:39:29.617 and across the EU if we're going to take that as a collective, 00:39:29.617 --> 00:39:33.204 but there are probably slightly more formalized 00:39:33.204 --> 00:39:36.124 cross-border institutions 00:39:36.124 --> 00:39:37.959 now than there ever have been before. 00:39:37.959 --> 00:39:39.335 For example, a few years ago 00:39:39.335 --> 00:39:42.797 the European Digital Media Observatory was founded, 00:39:42.797 --> 00:39:44.674 and one of its core mandates 00:39:44.674 --> 00:39:49.137 is to promote robust media literacy education 00:39:49.137 --> 00:39:53.141 which includes putting quite a lot of EU investments 00:39:53.141 --> 00:39:55.810 into monitoring evaluation exercises. 00:39:55.810 --> 00:39:58.062 So it's creating, again, frameworks 00:39:58.062 --> 00:40:00.899 or sort of easily transferrable tools 00:40:00.899 --> 00:40:03.901 which allow you to talk about the meaningfulness 00:40:03.901 --> 00:40:06.321 and the impact of those. 00:40:06.321 --> 00:40:10.116 So it's not that we are miles ahead of the U.S. by any means. 00:40:10.116 --> 00:40:11.993 You can go into pockets of the U.K. 00:40:11.993 --> 00:40:13.828 where no one has ever heard 00:40:13.828 --> 00:40:16.122 a single word related to media literacy, 00:40:16.122 --> 00:40:18.166 and then you can go into some schools 00:40:18.166 --> 00:40:20.543 where it's bread and butter now 00:40:20.543 --> 00:40:23.212 and has been integrated into general studies for years. 00:40:23.212 --> 00:40:27.216 And that could apply in relation to entire countries 00:40:27.216 --> 00:40:28.468 across the region, 00:40:28.468 --> 00:40:33.222 but really the main thrust of discussion now is all about MLE. 00:40:33.222 --> 00:40:37.477 That's what everyone is interested in. 00:40:37.477 --> 00:40:39.395 We want digital systems education, 00:40:39.395 --> 00:40:42.815 and we want media literacy, but how do we measure it? 00:40:45.443 --> 00:40:47.987 It's not just leading to educational outcomes, 00:40:47.987 --> 00:40:49.864 but whether it's leading to behavioral change. 00:40:51.658 --> 00:40:55.161 >>Wow. So you pretty much kind of set me up 00:40:55.161 --> 00:40:57.747 for the next question I wanted to ask. 00:40:57.747 --> 00:41:00.166 I know I see [Phonetic] in the chat 00:41:00.166 --> 00:41:03.127 kind of threw out there the challenges 00:41:03.127 --> 00:41:06.089 with media literacy 00:41:06.089 --> 00:41:09.592 from the library end stance in schools. 00:41:09.592 --> 00:41:13.137 I worked in the library world for over ten years, 00:41:13.137 --> 00:41:16.140 and I know one of the conversations 00:41:16.140 --> 00:41:17.975 that has come to the forefront 00:41:17.975 --> 00:41:20.978 even now with doing media literacy 00:41:20.978 --> 00:41:24.315 is the perspective of it being political. 00:41:24.315 --> 00:41:28.945 So we see there are some limitations and barriers 00:41:28.945 --> 00:41:32.990 that are kind of preventing media literacy 00:41:32.990 --> 00:41:38.037 from being presented and taught. 00:41:38.037 --> 00:41:39.706 This is an open question for the panel, 00:41:39.706 --> 00:41:43.000 but I know I want to get us to Part Three 00:41:43.000 --> 00:41:45.002 where we can really go more into the tools 00:41:45.002 --> 00:41:47.463 that are out there where we can share more resources. 00:41:47.463 --> 00:41:48.798 But I'm just going to open it up 00:41:48.798 --> 00:41:51.467 for maybe one or two people to touch base on. 00:41:51.467 --> 00:41:54.220 What do you see are the limitations right now, 00:41:54.220 --> 00:41:56.097 specific limitations, 00:41:56.097 --> 00:42:00.268 with teaching media literacy education in schools? 00:42:00.268 --> 00:42:01.644 Anyone? 00:42:01.644 --> 00:42:03.479 >>If I'm answering it correctly, Jimmeka, 00:42:03.479 --> 00:42:07.900 and I'm hoping that I understood the question, 00:42:07.900 --> 00:42:13.489 evidence shows actually that teachers want to bring this 00:42:13.489 --> 00:42:16.743 into the classroom, but they don't have the time 00:42:16.743 --> 00:42:18.369 or they don't have the resources. 00:42:18.369 --> 00:42:20.997 They might not have the administrative support. 00:42:23.499 --> 00:42:26.919 So I think what we do notice across the research 00:42:26.919 --> 00:42:30.465 when talking to teachers is that there is that desire, 00:42:30.465 --> 00:42:33.801 but it's just the how, right? 00:42:33.801 --> 00:42:37.388 I think in the experience right now 00:42:39.515 --> 00:42:42.101 we've always seen a little bit of liberal pushback 00:42:42.101 --> 00:42:43.186 for media literacy, 00:42:43.186 --> 00:42:45.605 but I think that as a community 00:42:45.605 --> 00:42:48.316 we need to continue to push back on that 00:42:48.316 --> 00:42:54.071 because information is kind of something 00:42:54.071 --> 00:42:56.032 that we're all trying to understand 00:42:58.951 --> 00:43:02.455 and trying to bring in as many opportunities to understand 00:43:02.455 --> 00:43:06.250 that we're all working to understand these new systems. 00:43:06.250 --> 00:43:07.210 I think that's really important, 00:43:07.210 --> 00:43:09.253 but I think it's also important to note 00:43:09.253 --> 00:43:11.631 what we have seen and the research shows 00:43:11.631 --> 00:43:15.676 that there's an intense interest for this work and this training, 00:43:15.676 --> 00:43:18.429 but there isn't always the resources, the time, 00:43:18.429 --> 00:43:20.473 or the administrative support. 00:43:20.473 --> 00:43:22.850 I don't know if others have heard similar things 00:43:22.850 --> 00:43:24.977 in their studies. 00:43:26.229 --> 00:43:29.607 >>Yes, the time and the support. 00:43:29.607 --> 00:43:34.070 Dr. Kristen, you wanted to answer that question? 00:43:37.448 --> 00:43:39.700 >>[Laughter] I clicked it twice. 00:43:39.700 --> 00:43:43.204 Yeah, I think another big barrier 00:43:43.204 --> 00:43:46.833 is coming from the high school 00:43:46.833 --> 00:43:49.418 and having these conversations with teachers in that, 00:43:49.418 --> 00:43:50.586 yes, this is important, 00:43:50.586 --> 00:43:52.213 and, yes, I think we should prioritize this, 00:43:52.213 --> 00:43:55.508 but we're still in such a very old model where kids go 00:43:55.508 --> 00:43:58.261 from Math to Science to Social Studies to English. 00:43:58.261 --> 00:44:00.012 It's all very compartmentalized, 00:44:00.012 --> 00:44:01.806 and there's still so much pressure 00:44:01.806 --> 00:44:03.641 to get a high SAT score, 00:44:03.641 --> 00:44:07.478 to get a high ACT score, or to get into college. 00:44:07.478 --> 00:44:11.315 Unfortunately those national standardized tests 00:44:11.315 --> 00:44:15.111 are not assessing things like news media information literacy 00:44:15.111 --> 00:44:16.696 and digital citizenship, 00:44:16.696 --> 00:44:19.031 so until we have a national collective conversation 00:44:19.031 --> 00:44:21.659 about what we really want our high school graduates 00:44:21.659 --> 00:44:23.744 leaving school with 00:44:23.744 --> 00:44:27.540 and how we're going to assess them on those skills, 00:44:27.540 --> 00:44:30.585 I'm not sure that we're going to see this huge change 00:44:30.585 --> 00:44:31.752 that we're looking for 00:44:31.752 --> 00:44:35.715 because there is still so much pressure to achieve 00:44:35.715 --> 00:44:40.636 all of these other academic bars 00:44:40.636 --> 00:44:44.140 that have been set by folks outside of the classroom. 00:44:44.140 --> 00:44:45.433 So that's one big barrier. 00:44:45.433 --> 00:44:48.352 I think another big barrier is just 00:44:48.352 --> 00:44:51.439 a lack of quality professional development 00:44:51.439 --> 00:44:54.525 and updates to the teacher preparation program. 00:44:54.525 --> 00:44:55.276 When I think about 00:44:55.276 --> 00:44:59.488 how stale many teacher preparation programs still are 00:44:59.488 --> 00:45:03.242 and how many of our teachers are not going into schools 00:45:03.242 --> 00:45:05.328 with the media information literacy 00:45:05.328 --> 00:45:08.331 or digital citizenship technology integration skills, 00:45:08.331 --> 00:45:09.999 they're having to pick those up later 00:45:09.999 --> 00:45:12.793 as they're also just trying to figure out how to be teachers. 00:45:12.793 --> 00:45:14.295 That's really, really difficult, 00:45:14.295 --> 00:45:16.797 so updates to that preservice teaching program 00:45:16.797 --> 00:45:20.009 are going to be vital. 00:45:20.009 --> 00:45:22.136 >>Thank you so much. 00:45:22.136 --> 00:45:25.723 I know I want to transition us over to Part Three 00:45:25.723 --> 00:45:28.559 to look at the solutions as time is flying by 00:45:28.559 --> 00:45:30.144 in looking at these resources. 00:45:30.144 --> 00:45:33.439 If there's limitations that you all are experiencing 00:45:33.439 --> 00:45:36.025 if you're educators, drop them in the chat. 00:45:36.025 --> 00:45:39.820 Let's continue the conversation 00:45:39.820 --> 00:45:41.280 because I know there are other barriers 00:45:41.280 --> 00:45:43.324 that we probably didn't touch on. 00:45:43.324 --> 00:45:44.867 I would love to see some of the challenges 00:45:44.867 --> 00:45:47.870 that you all are experiencing with doing media literacy work. 00:45:47.870 --> 00:45:50.831 So let's talk about what's out there to help us. 00:45:50.831 --> 00:45:52.166 [Laughter] 00:45:52.166 --> 00:45:53.751 What's out there to help that teacher 00:45:53.751 --> 00:45:56.003 that doesn't have that much time 00:45:56.003 --> 00:45:58.965 that could me it efficient to do this? 00:45:58.965 --> 00:46:02.134 What organizations are out there that they need to connect 00:46:02.134 --> 00:46:03.886 with and partner with to do this work? 00:46:03.886 --> 00:46:07.431 Let's talk about that conversation. 00:46:07.431 --> 00:46:09.475 I'm going to first touch base 00:46:09.475 --> 00:46:15.815 on this concept of digital civility. 00:46:15.815 --> 00:46:19.902 What I want to talk about is, one, let's define what this is 00:46:19.902 --> 00:46:22.738 because this was a new term to me not too long ago. 00:46:22.738 --> 00:46:24.115 There might be people out there 00:46:24.115 --> 00:46:25.324 that don't know what that term means, 00:46:25.324 --> 00:46:27.159 so what is digital civility, 00:46:27.159 --> 00:46:31.414 and how does peer led education around civility and citizenship 00:46:31.414 --> 00:46:33.874 work effectively in the metaverse? 00:46:33.874 --> 00:46:35.668 Some people might need to know what the metaverse is. 00:46:35.668 --> 00:46:37.294 [Laughter] 00:46:37.294 --> 00:46:38.587 I don't want to throw out buzzwords 00:46:38.587 --> 00:46:39.922 and assume everyone knows, 00:46:39.922 --> 00:46:43.801 but I do want to direct these questions to Tami and Courtney 00:46:43.801 --> 00:46:46.554 because I know you all do a lot of work in this space 00:46:46.554 --> 00:46:49.682 with the youth digital civility campaigns. 00:46:49.682 --> 00:46:52.268 So let's just talk about it. What is digital civility, 00:46:52.268 --> 00:46:55.146 and how is peer led education around civility 00:46:55.146 --> 00:47:00.026 and citizenship affected? How can we scale it? 00:47:00.026 --> 00:47:03.195 How do we measure impact? Let's go, Tami or Courtney. 00:47:03.195 --> 00:47:04.530 [Laughter] 00:47:04.530 --> 00:47:07.658 >>So I'll jump in first in terms of the metaverse. 00:47:07.658 --> 00:47:09.493 Courtney, I'm sorry. I spoke over you. 00:47:09.493 --> 00:47:10.786 [Laughter] 00:47:10.786 --> 00:47:15.416 So at Roblox the metaverse 00:47:15.416 --> 00:47:19.128 has sudden become this very overused term. 00:47:19.128 --> 00:47:23.466 Quite simply at Roblox we simply look at our platform 00:47:23.466 --> 00:47:26.761 as bringing people together from all over the world 00:47:26.761 --> 00:47:28.054 to learn about each other 00:47:28.054 --> 00:47:31.891 and to make connections to create friendships. 00:47:31.891 --> 00:47:36.312 And because our platform is 100 percent user-generated content 00:47:36.312 --> 00:47:40.232 which means that young developers create experiences 00:47:40.232 --> 00:47:42.902 and then publish it out on our platform, 00:47:42.902 --> 00:47:47.782 it puts the power of creation into these young people's hands. 00:47:47.782 --> 00:47:53.996 We have seen firsthand how smart these young people are. 00:47:53.996 --> 00:47:55.581 We're having all of these conversations, 00:47:55.581 --> 00:47:58.709 and we were originally talking about politicizing 00:47:58.709 --> 00:48:01.128 some of the information that's out there. 00:48:01.128 --> 00:48:06.634 Well, we feel that we've learned so much from our community, 00:48:06.634 --> 00:48:08.427 and the power that these young people 00:48:08.427 --> 00:48:11.680 have to teach us is enormous. 00:48:11.680 --> 00:48:14.308 I think that some of the challenges right now 00:48:14.308 --> 00:48:17.978 with educators is, one, lack of time, 00:48:17.978 --> 00:48:23.067 and also as Dr. Mattson pointed out, 00:48:23.067 --> 00:48:25.861 a lot of our educators don't feel confident 00:48:25.861 --> 00:48:29.448 with their digital citizenship knowledge. 00:48:29.448 --> 00:48:33.994 So in order to make sure that we're listening 00:48:33.994 --> 00:48:36.831 and learning from our young people, 00:48:38.833 --> 00:48:40.126 and this is probably something 00:48:40.126 --> 00:48:43.587 that Microsoft would agree to as well, 00:48:43.587 --> 00:48:46.799 we need to make sure that the features that we're developing 00:48:46.799 --> 00:48:51.470 are based upon evidence from our users 00:48:51.470 --> 00:48:54.932 from the industry itself and researchers 00:48:54.932 --> 00:49:00.437 to prioritize features that put our users first. 00:49:00.437 --> 00:49:03.357 In terms of civility and the definition of that, 00:49:03.357 --> 00:49:07.862 it's really putting the knowledge and the tools 00:49:07.862 --> 00:49:09.655 into young people's hands 00:49:09.655 --> 00:49:13.159 in order to really thrive and to be able to creative 00:49:13.159 --> 00:49:15.828 positive experiences for themselves 00:49:15.828 --> 00:49:19.415 and be in the driver's seat versus a victim. 00:49:19.415 --> 00:49:22.042 You know, I think that we are so focused 00:49:22.042 --> 00:49:26.338 and so passionate about educating young people, 00:49:26.338 --> 00:49:29.175 parents, and policymakers 00:49:29.175 --> 00:49:32.803 as to how to use the online tools 00:49:32.803 --> 00:49:36.015 and advocating externally as well as taking that research 00:49:36.015 --> 00:49:39.351 and advocating internally within Roblox 00:49:39.351 --> 00:49:42.688 to prioritize and inform our roadmap. 00:49:45.399 --> 00:49:46.609 >>Thank you, Tami. 00:49:46.609 --> 00:49:48.277 I'm going to let Courtney jump in. 00:49:48.277 --> 00:49:51.071 >>Well said, Tami. 00:49:51.071 --> 00:49:53.115 I like actually driving a distinction, 00:49:53.115 --> 00:49:53.824 if this makes sense. 00:49:53.824 --> 00:49:55.743 My job and responsibility 00:49:55.743 --> 00:49:58.454 as Microsoft's Chief Digital Safety Officer 00:49:58.454 --> 00:50:00.831 is handling some of the darkest parts of the internet. 00:50:00.831 --> 00:50:03.792 How do we prevent radicalization, 00:50:03.792 --> 00:50:05.127 terrorism and violent extremism, 00:50:05.127 --> 00:50:08.255 child sexual exploitation, and cyber harassment? 00:50:08.255 --> 00:50:10.007 When we talk about digital civility, 00:50:10.007 --> 00:50:13.594 we're really trying to call to the better angels 00:50:13.594 --> 00:50:17.223 for how to have a conversation to promote safer, healthier, 00:50:17.223 --> 00:50:21.143 and more respectful online interactions among all people. 00:50:21.143 --> 00:50:23.312 And really to further that conversation, 00:50:23.312 --> 00:50:25.648 Microsoft has invested in annual research 00:50:25.648 --> 00:50:29.985 now six years running where we globally survey 00:50:29.985 --> 00:50:33.614 teens and adults about their online experiences 00:50:33.614 --> 00:50:36.909 and their exposure to 21 different online risks 00:50:36.909 --> 00:50:39.119 across the spectrum from behavioral 00:50:39.119 --> 00:50:43.290 to sexual to reputation to personal and intrusive. 00:50:43.290 --> 00:50:46.710 Now, I can't release our data 00:50:46.710 --> 00:50:48.545 that will be coming on Safer Internet Day 00:50:48.545 --> 00:50:52.591 on February 8, 2022, please stay tuned, 00:50:52.591 --> 00:50:55.052 but looking back we've had roughly around 00:50:55.052 --> 00:50:58.305 16,000 respondents in 32 markets, 00:50:58.305 --> 00:51:01.642 and what we do is we provide a digital civility index score 00:51:01.642 --> 00:51:03.310 by each one of those countries 00:51:03.310 --> 00:51:07.147 where it's really a relative exposure to risk. 00:51:07.147 --> 00:51:10.693 Last year we found with the year over trend by age group 00:51:10.693 --> 00:51:12.736 that teens were actually responsible for 00:51:12.736 --> 00:51:17.032 the most noticeable improvement in online civility overall, 00:51:17.032 --> 00:51:20.244 and to be perfectly blunt, it fell among adults. 00:51:20.244 --> 00:51:23.497 Maybe it's somewhat because of that Covid experience. 00:51:23.497 --> 00:51:26.417 And as Tami indicated, we also want to make sure 00:51:26.417 --> 00:51:30.587 that we really are more deeply understanding the experience 00:51:30.587 --> 00:51:33.007 that these young people are having online. 00:51:33.007 --> 00:51:36.302 We have had several occasions to bring together what we call 00:51:36.302 --> 00:51:39.805 our Youth Engagement Council to have a deeper conversation 00:51:39.805 --> 00:51:42.641 so that they understand how those who on my team 00:51:42.641 --> 00:51:44.768 think about the trust and safety issues 00:51:44.768 --> 00:51:49.815 or a privacy team from product design to product enhancement 00:51:49.815 --> 00:51:52.109 and where their voice could be. 00:51:52.109 --> 00:51:53.777 We asked them to engage deeply 00:51:53.777 --> 00:51:56.196 and write their own personal manifesto 00:51:56.196 --> 00:51:58.657 as to how they want their online environment to be 00:51:58.657 --> 00:52:00.451 and how they can help promote it, 00:52:00.451 --> 00:52:01.827 and then we asked them to take 00:52:01.827 --> 00:52:03.704 the Digital Civility Challenge online, 00:52:03.704 --> 00:52:07.791 which to be perfectly blunt, is a lot like the golden rule 00:52:07.791 --> 00:52:10.419 that they'll act with empathy, compassion, and kindness, 00:52:10.419 --> 00:52:12.463 that they respect differences, 00:52:12.463 --> 00:52:14.131 and then there's those critical steps 00:52:14.131 --> 00:52:15.966 when you think about the online environment. 00:52:15.966 --> 00:52:18.093 Pause before replying. 00:52:18.093 --> 00:52:20.429 I liked Dr. Mattson's comment to use that thing 00:52:20.429 --> 00:52:23.640 between your two ears and to stand up for myself 00:52:23.640 --> 00:52:26.310 and others in the online environment. 00:52:26.310 --> 00:52:28.270 Often times they're more ready to do that 00:52:28.270 --> 00:52:30.522 on the playground, and they need to understand 00:52:30.522 --> 00:52:34.109 that in the anonymous space as well. 00:52:34.109 --> 00:52:37.488 I will just take a pause here to acknowledge my principle 00:52:37.488 --> 00:52:39.323 for the metaverse which is we need to acknowledge 00:52:39.323 --> 00:52:41.075 that there will be greater risks, 00:52:41.075 --> 00:52:43.535 but with the principles we're setting for the internet 00:52:43.535 --> 00:52:46.497 and the lessons we've learned about what has gone wrong 00:52:46.497 --> 00:52:48.540 when there's too much control in certain spaces, 00:52:48.540 --> 00:52:51.627 that should not be duplicated in a new opportunity 00:52:51.627 --> 00:52:53.295 to have a decentralized metaverse. 00:52:53.295 --> 00:52:58.133 That gives better control to all of our global citizens 00:52:58.133 --> 00:53:00.636 over their own online environment. 00:53:00.636 --> 00:53:02.262 Hope that helps. 00:53:02.262 --> 00:53:05.724 >>That did help. Wow, so much information. 00:53:05.724 --> 00:53:09.269 So many gems have been dropped today, 00:53:09.269 --> 00:53:11.605 and so many resources have been dropped in the chat. 00:53:11.605 --> 00:53:14.400 If you haven't seen some of those resources 00:53:14.400 --> 00:53:15.818 [Laughter], News Literacy Project 00:53:15.818 --> 00:53:17.653 has shouted out with their resources, 00:53:17.653 --> 00:53:21.824 their Checkology resources for educators. 00:53:21.824 --> 00:53:25.828 I'm also seeing that they do amazing PDs. 00:53:25.828 --> 00:53:28.622 NAMLE has dropped their list of work partners 00:53:28.622 --> 00:53:31.583 if you're looking for people to partner with or organization. 00:53:31.583 --> 00:53:35.129 PD is Professional Development, Jordan. 00:53:35.129 --> 00:53:37.631 So, yes, Professional Development, 00:53:37.631 --> 00:53:40.217 if you need them to come in and help train some teachers 00:53:40.217 --> 00:53:42.636 on how to teach media literacy, 00:53:42.636 --> 00:53:46.348 or news literacy because their focus is on news literacy, 00:53:46.348 --> 00:53:49.852 yes, please, look at the chat. There's so many resources. 00:53:49.852 --> 00:53:51.728 Now we only have five minutes, 00:53:51.728 --> 00:53:54.273 and I want to make sure that we answer your questions. 00:53:54.273 --> 00:53:57.109 So let's go and check out some of the questions 00:53:57.109 --> 00:53:59.736 that you all have. 00:53:59.736 --> 00:54:04.158 We have one question that says it's for Jennie specifically. 00:54:04.158 --> 00:54:05.617 It says, "Wasn't media literacy 00:54:05.617 --> 00:54:08.662 a bit of a political football in England 00:54:08.662 --> 00:54:11.290 when conservatives rejected efforts 00:54:11.290 --> 00:54:13.959 to advance media studies?" 00:54:13.959 --> 00:54:15.961 Jennie, are you familiar with this topic? 00:54:15.961 --> 00:54:17.379 This is directed towards you. 00:54:17.379 --> 00:54:19.882 How do we address the inevitably political nature 00:54:19.882 --> 00:54:21.675 of media literacy education? 00:54:21.675 --> 00:54:23.218 >>Well, that particular controversy 00:54:23.218 --> 00:54:25.804 was a little bit before my time, 00:54:25.804 --> 00:54:28.390 but when it comes to the current landscape 00:54:28.390 --> 00:54:32.436 I would say it feels less politicized than it once did. 00:54:32.436 --> 00:54:34.688 The government has empowered the release 00:54:34.688 --> 00:54:36.648 of a whole national media literacy strategy 00:54:36.648 --> 00:54:38.859 which applies partially to schools. 00:54:38.859 --> 00:54:43.780 And because they won't design a curriculum which is mandatory, 00:54:43.780 --> 00:54:46.325 what they will tend do is sort of pass the buck over 00:54:46.325 --> 00:54:48.410 to schools and say, "You deal with this. 00:54:48.410 --> 00:54:50.704 You create resources and forms of teaching 00:54:50.704 --> 00:54:52.706 that make the most sense for your school community." 00:54:52.706 --> 00:54:55.751 So in that way it can avoid some of the major pitfalls 00:54:55.751 --> 00:55:00.797 of having to have a standardized curriculum, for example. 00:55:00.797 --> 00:55:03.258 There's more politicization 00:55:03.258 --> 00:55:07.596 really in where some of the funding for this comes from, 00:55:07.596 --> 00:55:10.307 and it is the case across much of Europe. 00:55:13.143 --> 00:55:16.480 The tech companies themselves have put the majority 00:55:16.480 --> 00:55:18.524 of the investment into these programs, 00:55:18.524 --> 00:55:19.775 and it's not that 00:55:19.775 --> 00:55:21.109 they're not acting in good faith, 00:55:21.109 --> 00:55:23.445 but very often those programs are being funded 00:55:23.445 --> 00:55:26.532 out of marketing departments or PR departments 00:55:26.532 --> 00:55:28.158 rather than educational departments, 00:55:28.158 --> 00:55:31.078 and what that's created is a complete dependency 00:55:31.078 --> 00:55:33.956 for the third sector on tech companies 00:55:33.956 --> 00:55:36.083 who will obviously have red lines on what they do 00:55:36.083 --> 00:55:39.461 and don't want to broach within the content of curricula. 00:55:39.461 --> 00:55:42.422 So if there's going to be politicization anywhere, 00:55:42.422 --> 00:55:45.217 I think it's in how you balance some of those things 00:55:45.217 --> 00:55:46.802 and sort of how you do an education 00:55:46.802 --> 00:55:48.095 that is candid about 00:55:48.095 --> 00:55:51.265 some of the architectural issues of the internet. 00:55:51.265 --> 00:55:56.270 That is not accusatory against one platform over the other, 00:55:56.270 --> 00:55:57.604 and I wouldn't say that that was 00:55:57.604 --> 00:55:59.815 a government controversy particularly, 00:55:59.815 --> 00:56:06.113 but it is sort of something that the sector is contending with. 00:56:06.113 --> 00:56:07.906 >>Thank you so much, Jennie. 00:56:07.906 --> 00:56:12.744 We do have some requests to drop your emails into the chat. 00:56:12.744 --> 00:56:16.331 Some of these audience members want to get in contact 00:56:16.331 --> 00:56:18.208 with you experts and leaders, 00:56:18.208 --> 00:56:22.504 so please drop your email in the chat. 00:56:22.504 --> 00:56:24.715 Right now I don't see any more questions, 00:56:24.715 --> 00:56:27.968 and I know we are coming to an end, 00:56:27.968 --> 00:56:30.679 but I do want to make sure that I thank the panel 00:56:30.679 --> 00:56:33.348 for their time, for their wisdom, 00:56:33.348 --> 00:56:35.976 for sharing such insightful information, 00:56:35.976 --> 00:56:38.854 and for sharing all of the resources 00:56:38.854 --> 00:56:43.275 that their organizations and their companies are working on 00:56:43.275 --> 00:56:46.653 to help combat this issue with misinformation 00:56:46.653 --> 00:56:50.449 and online polarization and just creating a space for us 00:56:50.449 --> 00:56:53.160 to better educate our kids on these topics. 00:56:53.160 --> 00:56:55.996 So thank you all for doing the work 00:56:55.996 --> 00:56:58.206 that you do and for being here today. 00:56:58.206 --> 00:56:59.291 >>Thank you, Jimmeka. 00:56:59.291 --> 00:57:01.793 -You're awesome. Thank you. ->>Yes. 00:57:01.793 --> 00:57:03.045 >>Thank you, Dr. Anderson. 00:57:03.045 --> 00:57:04.296 >>I'm going to turn it over. [Laughter] 00:57:04.296 --> 00:57:07.466 I'm going to turn it back over to Kristi. 00:57:07.466 --> 00:57:09.968 >>Thank you, Jimmeka. Thank you to that whole panel 00:57:09.968 --> 00:57:12.846 so much for that wisdom and knowledge shared. 00:57:12.846 --> 00:57:15.724 Thank you all. I know we had a lot going on in the chat, 00:57:15.724 --> 00:57:17.059 so thank you all to the panelists 00:57:17.059 --> 00:57:20.187 that were answering questions as we went through as well. 00:57:20.187 --> 00:57:21.605 Just a reminder to everyone attending 00:57:21.605 --> 00:57:24.483 that we'll have all those resources available to you 00:57:24.483 --> 00:57:26.818 in a document post-event, 00:57:26.818 --> 00:57:28.445 so you don't have to sift through all of that 00:57:28.445 --> 00:57:29.863 because there was a lot of information. 00:57:29.863 --> 00:57:32.282 So thank you all. 00:57:32.282 --> 00:57:36.328 I'm really want to emphasize how important hearing 00:57:36.328 --> 00:57:37.954 from all sectors of society 00:57:37.954 --> 00:57:41.792 was today from industry and from non-profits 00:57:41.792 --> 00:57:43.168 and being able to host you all to 00:57:43.168 --> 00:57:45.796 have that conversation. I think it's really important, 00:57:45.796 --> 00:57:47.297 so thank you again for all of that. 00:57:47.297 --> 00:57:48.507 What an amazing conversation 00:57:48.507 --> 00:57:51.134 we had about media literacy education, 00:57:51.134 --> 00:57:54.513 supporting youth engagement in digital civility programs, 00:57:54.513 --> 00:57:59.017 and learning tools and skills as a way to empowerment in agency. 00:57:59.017 --> 00:58:01.853 So just thank you all again for joining us.