Top menu

Fostering Students’ Digital Media Literacy, an Interview with Dr. Sarah McGrew, Dr. Patricia Brooks, and Jessica Brodsky

The following interview took place on April 17th, 2020. Maya Rose (Scholarly Communication Fellow) interviewed Dr. Sarah McGrew, Dr. Patricia Brooks, and Jessica Brodsky about their timely research on media literacy. Bios for each contributor are included at the end of the post.

Maya Rose (MR): First off, thank you so much for making the time to speak with me. I know this is an unprecedented and stressful time so I really appreciate it! Each of you has been involved in research projects to foster students’ digital media literacy so that students are better able to evaluate the credibility of information they encounter online. This interview is very timely, given the spread of misinformation about the global COVID-19 pandemic and the transition to working and learning online. Lets first go around and briefly introduce ourselves and tell each other about our background and research interests. 

Sarah McGrew (SM): I’m Sarah McGrew and I’m an assistant professor in the College of Education at University of Maryland. I started out as a high school history teacher in Washington DC. I did that for five years and then went to Stanford to work on my PhD. For the last several years, I have studied how young people evaluate information online and how we can better support them––thinking about both building assessments and curriculum to support young people and also about the teacher education efforts that need to accompany any curricular and assessment interventions. 

Patricia Brooks (PB): I’m a professor at the College of Staten Island (CSI) and the Graduate Center of CUNY. I’ve been at CUNY for over 20 years where my research has focused on language and literacy development. In recent years, my research has shifted to include digital media literacy. It started with a graduate student project on middle school students’ conceptions of the Internet. We found that students grasped how to use the Internet (its functional affordances), but lacked technical understanding of its structure as a complex network. About three years ago I became involved in the Digital Polarization Initiative (DPI)––an effort of the American Association of State Colleges and Universities [American Democracy Project] to teach college students how to fact-check online information. CSI was one of eleven institutions asked to take part in the DPI and, as I was working on digital literacy, I had an opportunity to take a leadership role in that initiative on campus.

Jessica Brodsky (JB): I’m Jessica Brodsky. I am a doctoral student at the Graduate Center, CUNY. I work closely with Patty on researching media literacy with college students and adolescents. I’ve been working with her to support the instructors using the DPI fact-checking curriculum. I am also doing research on algorithm literacy in college students and hoping to expand that to adolescents as well. 

MR: Sarah, can you talk a bit about your research on civic online reasoning interventions and assessments of students’ online reasoning?

SM: My work as part of the Stanford History Education Group (SHEG) on civic online reasoning curriculum builds on two areas of research that we did before the curricular work. First is the development of assessments. We realized that in thinking about how young people engage with online information, there weren’t many good, relatively direct measures. There’s a lot of surveys, focus group interviews, multiple-choice assessments that are more knowledge-based. These don’t really give us a sense of what young people actually do when they are interacting with online information. So we started building assessments both in paper and pencil (to make them useable in younger grades or for classes that don’t have access to computers) and online forms. These assessments provided quick snapshots of how young people approached different parts of evaluations––whether it was identifying sponsored content or researching the sponsor of a cloaked website. We used the assessments to build a picture of the struggles that young people have––middle school, high school, and college students––when they try to evaluate information. 

The second area was devoted to developing a portrait of what skilled practice looked like. If we were going to try to teach young people how to evaluate information, what would we want to teach them? Sam Wineburg and I did a study with professional fact checkers, historians, and Stanford freshmen to understand the strategies and heuristics that fact checkers used when searching for and evaluating online information. We tried to understand those differences between the fact checkers on one hand, and the historians and students on the other hand, and distill the fact checkers’ strategies into a set of practices that we felt were teachable. 

Finally, we used both that assessment work and our growing understanding of expertise to build and test educational interventions. Starting pretty small-scale, I did a study in a high school history classroom, and then SHEG did a study in a first-year college course. Results from both of these were promising. Last year, we did a larger experimental study in a big district that showed promising gains; the students in the experimental group who had six online reasoning lessons outperformed their peers who were in a regular government class. The work is ongoing, but it’s promising that teaching these skills explicitly seems to help students improve their evaluation skills. 

MR: Can you briefly tell us about your research on teaching undergrads skills used by expert fact-checkers, the goals of the DPI, and your work with the civics course at CSI?

PB: I can talk about the course and let Jessica talk about the fact-checking lessons. COR100 is a general education civics course that is taken by the majority of first-year students at CSI. Donna Scimeca, the course coordinator, was part of the team asked to participate in the DPI. In the first semester of the project, we implemented the fact-checking lessons that Michael Caulfield [of Washington State University] had prepared for the DPI, with the exception that we collected data on our own platform with a control group so we could evaluate the curriculum’s effectiveness. After the first semester, we worked closely with Donna and the instructors to improve the lessons so that they would connect more directly with specific topics discussed in the civics course, such as immigration policy, the impeachment proceedings, and the economic response to COVID-19. Jessica can talk more about what we did to tweak the fact-checking lessons to meet the course objectives.

JB: We decided to do a couple of things differently after our first semester. We moved all the instruction online, both to ensure fidelity of the implementation of the curriculum and to help instructors who had different levels of comfort with teaching the fact-checking skills. All of the instruction now happens online through a series of online homework assignments which are graded pass/fail. 

We also worked with the instructors to align each of the assignments to content covered in the class. For example, in Fall 2019, the students were learning about Supreme Court cases that were scheduled to be heard and so each assignment was related to a specific Supreme Court case. In the assignment, students were asked to fact-check information about that case and they were prompted to use “lateral reading” strategies to investigate articles, images, and social media posts. Lateral reading involves opening additional browser windows to investigate sources (i.e. the individuals and organizations making claims), trace information back to the original source, and verify the information using trusted sources, including fact-checking websites like Snopes.com. 

In these assignments, we ask students to first try to assess the credibility of something, and then we show them how we would do it. So they get feedback in terms of seeing how someone would fact-check the information using lateral reading strategies. Then at the very end of the assignment, we ask them to find some new pieces of online information (not just the ones that we’ve given them) and figure out which is most credible and explain why. The instructors are really on board with this approach because it clearly ties to the curriculum and relates what the students are doing for homework to what they’re covering in class. And it makes it really relevant to students’ lives and what’s going on in the world around them right now. 

PB: Yes, I think the online assignments help students in class because they are reading all these varied articles on course-relevant topics. Even if they’re reading the articles at a superficial level, they’re still coming into class knowing more about these Supreme Court cases than they would if they hadn’t done the assignment. One of the topics was the reform of the New York State bail law, which isn’t something that students would normally read about and yet it’s critical to our society. What is fun for us as educators is that the COR100 class is different each semester because the instructors are always tying the curriculum to things that are currently in the news. So during the (Spring 2020) COVID crisis the students were looking at articles on the Paycheck Protection Program and how to reopen the economy. One thing we are trying to understand more is why some students don’t show gains in fact-checking after completing the curriculum. We have a very diverse population here at CSI due to our open admissions policies. One of our hypotheses is that gains in fact-checking information may be tied to students’ reading comprehension skills. 

MR: So that our readers have a better understanding of digital literacy, what are the steps for successful lateral reading? 

SM: Lateral reading is what professional fact checkers in our study did when they landed on an unfamiliar site. They prioritized investigating the source or sponsoring organization of that website, and they did that research by leaving the website and opening up tabs on the lateral, or horizontal, axis of their browser and searching for information about the organization outside of the site itself. They only returned to the original source when they had a sense of its perspective, authority, where it was coming from, and why it might be presenting the information.

PB: And one of the best places to find information about a source is Wikipedia. We have really butted heads with the belief that students shouldn’t use Wikipedia as a source. This belief is entrenched in both the students and the faculty. We’ve been seeing in our data that students report at the end of the semester that they’re using Wikipedia more, but they’re not changing their views about it (i.e. whether it’s trustworthy). 

MR: In shifting the focus a bit, do you think academic researchers are susceptible to making the same kinds of mistakes that students make in terms of evaluating online information? How can we help early career scholars develop fact-checking skills?

SM: Our expert study made me pretty pessimistic about even really talented adults. The historians, who were university professors with PhDs in history, looked in many cases much more like the students. On these tasks that asked them to search for and evaluate information on the open Internet, they were often not any better than students. We might assume that training in critical thinking, years in school, or experience online makes us better at these things––but the performance of the historians calls that into question. On the other hand, we have evidence that if we explicitly teach these strategies, students can develop them. I think that absolutely applies to academics, perhaps especially on issues we care about or have deep beliefs about. One thing we noticed was that historians seemed overconfident that they could reason through online content themselves––that they could stay on a webpage, even one they weren’t familiar with, and figure out if it was reliable just by reading carefully or looking at characteristics of the page. We might be able to evaluate information in areas that we have specific expertise in, but outside that pretty area, we’re not very good at just evaluating a website by judging the content of the information. For example, I can’t evaluate scientific studies about COVID in a sophisticated way. I just need to go read what the WHO or CDC say.

MR: It’s hard! It takes a lot of work when it is not your area of expertise!

SM: Right, we need to try to be clear about where our own expertise ends and then defer to people who are more expert than us.

MR: That seems like a skill in and of itself.

JB: Yes, this is the first step. We want students to stop reading [the article or social media post] and start going to other [trusted] sources for information on the topic. I also think teaching about cognitive biases and heuristics can be beneficial in this case—just to raise awareness about the fact these are at play when we’re consuming content. So Introductory Psychology classes that teach about those topics [heuristics and cognitive biases] can have a role in this as well.

PB: Yes, we talk about confirmation bias [in Introductory Psychology] and use the concrete example of how the information we gather from the Internet tends to match our pre-existing biases. I’m on a mission to go to Fox News everyday to try to reprogram my computer to think that I’m interested in other perspectives. But I’m also trying to get a sense of what in the information feed is driving other people’s behavior, e.g., what made crowds of armed individuals gather at the State House in Michigan during the COVID outbreak to protest the stay-at-home orders? I’m trying to understand where their ideas are coming from. I think that if we can’t figure out ways to get the Internet to give us both sides of the story, we’re really in trouble. Jessica, did you want to talk a little bit about our work that looks at dissociations between what students think they’re doing and what we actually see them doing when we record them fact-checking information?

JB: We had students come into the lab and do some fact-checking tasks and we screen recorded them while they were doing the tasks. I think that’s somewhat similar to what Sarah did with the students, faculty, and fact checkers. But we also asked students to self-report how frequently they use lateral reading strategies. We found that they actually say that they use these strategies in the range of “sometimes” to “frequently”, but the screen recordings show they rarely leave the original article. They might go and look up the image but they are not doing a reverse image search. So it’s highlighting the students’ overconfidence in their own skills and their overconfidence that they know how to fact-check, while in reality they’re not using the skills that are most effective and efficient for getting them to a good assessment of the content. So that’s been pretty interesting to observe and gets back to the idea of how we raise awareness of our own biases to help combat this.

SM: Have you done those before and after the intervention? Do they get closer together after the lesson?

PB: We haven’t done that yet. In this particular study, the students came into the lab just one time. We gave them a problem and then we showed how we would solve it. Then we gave them another problem and again showed them how we would solve it. We did this with four problems in total and even though we kept showing them how we would solve the problem, most of them didn’t apply the strategies to the next problem. Maybe it was an issue of not being able to flexibly utilize the instruction. You (SM) talk about seeing improvements after six lessons. We also see results after about five lessons [in COR100] where the students are getting exposed to these kinds of problems and getting lots of opportunities to review and practice the skills. But in the one session study, the intervention wasn’t sufficient to get students to apply the skills. Now what we really want to find out is whether the COR100 students who showed gains in their fact-checking skills are still doing it a year from now. Hopefully we can start calling students back who took the course a year ago to see if they retained the skills. 

MR: Why is it important for students and scholars to understand why there is a pressing need for fact-checking information now (e.g., data collection, algorithmic filtering, digitally polarized environments driven by biases and filtering)? 

PB: The DPI was attractive to me because it aimed to answer the question of how we ended up with such a polarized digital landscape. I grew up in Indiana but have lived in New York City for decades. How did it get to the point where the information in my news feed is so different from that in the news feeds of high school classmates who remained in Indiana? That to me is the crux of it. This piece is not part of the traditional media literacy curriculum taught in school and it isn’t in the DPI curriculum, which focuses on developing students’ fact-checking skills. Everyone today needs to understand how the Internet works. I want to see more integration of the DPI curriculum with instruction focusing on Internet algorithms (and filtering and personalization of search results), which may help students to understand why our news feeds differ so much. Otherwise, we’re not really getting at the problem of how we became such a divided country. And it’s not because people are illogical. We all have the same basic reasoning! People are getting different information and that’s a consequence of how the Internet works. 

JB: We’ve been doing some research with CSI students on their understanding of how their information is being manipulated and filtered using algorithms. It’s interesting because they are aware of targeted advertising because they see ads pop up on whatever they were searching previously. But when it comes to social media feeds and even Google search results, students are not aware of the fact that things are being filtered and that someone in another state or even the person sitting next to them will get different results. There is a gap in knowledge that traditional media literacy instruction is not tackling. So getting at this algorithm literacy to help students understand how we get this polarization is a critical step that we need to tackle next.

SM: I agree with all that. I think COVID is drawing more attention to misinformation––it’s drawing attention to the terrible effects that misinformation can have. Information is connected to our decisions and actions and it can lead us in really terrible directions. That argument holds true with social and political information along with health information. It might be a little bit less direct, but I think it’s really important to continue making the argument that our decisions and the information we use to make those decisions matters. Cultivating the sort of accuracy-oriented motivations in people is really important so that they have the motivation to stop and take the time to fact check before they post something on and their social media feeds. It’s important to keep talking about why it matters and the effects it has.

MR: Have you had informal conversations with scholars about this or with colleagues or students? Are they apprehensive about lateral reading or are they interested in taking that next step?

SM: I think most people are excited. One apprehension I hear from teachers, and I mostly work with secondary teachers, is from people who work in more politically divided areas and are worried about politicization. I have had teachers ask me, “What do I do if I talk about why the New York Times is a reliable source and a student just says, oh that’s fake news?” I think there are real questions there about how teachers deal with political controversy in general in the classroom.

PB: That’s why I love the Supreme Court cases because here we’re just talking about two different positions: let’s see what the energy companies have to say and let’s see what the environmental groups have to say. It’s not about right or wrong, it’s just that there are different ways of looking at things and the world is complicated. I feel like we’ve been able to skirt the issue of what is accurate by focusing on who is writing this story and what is their bigger agenda. People have different interests: profit may be one and preservation of the planet may be a different one. Let’s look at the information and understand that it’s always going to have a bias because we all have perspectives on the issues. I think it makes it easier for the instructors as well because we’re not delving into students’ political views but focusing on the legal debate.

Jessica E. Brodsky is a doctoral student in the Educational Psychology Program at The Graduate Center, CUNY. Her research interests include media literacy assessment and instruction in college students and adolescents. She works closely with instructors of the College of Staten Island’s general education civics course to develop and assess online lessons for teaching fact-checking skills. She is also currently conducting research on algorithm literacy in college students. Jessica is very interested in scholarship of teaching and learning and is involved in the development and evaluation of online resources for teaching scientific and quantitative literacies in Introductory Psychology. She is the current chair of the Society for the Teaching of Psychology (American Psychological Association Division 2) Graduate Student Teaching Association (GSTA).

Sarah McGrew is an Assistant Professor in the College of Education at the University of Maryland, College Park. Her research focuses on young people’s civic online reasoning—how they search for and evaluate online information on contentious social and political topics—and how schools can better support students to learn effective evaluation strategies. As part of the Stanford History Education Group, Dr. McGrew developed assessments of students’ online reasoning, conducted research on fact checkers’ strategies for evaluating digital content, and tested curriculum designed to teach these strategies to secondary and college students. Her current research focuses on how best to support teachers to learn online reasoning themselves and design lessons for their students, and how to teach online reasoning using civic and community issues that students care about. 

Patricia J. Brooks is Professor of Psychology at the College of Staten Island and The Graduate Center, CUNY. Her research program focuses on language and literacy development, digital learning, and evidence-based teaching. Brooks is Head of the Editorial Board of Language Development Research and serves on the Executive Committee of the International Association for the Study of Child Language and on the Board of the Eastern Psychological Association. Previously, she was Faculty Advisor to the Graduate Student Teaching Association (GSTA) of the Society for the Teaching of Psychology (American Psychological Association, Division 2). She received the President’s Dolphin Award for Outstanding Teaching by a Member of the Full-Time Faculty at the College of Staten Island (2016) and Special Recognition from the Society for the Teaching of Psychology (2019).

No comments yet.

Leave a Reply

Powered by WordPress. Designed by WooThemes

Skip to toolbar