Home National Analysis | Russia’s fake ‘fact-checking’ Ukraine videos and other news literacy lessons

Analysis | Russia’s fake ‘fact-checking’ Ukraine videos and other news literacy lessons

0
Analysis | Russia’s fake ‘fact-checking’ Ukraine videos and other news literacy lessons

[ad_1]

The material in this post comes from the Sift, the organization’s newsletter for educators, which has more than 23,000 subscribers. Published weekly during the school year, it explores timely examples of misinformation, addresses media and press freedom topics, discusses social media trends and issues, and includes discussion prompts and activities for the classroom. Get Smart About News, modeled on the Sift, is a free weekly newsletter for the public.

The News Literacy Project’s browser-based e-learning platform, Checkology, helps educators teach middle and high school students how to identify credible information, seek out reliable sources, and know what to trust, what to dismiss and what to debunk.

It also gives them an appreciation of the importance of the First Amendment and a free press. Checkology, and all of NLP’s resources and programs, are free. Since 2016, more than 37,000 educators in all 50 states, the District of Columbia and more than 120 other countries have registered to use the platform. Since August 2020, more than 3,000 educators and more than 125,000 students have used Checkology.

Here’s material from the March 14 edition of the Sift:

1. Russia is pushing fake “fact-checking” videos of fabricated examples of misinformation as a tactic to spread disinformation and uncertainty about its invasion of Ukraine, according to an investigation by Clemson University’s Media Forensics Hub and ProPublica. These fake debunk videos are designed “to inject a sense of doubt among Russian-language audiences as they encounter real images” of the war, including those showing Russian losses. The investigation found that some of the videos contain metadata proving that both the alleged “fake” and the authentic footage were created together — not independently, as would be the case with actual pieces of viral misinformation.

  • Discuss: Why would Russia push fake misinformation and attribute it to Ukraine? How does widespread confusion about what information can be trusted serve Russian interests in Ukraine?
  • Related:

2. About 90 percent of Americans think social media makes it easier to spread misinformation, harassment and extreme viewpoints, but they are divided over how — if at all — to address harmful and false content online, according to a new report by the Gallup polling firm and the Knight Foundation. Americans’ complex attitudes about Internet regulation do not always neatly follow partisan divisions, but many share deep concerns about technology, with 62 percent of U.S. adults saying elected officials “pay ‘too little’ attention” to tech issues, the report found.

  • Discuss: Did any of the report’s findings surprise you? How do you think social media has affected social and civic life? List some positive and negative effects. How should social media companies decide what content to prohibit? How often should these policies be reevaluated and updated? How much do you trust information on social media? What steps can you take to verify information before sharing it?
  • Idea: Have students take this quiz to see how their views compare with those of survey respondents on who — if anyone — should regulate harmful and false content online.

NO: It’s not true that megachurches in the United States have failed to offer support to Ukrainians during the Russian invasion.

NewsLit takeaway: People and partisan groups often find ways to use major news events such as the war in Ukraine to score political points. Megachurches and celebrity pastors — including Joel Osteen of Lakewood Church in Houston, which is pictured in this meme — have previously been targets of misinformation, particularly in the aftermath of disasters. This post also is a good example of how a “sheer assertion” — or a claim made without evidence — can be shared widely when it connects with people’s existing beliefs and biases.

  • TikTok may be known for short, funny videos, but clips focused on the conflict in Ukraine are flooding the app, presenting new and urgent challenges to content moderators, who are struggling to keep up. (Learn more about what makes TikTok vulnerable to misinformation in this new report from the Media Manipulation Casebook.)
  • False and misleading information is easy and cheap to produce compared with high-quality information, argues Richard L. Hasen in this March 7 New York Times op-ed. But we can’t regulate our way out of this problem, argues Jay Caspian Kang, in a separate March 7 opinion piece in the Times. Instead, we need to build “an educated and resilient public that can spot and then ignore” misinformation.
  • Only 21 percent of the top editors across 240 major news organizations are women — far below the 40 percent of journalists who are women in these markets, according to a recent report by the Reuters Institute for the Study of Journalism.
  • Nearly two years after George Floyd’s murder sparked a “racial reckoning” in American newsrooms, some journalists are reflecting on news organizations’ efforts to promote diversity, equity and inclusion. Their experiences highlight that “meaningful, sustainable, and lasting change — especially when it comes to institutional racism and discrimination — takes time,” writes Nieman Lab’s Hanaa’ Tameez.

[ad_2]

Source link