“In the new school year, thousands of Oklahoma students will be required to learn about 2020 election fraud conspiracy theories as part of a new curriculum developed by the state’s controversial superintendent, Ryan Walters. Walters, who has come under fire in recent months for an effort to require Oklahoma classrooms to stock Bibles and display the Ten Commandments, has said that the addition “empowers students to investigate and understand the electoral process.”
Under the state’s new curriculum, high school students will be taught to “identify discrepancies in 2020 elections results by looking at graphs and other information, including the sudden halting of ballot-counting in select cities in key battleground states, the security risks of mail-in balloting, sudden batch dumps, an unforeseen record number of voters, and the unprecedented contradiction of ‘bellwether county’ trends.”
While it’s not necessarily unreasonable to want students to learn about the dispute over the 2020 election, the standards’ framing of the controversy (which turned up no evidence of election interference) and Walters’ comments about it make it clear that teachers are meant to shed doubt on the veracity of the election.”
…
“The standards also contain passages directing teachers to ensure that students can “identify the source of the COVID-19 pandemic from a Chinese lab,” and “explain the effects of the Trump tax cuts, child tax credit, border enforcement efforts.””
…
“The curriculum change is just one of a battery of recent attempts to inject partisan politics into public school curricula. While blue states have faced criticism from the right for injecting critical race theory into the classroom, many red states have engaged in far more galling efforts to politicize classroom instruction.”
“It’s a startling reality about Gen Z, backed up by multiple studies and what we can all see for ourselves: The most online generation is also the worst at discerning fact from fiction on the internet.
That becomes an issue when the internet — and specifically, social media — has become the main source of news for the younger generation. About three in five Gen Zers, from between the ages of 13 and 26, say they get their news from social media at least once a week. TikTok is a particularly popular platform: 45 percent of those between the ages of 18 and 29 said they were regular news consumers on the app.”
…
“although people of all ages are bad at detecting misinformation — which is only getting harder amid the rise of AI — members of Gen Z are particularly vulnerable to being fooled. Why? There’s a dangerous feedback loop at play. Many young people are growing deeply skeptical of institutions and more inclined toward conspiracy theories, which makes them shun mainstream news outlets and immerse themselves in narrow online communities — which then feeds them fabrications based on powerful algorithms and further deepens their distrust. It’s the kind of media consumption that differs drastically from older generations who spend far more time with mainstream media, and the consequences can be grim.”
…
“Only 16 percent of Gen Zers have strong confidence in the news. It’s no surprise then that so many young people are shunning traditional publications and seeking their news on social media, often from unverified accounts that do little fact-checking.
The ramifications are potentially huge for American politics. Without some sort of course correction, a growing piece of the electorate will find itself falling victim to fake news and fringe conspiracy theories online — likely driving the hyperpolarization of our politics to new heights.”
…
“Gen Zers are uniquely vulnerable to misinformation compared to older age groups not just because of their social media habits, says Rakoen Maertens, a behavioral scientist at the University of Oxford, but because they have fewer lived experiences and knowledge to discern reality.
Maertens, who helped create a test that measures a person’s likelihood of being duped by fake headlines, says that while Gen Zers were most likely to fall for fake news now, there is hope that as time passes, they’ll become better at detecting falsities, just like the generations before them.
There’s also another, far more depressing alternative that may be just as likely — that the rest of the population will go the way of Gen Z.”
“Since buying the platform in 2022, Musk has helped turn X into an epicenter of election misinformation. With 203 million followers, Musk has the biggest reach on X and is the platform’s most prominent pusher of anti-immigrant conspiracy theories and right-wing propaganda. At Musk’s request last year, X changed the site’s algorithm to put his posts in more people’s feeds — posts that increasingly urged people not to trust the outcome of the election. The nonprofit Center for Countering Digital Hate estimates that Musk’s misleading posts about the election have been viewed more than 2 billion times this year.”
“Based on an analysis of posting behavior and subsequent suspensions on Twitter, Oxford Internet Institute professor Mohsen Mosleh and four other researchers confirmed that Republicans and conservatives were much more likely to run afoul of moderators than Democrats and progressives were. But they also found that right-leaning social media users were much more likely to share information from “low-quality news sites.” Those findings, the authors say, suggest that “differences in misinformation sharing can lead to politically asymmetric sanctions.”
I know what you’re thinking: Since “misinformation” is a vague, subjective, and highly contested category, it can easily serve as a cover for bias against particular opinions or ideologies. But Mosleh et al. took that possibility into account by judging the quality of news sites based on “trustworthiness ratings” by a nationally representative and “politically balanced” sample of 970 Republicans and Democrats. They also considered how sites ranked when they were rated only by the Republicans.”
…
“Mosleh et al. found further evidence that “the tendency to share misinformation” is politically skewed when they analyzed data from seven other sources, including information about “YouGov respondents’ on-platform Facebook sharing in 2016,” “prolific respondents’ on-platform Twitter sharing in 2018,” and “the on-platform sharing of Twitter users sampled in various ways in 2021.” And again, that association was apparent based on the “politically balanced” trustworthiness assessments as well as “fact-checker ratings.”
These results are consistent with previous research, Mosleh et al. say. They cite studies finding that “links to websites that journalists and fact-checkers deemed to be low-quality ‘fake news’ sites were shared much more by conservatives than liberals on Facebook” during the 2016 election and the 2020 election and on Twitter during the 2016 election and during Donald Trump’s first impeachment.
Other studies have found that “conservatives on Twitter were much more likely to follow elites [who] made claims fact-checkers rated as false compared with Democrats” and that “Republican-oriented images on Facebook were much more likely to be rated as misleading than Democratic-oriented images.” Mosleh et al. also note evidence from surveys that “present participants with politically balanced sets of headlines,” which “typically find that conservatives indicate higher sharing intentions for articles deemed to be false by professional fact-checkers than liberals.”
Such associations can be seen in other countries as well as the United States. “A survey experiment conducted in 16 countries found widespread cross-cultural evidence of conservatives sharing more unambiguously false claims about COVID-19 than liberals,” Mosleh et al. note. “An examination of Twitter data found that conservative political elites shared links to lower-quality news sites than liberal political elites in the USA, Germany and the UK.””
…
“”differential treatment of those on one versus the other side of the aisle does not on its own constitute evidence of political bias on the part of social media companies.””
“The law, Assembly Bill 2839 makes it illegal for an individual to produce “knowingly distributing an advertisement or other election communication, as defined, that contains certain materially deceptive content,” within 120 days of an election and up to 60 days after. Affected candidates can file for a civil action enjoining distribution of the media, and seek damages from its creator.”
…
“content creator Christopher Kohls filed a lawsuit arguing the law was overbroad, violating his First Amendment rights to make parody content. Kohls has a YouTube channel with more than 300,000 subscribers, and his videos often consist of political parodies featuring political candidates seemingly mocking themselves.”
…
“Judge John A. Mendes, a judge on the United States District Court for the Eastern District of California, sided with Kohls, ruling that the law doesn’t pass constitutional muster because it does not use “the least restrictive means available for advancing the State’s interest.”
“Counter speech is a less restrictive alternative to prohibiting videos such as those posted by Plaintiff, no matter how offensive or inappropriate someone may find them,” Mendez’s opinion reads. “AB 2839 is unconstitutional because it lacks the narrow tailoring and least restrictive alternative that a content based law requires under strict scrutiny.”
Mendez’s ruling argues that the law, which is aimed at cracking down on “deepfakes” and other forms of false speech intended at misrepresenting an opponent’s views and actions, ends up making illegal a much wider range of speech than these specific statements.
“While Defendants attempt to analogize AB 2839 to a restriction on defamatory statements, the statute itself does not use the word ‘defamation’ and by its own definition, extends beyond the legal standard for defamation to include any false or materially deceptive content that is ‘reasonably likely’ to harm the ‘reputation or electoral prospects of a candidate.'”
While the law did contain a provision exempting parody content that contains a disclosure, the requirement was onerous, mandating that it be “no smaller than the largest font size of other text appearing in the visual media.”
Just one part of the law was found to pass constitutional muster—a requirement audio-only media be disclosed at the beginning at the message, and every two minutes during the duration of the content.
“While the Court gives substantial weight to the fact that the California Legislature has a ‘compelling interest in protecting free and fair elections,’ this interest must be served by narrowly tailored ends.” Mendez writes. “Supreme Court precedent illuminates that while a wellfounded fear of a digitally manipulated media landscape may be justified, this fear does not give legislators unbridled license to bulldoze over the longstanding tradition of critique, parody, and satire protected by the First Amendment.””