In 2021 and 2022, during the disastrous Covid lockdowns, the U.S. government gave professors at the University of Wisconsin-Madison upwards of $5.7 million to develop software to help correct other people’s “misinformation” when it appears online. The fact checking engine they have built, called “Course Correct,” is supposed to help journalists identify trending scientific and political “misinformation,” and ‘fix’ false claims in real time. It uses machine learning and natural language processing to check social media posts.
“Challenges of misinformation are not restricted to elections or Covid or to a particular community,” said one of the professors who got the grant. “Countering misinformation will require vigilance and adaptation.” What it ought to require is strict oversight by people committed to the First Amendment. In fact, the National Science Foundation is funding the research.
Presumably this search engine is being developed for that time when Facebook or Twitter cannot be relied upon to censor views that are at odds with the preferred establishment narrative. It’s pretty daring of the Feds to do: The revelations about how the FBI and CIA were sent to work with social media/tech companies, to censor online views that did not comport with the Administration’s chosen explanations, have not gone over all that well.
It must be asked: what is misinformation? Is it a genuine, factual mistake? 2+2=5? Or is it just a difference in opinion over what the facts are? To ask is to answer. In the wake of the aggressive censorship that accompanied the developing knowledge of Covid-19 and the attendant disagreements over masks, vaccines, lockdowns, social distancing, and school closings, we know that the government was rabid in suppressing anything that challenged its preferred narrative of the moment. People were kicked off social media and worse for suggesting that the Covid virus originated in a lab at the Wuhan Institute of Virology. The New York Times reporter Apoorva Mandavilli, called that view "racist." These days the lab leak theory is the preferred explanation of the FBI and many fellow agencies.
So how legitimate is it to use tax dollars to develop “Course Correct,” which, if we are being honest, exists in order to more efficiently suppress information, whether it is factually incorrect or merely a politically incorrect point of view? After all, one wonders why it's such a big deal if someone gets a fact wrong – even a fact pertaining to science – in online discussions? People have been wrong about things forever; the truth emerges from contention not reflexive agreement. On the other hand, it is clear why the government would be politically challenged by seeing people assert as factual things that challenge its preferred path of action regarding a pandemic, or climate change, or the way inflation works. Even more so about elections, which is another potential field for the application for Course Correct.
“Democracy and public health in the United States rely on trust in institutions,” the grant announcement states.
Skepticism regarding the integrity of U.S. elections and hesitancy related to Covid-19 vaccines are two consequences of a decline in confidence in basic political processes and core medical institutions. Social media serve as a major source of delegitimizing information about elections and vaccines, with networks of users actively sowing doubts about election integrity and vaccine efficacy, fueling the spread of misinformation.
Apparently, the Biden administration regards it as a crisis that many citizens question the integrity of some U.S. elections and are skeptical of Covid-19 vaccines. Indeed, half the country believes there was cheating in 2020. And a great many people have come to mistrust the mRNA vaccines, which have had deeply problematic side effects, and have provided relatively little protection to those who got or were forced to take them. But the Biden administration chalks it all up to misinformation, instead of legitimate disagreement. “Both of these crises are fueled by online misinformation,” according to the grant document.
Sometimes, of course, skepticism regarding government actions, and diminished trust in government institutions, has been earned by government behavior. Such skepticism is valuable. It should prompt the government to do better. It should help other parties to get elected. The views, opinions, and counter-narratives, many of which may well be true, should not be wiped away because an administration or bureaucracy is unhappy that the public does not trust its information. In our democracy, we have always lived with conflicting narratives about the nature, reasons for, and appropriate response to whatever is happening. Course Correct sounds a bit too Orwellian for comfort.
Nor is it clear how Course Correct determines what is the truth about a question or situation. When queried by the conservative reporter at “The College Fix” the researchers at Wisconsin did not provide an answer. Yet we are to accept that the program transcends bias?
The saddest fact about Course Correct is that the $5.7 million, and the idea itself, are but drops in the bucket of new U.S. government technology and personnel put in place to monitor and control the speech and thought of the American people. The State Department’s Global Engagement Center was designated by President Obama to run the U.S. counter-disinformation campaign. That has cost billions. It is a leviathan with tendrils reaching into all corners of the internet. And it has thoroughly overridden whatever remaining protections inherent in the Privacy Act of 1974, which was meant to prohibit government from spying on U.S. citizens.
Sadly, that age is long gone. And Course Correct is the least of it.
Every leap in cyber technology is accompanied by its own display of inbred biases, and all the biases always point in the same direction -- leftward, wokeward. ChatGPT is the latest obvious example. There is no longer any room for "benefit of the doubt." Conservatives should take it for granted that all these wonderful new innovations will always harbor leftward biases, and should make their usage decisions accordingly.