The bongino report

Matt Palumbo: How We Know PolitiFact Is Biased

This article is an excerpt from Matt Palumbo’s ‘Fact-Checking the Fact-Checkers’: How The Left Hijacked and Weaponized the Fact-Checking Industry 

The existence of bias at PolitiFact has been studied and analyzed many times over the years—and all with the same result.

That so-called fact-checkers are just biased left-wing gatekeepers is obvious to anyone that’s followed their track record, and PolitiFact in particular has done a poor job of concealing its partisanship. In one case they assigned different ratings to nearly identical statements based on the ideology of the person making them. When Ron Paul (a libertarian who was a registered Republican at the time) and Jim Webb (a Democrat) made nearly identical statements about the U.S. lacking income taxes prior to 1913 they received different rulings from the site. Ron Paul’s statement was only determined to be “Half True,” while Webb’s was “Mostly True.”

In one amusing case of reality slapping them in the face, PolitiFact initially rated Obama’s first-time campaign promise that “if you like your health-care plan, you can keep your health-care plan” claim as “True.” PolitiFact would later have to reevaluate that claim, which they determined to be the “lie of the year” in 2013.

Both the article rating Obama’s “if you like your plan you can keep your plan” lie true and the one years later rating it the lie of the year were written by the same person, PolitiFact’s editor-in-chief Angie Drobnic Holan. She made no mention of her own positive evaluation of the lie while penning her 2013 “lie of the year” article where she admitted that “Obama’s promise was impossible to keep.”

But she didn’t realize that years prior?

The RealClearPolitics Fact Check Review reviewed 434 articles from PolitiFact and found that 15 percent of their “fact-checks” are really “opinion-checks.”

A University of Minnesota School of Public Affairs survey of all five hundred statements that PolitiFact rated in January 2010 through January 2011 found that of ninety-eight statements they rated false, seventy-four were from Republicans.

The Center for Media and Public Affairs at George Mason University found in a 2013 study that in the first four months of Obama’s second term, PolitiFact flagged Republicans as being dishonest at three times the rate of Democrats.

The Media Research Center’s Tim Graham noted of the report: “Even while the Obama scandals piled up—from Benghazi to the IRS to the DOJ phone-records scandals—Republicans are still being flagged as worse than Democrats, with 60 percent of the website’s selective claims rated as false so far [in May 2013] compared to 29 percent of their Democratic statements—a 2 to 1 margin.”

“As for the entire four months, PolitiFact rated 32 percent of Republican claims as “false” or “pants on fire,” compared to 11 percent of Democratic claims—a 3 to 1 margin. Conversely, Politifact rated 22 percent of Democratic claims as “entirely true” compared to 11 percent of Republican claims—a 2 to 1 margin.”

“A majority of Democrat statements (54 percent) were rated as mostly or entirely true, compared to only 18 percent of Republican statements. By contrast, a majority of Republican statements (52 percent) were rated as mostly or entirely false, compared to just 24 percent of Democrat arguments.”

Meanwhile, statements by Republicans were rated entirely false (“False” or “Pants on Fire”) twice as often as Democrats (29 percent vs. 15 percent). At the time Graham reported on this study, the “Pants on Fire” page on PolitiFact’s website displayed eighteen false claims by Republicans and two by Democrats.

The Federalist’s Matt Shapiro ran the numbers in December 2016 and further confirmed that there really is a double standard when it comes to how PolitiFact evaluates claims.

After evaluating thousands of PolitiFact articles and assigning their ratings a score ranging from 0 (True) to 5 (Pants on Fire), he found that Democrats had an average rating of 1.8, which is between “Mostly True” and “Half True.” The average Republican rating was 2.6, which is between “Half True” and “Mostly False.”

“Mostly False” is the most common rating given to Republicans besides Donald Trump, and they ranked Hillary Clinton as the second-most honest politician (lol).

When it came to the worst possible ruling, “Pants on Fire,” Donald Trump accounted for half of those ratings. But even with Trump excluded, PolitiFact has a penchant for giving Republicans that rating which indicates they were caught in an outrageous, bald-faced lie, while when Democrats make false claims, PolitiFact will be able to find some semblance of truth within the claim so that they only have to rate it “Mostly False” or “Half True.” During the 2012 election season, PolitiFact assigned Mitt Romney nineteen “Pants on Fire” ratings, while ALL Democrats combined received twenty-five “Pants on Fire” ratings from 2007 to 2016.

Are we to believe that Romney was more of a liar during the 2012 campaign season than all Democrats over nearly a decade? That’s about as believable as the claim that Hillary Clinton was the second-most honest politician in America.

Speaking of Hillary, a Media Research Center analysis during the 2016 election found that PolitiFact gave its “Pants on Fire” rating to Trump fifty-seven times, compared to only seven times to Hillary Clinton, the woman who played a role in creating a conspiracy theory that Russia was behind a secret plot to elect her opponent. Trump’s statements received “False,” “Mostly False,” or “Pants on Fire” ratings 77 percent of the time, compared to only 26 percent for Hillary.

As the race was in its endgame, from September to Election Day, Republicans received the “Pants on Fire” ranking twenty-eight times, half of which went to Trump, while Democrats only got that ranking four times, one of which went to Hillary.

The inconsistencies in ratings are everywhere—and while they aren’t as explicit as fact-checking a nearly identical claim and giving it a different rating, they aren’t hard to miss.

The obvious bias has only continued into the Biden era.

During Joe Biden’s first one hundred days in office, PolitiFact evaluated only thirteen of his claims while evaluating 106 others in defense of him.

Of the thirteen Biden statements, eight were some sort of falsehood, but not one earned him a “Pants on Fire rating.” Of the 106 fact-checks of claims about Biden, ninety-one were “Mostly False,” and twenty-four were “Pants on Fire.”

There wasn’t exactly a shortage of false claims they could’ve evaluated, as even the Washington Post’s fact-checker (Glenn Kessler) had concluded that Biden told seventy-eight lies that earned a “Four Pinocchios” rating by that time (but then, as mentioned previously, announced that he would be discontinuing the database of Biden falsehoods during his first one hundred days in office).

This was just the beginning of PolitiFact treating the Biden presidency with kid gloves.

According to Graham, after reviewing Biden’s first year, the same pattern held. Biden was fact-checked forty times by PolitiFact his first year, while his critics were checked 230 times. Excluding the aforementioned first one hundred days, Biden was checked twenty-eight times, compared to 124 fact-checks of his critics. While nearly half of Biden fact-checks were “Mostly False” or worse, 201 of the 230 claims from his critics (87.4 percent) were rated the same. Only three statements from Biden’s critics were rated true. PolitiFact also didn’t assign a single “Pants on Fire” rating to any of Biden’s comments, of which he’s only received six since they launched in 2007. As Graham puts it, “Biden can say the evacuation from Afghanistan was an ‘extraordinary success,’ and the Republicans are pushing ‘Jim Crow on steroids,’ and there will be no fact checks.”

By contrast, many of the “Pants on Fire” ratings for Biden’s critics were claims laughably pointless to fact-check. That included fact-checking claims such as “Biden is computer generated,” debunking photoshopped pictures of Biden sniffing people (though there are plenty of real ones), and refuting a report that Biden was handed a “vile of blood” from a child on a walk back to the White House. (Similarly, over at Snopes, they’ve gone as far as to get to the bottom of whether or not Biden really pooped his pants while on a trip in Rome.)

Similar to the example given with Ron Paul and Jim Webb, PolitiFact’s bias leads to inconsistent standards in fact checking. In one case PolitiFact’s writers couldn’t even agree with one another on what it means to “cut” government spending.

PolitiFact rated Republican claims during the 2012 election that Obama had cut Medicare by $700 billion as false because it wasn’t an outright $700 billion cut but a $700 billion reduction in future spending.

Meanwhile, in a fact-check defending Britain’s National Health Service (NHS) against criticism from Donald Trump, PolitiFact reported that “while the NHS has lost funding over the years, the march that took place was not in opposition to the service, but a call to increase funding and stop austerity cuts towards health and social care.” They were referring to Trump mistaking a pro-NHS protest for an anti-NHS protest, but the relevant point here is that NHS had not seen their budget cut, it was a cut in future spending.

It’s a common trick for politicians to frame reductions in increases in spending as cuts, and PolitiFact was able to correctly identify this in the former example yet also apply it inconsistently in the latter depending on the party of those being fact-checked.


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker