Facebook Releases ‘Content Distribution Guidelines,’ Will Target ‘Untrusted’ News
On Wednesday, Facebook published their “Content Distribution Guidelines” (CDGs), which documents the types of content which are actively demoted by the social media giant on Facebook’s “News Feed.”
While Facebook’s “Community Guidelines” — which are used to justify the removal of content from the platform — are public, the company released their CDGs as part of a supposed effort to improve “transparency” when it comes to Facebook’s ranking decisions.
Nick Clegg, a former British politician and now Vice President of Global Affairs for Facebook, released a blog post in March titled, “You and the Algorithm: It Takes Two to Tango,” which hinted at the release of these CDGs. (Before accepting a seven-figure salary from Facebook, Clegg wrote in 2016 that he was “not especially bedazzled by Facebook,” and that he found “the messianic Californian new-worldy-touchy-feely culture of Facebook a little grating.”)
“Other measures coming this year include providing more transparency about how the distribution of problematic content is reduced; making it easier to understand what content is popular in News Feed; launching more surveys to better understand how people feel about the interactions they have on Facebook and transparently adjusting our ranking algorithms based on the results; publishing more of the signals and predictions that guide the News Feed ranking process; and connecting people with authoritative information in more areas where there is a clear societal benefit, like the climate science and racial justice hubs,” Clegg wrote earlier this year.
“Our Content Distribution Guidelines outline some of the types of content that receive reduced distribution in News Feed. As these guidelines develop, we will continue to provide transparency about how we define and treat problematic or low quality content,” Facebook announced on Wednesday. “Our enforcements to reduce problematic content in News Feed are rooted in our commitment to the values of Responding to People’s Direct Feedback, Incentivizing Publishers to Invest in High-Quality Content, and Fostering a Safer Community.”
These guidelines — used to govern the demotion of content — are separated into three categories: “Responding to People’s Direct Feedback,” “Incentivizing Creators to Invest in High-Quality and Accurate Content,” and “Fostering a Safer Community.”
The first category, “Responding to People’s Direct Feedback,” appears to be based on user research. It includes the demotion of “clickbait links” which “lure people into clicking on an included link by creating misleading expectations about the post or article’s content,” “engagement bait” which explicitly seek “votes, shares, comments, tags, likes, or other reactions” for bad-faith reasons, and a range of “low quality” posts. These include low quality “browsing experiences,” “comments, “events,” and “videos,” with the definition of quality being arguably fluid — if not entirely subjective — based on “feedback.”
The second category, “Incentivizing Publishers to Invest in High-Quality Content,” claims to encourage publishers to produce “interesting, new material.” This category includes efforts to demote “domains with limited original content,” articles “debunked” as “False, Altered or Partly False” by “non-partisan, third party fact-checking organizations,” and posts from “untrusted” news publishers or publishers without “transparent authorship.
Perhaps the most controversial item in this category involves the demotion of “Links to Domains and Pages with High “Click-Gap,” in which posts to websites that receive a “particularly disproportionate amount of their traffic directly from Facebook compared to the amount of traffic the websites receive from the rest of the Internet” are demoted.
Finally, the third category, “Fostering a Safer Community,” addresses content that “may be problematic for our community, whether or not it’s intended that way.” This includes so-called “borderline” content,” content from Groups and Pages with an association with “Violence-Inducing Conspiracy Networks” such as “QAnon,” and content that might violate Facebook’s Community Standards or other policies.
Andy Stone, Facebook’s “Policy Communications Director” and former Press Secretary for then-Senator Barbara Boxer (D-CA), tweeted, “The Content Distribution Guidelines outline what content receives reduced distribution in News Feed because it’s problematic or low quality — things like misinfo, clickbait, and ad farms.”
“Our Community Standards describe what we remove because we don’t allow it on the platform. The Content Distribution Guidelines focus on what we reduce via ranking,” he added.
Our Community Standards describe what we remove because we don’t allow it on the platform. The Content Distribution Guidelines focus on what we reduce via ranking.
— Andy Stone (@andymstone) September 23, 2021
Stone sparked controversy during the 2020 election cycle when he announced that the now-infamous New York Post article involving Hunter Biden and alleged links between Joe Biden and Ukraine was “eligible to be fact checked,” and would be demoted on Facebook in the meantime. While multiple news outlets refuted the story’s authenticity at the time, it was never debunked by a mainstream fact-checking organization.
While I will intentionally not link to the New York Post, I want be clear that this story is eligible to be fact checked by Facebook’s third-party fact checking partners. In the meantime, we are reducing its distribution on our platform.
— Andy Stone (@andymstone) October 14, 2020
The Daily Wire is one of America’s fastest-growing conservative media companies and counter-cultural outlets for news, opinion, and entertainment. Get inside access to The Daily Wire by becoming a member.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...