U.S. conservatives say Facebook needs ‘significant work’ to address concerns: former senator

FILE PHOTO: Small toy figures are seen in front of Facebook logo in this illustration picture, April 8, 2019. REUTERS/Dado Ruvic/Illustration/File Photo

WASHINGTON (Reuters) – A review by a former Republican U.S. senator concludes that political conservatives believe Facebook Inc needs to do “significant work” to satisfy their concerns that the social media website is biased, describing policies and examples that they found problematic without laying out evidence of systemic partisanship.

FILE PHOTO: Small toy figures are seen in front of Facebook logo in this illustration picture, April 8, 2019. REUTERS/Dado Ruvic/Illustration/File Photo

The report by former Senator Jon Kyl, commissioned by Facebook and released on Tuesday, found in interviews with about 133 political conservatives that many opposed Facebook policies they believed undermined free speech by conservatives, such as bans on “hate speech.”

It is the latest effort by Facebook to address rising anger among Republicans over alleged conservative bias as some lawmakers call for legislation that would revoke the liability shield big tech companies have for content posted by users.

They also pointed to anecdotal examples of what they call unfair treatment of conservative viewpoints, such as unjustified removal of language from the Bible, which they suggest are examples of broader problems with enforcement of policies.

Facebook said in response it has hired staff dedicated to “working with right-of-center organizations and leaders.”

U.S. President Donald Trump and many Republicans in Congress accuse various social media firms of anti-conservative bias, while tech companies and Democrats have rejected the charge.

U.S. Representative David Cicilline, a Democrat who chairs a House panel on antitrust issues, questioned the review, noting the “‘audit’ was conducted by a conservative former Republican Senator who now works as a federal lobbyist.”

Republican Senator Josh Hawley said the report was not a real audit but a “smokescreen disguised as a solution. Facebook should conduct an actual audit by giving a trusted third party access to its algorithm, its key documents, and its content moderation protocols.”

Facebook and other large tech firms have acknowledged mistakes in handling some specific content issues.

Facebook spokesman Nick Clegg said in a blog post on Tuesday that the company needs “to take these concerns seriously and adjust course if our policies are in fact limiting expression in an unintended way.”

The Kyl report noted Facebook has made changes including more transparent decisions on why people see specific posts, ensuring page managers can see enforcement actions, launching an appeals process and creating a new content oversight board made up of people with diverse ideological views.

Republican senators have held hearings over the last two years with Facebook, Twitter Inc and Alphabet Inc’s Google accusing them of bias. Last month, two Republican senators asked the Federal Trade Commission to probe how major tech companies curate content.

Democrats say the bias allegations are without merit. Democratic Senator Mazie Hirono said in April that “we cannot allow the Republican party to harass tech companies into weakening content moderation policies that already fail to remove hateful, dangerous and misleading content.”

The report noted Facebook’s advertising policies prohibit “shocking and sensational content” and the company has historically rejected images of “medical tubes connected to the human body.”

This resulted in some anti-abortion advertisements being rejected. Facebook has revised its policies to prohibit only ads depicting “someone in visible pain or distress or where blood and bruising is visible. This change expands the scope of advocacy available for groups seeking to use previously prohibited images.”

The report by Kyl – who represented Arizona in the Senate from 1995 to 2013 and again in 2018 – focused on six areas of concern. These included how Facebook chooses content for readers, content rules such as those banning hate speech, potential political bias in content enforcement, ad policies such as the prohibition of “shocking or sensational content,” enforcement of ad policies, and a belief that Facebook’s workforce lacks political diversity.

Reporting by David Shepardson in Washington; Editing by Chizu Nomiyama and Matthew Lewis

Leave a Reply