Facebook

Soon Facebook Will Show You What Russian Propaganda You’ve Been Exposed To

By Jake Anderson

Facebook users will soon be able to access a portal showing them if they liked or followed pages created by the Internet Research Agency, Russia’s so-called troll farm that created thousands of fake accounts in order to influence American voters. The move comes after the social network came under protracted scrutiny for its role in facilitating a ‘disinformation’ campaign during the 2016 presidential election.

In a blog post entitled, “Continuing Transparency on Russian Activity,” published on November 22nd, Facebook stated its intent:

It is important that people understand how foreign actors tried to sow division and mistrust using Facebook before and after the 2016 US election. That’s why as we have discovered information, we have continually come forward to share it publicly and have provided it to congressional investigators. And it’s also why we’re building the tool we are announcing today.

According to estimates, 29 million Americans saw Russian propaganda directly in their newsfeeds, and 126 million shared similar posts linked by friends. When Instagram posts are factored in, the number rises to 150 million.

The portal will show users if they liked or followed Internet Research Agency-linked Facebook pages or Instagram accounts between January 2015 and August 2017. However, they will not be notified if one of these pages’ posts appeared in their newsfeed via a friend.

Earlier this month, Facebook, Alphabet Inc.’s Google, and Twitter Inc. appeared before Congress to testify regarding the extent to which Russia leveraged Internet platforms to influence voters with incendiary posts on a wide range of issues, including gun rights, immigration, religion, and race relations.

Facebook representatives, including spokesman Andy Stone, acknowledged the new portal would not be able to show users the full extent of how many troll ads and propaganda pages were shared during the election.

Jonathon Morgan, chief executive of New Knowledge, who studies the spread of information, stated:

People are much more affected by content shared by their friends, they are more likely to click on it and spend time reading it and considering its merits when a trusted friend shares it on their Facebook page. People don’t know the extent to which they are influenced by what their trusted social circles post online.

The New York Times article where this statement was originally published concluded with the line, “The lesson? Choose your friends carefully,” underscoring the deeply divisive nature of ‘Russian disinformation’ and corresponding ‘fake news.’

When Facebook said it was “much more challenging” to gauge who paid attention to specific content, congressional members challenged them, noting that their business model depends on tracking the effects of targeted ads on the platform.

The effort comes after the announcement of a larger initiative by a massive consortium of tech companies and publishers — including Facebook, Google, and the Washington Post — called the Trust Project, which will seek to label online content with “Trust Indicators” and potentially censor it.

Creative Commons / Anti-Media / Report a typo

Sponsored Links

Click to comment

Most Popular

To Top