If you prefer audio, listen to my conversation with @CaseyNewton: https://t.co/LDdSnQuH2V
This piece is lengthy, but the questions surrounding technology’s power – and the companies behind it – are complex and require thoughtful consideration.
People are only going to feel more comfortable with the algorithms behind News Feed if they have more visibility into how they work and more control over what they see. So we’re rolling out new tools to help people better understand and control what they see in their News Feeds.
But the caricatures of social media's algorithms don't tell the full story: You are an active participant in the experience. The personalized “world” of your News Feed is shaped heavily by your choices and actions.
Some of this criticism is fair: companies like Facebook need to be frank about how the relationship between you and their ranking systems really works. And they need to give you more control.
The debate over social media's role in society often centers on concerns that people are being manipulated by algorithms they can’t control. Some say the machines have already won. Read my take here: https://t.co/B1a7UbxEzZ
The board’s recommendations touch on some of the trickiest content moderation issues Facebook faces, where there are often no easy decisions. Today’s announcement shows that an external body has the power to affect long lasting change across our policies, products and operations.
When Facebook created the Oversight Board, we hoped its impact would come both from its decisions and by making broader recommendations on how we can improve our policies and practices. These first 17 recommendations show the latter in action.
In addition to the Oversight Board’s binding rulings on content, we are committed to consider its recommendations and communicate transparently about actions taken.
Today Facebook is responding to the first policy recommendations made by the @OversightBoard. Of the seventeen recommendations, we will commit to change on eleven, we’re still assessing feasibility on five, and will take no further action on one. https://t.co/g0mb6BNq5c