In the month Oxford Dictionaries declared post-truth as the word of the year, my news feeds were storming with readings related to social media, fake news and the American elections. These were my favorites:

1) A New Yorker article featuring interviews with Barack Obama.

Before the elections Obama was showing concerns about how information moves and spread in social media, especially in relation with the (at the time still) future American Elections.

A particular matter that the article’s author David Remnick report Obama “talked almost obsessively” was about a group of people in Macedonia publishing a number of pro-Trump websites, often reporting fake news, yet with hundreds of thousands followers on social media (Facebook). And he doesn’t hide his concern about it.

The new media ecosystem “means everything is true and nothing is true,” Obama told me later. “An explanation of climate change from a Nobel Prize-winning physicist looks exactly the same on your Facebook page as the denial of climate change by somebody on the Koch brothers’ payroll.

And somehow it all came together after the election, when he finally taps into a key matter. Social media relies on engagement, and this is a currency measured in emotions, not factual rigor…

“What I’m suggesting is that the lens through which people understand politics and politicians is extraordinarily powerful. And Trump understands the new ecosystem, in which facts and truth don’t matter. You attract attention, rouse emotions, and then move on.””


2) Benjamin Bratton’s note on Facebook and (what he calls) algorithmic populism (on Facebook)…

One point of his long (and complex) text, was about the fact that software design choices can acquire political significance. And Facebook’s algorithms which caused (or at least didn’t prevent) the spread of fake news about the American elections, is not the only solution for controlling what posts appear in people’s news walls.

Google’s PageRank (and other Search and Display algorithms) were designed to surface the most credible source on topics, based originally on peer-review citation mechanisms, and in theory FB can explore and implement something similar. Doubtless FB has and is re-debating that turn now.

And shedding light on this matter reveals a problematic ambiguity which lies at the core of Facebook (and social platforms with business obligations in general).

Zuckerberg is now in the awkward position of having to convince people that FB does influence purchasing decisions but does not influence voting decisions. How so?


3) …and Mark Zuckerberg’s thoughts on Facebook and the election (on Facebook)

Whose bottom line is

we take misinformation seriously


problems here are complex, both technically and philosophically.