Image: Brian Solis Image: Brian Solis

Facebook is coming under criticism, partly because of the view that it has helped spread fake news, and partly because of the suspicion that it acts as an echo chamber, reinforcing our own biases. But its boss, Mark Zuckerberg has responded.

Sometimes, misinformation is enough to make one despair. Some of the stories that have done the rounds, across the internet of late, beggars belief, yet oddly, many people do believe them.

And Facebook has got the blame.

There’s another point, and this relates to a view first expressed by Eli Pariser, in a book called the Internet Filter. It’s the argument that the internet feeds us with news which we want to hear – it reinforces our bias.

So, let’s say we are either strongly in favour of ‘gun control’, or strongly in favour of the ‘right to bear arms’. We go to Google, and it knows us, and only feeds us with information that confirms our point of view, and we become more convinced than ever that we are right because there is such a paucity of information that contradicts us.

Others put it differently, they call it an echo chamber.

And what about Facebook – does is spread lies and echo our thoughts, confirming our bias?

But Mark Zuckerberg, has responded.

His counter argument comes in two forms.

He said: “Personally, the idea that fake news on Facebook . . . influenced the election in any way is a pretty crazy idea, voters back decisions based on their living experience.” He said: “We really believe in people, you don’t generally go wrong when you trust that people understand what they care about and what is important to them. And you build systems that reflect that.”

He continued: “There is a profound lack of empathy in asserting that someone voted the way they did because they saw some fake news. If you believe that, then I don’t think you have internalised the message that Trump supporters have been trying to send in this election.”

In any case, he asked why would someone think that there was more fake news on one side than the other, and that hoax news is a very small part of Facebook. He said, “we are doing our best so people can report [fake news].”

The second point relates to the idea of the internet filter.

He said that “I really care about this. I want people to have a diversity of information. . . all the research we have suggests this is not really a problem.” He said if you go back 20 years, there were few major TV networks . . . and few major newspapers, which each had an editorial opinion – then those were your opinion. You got all your news filtered through that.

“Almost everyone has some friends who are on the other side.” He said that Facebook research has found that even if 90 per cent of your friends are Democrats, you will have some friends who are Republican. He continued “you can go through religion, or ethnic background and this kind of diversity is true – we all know people who have a different perspective on the world . . . that means that the media diversity and diversity of information that you get through a social system like Facebook is going to be inherently more diverse than you would have got from watching one of the three news stations.”

He makes a good point, and let’s face it, many newspapers are not exactly the most objective they can be!

But here is an alternative point of view. We are seeing a move towards apps. Time was, we could jump from an article in the Guardian, to the Telegraph to the Times, we are losing that ability.

Increasingly more news sits behind pay-walls, or is simply cosseted in apps.

Other content is so surrounded by ads, that it is hard to read.

You can’t blame the publishers, they have to make money somehow – yet this, and not social media, may provide the biggest threat to us receiving balanced news and opinion, and fixing that may be a bigger imperative than worrying over Facebook.