The outrage following the US election has catapulted the once benevolent digital institutions of Facebook and Google into the headlines as being the 'useful idiots' of extreme political agents and propaganda mills. Oh how the mighty have fallen!
But I feel this is an unduly harsh reaction from the now politically dispossessed who, like a frog, were boiled so slowly they didn't even notice the temperature rising.
We are all to blame here and if we first look at Facebook than the infamous 'echo chamber' isn't something imposed on you but is of your own creation. You aren't viewing an impartial 'news feed' but the aggregated opinions of your self-determined peer group. So get better (more diverse) friends, remove the nutters from your profile and mute those with extreme views (unless you're into that sort of thing in which case, follow the Guardian to shake things up a bit).
While you're at it why don't you ditch the Daily Mail (in all forms), stop spreading click bait, actually read an article before sharing it and learn how to spot real, fact checked journalism before you question why you're suddenly goose stepping around.
If you don't or 'can't be bothered' than you can't blame Facebook who (btw) has removed extreme accounts from the ad exchanges and actively demoted fake, click bait news - attempting to do anything more would cause an uproar.
We seem to think social media is intended to be a democracy but in fact it's more free market where fear and the herd mentality ride roughshod over the facts.
Google on the other hand is designed to deliver the best results to meet a searcher' s query, which by design leaves it vulnerable to slander and manipulation, because it prioritises legitimately negative news that can be damaging far longer than is reasonable (just ask Rick Santorum lol).
If Facebook exposes the ugliness of modern tribalism then Google's 'enlightened' algorithms sadly fall victim to the same human condition they try and overcome.
Here's how it works. Google wants to be impartial and will judge a page on it's relevance and authority (ironically decided by links and social shares), matching the best page to meet the intent of a searcher.
However intent is a tricky thing for Google to judge, particularly if not many people have asked a question before. So, quite rightly, the algorithms constantly test what they serve each user, to continually improve and deliver a better experience. It's how machines learn and make sense of human intent and natural language.
This means that searches for people and businesses will return a mixture of photos, profiles, wikipedia entries and videos to give the user a smorgasbord of choice which is refined over time to a more precise representation of what Google thinks you're looking for.
But where this goes wrong is when Google tests 'fresh content' and tries to not just balance the variety of content but the sentiment too, presenting both good and bad, real and fake in equal measure to see what we choose.
And sadly that's where man, machine and impartiality breaks down because just like on Facebook we'll more often than not slow down to check out the car crash headline or scandalous fake news story, clicking on the only negative thing in the results.
In turn these stories get served up to more and more users, accumulating even more clicks until an article reaches the top of the page and stays there because of the search engine's confidence in user behaviour.
So in both of these fantastic systems, in which we spend vast amounts of time each day, we are the problem because Google and Facebook's shortcomings are a reflection of our worst nature.
So be better people, think before you click!
I like to point out that on the information superhighway, when there’s an accident, people tend to be rubberneckers. It only makes sense that when you sees a link in search results that has a scandalous or highly negative headline, your attention is captured. It’s only human nature. If you’re searching for information about a person or a company, seeing such a headline makes you curious, and you want to find out about it.