barn-20785_1280

I must admit that I enjoy the targeted marketing hitting me on for instance Facebook, as compared to advertising in flow TV or other medias, I rarely see anything that does not have some relevance. Only when I turn on flow TV or radio, I get exposed to random advertising – and it annoys me. Like really annoys me! Stop talking to me about new cars or crappy beer I do not want, Facebook would never do that to me, because Facebook knows that I don’t care about a new car and that my palate prefers decent beer.

Having worked “behind the scenes” around Facebook advertising for a few years, I have knowledge about some of the many possibilities that are presented to the advertiser. Obvious choices like age, gender and geography are settings alongside more fluffy settings like “interests”. And it is all run by algorithms.

Algorithms makes the world go round

Algorithms this new buzzword that is on everybody’s lips, but hardly understood by many. Algorithms, complex mathematical models are deciding who should see what on platforms like Facebook, Linkedin and Google. This is definitely good. The whole idea of “googling” stuff, requires selection methods, otherwise too much nonsense would appear. Great algorithms were the reason Google won the search engine game so easily – who would ever be “Alta vistaing” anything?

Recently I read an article about how Linkedin is favouring some posts from influencers over others. And maybe that is ok, however, it is impossible for the users, or in this case the contributors of content, to know how this favouring works.

Another recent article implies something even more troublesome. Analysis of big data in connection to crime, enhances racial biases based on names predominantly used in certain ethnical groups. Link to article here

So are algorithms bad and evil? By all means no, not at all. Algorithms are our only way of using the vast amount of big data that are piling up in all sectors and companies. We have absolutely no possible ways of analysing all the patterns hidden in these data without advanced data modelling, done by algorithms. But what is important is to consider the hidden undemocratic implications.

On Facebook, the algorithms take into account what I like, click, watch and so on. I am a huge soccer fan, and whenever a video or post regarding this topic is posted, I usually click/watch/like/share. This confirms Facebook algorithms in what they already know: She is a soccer fan, and if we feed her posts on this topic, they will become successful!

Self-fulfilling presets

And what is wrong with that? Not necessarily anything, however, this “self-fulfilling” algorithm can create what evolutionary biologists would call a drift: The more you confirm certain interest, the more of the same stuff you get, and: And now it becomes important! The less you get of other stuff. Obviously you know what you see, but not what you don’t see. And maybe, just maybe, this could pose a threat to democracy. Democracy is based on free information and awareness of the people, but what if we, the people, are presented with different versions of the truth, without knowing that other versions exists? One could argue, that this is already the case. different news media show different stories, but if these stories become even more specific, and advanced algorithms on the news sites choose never to show you even the headlines for stuff you normally do not click on? then at some point you might not be aware that it is there?

We desperately need algorithms to show the way thought overwhelming amounts of data. Especially in the health sector miraculous solutions appear using big data analysis, but we do need to be careful and demand more transparency and apply caution. Algorithms run on parameters, and we need to understand that these parameters are set by humans. We have ways of controlling that algorithms doesn’t create racial drifts or drifts towards separate groups of people with no awareness of what goes on in the other groups. As in all other data analysis, we need to think! And we need to get access to control how major algorithms control our data access. When we are in a situation where algorithms control how we perceive the world, based on prior behavior (not necessarily our own, but people that look like us!) then we need to look out.

Comments

comments