Algorithms comprise a key layer of a new power framework in surveillance capitalism. They require huge datasets, a data “surplus,” to learn to recognise new patterns, refine their ability to “predict” the future, and transform themselves into ever more perfect systems of rules and functions. Simultaneously, the assumptions on which they are based, the criteria according to which they classify, and the rules regulating their work are usually non-public and hidden. This not only stems from the users’ lack of competence regarding programming language but also the patent law and trade secrets. The algorithms’ creators and holders oftentimes wish for their way of acting to remain hidden. Thus, they exert a bigger impact on the users’ choices, behaviours, and needs.
Algorithms are increasingly more skilled at recognising our predilections, behaviours, needs, desires, and habits. They can discover various patterns in different datasets, from natural sciences, medicine, and economy, through socio-political processes, to technologies of scrutiny, control, and surveillance. They not only “notice” rules and correlations but also produce hypotheses. Many of our individual and collective choices, with whom we talk, and what content reaches us – often stem from algorithms’ activity. Usually, we have no access to knowledge regarding their functioning. It has been contained in a “black box.” We often do not know the source code, databases on which they are trained, and other input data. Algorithms comprise an invisible framework of our world, somewhat like an invisible labyrinth or invisible threads animating our bodies, affects, and cognitive processes. Thus, they become a system of hidden power, devices of surveillance and social control.
It is an important goal for today to break that impasse, uncover the political contexts of technology, and point out that the direction of their development is not determined by obscure and arcane powers but depends on our agency as humans and the political decisions we make.