Algocracy and surveillance capitalism: we live in a world governed by algorithms

Nicoletta Boldrini
9 min readMay 30, 2017

--

They can save lives, streamline bureaucratic processes and make organizational models more efficient, and even give rise to new business models. However, they can also affect (directly or indirectly) people’s lives and their work, even going as far as manipulating the economy, society or politics. Between algocracy and surveillance capitalism, the role of algorithms is becoming increasingly significant, concentrating global power in the hands of very few corporations.

by Depositphotos

Instructions for solving a problem and carrying out activities. That is what algorithms are. The code of a computer, an application, the Internet, a GPS itinerary, buying tips for a trip or a book. Today, everything is based on algorithms, even a recipe. Artificial Intelligence (AI) is nothing but a set of algorithms — in other words, what we see on social networks is the result of the work of algorithms. Even the news we view on the Internet or a mobile is the result of selection carried out by mathematical models “translated” into computer code. Most financial transactions take place by means of algorithms — the screen readers we now use on a daily basis work thanks to algorithms, just like the facial recognition systems that allow us to “organize” our photographs (and do much more, such as checking people’s identity) — or those that build automobiles or drive them on our roads with no one behind the wheel. These are mostly invisible instruments that can help and “augment” our lives, assisting us in our everyday activities, on the job, in business. The other side of the coin is that, in what many today already call the age of algorithms, the world may be governed by artificial intelligence, concentrating “power” (social, economic and political) in the hands of those capable of modeling and controlling algorithms. Hence, it becomes a priority, in this delicate evolutionary phase of computer science and human history, to study and understand not only the potential and opportunities offered by these systems, which are undoubtedly many, but also the macro-economic, social, political and organizational impacts in order to define a new culture (based on code) and the rules according to which it can expand.

I had the chance to take an in-depth look at and analyze these aspects thanks to a piece I wrote for ZeroUno. In this post, I go back and re-examine some aspects in greater detail, thanks to the support of Aneesh Aneesh, Senior Director, International Affairs and Outreach, Professor of Sociology and Global Studies of the University of Wisconsin-Milwaukee (also a former Professor at the prestigious Stanford University), theoretician of the so-called algocracy, and of Shoshana Zuboff, Charles Edward Wilson Professor Emeritus of Business Administration at the Harvard Business School.

The theory of algocracy

As already mentioned, Aneesh Aneesh is the professor/researcher who theorized the concept of algocracy (an organization model with no conventional bureaucratic checks but rather guided by codes and algorithms) after carrying out a number of ethnographic studies and analyses “in the field”, mainly between India and the United States, exploring how technological developments are changing the way a job can be done and organized.

In a book written in 2006, entitled “Virtual Migration”, Aneesh examines in depth labor and workers migration, comparing the online wok flows of software developers located in India (labor migration from the US towards India) with that which he calls “body shopping”, the migration of workers (in other words, analyzing the flows of Indian workers to the Unites States to perform the job). The analysis of these flows was always conducted by taking into account the physical barriers that separate the job (and the workers), also found in outsourcing models (which were mainly developed to search for low-cost labor). “With the advent of an increasingly integrated global economy, a different type of migration is underway, a virtual migration”, says Aneesh. “Today, we have the possibility to transfer skills, jobs, data with no physical movement entailed. One can migrate without actual migration.”

In this context, what should make us think, according to Aneesh (whose studies always concentrated mainly on the analysis of new organizational models and on the bureaucracy that is ‘created’ by means of technology) is the fact that the programming language (the code), seems to be in itself “the key organizational structure behind this virtual migration”. To understand exactly what Professor Aneesh means, we have to take a step back in time. “It was 1999 when, looking for a software company in India, it struck me that most of the applications that we used had never been created ‘at a single location.’ Most of the software was developed (and still is to this day) in multiple places at the same time: different teams, ‘sitting’ in different continents and countries, working on the same project. I was familiar with the ‘literature’ on the advent of the large central management systems (Enterprise Resource Planning or ERP) as a crucial element for coordinating the ‘huge’ bureaucratic activities of company organizations — explains Aneesh -,but in the case of global software development it has never been possible to have a sort of ‘middle managerial layer’ for centralized coordination of the teams and of the different labor ‘schemes’ of the world’s various countries”.

This is an even clearer scenario if we think about the development of open-source code, where the communities of developers throughout the world coordinate ‘simply’ by a self-management model.

Some time after these initial remarks, while sitting one day next to a programmer, looking at the screen she was working on, Aneesh had an insight: “it is the software itself that acts as job supervisor!”. Aneesh had noticed so many access checks integrated in the software platform the programmer was working on “that there was no need for a human manager to supervise the work,” he thought. When, in 1998, during his post-graduate studies, Aneesh wrote his first paper on the topic of hyper-bureaucracy (published one year later) to describe an “out-of-control” hyper-bureaucratic system, he had yet to fully understand that code — according to his theory — is itself the “organization”, a vision that became clearer to him after watching the programmer in action. “In the absence of a better word, I coined the term ‘algocracy’ to identify ’the rules of the code’ (or of an algorithm) as an organizational model capable of replacing the ‘rules of an office’ (the bureaucracy of a company or of an economic system),” as described by Aneesh. “Algocracy tends to flatten out all bureaucratic hierarchies as it does not require any management level, whether intermediate or centralized.”

Algocracy is not defined by elements such as hierarchy, documentation, the dominance of certain people’s positions over others. The traditional bureaucratic rules must be ‘internalized’ by those who are required to follow and obey them, whilst “an algocratic system structures the possible field of operation without demanding from a person any effort to adapt to the rules — the algorithms built into the system itself start or block an operation without asking anyone to ‘internalize’ the policies.”

The bureaucratic, panoptic and algocratic society

To explain in even more detail how our society and our organizational models — not only occupational but social and economic as well — can be modified by technologies, generally speaking, and by algorithms specifically, Aneesh illustrates a very simple example concerning traffic control and motorists’ violations. “Control by using traffic lights implies the requirement for motorists to comply with certain rules (for example, stopping at a red light), the breaches of which can be observed by the traffic police directly, ” explains Aneesh. “This organizational/behavioral model works for two reasons: the internalization of rules by motorists, who thus adapt their actions, and the threat of a fine as the consequence of misconduct.”

This first model represents a bureaucratic organization (complying with the red, yellow or green light is the equivalent of obeying the rules of a company or of civil society) but, Aneesh points out, “how may drivers are there who do not obey stop signs or red traffic lights without getting ‘caught’ by the police?”.

Then there is a second control method, based on the use of video cameras that film the traffic and potentially record all motorists’ violations. “The rules are obviously the same as the previous model but, with this type of organization, for each observed violation a fine is sent to the offender, along with a photograph as proof of the violation,” explains Aneesh. “This type of technological system, used to its full capability, is able to record all violations, and the notification of the fine becomes the consequence of the ‘rule breach’ committed by a person.” In this case, the organizational model is identified as panoptic.

Lastly, there is the algocratic model which, in Aneesh’s example, becomes a traffic self-control system not based on rules but on how roads are built: “let’s think of a road infrastructure that, due to the way the lanes are paved, prevents drivers from turning right or left or from parking in places other than those ‘designed’ by the road engineers. In this model, one does not need to be chased by the Police or receive a fine by mail, since if one ‘violates’ the model, they will crash and damage their car.”

Aneesh failed to express any personal opinion concerning these three governance models, simply stating that “all three have potentials and limits, and work in different ways: traffic lights can break, the number plate photographed by a video camera may be illegible, a sports car may be able to overcome the physical traffic control barriers…”.

But after the preliminary analysis on the ‘power’ of algorithms, it is clear that the future of a possible algocratic society captures my attention. “Whilst bureaucracies take advantage of the ‘action-guiding’ model (they guide our personalities towards specific norms), algocracies predetermine the action towards specific results,” explains Aneesh. “As a practical and current example, we do not know the algorithms used by Google or Facebook, even though they define a priori our possible range of action. One effect of algocracy that today we see being applied to identities as well: financial identities (credit scores, for example), shopping identities (which ‘frame’ a purchasing behavior) and even medical identities (which group people together depending on condition or treatment), are all built algorithmically by different systems without our approval or involvement.”

On these topics, Aneesh wrote his last book entitled ‘Neutral Accent’, which is highly recommended reading.

Surveillance capitalism

Shoshana Zuboff is the author of the famous book ‘In the Age of the Smart Machine: ‘The Future of Work and Power’ through which, in 1988, she introduced the concept of ‘Informating’, in other words the digitization process that translates corporate activities, events, changes and objectives into information.

The research conducted by Zuboff focused on an in-depth analysis of the changes in professions and professionals, as well as of the organizational models of offices, companies, factories, in the environments where — since the end of the 1980s onward — computers were introduced, followed by IT machines and systems. Over time, her research has proven that the relationship between IT and man (and work) is based on three directions:

1) technology is not neutral: it includes intrinsic features that enable specific human experiences but precludes others;

2) the new possible horizons change the scenarios: through technology, individuals and groups of people ‘build’ new ‘horizons’ and take different decisions compared to the past, thus affecting future and possible scenarios;

3) possible influences and limitations on society, the economy and politics: human choices are affected by social, political and economic interests that have an impact on both the opportunities and possible limitations of man, of work and even of technology. At the same time, nowadays technology, and even more so algorithms, can make their ‘weight’ be felt on society, the economy and politics.

It is precisely on this last aspect that the American researcher has focused in the last few years, speaking in particular of the new ‘surveillance capitalism’ model, of which Google is a pioneer and whose social model, according to some surveys by the Economist, today is led by five corporations only: Alphabet/Google, Apple, Facebook, Amazon and Microsoft.

“Google is the ‘ground zero’ of a new species of capitalism, the profits of which arise from (one-way) surveillance and from influencing and changing human behavior,” Zuboff recently wrote in an article. Capitalism was ‘betrayed’ by a profitable surveillance project that overturns the normal evolution mechanisms of society and of the economy by completely changing the ‘rules of the game’ and the demand and offer dynamics that assured a market democracy for years.”

Zuboff’s remarks are rather critical. In her publications, she goes as far as claiming that surveillance capitalism should be looked at as “a new economic mutation generated by the clandestine mating of the great powers of the digital sector with the radical indifference and intrinsic narcissism of financial capitalism and its neo-liberal vision that have dominated Anglo-American economies for at least three decades. It is an unprecedented form of market that is blooming in a lawless space.”

It is therefore clear, that the time has come for us to “get moving” and start developing, at global level, a new social, economic and political culture around which to identify proper “behavioral” rules (ethical as well as judicial).

--

--

Nicoletta Boldrini
Nicoletta Boldrini

Written by Nicoletta Boldrini

Independent Journalist, tech popularizer, author & speaker | Double soul: tech & humanist | Design Thinking Facilitator | Futures Literacy&Foresight Facilitator

No responses yet