Image: Pexels/freestocks

In 2014, a team of psychologists and computer scientists engineered Facebook’s newsfeed to show either positively or negatively associated words for 689,003 users. Users responded by posting either positive or negative comments to the emotively rigged words. This was the first recorded instance of mass emotional contagion. Facebook successfully, and secretly, shaped the emotions of hundreds of thousands of unaware users. When the results were published, controversy erupted.

This is just one instance of a day-to-day phenomenon. Every single time you open Facebook, Instagram, or Twitter, you are guided, shaped and led by code and algorithms designed to manage and regulate user traffic. Coding is a set of instructions that shape computer software and hardware and particular ways. Algorithms are complex sets of instructions that underlie the logic of computer software — you input data, it spits an output. Codes and algorithms act on the world independently of direct human control.

Code and algorithms are typically understood as being blackboxed. This means that they evade mass understanding because they are both dizzyingly complicated and well-kept secrets of the corporate entities that build them. The world of codes and algorithms is a cryptic mess that is notoriously difficult for academics and journalists to untangle.

Even so, codes and algorithms are everywhere. They are not only the backbone of your social media, they are in smart objects, elevators, doors, cash registers, bank cards and video games. Most of the products we use have been in intimate contact with them.

They are the silent mediators that shape our daily existence. While they’re shaping our lives, they also generate data on everything we do. This data alone might be meaningless — but combined with data from other devices, those with access are able to construct profiles. These profiles are used for targeted advertisement, insurance firms, health companies, and intelligence agencies. They can be used to disadvantage you. Or, they can be used to provide accurate services (so the argument goes).

Codes and algorithms are not neutral. This is an easy fallacy to fall into because they are the ultimate “third party” to every social interaction you have. However, such codes and algorithms were initially programmed by computer engineers hired to do so by capitalist entities. These codes and algorithms are designed by the logic of capital. To return to the anecdote above — Facebook, arguably the largest advertising firm in the world, has the power to shape emotions — which it can recursively feed back into their targeted advertisements.

Codes and algorithms are designed to manage human traffic in profitable ways. Through social shaping, we are tuned into data piggy banks for large corporations. They allow us “free” use of their connective services in exchange for meta-data and content data. This is a good example of what philosopher Maurizio Lazzarato called ‘immaterial labour’, or, the labour that produces informational and cultural commodities.

Facebook’s algorithms facilitate connections between people on its newsfeed so that we produce content commodities that are then fed back through the system by way of targeted advertisement. Considering that Facebook has over a billion users — this commodity is lucrative business.

It’s not all that bad, though. Sociologist José Van Dijck observes, “connectivity is premised on a double logic of empowerment and exploitation.” Even though we are all being exploited for our creative labour — we do have new platforms that enable us to create phenomenal content. Services like Snapchat and Instagram enable humorous and novel ways of interacting. Whisper and Jodel allow us to create vibrant anonymous communities.

However, these communities are founded on profit margins, surveillance, and corporate propaganda. We need to weigh the creative benefits against the political dangers. Though it must be stated that social media has proliferated so much that we can’t get rid of it. We can’t destroy it. We can barely even make alternative social media platforms (Look at the largely failed projects Diaspora and Ello). Alternatives won’t be viable unless the majority of users make the switch.

The only answer that I can think of is education. People need to know how social media platforms can evoke the power of code and algorithms to shape entire populations. It’s a rough battle. Though there is science supporting these claims (such as the mass moral contagion experiment), such a phenomenon appears to be taken from a dystopic science fiction. In order to tackle this issue, users need to become more technologically literate. We need to learn the machine in order to change it.

The philosopher Gilles Deleuze became famous for his declaration that we have become a “society of control.” He claims that societies of control operate through computers and codes. We are shaped by the devices we use. Nations are shaped by scores of data. The global market is shaped by digitally mediated exchange rates. All of these things are largely out of our control, beyond the reach of human influence. The role of codes and algorithms in facilitating social control must be exposed, understood, and curved.

Like this article? rabble is reader-supported journalism. Chip in to keep stories like these coming.

Image: Pexels/freestocks.org

26907791_2030909257166562_4297075615213482560_n

Abigail Curlew

Abigail Curlew is a PhD student at Carleton University, a writer, and an activist. Her research is focused on the sociology of surveillance, technology, and security. She writes to bring attention to...