The Sociology of Algorithmic Power

Lauren Toulson
9 min readApr 2, 2021

Power and inequality is reproduced through our algorithms. It’s a simple fact and it’s something the industry as a whole is starting to become aware of. Their design is a direct reflection of the cultural values of the designers and the company that makes them, but it also mediates and shapes the cultural practices of community engagement, in the case of online algorithms. Automated decisions are trained on cultural data, such as a diverse array of tastes from different cultures, and it ‘learns’ to predict and move in ways accustomed to that taste group’s cultural preferences.

There are numerous examples of how the algorithm is not trained on diverse cultures of taste and reproduces the values of those controlling it.

Photo by Charles Deluvio on Unsplash

The Social Theory

Bourdieu’s famous theory of Habitus defines it as a way of thinking and being that distinguishes the classes, and is reproduced through generations. Parents teach their children values, attitudes and skills, to help them navigate the social world, but through doing this also isolate them from socialising with other classes who have differing sets of attitudes and tastes, which marks them as a specific social class. Reproducing these class specific tastes and values reproduces class inequality, such as access to higher education and higher-paid jobs which may require a certain level of cultural capital (knowledge of literature and fine arts) and economic capital (the money to pay for the right classes and education, in order to break the class boundary.

Social capital is the building of a network of social connections, where cultural capital is what you know, social capital is who you know. Bourdieu’s social capital explains the hardships of inequality; those who know successful people are more likely to have life success and opportunities than those who do not know the right people.

‘Real life’ offers individuals the opportunity to diversify their social capital through engagement with many opposing interests. Online, however, the filter bubble of our online newsfeeds contains the user within a community of like-minded individuals.

Capital and AI

Many algorithms are trained to make predictions based on social, behavioural data, which is therefore data representing social and cultural capital. By training AI systems this way, there are many cases where power and inequality are reproduced, because we are defined by our capital and this shapes our social opportunities.

Facebook

Through media like Facebook, people are particularly receptive to trusting it as a news source, with 62% of US adults using it daily for news, therefore trusting the authority of Facebook and its algorithm to deliver the right ideological messages. Facebook, through its algorithm that can tune towards certain messages, has the power to narrow debate and create consensus for a chosen political agenda. This was evidenced through Facebook’s social experiment using social contagion as a way to get more US citizens to vote. Some Facebook users were shown content declaring who in their social network had voted, others received no content. From those that received the former, an extra 340,000 voters turned out, enough to swing elections (O’Neil). This is an example of Shoshana Zuboff’s Instrumentarianism, eroding democracy quietly by influencing behaviour in a way that challenges the freedom of action and information necessary for maintaining a democracy.

This is done using social capital, exploiting shared communities and connection to reproduce power among a few elites. The social capital of ‘who they know’, and social capital’s leisure activities, is precisely the tool used by the algorithm to influence voting behaviour through Facebook. Furthermore, the personal relevancy score used as part of the Facebook newsfeed algorithm computes aspects of both social and cultural capital, through friend networks, community engagement, content that is liked, pages and brands followed. Precise knowledge of all of a person’s capital is used as a tool against them in a fight to shape their behaviour towards consumption that leads to monetary success for Facebook, through personal news content and advertisements that will meet Facebook’s ideological agenda (Zuboff). Whether someone is part of a lower-class community or a middle-class executive, their class-specific tastes are recognised by the algorithm and used to turn those tastes into the preferred outcome: consumption of advertised goods, or politically turned towards a certain ideology.

You can read more about the dangerous affects of Zuboff’s surveillance capitalism and the filter bubble here:

Life Chances

Capital is embodied by China’s Sesame social credit system as a form of exerting power and inequality based on a person’s network and consumption practices. Credit scores are made by an algorithm based on all aspects of the citizen’s life: aspects of their cultural capital such as the type of products they buy and their school and educational background, and social capital through their job, friends and wider network. All of this determines their life opportunities, such as their ability to take out loans, buy houses and even chances on dating apps. The algorithm is intended to single out people with ‘bad’ credit and exclude them from society, therefore reproducing the inequalities that people may have been born into, as well as reproducing the state surveillance and authoritarian power of the Chinese state. While Bourdieu’s theory of cultural capital is based upon the idea that parents may choose to educate their children with ‘better’ tastes to allow them to be more socially mobile, the example of social credit algorithms means it may be much harder for children from a disadvantaged background to gain all the social credit needed in order to be offered the same opportunities through the algorithmic system as a child born into a network of family and friends with high social credit.

Photo by Cytonn Photography on Unsplash

Recruitment

Something similar occurs through Western recruitment algorithms. Criado-Perez raises issue of the gender bias imbued into recruitment algorithms designed to reduce the human bias it results in reproducing further. Recruitment algorithm Gild works to provide suitable candidates for programming positions by not only scanning their CV, as is commonly used in AI tools in recruitment, but also their online data.

Prior to algorithms, some companies would seek out anti-social males with bad hygiene as statistics showed these were characteristics common in the best programmers and hackers. AI is trained on the data of top programming employees so it could assess interview candidates by their presence, like tone of voice, as well as online community engagement. However, the issue with this is it does not provide equal chances for female candidates. AI trained on top employees is most likely to be trained on male data, such as the male tone of voice or programming style, because the proportion of females in IT positions is a fraction compared to males. The AI, then, would be looking for male characteristics as signifiers for a ‘good’ candidate, biased against women. In the Gild example, the online data it gathers includes an assessment of how central they are to the programming community through their engagement on sites like GitHub and Stack Overflow, as well as interest in Manga sites, which is associated with good hacking skills. The issue that Criado-Perez raises is that on average, women do not have the time to engage in such online pursuits; 75% of the world’s unpaid work, such as care of children and elderly parents like school runs and shopping, and domestic duties, is done by women. Leisure time for women is much shorter than for men, meaning an algorithm looking for someone spending more time coding online is much more likely to find a man as the best candidate than a woman, simply because of the difference in social capital between the two sexes. Inequality is reproduced because of the candidate’s social capital, therefore it is likely more men will continue to be hired to build AI that will further exclude female employees, because their ideology reinforces the idea that good programmers can prioritise time online building a network presence. It is not the intention of programmers of Gild to favour men, the trouble is in using social capital as a form of assessment for the worthiness of a job. By only using one form of social capital as an indicator of successful programmers (their online activity) they are inadvertently favouring males and using their power in AI design to fuel reproduction of the gender inequality in the tech industry.

Can those under the influence of the Algorithm take back power?

Photo by Miguel Bruna on Unsplash

Gauntlett argues that within a Web 2.0, an internet that allows for active engagement and creation of the online space, that consumers are the creators and have taken back some power from the traditional mainstream media through active blogging.

If the active blogger has the ability to construct online who they wish to be, within or outside of hailed ideologies, then that same consumer must be able to reject propaganda on their newsfeed that attempts to sway their political decision. However, the difference is that for the blogger’s identity, they are offered a choice of self-transformation, whereas with the Facebook newsfeed, the user is slowly and carefully fed only one viewpoint, which adapts in real-time to fill the gaps in their doubts, to ensure only one outcome is met.

The power of surveillance capitalism is to undermine freedom by directing users towards the preferred end of the capitalist. Gauntlett has the view that user content is a form of consumer power, however in surveillance capitalism it is that content which is used to exploit them. A user may, for instance, be actively aware that Facebook erodes democracy by limiting the type of content they can see. They may unlike all of their pages (or like a wide scope of opinions), so that Facebook has ‘nothing to go on’ regarding their views, in fact they could delete all of their data, making it incredibly difficult for Facebook to know their political preference and therefore unable to ‘swing’ them with injected content. If they keep just their friends, their social capital would be enough for Facebook’s Cambridge Analytica, and today’s algorithm, to use as an indicator, since they assume friends share the same values. Even if they deleted all their friends, their GPS location could be used as data to undermine their freedom. With any kind of social capital, even for an active web 2.0 user, it seems the algorithm is relentless in its ability to reproduce power over the masses and erode democracy and personal freedom.

In the case of the algorithm that reproduces gender inequality in the workplace through its use of social capital, it is difficult to apply the analytical perspective of the active user. The woman who understands that the algorithm works against her may spend time engaging more online to earn social capital that may get her an interview, however, in doing so she faces even more inequality. If, as per Criado-Perez’s statistics, she is like the majority of women undertaking excessively more hours of unpaid domestic labour than men, then she is not only having to continue in this role but take on the extra challenge of building credit online, working against men that have a much easier time doing so

This all sounds dire, and in my view there is a huge amount to be done to redesign algorithms to out-design its reproduction of inequality. With feedback loops and social factors, in a lot of cases simply being an active consumer using algorithms is not enough to tip the balance of power. Sociology is going to find a new lease of life when it comes to tackling these issues with AI and developers need to actively seek out team members not only with backgrounds in social sciences but also diverse backgrounds full stop.

This was a personal essay adapted from a longer paper I wrote for my MSc at LSE.

Sources:

Bourdieu, P. (1984) “The Sense of Distinction” in Bourdieu, P. (ed) Distinction: A social critique of the judgement of taste. Oxon: Routledge Kegan & Paul

Criado-Perez, C (2019) Invisible Women: Exposing data bias in a world designed for men. London: Penguin.

Gauntlett, D. (2011) Making is connecting: The social meaning of creativity, from DIY to knitting to YouTube to Web 2.0. Cambridge: Polity Press.

O’Neil, C. (2016) Weapons of Math Destruction: How big data increases inequality and threatens democracy. New York: Crown.

Spohr, D. (2017) Fake news and ideological polarization: Filter bubbles and selective exposure on social media. Business Information Review. 34(3), p. 150–160

Zuboff, S. (2019) The Age of Surveillance Capitalism: The fight for a human future at the new frontier of power. London: Profile Books.

--

--

Lauren Toulson

Studying Digital Culture, Lauren is an MSc student at LSE and writes about Big Data and AI for Digital Bucket Company. Tweet her @itslaurensdata