Human Attention has become a Marxist commodity

12 Jun 2022

Sofia, Bulgaria

by Matthew Eric Bassett

Fake news. Misinformation. Mental health crises. Election meddling. Extremism. Partisanship. Eating Disorders. Body Dysmorphia. Vulnerability porn.

What do all these things have in common?

Everyone from the Atlantic1 to the BBC2 to the now-defunct Last Psychiatrist3 agrees that social media is bad, that it creates bad things, or that it makes them worse. And so we need to regulate social media, to strike a balance between “connect[ing] with friends and family” and the evils that may result, right? Except that the idea that today’s social media companies could be regulated or "balanced" to only have the goods and not the evils is completely wrong. Because of the way these companies churn a profit – because of their business models – it is impossible to have social media without them being predominately driven by these evils. In fact, today’s social media are likely as “balanced” as they ever could be.

Ethics is about how we choose to live. And the advent of social media has altered those choices. We have entirely new concepts for how we relate to one another, like ghosting, and entirely new types of relationships, like parasocial relationships4 where a single person can make thousands (or more) feel as though they are interacting on a personal, one-on-one level. No one asked for or choose those changes. There wasn’t a town hall meeting where Mark Zuckerburg asked people if they wanted to know what their 3rd grade friends now think of Donald Trump; no one took a vote and decided "yeah, that’d be neat!". These changes were thrust upon society by new technologies and economic forces without anyone’s consent or forethought. So it is understandable why conservative or reactionary5 elements within us want to claw back some control, to regulate and tame social media. But it’s a fool’s errand. The business model is too powerful and these companies won’t work any other way.

At the core of Marx’s Capital is his definition of a commodity: a thing made by human labour that has both a use value (a capacity to satisfy some human want or need) and an exchange value, the latter coming from the human labour used to produce it.6 Something like gold or coal could be a Marxist commodity, as both of them require human labour to mine and extract and both of them satisfy some human want or need. The former in terms of manufacturing other goods (and also in serving as the “universal” commodity, hence Marx’s theory of money) and the latter as a source of energy. What social media has done is turned human attention into such a commodity. Like coal companies hiring miners to operate heavy equipment that strip mines the mountainside, social media companies hire programmers and data scientists to operate “engaging user interfaces” that strip mine your attention. That attention is bought and sold to advertisers.

But how is this different from traditional advertising?

Traditional, pre-social media advertising could not create a market for your attention. With something like Facebook, ad buyers bid against each other not for the chance of being seen by someone walking down Oxford street (as with an advertisement on a billboard) but for three seconds of the time of a 25- to 28-year-old male with a Bachelors degree in a STEM subject from a state college, an interest in travel to the Baltic countries, centre-right political leanings, and a credit score between 650-720. Facebook can also ensure that those three seconds are sandwiched between a message from his mom and a holiday photo from a potential love interest. Agencies can become specialists in buying such attention on social media, and then sell it on to small businesses and startups (like the one your author works for), major corporations, political interests, or anyone else. Traditional advertising could never get your attention, at best they could get a proxy for it, like that billboard on Oxford street.

Traditional advertising also never tried to produce attention in and of itself. Before social media advertising would have to hitch a ride with some content produced for something else. Billboards, newspapers, television shows, and magazines all tried to provide a use value to their customers and audiences. Of course, the economic pressure of advertising was pushing things towards a commodity before social media. Long before Facebook existed television executives would take advertising considerations into account before green-lighting a new show. But television executives were still trying to produce an entertaining show, they only incidentally got good ratings, and were sometimes surprised when they did.7 Social media companies try to produce monthly active users – it’s an important part of their financial reports.8 Any enjoyable content is merely incidental. In fact, it’s worse than that: most content is created from those active users.

"...you’re not the customer, you’re the product."

While social media companies might try to convince themselves and society that they are in the business of "build[ing] community and bring[ing] the world closer together"9, their profits (and thus their corporate DNA) come from mining human attention and selling it on-wards as if it were any other commodity, like coal. Facebook, for instance, makes a profit not when someone “builds their community” (whatever that means) but when someone buys those three seconds of attention from our Baltic-loving, bill-paying STEM graduate. And here the forces of capitalism ruthlessly optimize for that specific goal. If capitalism means the exploitation of labour then social media companies mean the endless mining of human attention.

Unlike a television executive, who tries to produce a show so entertaining that people will pay good money so their ads could hitch a ride with it, social media’s product is your attention, and the only thing they produce are tools to capture that attention and prevent you from leaving. Those tools are the software, the apps, the websites they make and you interact with. There isn’t a “balance” achievable in the use of these tools, because any value the user derives from them is incidental. The goal is to capture as much of your attention as possible, any attention sent somewhere else – like to building one’s local community in face-to-face interaction – is lost profit. This is why the algorithms they use to show you content work so well. Capitalism is optimizing them to keep you scrolling. It just so happens that the way to keep your attention is to keep you preoccupied with your status in your social groups, with envy of others, with fear of missing out, et cetera. Social media companies use these same emotions to get you to produce content for them, so they can better mine the attention of others.

Once the business model is set, the invisible hand pushes the rest of society in that direction. There isn’t a form of regulation that would tame this business model, as market forces would offer far greater rewards to any company or technology that could skirt them than they would to companies that obeyed the rules. Such regulations would likely only be used to keep new human attention miners from competing with established ones.

Can you stand the ethics of it?

Obviously, neither you nor your average social media user would say that they use social media largely because they are pre-occupied with status or envious of others. Rather, they would say that they use it to keep in touch with family members and friends that they otherwise wouldn’t be able to connect with.10 You might not have asked to know what your 3rd grade friend thinks of Donald Trump, but without Facebook, you would have never known nor cared. The existence of social media companies has made two changes to society's ethics here: first, they have convinced people that connections with people “they otherwise wouldn’t be able to connect with” is a product that can only be produced by for-profit corporations. Second, they’ve convinced people that they chose to foster many of these connections, rather than the platform deciding that for them in order to mine their attention more effectively.

This helps illustrate how there isn’t a version of social media companies that would be "ethical" or would match how people choose to live. Actually, that concept itself is a bit nebulous, as we are all participants in these market forces, not just as consumers (and raw materials of attention waiting to be mined) but also as employees of these social media companies and purchasers of the newly-minded attention. Now that the commodity is available the rest of society relies on it to keep their current business operations afloat. The BBC might publish articles about how Instagram damages the mental health of young adults, but they still need those hits from Facebook shares. And your author is posting this piece on Twitter. It’s an example of how the economic modes of production can set ethical values: People before Facebook might find the ethics of it abhorrent. They might balked at the idea of sharing their meals or holiday photos on Instagram. They might have never chosen to live this way. But for us, after it’s happened, we need it to advertise our own products and services, and soon that turns into “needing” it to plan parties and social gatherings. So we decide it’s okay. But let us pretend that we could go back to a pre-Facebook ethics, and decide that it is important for human interaction, connection, and relationships to happen in some way that isn’t captured and turned into someone else’s profit. Facebook could not re-organize itself for that. It would need to develop entirely different tools for users to interact with because those tools are built for the business model, which is mining human interaction for attention to sell for profit. Even innocuous tools, like messaging, have heavy operating costs, and any business would struggle to pay them while they search for a new “ethical” business model. It is not that social media does evil things. It is that social media is evil.

But this business model is not inevitable, nor is Marx correct about there being one and only one ethical system that results from a given mode of production. If we, as a society, made a conscious decision on how we wanted to live, in particular, if we decided that we valued new and more interpersonal connections within our local communities, then we could use those same market forces to encourage lots of such connections to happen. The business model would be different, it would have to make money only when such connections happen. If advertising were a part of it at all, it would have to be advertisements shown to the same people who are considering meeting up in real life, and those advertisements would need to be suggestions on where and what to do. It would be intention-based advertising, showing the person ads based on their intention to meet up in their local community, rather than targeted advertising, showing the user ads based on who they are.11 It is the sort of business model that Google had back when their motto was “don’t be evil”.

 

Notes