The Power of Big Tech

Reading Time: 7 minutes

Speech from the Auditor of the College Historical Society, Trinity College Dublin, Brid O’Donnell on the occasion of the 251st session of the oldest student society in the world.

The Power of Big Tech

The topic “The Power of Big Tech” is surprisingly poignant to me. For most of my life, I have never found a passion, a cause, an area that instilled such drive in me that I desperately wanted to change the world. Not to say I wasn’t ambitious, but I cared about many things. In the battle between breadth vs depth, I have always chosen breadth, and that’s part of the reason debating appeals to me. Debating is and always will be an activity for people who like talking, with authority and conviction, about many topics without being an expert in any.

Regardless, I didn’t have much passion for a single topic until roughly two years ago. I remember I was in the Hist Committee Room discussing China’s use of technology to control its population. The discussion steered towards the comparison between the West and China’s use of surveillance. I believe it was Caoimhin Hamill who first mentioned Surveillance Capitalism to me during that conversation, so please blame him if this speech is unwieldy.

Since that conversation, I have been consumed. I read Age of Surveillance Capitalism by Shoshana Zuboff, listened to every tech policy podcast I could find, wrote a paper on cyber-insurance, learned about cybersecurity and worked as a digital marketer for a tech start-up. I experienced the Power of Big Tech from both the user-end and the business-end, and both perspectives are terrifying, in my opinion. All of this knowledge and experiences has led to the theme this evening being The Power of Big Tech.

Three policy areas concern the Power of Big Tech. Each of these policy areas has its own uniques problems and solutions. They all intersect but they are different, and they impact companies differently. Many subtopics I mention deserve speeches in their own right, but we don’t have all evening. At the very least, we should understand the basics.

The first is antitrust, or rather the economic and financial dimension. Antitrust indeed concerns BIG tech, the likes of Google, Apple, Facebook and Amazon. Google has no real search engine competitors, so the quality of its services has decreased; you get served ads when ten years ago you would have gotten valuable, relevant information. Apple gatekeeps the apps allowed on iOS and charges a hefty tax for those on the App Store. Facebook brought up competitors like WhatsApp and Instagram, and if they can’t acquire the company, they clone it, like the case of Snapchat and Instagram stories. Amazon uses its data to undercut the prices of smaller competitors.

Antitrust is arguably the most straightforward policy area to understand; at least there is some precedent. Not just with Microsoft during the 1990s but also the railroads’ monopolies in the 1890s. Monopolies and anti-competitive practices have always been a feature of capitalism; the only difference in this age is the scale. These companies’ stocks have been booming in a time of unprecedented economic hardship, hitting record values. Their founders and CEOs are multi-billionaires, the richest in history. They are the oil barons of our age.

The second area of interest is privacy. Data protection is now practically a buzz word, thanks to GDPR. However, these laws’ effectiveness is still up in the air. So long as mile-long terms and conditions exist, so long as we consent at the push of a button, our privacy is an easily transferable commodity. Cybersecurity also has a significant influence on this area. Facebook only last week suffered a leak of 500 million accounts. Hacks are the norm, and bad actors will continually invade your privacy and use it for financial and political gain. This is particularly dangerous on the geopolitical scale, with nations like Russia and China spying and stealing intellectual property from the West. However, I think there is an alarming trend in this area that has considerable societal impacts, and I will develop this point later in my address.

Finally, you have content moderation. I believe this is the most political of the areas, especially after the Trump era. Its impact on democracy can easily be demonstrated by the increase in polarisation, though an obvious example of content moderation failing was the Russian interference in the 2016 US election. Thanks to Section 230, better known as the 26 words that created the internet, websites aren’t utility providers nor publishers; they get the best of both worlds and have little liability but much control. Their moderation decisions are incredibly political and impactful. Facebook has even skirted its moderation responsibility in the form of its Oversight Board – an “independent” body created to help Facebook answer some of the most challenging questions around content moderation: what to take down, what to leave up, and why.

But this highlights the more significant issue. By creating this quasi-supreme court, Facebook participates in state-building; they are creating their own judiciary, with thousands of zero-hour contracted moderators as their police force. Facebook has 2 billion users, a higher population than any nation-state globally, and in this quasi-state, Mark Zuckerberg is King.

The genuinely horrifying reality of this situation is not the fact that Mark Zuckerberg is an unelected billionaire with no accountability to Facebook’s users. No, it’s instead that Mark Zuckerberg has no accountability even to Facebook’s stockholders. Thanks to Facebook’s dual-class share structure, Mark has an enormous amount of voting power. And many tech companies share similar structures, with founders and a small number of individuals possessing outsized control over their activities. Therefore, we cannot expect corporate governance to rail in the power and influence of Big Tech.

At this point, it probably seems like I am scaremongering. Big Tech companies have done much good in the world, and I don’t want them to disappear. I enjoy their services and admire their innovation. But we should understand the dangers. There are the obvious dangers; the manipulation of democracies, the growing wealth inequalities and the cybersecurity risks; however, I want to focus on one abstract consequence – the loss of your private self.

I alluded to earlier the terrifying consequence of a slow invasion of privacy. To best explain this, I will use the metaphor of the front and backstage. In this metaphor, you are a performer. When you are in the front stage, you project your public self to the world; you are aware that you are being observed and act accordingly. When you are backstage, you are by yourself, completely unobserved, your private self, and you get to relax and be free. There is nothing fundamentally wrong with your public self, but you cannot spend your entire life on the front stage; you need the two aspects to strive.

When big tech is constantly rendering parts of your life into data it can use for profit, you notice, even subconsciously and act accordingly. As this trend continues, there is less space and time for us to be backstage, our private selves. As someone who has grown up with social media, my mind feels like it is on the front stage for years. Social media and big tech are all-composing, it touches every part of my life, there are few times when I am completely separated from it. There is a reason it is called surveillance capitalism; it is always watching.

Maybe I need a practical example of this to illustrate how Big Tech invades our private selves. Spotify Wrap. This is where Spotify tells you your top songs of the year with many other figures as minuscule as “you listened to this particular song this many times on this day months ago”. I think this is an excellent example because while music is a public good, often when you listen to music in the comfort of your own home, it’s a private action.

What Spotify is doing is rendering that private action into data it owns. They analyse it and then commodify it, both for advertisers but also the wider world. We as users are actively encouraged, practically manipulated to share our Spotify Wrap online, to be our public selves explicitly. And this is now normal; people share more of themselves online. This will continue because surveillance capitalism needs that data for growth. The information is permanent, and your “friend group”, the people who can see it, is far larger than you can imagine.

On an individual level, you may not think this is a big deal, it doesn’t affect you, but Big Tech is all about the power of scale. On an aggregate level, this has an impact. It is causing a cultural and sociological change – creating a society where we are always observed, where we should always share our entire lives online. This intrusion has a mental impact, as evidenced by the increasing mental health issues in younger generations and a philosophical one.

Think of Schrodinger’s cat. When the cat is in the box, unobserved, the cat is both dead and alive. It is only when we observe it that its fate is sealed. I like to think our public and private selves operate in this way. When we are backstage as our private self, anything is possible, the future is uncertain, and that’s wonderful. It’s when we become our public selves that things become certain, which is also good. However, when forced to perform as our public selves constantly, we lose the uncertainty that makes life and the future exciting and meaningful. Shoshana Zuboff describes this as a “Right to the Future Tense”. Yes, this is an intangible concept but isn’t that what so many of our rights are?

To finish this address, I want to draw a comparison between climate change and surveillance capitalism. On the surface, the two may seem drastically different, but both are two crises that we have to deal with in the twenty-first century, and the fundamentals of what makes each so different to address are the same. Climate change was caused by corporations claiming the commons, whether it’s the air, oceans or natural resources. Surveillance Capitalism was created by corporations claiming our personal data. In both situations, corporations are pacifying us by selling us convenience. It’s easy and cheap to fly abroad for a holiday or to buy new clothes. It’s free to connect with friends over Facebook or to ask Google for information. It’s not a surprise we as consumers take this deal; it’s excellent in the short term. But we now have long term problems, and now we have to deal with them.

Can we fight these problems individually? Sure, but only partially. You can reduce, reuse, recycle, but corporations are still burning fossil fuels. You can delete your social media or become a hermit in the woods, but these corporations are still going to influence our democracy.

So I suggest that we fight these problems collectively.

I don’t act like I have all the solutions at this very moment. You need innovative and novel ideas to overcome each of these issues and build structures that can regulate Big Tech without removing the many benefits they bring to the world. To implement these solutions, the government will take action and stand up to the lobbyists and special interests motivated by profit. There is a long road ahead of us. However, I think the first step is education. We need to identify the dangerous incentives in the system, understand the powers these companies possess and label the very mechanisms of Surveillance Capitalism. Start educating yourself and then inform the people around you.

But most importantly, what you need to do is start imagining new rights. Digital rights that were barely imaginable 50 years old but have practical use in this new age. New rights that we accept as inalienable.