Digital technologies have infiltrated nearly all aspects our existence, covering each natural and cultural urge from ordering food, checking our banking details, finding a partner (short term or long term), or paying our taxes.
Our interactions with new digital and computational technologies affect how we think of ourselves and our cultural heritage, both individually and collectively; they influence the ways in we interact socially and politically; they affect how we determine public and private spaces in an increasingly connected world. Our digital legacies even outlast our lives, preserving some part of us even once we are gone.
Regardless of the level of involvement, we are all human beings living in the digital era.
Until recently two metaphors, George Orwell’s Big Brother and Jeremy Bentham’s Panopticon, dominated theoretical work on surveillance. Both tie conceptualisation of surveillance to totalising systems and to the state. However, these metaphors are poorly equipped to deal with the creeping growth of non-state surveillance by technology companies in recent years. The rise of corporate data collection has created a system of surveillance oriented by economic power rather than discipline and punishment.
Zuboff clarifies that surveillance capitalism is not a technology but a logic, albeit one that “imbues technology and commands it into action”. Central to Zuboff’s account is a rejection of technological determinism. Systems of power— economic and otherwise—direct technological development. She argues, following Max Weber, that technological development is largely oriented by economics and profit-making. As such, any account of technological development must consider its broader place in a system of economic relations.
How Do You Define Privacy?
- Seclusion vs attention from others,
- Prevent or select what data about you is gathered,
- Control what happens to that data,
- Secondary use of data by third parties.
““It’s private! ” kids are always yelling at their parents and siblings, which suggests that there is something primal about the need for privacy, for secrecy, for hiding places and personal space. These are things we seem to want. But do we have a right to them?” (Menand, 2018).
But is privacy a recent phenomena? Is this a consideration for the Global North? Has privacy always been the privilege of the wealthy!?
“Images of cellars in Manchester, UK. One from 1840s, second from 1897. The caption for the second was that it was advisable to sleep naked so as to avoid touching other people’s lice-ridden clothes.” Prof Paul Pickering, Australian National University.
The notion of privacy as the absence of observation (the idea that we could hide or keep our data safe) is, in my opinion, an immutable impossibility on the Web. Privacy as I would understand it, as people who are my peers, people who have grown in the same socio-cultural contexts as I have, that notion of privacy, is gone. Or at least, huge parts of it have gone.
But this is not exclusively the fault of the big players like Google, like Facebook. It is not just them who are to blame.
Because even when aware of the challenges and the dangers, billions of users across the globe opt to share their data in a phenomenon known as the privacy paradox, choosing convenience over all other considerations, or for fear of missing out (Gerber, et al, 2018).
Regaining Agency, Representing Diversity
New technologies of AI, deep learning, and machine inference have been applied to these and older datasets and used to determine or suggest outcomes as radically diverse as sentencing individuals to prison or targeting them with prestigious positions – often perpetuating historical biases, privileges, and prejudices based on demographic features such as levels of education, socio-cultural background, age, gender, ethnicity, and sexual orientation.
In a digital space where there is nowhere to hide, should we instead strive to regain our agency – that is to say, our control – over how we as a species are represented? If so, we must ensure a more accurate and truthful capture of the richness and diversity with which our species views itself, diversifying information categories, resisting traditional groupings, and opposing reductionist approaches to data, and in doing so, ensuring that all the voices of our global human community are heard and equally represented.
But what are the drawbacks and problems of this? What if this data is used to identify and target particular groups? What if representing different genders endangers people who don’t comfortably fit the gender binary leaves them vulnerable? What if ethnicity is used to implement genocide of minority groups? Historically, and even today, we have evidence of the monstrous behaviour that oppressive governments can engage in. What if we have the best of intentions, but once collected, the data is used for evil rather than good?
What are needed are clusters of multifaceted solutions, which address both the technical and human components. Educating programmers and software engineers to the subtleties of ethics and providing opportunities to learn from historical mistakes is but one aspect; diversifying datasets to remove biases (both historical and modern) and ensuring they reflect the human species accurately is but another; teaching users of all levels to be aware of covert data collection, and advising them on how to minimise is insufficient on its own; improving digital literacy, and alerting users to fake news and targeted political marketing needs to be one of the cornerstones of pedagogical programs across all levels and nations before any real change can be seen; ensuring ethical considerations form and integral part of software development, especially with artificial intelligence, deep learning, and machine inference, must take priority in the workflow, rather than a simple add-on.
But even this might not be enough. Can we trust that our governments will use data responsibly, and keep it safe? Even if we can trust out current political leaders, can we guarantee that future leaders won’t have more nefarious agendas?
This isn’t a simple question with a straightforward answer. Realistically, what can we do?
References and Things to Read
Ackland, R. (2018) “We need to talk about the data we give freely of ourselves online and why it’s useful“, The Conversation
Gerber, N., Gerber, P., & Volkamer, M. (2018). Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & Security, 77, 226-261 .https://www.sciencedirect.com/science/article/pii/S0167404818303031
Menand, L. (June 2018) “Why Do We Care So Much About Privacy”, New Yorker, https://www.newyorker.com/magazine/2018/06/18/why-do-we-care-so-much-about-privacy
Zuboff, S. (2019) The Age of Surveillance Capitalism: The fight for a human future at the new frontier of power. Profile Books.