Waking Up from the Internet: A Digital Nightmare Dressed Like a Daydream

This review of related literature was originally written for a subject in my Masters in Digital Communication Leadership programme in February 2018 and is now being published online with a few minor revisions from my professor. I chose this topic because personal values vs. work was a huge subject of debate in a UX Philippines Facebook thread. I decided to make it public after hearing Mike Monteiro’s How to Build an Atomic Bomb talk at UX Copenhagen. This work is over 4000 words long and if you don’t feel like reading everything, I suggest reading Mule Design’s Code of Ethics.

And yes I took the title from a Taylor Swift song (please don’t sue me)
Edited for some typo corrections

Introduction

The Digital Dream of the Good Society

Left-wing academic Lawrence Lessig was positive about the outcome on the market that digital technologies would produce (2004). The new market would be more competitive and would have a more diverse range of creators who could actually earn more than what they did in average before (Lessig, 2004). Paul Simon, the singer, described Web 2.0 as a “fire… for vigorous new growth” (Keen 2015, p. 141). Consistent with this, Robin Mansell, professor of new media, stated that “each new generation of technology is presumed, on balance, to be consistent with human well-being, democracy, and freedom” (2012, p. 16). The vision of these technologists is said to be inline with the notion of the “good society” and in time, everybody will benefit from technological progress (Mansell 2012).

The goal of automation that Negroponte predicted “was consistent with the prevailing social imaginary of a world in which ‘man’ could ‘better review his shady past and analyze more completely and objectively his present problems’, in the interests of building the good society” (Mansell 2012, p. 96). Technology proponents all believed that “the Internet was the answer… because it “democratized” media, giving a voice to everyone, thereby making it more diverse” (Keen 2015, p. 140). Kevin Kelly, founding editor of Wired magazine, said that Web 1.0 would give everything away for free. Dale Dougherty, O’Reilly media co-founder, said that in Web 2.0, everybody could become a writer or musician (Keen 2015). In Web 2.0, people could produce content because they didn’t need the gatekeepers anymore (Keen 2015).

Cultural anthropologist, Adam Fish said that rights that protect free speech, “an essential right to information exchange” (Halleck as cited by Fish 2017) in person and in traditional media, would also apply to new technologies (2017). If old media was “parochial, self-interested and sexist,” (Keen 2015, p. 149) then Web 2.0 social networks such as Reddit and Twitter would give voice to the voiceless and even people who are not usually eloquent have a human right to participate (Fish 2017).

Collective project communities where participants are all assumed to have a worthy contribution to make are made possible through the Internet, according to Axel Bruns, creative industries professor (2008). Since there is no pre-filtering, it is easier to get in and hierarchy is determined based on cooperation (Bruns 2008). In Slashdot, a citizen journalism platform, which doesn’t require accreditation, users feel that the broad diversity as critics has enabled more trust in the site (Bruns 2008). Produsage communities are organized through “ad hoc forms of governance” which had been predicted by futurist, Alvin Toffler in the 1970s who said that “We are witnessing not the triumph, but the breakdown of bureaucracy… the arrival of a new organizational system that will increasingly challenge, and ultimately supplant bureaucracy. This is the organization of the future” (Bruns 2008, p. 26).

In order to reduce human distortions such as “desires, prejudice, distrust of outsiders” that could affect decisions, the traditional industry handed their work over to the unbiased machine, as mathematician Cathy O’Neil narrates in Weapons of Math Destruction (2016). However, nowadays the the world’s automated systems are feeding on garbage data. Only humans can identify the mistakes that these machines are making. But since this is not most important priority of the market because it would cause inefficiency, humans are discouraged from interfering (O’Neil 2016).

Competing Visions

More Transparency and Loss of Individual Freedom vs. More Anonymity and Bullying

However, like Bentham and his greatest happiness principle, Zuckerberg also oversimplifies human beings to a quantifiable code of happiness or pain like a “cost-benefit expert on a grand-scale” (Keen 2012, p. 61). Keen believes that Mark Zuckerberg is “wrong that this shared future makes us more human”, rather a “vicious cycle of less and less individual freedom, weaker and weaker communal ties, and more and more unhappiness” (Keen 2012, p. 66).

On the flipside, in defense of transparency, when people can hide behind a screen, they reveal the worst of humanity. Amanda Todd, a 15-year old girl, committed suicide after 3 years of cyberbullying (Ess 2010). The Internet was supposed to empower people like her, instead it has compounded hatred toward the very defenseless people it was supposed to empower (Keen 2015, p. 149–150). Despite this, Silicon Valley continues to pour funding into anonymous networks and apps like Secret, Whisper, etc. (Keen 2015).

Discipline and Order vs. Mass Surveillance and Loss of Privacy

Deuze cited Foucault, Deleuze and Mattelart that this discipline is becoming something that is not just in the hands of a powerful few but as part of everyday life (2012). “Discipline, therefore, is enforced as well as (potentially) subverted by all individuals in everything they do” (Deuze 2012, p. 107). Since media and power are everywhere yet nowhere, the surveillance of new media is becoming mundane and almost desirable as well (Deuze 2012).

Convenience and Efficiency vs. Loss of Autonomy/Privacy

Freedom of Speech vs. Misinformation and Propaganda

Problem Solving vs. Technosolutionism

Their Electronic Daydream, Our Digital Nightmare

The Effect on our Brains

Workers are not only expected to be disciplined and efficient but also to have many achievements, philosopher Byung-Chul Han observes (2015). In the twenty-first-century, everyone is an “entrepreneur of themselves” (Han 2015, p. 8). While the early cultural achievements of humanity have been attained from deep contemplation, David Brooks, a New York Times writer stated that achievement has now been redefined as the ability to attract attention (Keen 2012) which can be done much faster.

Immersive reflection is replaced by hyperattention, “a rash change of focus between different tasks, sources of information, and processes characterizes this scattered mode of awareness” that “has a low tolerance for boredom” and leaves no room for “profound idleness that benefits the creative process” (Han 2015, p. 13). Han refers to this overachieving and constantly tired and exhausted society as the burnout society (2015).

Power Inequality and Erosion of Trust

The questions is who are they making it better for? Mansell notes that the knowledge and skills for developing and understanding the powerful algorithms is a privilege and that those who don’t have it are excluded from the conversation on making a good society, leading to power imbalances (2012). Although the World Bank and UN usually try to solve the digital divide in terms of Internet availability (Mansell 2012) where ICT providers have no desire to work in poor areas with a low return on investment (McChesney 2013), on a deeper level, the digital divide is about the power gap and distribution of information resources (Schiller 2007). Unless this is addressed, the information asymmetry is a threat to democracy (Schiller 2007).

Even peer-to-peer exchange, which was designed to democratize the Internet is power imbalanced according to NYU business professor, Arun Sundararajan (2016). For example, in Uber, a passenger doesn’t know the intentions of the driver and in Airbnb, the home owner would know more about the accommodation than the traveller. The producer is the one with the power, agrees communications professor Robert McChesney (2013). “They may give the people what they want, but only within the range that is most profitable for them” (McChesney 2013, p. 74). Another example is the massive open online course (MOOC) which was intended to democratize education but William Deresiewicz), which was intended to democratize education, but William Deresiewicz, notes “That is just their cover story… They’re reinforcing existing hierarchies and monetizing institutional prestige” (Keen 2015, p. 145).

As power is yet expressed algorithmically, the digital divide grows even more, law researcher Frank Pasquale notes (2015). Automated decisions like optimizing engines and restaurant recommendations, that take thousands of rules in a fraction of a second are treated more like technical problems instead of asking about their fairness and the values are hidden in the algorithmic blackbox (Pasquale 2015).

For all the talk about making the world a better place, the ability to answer those questions about social justice and technological capability lies only in a few powerful hands who have access to the world’s data (Morozov 2018). No wonder that Keen concludes that trust in authority has become the greatest casualty in this society when big tech’s blackbox algorithms challenges them (2015). Alongside that, “trust is coming to be regarded as relating to the trustworthiness of the software ‘system’, not the human beings who design and manage it” (Mansell 2012, p. 112).

Lucidity in Design

“Every technology is an expression of human will”

(Carr 2010, p. 44)

Relying on purely data-driven decisions made by machines, which are not neutral (O’Neil 2016, p. 171) allows tech designers to elude responsibility.

Every Design Action has a Reaction

Showing a Number or Adding a Word can Manipulate and Lead to Conformity

Behavioral economist Richard Thaler and Harvard law professor Cass Sunstein write about the nudge effect in the artificial music market study by Matthew Salganik, et al. (2008). They discovered that individuals were “far more likely to download songs that had been previously downloaded in significant numbers, and far less likely to download songs that had not been as popular” (Thaler and Sunstein 2008, p. 62). The success of a song depended on whether the number of previous downloads could be seen or not which leads us to believe that the music industry could manipulate us to conform to listen to the same tracks (Thaler and Sunstein 2008).

User experience consultant Chris Nodder adds another choice architecture example in the design of Microsoft’s automatic updates in Windows XP (2013). “Adding a word such as “recommended” or “preferred” can either rely on social proof (most people do this) or authority (we say you should do this)” (Nodder 2013, p. 52).

Default Settings can Betray

Everything is Political

Wake Up, Designers!

Design of course is not limited to what users can see, as Negroponte predicted,s that interfaces will were to be less about the look and feel and more about the intelligence behind it (1995). It is also not just limited to algorithmic design but also business and community models that have to be carefully thought out like the produsage model (Bruns 2008). “Those who choose to compose and disseminate alternative value systems may be working against the current and increasingly concretised mythologies of market, church and state, but they ultimately hold the keys to the rebirth of all three institutions in an entirely new context” (Bruns 2008, p. 89).

Introducing a Code of Ethics in Design

Dutch philosopher Henk Oosterling calls this new movement relational design, which “is the overture to a creative lifestyle whose cornerstones will be ecopolitical sustainability and geopolitical responsibility… for a revaluation of some of its inherent values, such as responsibility, honour and respect, so as to limit the excesses of hyperindividualism and hyperconsumerism” (2009, p. 19). Likewise, interaction design professor Yvonne Rogers believes that Human Computer Interaction is becoming more transdisciplinary (2012).

Because of the complexity of the Internet involving multiple agents which include both designers and users as well as machines and networks, media studies professor Charles Ess recommends that ethics, instead of being thought of as an individual duty, must be thought of in a shared and distributed responsibility framework (2010). In his book Digital Media Ethics, he compares moral absolutism and relativism and concludes that pluralism and dialogical approaches are a good framework in order to recognize diverse ethical views.

Business writer Nir Eyal suggests that more designers do thought experiments like the “Regret Test” (2017). There are more examples of evil interaction design patterns in Nodder’s Evil by Design. He suggests a design activity where designers can think of a product, flip a coin, heads is “good” and tails is “evil”, randomly pick a pattern and try to imagine the product designed in that manner (Nodder 2013).

Conclusion

Bibliography

2. Bruns, A. (2008). Blogs, Wikipedia, Second Life, and Beyond: From Production to Produsage. New York: Peter Lang.

3. Carr, N. (2011). The Shallows: What the Internet Is Doing to Our Brains. New York: W. W. Norton & Company.

4. Castells, M. (2010). The rise of the network society (Second Edition). Wiley-Blackwell.

5. Deuze, M. (2012). Media Life. Cambridge: Polity.

6. Ess, C. (2010). Digital media ethics (Reprint.). Cambridge: Polity.

7. Eyal, N. (2017). “Designers Need the Regret Test,” Words That Matter. Retrieved from https://medium.com/wordsthatmatter/designers-need-the-regret-test-86ef957e0d34

8. Fish, A. (2017). Technoliberalism and the End of Participatory Culture in the United States. Springer.

9. Han, B. (2015). The Burnout Society. Redwood: Stanford Briefs.

10. Keen, A. (2013). Digital Vertigo: How Today’s Online Social Revolution Is Dividing, Diminishing, and Disorienting Us. New York: St. Martin’s Griffin.

— (2016). The Internet Is Not the Answer. New York: Grove Press.

11. Lessig, L. (2004). Free culture: How big media uses technology and the law to lock down culture and control creativity. Penguin.

12. Mansell, R. (2012). Imagining the Internet: Communication, Innovation, and Governance. Oxford, UK: Oxford University Press.

13. McChesney, R. W. (2013). Digital Disconnect: How Capitalism is Turning the Internet Against Democracy. New York: The New Press.

14. Morozov, E. (2011). The net delusion: the dark side of internet freedom (1. ed.). New York: Public Affairs.

— (2014). To Save Everything, Click Here: The Folly of Technological Solutionism. New York: Public Affairs.

— (2018). “Die Menschen müssen die Daten der Internet-Giganten”. Sueddeutsche Zeitung. Retrieved from http://www.sueddeutsche.de/digital/digitale-abhaengigkeit-die-menschen-muessen-die-dat en-der-internet-giganten-zurueckerobern-1.3828542

15. Negroponte, N. (1996). Being Digital. Vintage.

16. Nodder, C. (2013). Evil by Design: Interaction Design to Lead Us into Temptation. New Jersey: Wiley.

17. O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books.

18. Oosterling, H. (2009). DASEIN AS DESIGN Or: Must Design Save the World?. Melintas, 25(1), 1–22. Retrieved from http://journal.unpar.ac.id/

19. Pasquale, F. (2016). The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge: Harvard University Press.

20. Schiller, D. (2007). How to Think about Information. University of Illinois Press.

21. Sundararajan, A. (2016). The sharing economy: The end of employment and the rise of crowd-based capitalism. Cambridge, Massachusetts & London, England: MIT Press.

22. Thaler, R. (2009). Nudge: Improving Decisions About Health, Wealth, and Happiness. London: Penguin Books.

23. Wong, J. (2017). “How big tech finally awakened to the horror of its own inventions”, The Guardian. Retrieved from https://www.theguardian.com/media/2017/dec/20/facebook-twitter-mental-health-sea n-parker

24. Wu, T. (2016). The Attention Merchants: The Epic Scramble to Get Inside Our Heads. New York: Knopf.

a designer who writes sometimes