Читать книгу Digital Universe - Peter B. Seel - Страница 25
The Tao of Digital Technology – Yin and Yang
ОглавлениеPerhaps a Taoist view of digital technology might facilitate a greater understanding of the complex relationships between society and technology. The Taijitu symbol and the yin and yang elements associated with Taoism can be adapted for this purpose. Technology is the lighter, active yang part of the symbol and in China is symbolized by fire. There is an elegant symmetry with technology representing the fire element since the discovery of fire was one of the earliest uses of technology by our ancestors. Applied to modern digital telecommunications, the fire of yang represents the pulsing light of lasers as information is circulated over the internet in millions of fiber optic cables and as the glow emanating from a computer or phone display. The yin element – the darker, passive half symbolized by water – represents society. In this sense it is helpful to think of society as both active and reactive. Technology evolves to meet human needs – which obviously will vary over time. Antibiotics were developed to deal with the effects of bacteria and infectious diseases that have caused millions of human deaths over the centuries. Medical science has developed treatments for illnesses caused by viruses such as influenza only to discover to our chagrin that bacteria such as those that cause tuberculosis (that we thought were under control) have evolved into drug-resistant strains.
The yin–yang relationship of technology and society is a useful model to temper the world-view of self-admitted technological determinists such as Thomas Friedman while allowing the perspective that technological developments do, in fact, influence society. The converse is also true – that technology is part and parcel of the society created by humankind and is actively influenced by economics, politics, and a host of other human activities and beliefs. The yin–yang relationship of technology and society is a dynamic one with evolving cybernetic feedback loops. This view of the digital universe allows that “technological capabilities do create intentions” as Friedman noted – it also encourages us to simultaneously consider the Ellulian perspective that “technique” is a much broader concept than just tools and machines. Technology is embedded in every aspect of contemporary human life and is a response to our needs, both real and imagined. Digital technology takes the merger of society and “la technique” to the next level of embeddedness, as we create and adopt technologies that are mobile, compact, and can understand and anticipate our needs. The development of artificial intelligence and agent technology have evolved to a nascent stage, but combined they will take the embeddedness of digital technology in society to a remarkable – and to some, alarming – new level in this century.
Technology critic Safiya Umoja Noble (Figure 3.3) has studied the effects of employing artificial intelligence (AI) in search technology, specifically with Google’s widely adopted search engine. While searching in Google in 2011 for topics of interest to her step-daughter and nieces, Noble discovered to her great surprise and shock that when she searched for the phrase “black girls,” the first hit was for a pornography website titled, “Sugary Black Pussy.com.” Needless to say, she had the presence of mind to not do this search in front of the girls, but it did motivate an interest in how the algorithms programmed into search engines delivered biased results related to women of color. When she compared searches for the phrase “why are black women so…,” the top ten results included terms such as “angry, loud, mean, attractive, lazy, annoying, confident, sassy,” and “insecure.” A similar search for “why are white women so…,” the autocomplete function then built into Google returned these top ten adjectives: “pretty, beautiful, mean, easy, insecure, skinny, annoying, perfect, fake, rude.” While not all are laudatory attributes, she noted that six of the first eight adjectives for “black women” were negative, while the first two for “white women” were positive. Professor Noble’s point is that online search results are based on very complex AI algorithms created by human software engineers, many of whom are men.25 She states:
The near-ubiquitous use of algorithmically driven software, both visible and invisible to everyday people, demands a closer inspection of what values are prioritized in such automated decision-making systems … I believe that artificial intelligence will become a major human rights issue in the 21st century. We are only beginning to understand the long-term consequences of these decision-making tools in both masking and deepening social inequality. (emphasis added.)26
Most search engines are now programmed to reject pornography sites, unless the user specifically asks for unfiltered results. Google has updated its search algorithms so that a recent search for “black girls” returned photos of aerospace engineer Tiffany Davis and the top two stories were for the pro-digital literacy group Black Girls Code and a biography about the legendary NASA mathematician Katherine Johnson, subject of the film Hidden Figures. Another recent Google search for the phrase “why are Black women so” returned a first listing for a YouTube (owned by Google) video titled, Why are Black Women So Angry? Produced by Buzzfeed in 2016 and running under six minutes, it features young Black women talking about the source of their anger and why they resent allegations in the media that it is misplaced. The likely reason it was ranked first in the search is that the video had been viewed 2.5 million times.
The fact that search engines are providing more equitable and less-biased results does not detract from Noble’s central thesis. She states that a society cannot comfortably sit back and allow decisions about what we can or cannot see to be made by AI algorithms designed, by engineers in tech companies, to grab and hold our attention . One of the key attributes of YouTube is that the algorithms that automatically serve the next video are programmed to make it more controversial so that the viewer will stay on the site longer and thus see more advertising. This economically benefits YouTube and its parent company, Google. Some critics have argued that these algorithms serve up increasingly controversial videos that divide already polarized audiences. The “black boxes” of AI-based decision-making are designed by humans, who bring all the biases of their education, class, and gender to the lab as they create the tools that define our online experiences and how we acquire and process information gathered online. In Chapter 12 on privacy and surveillance, we will return to this topic of vital importance to all who live and work in the digital universe.