In article on The Conversation titled “Algorithms have already taken over human decision making,” Dionysios Demetis says,
In the past, we humans used technology as a tool. Now, technology has advanced to the point where it is using and even controlling us.
We humans are not merely cut off from the decisions that machines are making for us but deeply affected by them in unpredictable ways. Instead of being central to the system of decisions that affect us, we are cast out in to its environment. We have progressively restricted our own decision-making capacity and allowed algorithms to take over. We have become artificial humans, or human artefacts, that are created, shaped and used by the technology.
Examples abound. In law, legal analysts are gradually being replaced by artificial intelligence, meaning the successful defence or prosecution of a case can rely partly on algorithms. Software has even been allowed to predict future criminals, ultimately controlling human freedom by shaping how parole is denied or granted to prisoners. In this way, the minds of judges are being shaped by decision-making mechanisms they cannot understand because of how complex the process is and how much data it involves.
The last sentence reminded me of Georg Simmel’s concept of “the tragedy of culture.” Simmel argued that through increasing division of labor and specialization, society creates technologies and sciences that no single individual understands. Many individuals can understand small parts of what is created but no one understands the whole thing. For instance, in a highly specialized field like chemistry, no single researcher can actually understand more than a small fragment of the body of knowledge.
One danger in this context is that we might come across someone who gives the impression that he understands everything, a cult leader like Trump, and people follow him into the abyss.
Here is a concise explanation of the tragedy of culture:
Simmel viewed human culture as a dialectical relationship between what he termed “objective culture” and “subjective culture.” He understood “objective culture” as all of those collectively shared human products such as religion, art, literature, philosophy, rituals, etc. through which we build and transform our lives as individuals. “Subjective culture,” in turn, refers to the creative and intelligent aspects of the individual human being, aspects of ourselves that Simmel argued could only be cultivated through the agency of external or “objective” culture.
The Tragedy of Culture, Simmel theorized, occurred as societies modernized and the massive amounts of objective cultural products overshadowed (and overwhelmed) the subjective abilities of the individual. Presented with more options than one person can possibly ever hope to experience in a lifetime, the modern individual runs the risk of stunting his or her social psychological growth.http://routledgesoc.com/category/profile-tags/tragedy-culture
When “objective culture” becomes too complex for individuals to comprehend, people specialize. Scientists and scholars specialize more and more finely, which means they “know more and more about less and less until they know everything about nothing.”
The more information an AI system acquires, the smarter it gets. This is not true for humans. We can only handle so much information, and we use complexity reduction methods to process information. According to Ethan Bernstein,
AI can filter floods of information—from our email, apps, calendars, social media, Web browsers, news services, enterprise workflow apps, systems of record, monitoring devices, wearable sensors, video camera feeds—and make sense of it. All in real time. While we humans can only
handle so much data, AI systems get smarter with more information.