I want to pass on
a useful precis of two books by Yuval Harari prepared by my colleague Terry Allard for a meeting of the Chaos and Complexity Seminar at the University of Wisconsin, Madison (where I am an emeritus professor and still maintain an office during my summer months away from Austin TX in Madison WI.) Here is his summary of Harari's "A Brief History of Humankind" and "Homo Deus: A Brief History of Tomorrow."
In these 2 volumes, historian Yuval Harari, reviews the successive transformations of humanity
and human civilizations from small bands of hunter-gatherers, through the agrarian and
industrial revolutions to today’s scientific revolution while reflecting on what it means to be
human. Our collective belief in abstract stories like money, corporations, nations and religions
enables human cooperation on a large scale and differentiates us from all other animals.
Today’s discussion will focus on a possible transition from the humanist values of individual
freedoms and “free will” to a disturbing dystopian future where individualism is devalued and
people are managed by artificially intelligent systems. This transition is enabled by reductions
in Famine, plague and war that have historically motivated human behavior. Further advances
in biotechnology, psychology and computer science could produce a superhuman elite having
the resources and opportunity to benefit directly from technological enhancements while
leaving the majority of humankind behind.
Allard's suggested discussion questions:
1. Does technology, social stratification and empire enhance the human experience?
Are we happier than hunter-gatherers?
2. What is humanism?
3. Are people really just the sum of their biological algorithms?
4. When will we trust artificial intelligence? Is AI the inevitable next evolutionary step?
5. What do we (humans) really want the future to be? What are our transcendent values?
Harari quotes from an
interview in The Guardian (19March2017):
Humanity’s biggest myth? “gaining more power over the world, over the environment, we will
be able to make ourselves happier and more satisfied with life. Looking again from a
perspective of thousands of years, we have gained enormous power over the world and it
doesn’t seem to make people significantly more satisfied than in the stone age.”
On Morality: “we are very close to really having divine powers of creation and destruction. The
future of the entire ecological system and the future of the whole of life is really now in our
hands. And what to do with it is an ethical question and also a scientific question.”
On Inequality: “With the new revolution in artificial intelligence and biotechnology, there is a
danger that again all the power and benefits will be monopolised by a very small elite, and most
people will end up worse off than before.”
On timing: “I think that Homo sapiens as we know them will probably disappear within a
century or so, not destroyed by killer robots or things like that, but changed and upgraded with
biotechnology and artificial intelligence into something else, into something different.”