Parry, M. (2010) The humanities go Google Chronicle of Higher Education, June 3
In 1959, a physicist and novelist at Cambridge University, C.P. Snow, argued that the breakdown of communication between the “two cultures” of modern society — the sciences and the humanities — was a major hindrance to solving the world’s problems.
This article looks at the modern interface of digital science with traditional literature, and from the way this article is written, the cultural interface remains an ugly mess. The article is about attempts by computer scientists to analyse on a quantitative basis all the works captured in Google books to identify (here I got a little lost) trends or themes that perhaps that have been ignored in the past.
One word cried out to me when reading this article: epistemology. One reason for the great divide in cultures is epistemological. Science and humanities approach the issue of what is ‘true’ from completely different perspectives. Writing algorithms to identify common ‘themes’ based on quantitative, statistical analysis seems to me to miss the point about meaning in literature and how it is interpreted.
For computer scientists, ‘meaning’ is a big, black hole into which billions of dollars have been sunk. Just look at the pathetic results from many years research into speech recognition or artificial intelligence (with respect to meaning – AI has been very successful in other areas.) Let’s not even consider the semantic web. I really do fear that if, one day, computer scientists do crack the code of meaning, as humans we will be redundant, and totally replaceable by machines.
There is so much in this article – the way it is written, the goals the computer scientists have set themselves, the attempt to dehumanize the reading of literature – that terrifies me. We do not need a single, dehumanized, reductionist, computerized culture. Be afraid, very afraid.