[Back] [Forward] [Contents] [Journal Homepage]

5. A depersonalised past?

One aspect of Renaissance thought that has long had currency is a mechanistic (and hence, at least in part, a technological deterministic) view of the world, as proposed by Descartes, Bacon and others, and which subsequently became rolled into a Darwinian evolutionary model. Machines drove progress: celestial machines powered the heavens and the seasons, iron machines structured and ordered humanity. Cybernetics or systems theory (for example, Bertalanffy 1950; see also Clarke 1968; Trigger 1989, 303ff) can be said to be part of this long process of development, in which the rules believed to govern the behaviour of entities, whether they be sociocultural systems or computer systems, were formalised. The parallels between computer technology and systems theory, with its talk of feedback mechanisms, equilibrium, goal-seeking, networks, and 'black boxes'; with their inputs and outputs, are clear. Indeed, David Clarke claimed that:

'The analog computer and the digital computer can act as precisely such kinds of isomorphic models and their output may be made to represent the "behaviour" of the problem box under investigation'
(Clarke 1968, 59-60).

In other words, archaeological models could be derived, converted into variables and computer algorithms, and outputs generated. If the outputs for a given set of inputs matched the archaeologist's expectations then the computer model could be said to be a good representation. So, in the 1970s much use was made of systems theory in the New Archaeology, a theory which had its origins in a mechanistic view of the world as represented most clearly in the computers which facilitated the archaeological analyses undertaken and which by definition required a formalised, algorithmic, definition of tasks and data in order to operate at all. While computers cannot be said to have brought this about, it seems more than likely that they contributed to, and in certain respects facilitated, these theoretical developments. In such an environment, quantification and measurement flourished, although numerical methods had been in use in archaeology for at least twenty years before the arrival of the New Archaeology (see Doran and Hodson 1975). However, a deterministic relationship between computers and the more general uptake of quantification methods can immediately be identified — for example:

'A major difficulty with the data analytic approach … was the need for immense numbers of repetitive, if simple calculations, requiring inordinate time and accuracy even for a relatively small problem. With the development in the 'fifties of the powerful electronic computer these difficulties suddenly receded, and data analysis has become a widespread and highly developed approach for solving problems …'
(Doran and Hodson 1975, 4).

Indeed, for a long time, computer archaeologists have struggled to overcome the widespread perception amongst many archaeologists and others that computers 'meant' quantification. While Doran and Hodson were anxious to distance themselves from both New Archaeology and systems theory, they nevertheless recognised that quantification was one of the 'chief props' of the movement (1975, 5-7). However, to some, quantification — and especially multivariate analysis — took on a black box approach into which data were pushed and results generated without any clear understanding of the intervening process (a criticism made by, amongst others, Ruggles (1986)). The demonstration of a statistically significant correlation almost became seen as an explanation in itself — a form of computer-based statistical determinism — in spite of the fact that reasons for the correlation might be archaeologically spurious and that there may at times be a difference between what is statistically significant and what is perceived to be archaeologically significant.

The black box approach to archaeology is not confined to systems theory or quantification. In general, any computer-based application area capable of taking prepared data and generating output with relatively little involvement by the user in between can fall prey. Computers have facilitated a push-button approach to archaeology in which the tools or techniques can be applied blindly in the knowledge that something will transpire — not so much Clarke's ideal isomorphic black box as one in which the intervening model is poorly understood, if at all. Although much of this relates at first sight to archaeological methodologies that belong to the past, it is clear that the hangover from those days survives in various forms. Artificial intelligence and expert systems applications are perhaps the most extreme example of this, purporting as they do to model archaeological knowledge and reasoning about that knowledge, albeit for the most part in very restricted areas (see Huggett and Baker (1985) for a discussion and critique, and Francfort (1992) for a response). Geographical Information Systems, combining quantitative methods, systems approaches, increasingly user-friendly interfaces, ease of data input, and (at least on the surface) push-button application tools guaranteed to generate some kind of output which are typically brightly coloured, have the potential to share many of the same characteristics.

Although archaeological theory and methodology has moved on since the 1970s, to a not inconsiderable extent computers are still seen as being associated with an explicitly 'scientific', reductionist, processual approach to the subject. It is only since the late 1990s that computers are becoming seen as a means of realising the post-processual objectives of some present-day archaeologists — they are said to offer the flexibility, inter-textuality, fluidity, and so on that are increasingly in vogue (for example, Hodder 1997; 1999). Whether or not these claims in fact misrepresent the technological capability is discussed elsewhere (Huggett 2000, 16-18). A paper by Gary Lock (1995) to some extent attempts to bridge the processual/post-processual divide with reference to computers and discusses the relationship between data models, digital models, and theoretical models (using a model to do so!). The black box has become an iterative web, loosening the ties of confirmatory hypothesis testing, and the computer provides less of the 'hammer-and-anvil procedures to beat out archaeological theory from intransigent data' (Clarke 1973, 9) and more of an exploratory tool.

Despite this, as Jayne Gidlow (2000) argues, many post-processual archaeologists still see the computer as an ahistorical tool: hardly surprising given the way in which, for example, many GIS-based analyses re-visit processual, deterministic models (for instance, see the exchanges regarding environmental determinism by Gaffney and van Leusen (1995), Kvamme (1997), and Wheatley (1998)). In the process, archaeological intuition and speculation can seem to be replaced by dispassionate, logical, mechanistic procedures and archaeologists and their subjects of study — people in the past — have become increasingly remote, alienated from each other; quite the opposite of what the various post-processual strands seek to achieve. Recent work in relation to GIS (for example, Bell and Lock 2000; Llobera 1996; 2000; Wheatley and Gillings 2000, and see Wheatley, this volume) and computer modelling (for example, Gillings and Goodrick 1996) give cause for hope in this respect. However, none of these examples explicitly consider that, while the use of computers in archaeology may very well be predicated on the properties of the digital model (Lock 1995, 14 and discussed further below), that very digital model is itself predicated on the properties of computers, as argued here.


[Back] [Forward] [Contents] [Journal Homepage]

© Internet Archaeology URL: http://intarch.ac.uk/journal/issue15/4/jh5.html
Last updated: Wed Jan 28 2004