[Back] [Forward] [Contents] [Home]

4. Conclusions

The preceding discussion of relevant technological issues highlighted the contributions from informatics to archaeological prospection. The split into five stages helps to structure the range of applications for a more generic assessment.

Advances in measurements and data recording are very closely intertwined with developments in data capturing and storage. High resolution imaging sensors and 'intelligent' data loggers were only possible after advances in computer memory and miniaturised processing power. Conversely, the challenges posed by the newly developed interferometric radar led to advances in computer science. Acquisition procedures of archaeological prospection methods are often governed by available resources (e.g. light aircraft) and customary practice. The latter is sometimes simply derived from archaeological field routine (e.g. geophysical surveys on small grids) and computational solutions have to respect these constraints. In this regard they are subordinate to existing practice and operators' preferences. However, advances in information technology can also stimulate changes to current field routine, as shown by the random recording system described by Sauerländer et al. (1999). Only the advances in Delaunay triangulation developed for other applications made measurements without a fixed grid pattern possible. Such a major change in geophysical data acquisition would not have been conceivable (or desirable) prior to a shift in the data manipulation framework.

Although information technology had a profound impact on measurements and acquisition procedures, data processing and visualisation are clearly the main applications in archaeological prospection. While the benefits of such procedures are often tremendous, they cannot be a substitute for high quality data in the first place. Data processing only enhances what is already there — and sometimes even introduces undesirable effects (e.g. halos in highpass filtered data). The old adage 'garbage in — garbage out' is a reminder that the imperfections in poorly collected data (e.g. staggering) are often inconsistent and hence resistant to algorithmic remedies. These limitations of 'black box' processing techniques have to be acknowledged, otherwise an over-reliance on the vast number of now available processing tools may lead to poorer data.

Processing and visualisation are crucial intermediaries, helping to unleash the information contained in archaeological prospection data. However, the most important stage is their archaeological interpretation. As a result of an integrated prospection strategy one hopes to advance archaeological comprehension or to answer archaeological questions. To this end, the 'hard' data, computationally derived from remote sensing imagery and geophysical surveys, have to be amalgamated with the 'soft' archaeological understanding of landscapes, societies and human behaviour. At this stage the mainly deterministic approaches of information technology clash with the humanities. It has been shown in GIS technology that advances in computing processes allow a departure from strictly deterministic data treatment (e.g. by using perceptions of space rather than 'least cost surfaces' for predictive modelling (Witcher 1999)). Similarly, it may be expected that soft archaeological knowledge will be incorporated into automated interpretation schemes for archaeological prospection data. For the time being, however, human interpreters are essential for the final analysis of data that have been greatly enhanced and simplified through information technology.

One particular example of such computer assistance concerns the use of classification techniques based on several different input parameters (e.g. spectral bands, different geophysical techniques). Combining all data in a multi-layer analysis for their subsequent interpretation is essential. This has always been the approach of human interpreters, comparing maps of different survey results and basing their analysis on a comprehensive understanding of spatial relationships. However, the complexity of emerging patterns rises dramatically with the number of investigated datasets and soon becomes prohibitive for human interpretation. Information technology that automatically simplifies and summarises such hyper-spectral data greatly assists any subsequent interpretation. The use of artificial neural networks may be the best way to expand the remarkable powers of the human brain.

There are a number of prerequisites to achieve any future improvements. First of all, data standards have to be defined so that measurements and processed results can be exchanged more easily between individual researchers and software packages. While not yet overwhelmingly adopted, GeoTIFF has emerged as a useful standard for the exchange of georeferenced remote sensing data. It is supported by several modern software packages and will be a great help when integrating data from different sources. So far there is no accepted standard for the storage of archaeological geophysical data (Schmidt 2002) but a new framework, the Archaeological Grid Format (AGF), is being developed by the author.

In the past, much effort has been spent on the definition of data standards and metadata but the issue of data quality has been neglected. It is crucial for any data integration and analysis that meaningful information on the accuracy and precision of data is available. For example, if geophysical data were collected with a ground accuracy of 0.01m and satellite images, after rectification, are only reliable to within 2m, little can be made of the spatial relationship between these datasets. It is therefore crucial that information on data accuracy is available so that automated analysis can take them into account using geostatistical methods.

The most pressing issue, however, is the access to archaeological prospection data. Geophysical data are not too expensive to commission and many institutions and units now have their own survey equipment. In Britain, the Archaeology Data Service (ADS) has started to archive geophysical data to make them available to interested third parties. With this continuing process it will be possible to use existing surveys for new investigations. The commissioning of aerial photographs is expensive but, more importantly, results are often unpredictable and depend on climate, weather history, time of day and a very skilled operator. Fortunately, a large archive of photographs is available for inspection and many archaeological sites have already been recorded, for example in the British National Mapping Programme (Bewley 2001). The most expensive source of data are satellite data. While the price per covered area is often reasonable, the minimum coverage that needs to be bought can make a purchase prohibitively expensive. It is hoped that these data will become more cheaply (or freely?) available and a charging policy that takes the currency of images into account (e.g. half price after 1 year of acquisition) would be highly welcomed.

Overall, the future for further advances in the computer manipulation of archaeological prospection data looks bright and with new data sources, easier access, better computers and novel processing techniques, exciting new results will become possible.


[Back] [Forward] [Contents] [Home]

© Internet Archaeology URL: http://intarch.ac.uk/journal/issue15/9/as12.html
Last updated: Tue Jan 27 2004