As outlined above, it has become common practice to record archaeological magnetometer and earth resistance surveys on a regular grid (e.g. 0.5m x 0.5m) as an even and unbiased coverage can be achieved and gridded data can be handled very efficiently during storage and processing. It has become obvious in recent years (Becker 1995; Neubauer and Eder-Hinterleitner 1997; Schmidt and Marshall 1997) that higher sampling resolutions bring significant benefits for the interpretation of geophysical data. Not only can smaller archaeological features be detected (e.g. Fassbinder and Irlinger (1994) were able to identify individual postholes) but the overall interpretation of data is improved as the full geophysical signature of anomalies becomes apparent. Small-scale magnetic anomalies may reveal themselves with clear dipolar signatures rather than being reduced to a single high reading at lower spatial resolution. Based on such improved resolution, Norton and Witten (1998) proposed an algorithm to remove magnetic dipole signals from magnetometer data, caused by minor ferrous contamination. The advent of multi-sensor arrays (Becker 1999) made surveys of 0.25m x 0.25m feasible.
An entirely different approach is the acquisition of data while walking randomly. 'Scanning' describes a method where the operator continuously assesses the readings of, for example, a magnetometer while walking over a field and dropping markers on the ground where the readings 'seem to warrant it'. The results are not strictly reproducible, very subjective and depend strongly on the skills of the operator. The data cannot be analysed further and important, but weak, features may easily be missed (Gimson 2001, 25). It is therefore not a recommended technique! If undertaken more scientifically, the position and reading of an instrument carried over a site can be recorded continuously. This method produces data maps that are comparable to conventional survey results. However, the sampling density of such surveys is not uniform, with a high resolution along the line of walking but larger, and irregular, distances between such lines. It is crucial to preserve information on this sampling regime to assess resulting maps. Simple interpolation to a regular grid may therefore be unsuitable. Sauerländer et al. (1999) suggested the use of Delaunay triangulations and their associated Voronoi diagrams for mapping, which corresponds to the use of Nearest Neighbour interpolation on a fine grid (e.g. 0.05m x 0.05m). The resulting polygons honour the original sampling regime. If smooth transitions between data values are required (i.e. interpolation), Natural Neighbour gridding can be used to remove data mismatch at polygon boundaries (Li and Götze 1999).
The position of the geophysical instrument can be recorded accurately with differential GPS (Sauerländer et al. 1999) but the necessary equipment may influence very sensitive magnetometers. This data acquisition method avoids the laying out of predefined grids (e.g. 20m x 20m) in advance of a survey, considerably reducing the overall time spent on a site. However, a drawback is the uneven sampling and the potential to miss small anomalies. Sauerländer et al. (1999) suggested evaluating results continuously and sending a surveyor back to interesting, but under-sampled, locations to acquire additional data. Clearly, this relies on subjective judgement. A prerequisite for such an approach is instantaneous data visualisation during a survey. Data can be transmitted to a base station where powerful computers continuously recalculate the Delaunay triangulation for each new measurement point.
If measurements are recorded at large intervals only, sampling strategies should be selected carefully. For example, magnetic susceptibility surveys are often undertaken with very sparse sampling (e.g. every 20m) and the validity of this approach needs to be investigated. As with any other sampling technique, the intended use of resulting data determines the design of the strategy. In contrast to geochemical measurements, enhanced magnetic susceptibility does not normally diffuse in soil but often varies strongly even over small distances (e.g. over a fireplace). Interpolation of data can therefore only be justified if the soil has been mixed and spread, for example by ploughing. It is hence important to consider whether interpolation of sparse data is appropriate or whether a denser sampling regime is required. It is anticipated that geostatistical methods will be used to assess the validity of certain sampling regimes (Dabas 1999). While the underlying sampling resolution is of crucial importance, the choice of an interpolation algorithm for the resulting data is far less critical.
© Internet Archaeology
Last updated: Tue Jan 27 2004