Mini journal logo  Home Summary Issue Contents

The Mobile Phone in Late Medieval Culture1

John Robb

Cite this as: Robb, J. 2021 The Mobile Phone in Late Medieval Culture, Internet Archaeology 56. https://doi.org/10.11141/ia.56.7

Introduction

I wrote the first version of this story in 2010. Since then, some of the trends foretold here have come true, one by one. I use my smartphone (which I first got in 2015) much more often to take pictures, read the news, play games and do research than I use it to talk to people. For more techno-savvy people than I am, phones have absorbed many more functions (GPS, exercise coach, sleep monitor, payment card, airline and train ticket, bank portal, remote control for home appliances‥). Though a lot of the developments mentioned below remain well in the future, mobile phones are well on their way to becoming universal devices. Their social centrality has not gone unnoticed; mobile phones have even spawned a large academic literature numbering (according to the Web of Science) into the hundreds for sociology, anthropology, and linguistics, and into the thousands for psychology.

Given this, why write about them in fiction? One very simple reason is that I was bored with writing standard academic prose and thought it would be fun to try out a different mode. Although nobody ever admits it, and it is often crushed to invisibility under the other feelings (such as despair) that the act of writing generates, fun is a legitimate motive in academic writing.

Beyond this, fictional narrative is increasingly common in archaeology. Almost without exception, it is used to 'give the past faces', to humanise it, to make us think about what it was like to live as a particular kind of person at a particular historical moment. Done well, it can give us a way to make the past personal and compelling and to explore alternative voices (Van Helden and Witcher 2020; Elphinstone and Wickham-Jones 2012). This is not that kind of fiction. It is about ideas, not about characters. Rather strangely for a historical meditation, it is set in the future; the future provides a bridge to understanding our present and our past. It traces its lineage not to historical fiction but to science fiction, mostly to the playful, sometimes dark, thought-experiment stories of writers such as Stanislaw Lem (Imaginary Magnitude, The Cyberiad, The Star Diaries).

Fiction allows exploration in ways expository prose does not, and this goes for ideas as much as for human experience. Unlike most of the writing we do, this piece is unencumbered by the obligation to present research results; it can wander through a landscape of ideas. Indeed, in many ways, the really important stories in human life are too big to see empirically. They certainly go beyond what any research project could actually tackle. Instead, their only true literary genre may be the philosophical statement of 'the world as I see it'. Yet, inevitably, we write such essays in the present, seeing the story in media res; by definition, any idea big enough to be worth thinking about always projects beyond us into the future; it is always unfinished and ongoing. From where we stand, nobody, however learned, however large their research grant is and however many footnotes hang off their essays, really knows how they will turn out. It is hard to read a newspaper from fifty or a hundred years ago without noticing how many things they used to think were important, and how many of the real changes they didn't see coming. Will today's end of the world turn out to be tomorrow's tempest in a teacup? What currently unthinkable place will the winds of change blow us to? I could have equally well written up the ideas in this piece as a philosophical essay, but that would merely have radiated an artificial and unwarranted sense of certainty about the future; it would have focused attention on whether various futurological visions were likely to be correct, rather than upon the general evolution of human-thing relations. It would not have fundamentally changed anything about its epistemological status, except perhaps for making it less clear. Perhaps this is why there is a long tradition of philosophical fiction, from Plato's dialogues through Thomas More's Utopia and its innumerable 17th-19th century descendants (cf. Guadalupe and Manguel 1987) up to modern thinkers such as B.F. Skinner (Walden 2) and Ursula Le Guin (The Left Hand of Darkness).

The landscape of ideas is not a featureless plain, of course, and this work fits into a number of discussions. Only the most relentlessly programmatic works of philosophical fiction – the ones where they really should have just written an essay instead – demand footnotes. But it is worth highlighting a few themes. Amidst the huge literature on the anthropology and sociology of technology, mobile phone studies have focused principally upon psychological effects such as phone addiction, and upon political and social effects, for instance, the mobile phone's role as an agent of democracy or social control, and as a vector of identity (Glotz et al. 2005; Bell et al. 2018; Bell and Kuipers 2018). Other authors consider technology's phenomenological effects (Introna 2011). A different strand considers the borderlands between humans and technology. Humans are part of hybrid networks (Latour 2005); matter can act autonomously, even in political and social roles (Bennett 2010); humans are not only part of assemblages, they are assemblages (Deleuze and Guattari 1987; DeLanda 2006). The future is cyborg (Haraway 1991) – or at least a future is. The road leads through post-humanism and post-post-humanism to postn-humanism, a territory as well explored by fiction (Garcia 2018) as by futurology. And in this future, what clearer border for medieval/modern than human/thing?

Footnote

This article is dedicated to the anonymous person I saw on a British train early in 2011 who was using their state-of-the-art, pinnacle-of-technology, globally-networked-via-satellite iPhone as a hand mirror while brushing their hair. You can identify true techno-domesticates because they fail to see anything funny about this.

 

Address from the President, Material Culture Nexus

Reprinted from the Our Thingly History: Academic Hyperpapers of the Material Culture Nexus (Historical section)

Thank you for inviting me to address you today. It is always a pleasure to speak to the Material Culture Network's annual plenary meeting. Though such big conferences are now rarely held, it is a pleasure to see scholars in our field get together once a year for such an occasion, which in its very form remains a homage to history, a period event, much like reperforming a Greek tragedy or a 20th-century political debate in its original form.

My subject today is the role of the mobile phone in late medieval culture, a topic which has, strangely, been overlooked by material culture theorists exploring the roots of modern civilisation. Every schoolchild nowadays is familiar with the mobile phone's role in the Digital Unification Event, but only as a waypoint in a canned, teleological history of the ascent from primordial invertebrates awash in ancient seas through dinosaurs, primitive hominids, the inseparable splendour and barbarity of the medieval world, and finally ourselves. Such comic-book narratives mask much more than they reveal. The reality is much more interesting.

The Digital Unification Event, like all historical transformations, was of course not an event, but a process. Historical consensus has recognised this, rechristening it the 'Digital Unification Horizon' (often abbreviated as 'DUH'). The DUH was based on the assumption that, ultimately, almost everything can be reduced to information, or at least, somewhat neo-Platonically, that the information in everything can be sifted out from its matter. Technologically, it involved the reduction of formerly separate media such as pictures, music, and text into digital forms that could be stored, accessed and circulated by the same universal devices. It really does not matter, for our theme today, whether it was ushered in by the famous Nokkung EveryByte, or by the ultra-portable, high-powered Sony eCrumb and Apple iCore. These phone-based devices, with their somewhat culinary names, fed users all the data they could consume. It is no surprise that it was the mobile phone which evolved into the first Universal Device. (Contemporary chroniclers and early Modern historians often referred to these objects as 'gadgets' or 'gizmos', but the pejorative and belittling nature of these terms has since been recognised; indeed, their derogatory connotations themselves reflect the social tensions of that time. 'Device' is now widely adopted as a politically neutral form of reference.) The mobile phone outstripped all other candidates (including the printed and written page, the computer, the CD player, the camera, the wristwatch, the GPS, the credit card, the cardiometer, blood-pressure meter, and thermometer, those short-lived and abortive text readers such as the Amazon Kindle, the programmable vacuum cleaner, and the electronic contraceptive). The phone's portability and status as an individual's personal companion clinched the issue. The flush of rather ad hoc 'apps' that initially extended the phone into these other devices' territory was rapidly supplanted by centralised and streamlined devices which made such integration simply taken for granted. A few decades into the 21st century, the mobile phone rapidly became the universal device.

What is really of interest to the student of material culture are the social consequences. Many of the initial effects were highly predictable. For example, as soon as mobile phones became the universal device, many national governments instituted deprivation of these devices (or 'byte-stripping') as a legal punishment equivalent to imprisonment, disenfranchisement or mutilation. And then there were the quite violent anti-device riots held by reactionary groups. These picturesque events, with their evangelical songs, archaic costumes and bonfires of electronic gear, were ironically themselves possible only through the use of devices such as electronically wired trucks and lighters. Indeed, studies of electronic traffic archives have recently shown how anti-device demonstrations were almost always actually organised via the Internet, even if the social elements involved insisted that the Bible warranted only the use of 'email classic' and text-messaging rather than instant blogfeeds (or 'blitter' made up of 'bleats'), simulpods and electronic delegation. Some devices, it would seem, are holier than others! Counter-arguments mounted by scholars in the rabbinical tradition that the Talmud and its commentaries provide a divine sanction for hypertext are particularly fascinating.

We can skim briefly over other technical developments in the same period. The most important one, obviously, was the advent of universal connectedness via total wireless cover. This was coupled with universal locatedness through your device's GPS, tied in first to the Greenwich global clock and thence to a despatialised universal satellite time, a globally shared 'real time' entirely free of local reference. No longer were you subject to the social mortification of being 'out of touch' or 'out of range'; you were always integrated in your social network (indeed, with live streaming multi-webcams and ultimately the mid-21st century 'Vicaricorp', the first remote interchangeable body, you could gather in parties – or intimate relations – with other people without ever leaving home – but that is getting ahead of ourselves here). Information continually arrived: social contacts from cyberfriends, updates from your bank, credit society or cyber-usurer, weather reports on the micro-climate of your projected path later that day, news items, music feeds, GPS locations of your friends, your pets, your friends' pets and your pet's friends, and continuously streamed opinion polls whose results fed directly into the government's automated denial and buck-passing circuitry. And with all this vital information, no longer did you risk becoming out of date; your device was continuously and automatically updated from secure and trusted governmental sources.

What is more to the point – and here I would once again decry those so-called historians who peddle simple-minded stories of the Rise of Civilisation as an inevitable stream of Great Inventions and Their Inventors; technology is always and only social, just as society is always and only technological! – what is more to the point is the psychological effects of the DUH on humans. Such effects were by no means unpresaged. As studies of early technology have shown, the technological refashioning of consciousness began surprisingly early. Take the wristwatch. In the late High Medieval era, the wristwatch, along with the keys and the wallet, was the closest thing to a universal device; it was not a coincidence that the mobile phone soon absorbed and supplanted all three of these. To medieval people, wristwatches were faithful servants to humanity, telling you what time it is so you didn't miss your appointments and you knew when to go home from work. But, as historical phenomenologists have shown, they were more than servants; they were companions. The wearing of watches developed habits of time-sensitivity or awareness, coordinating people's separate consciousnesses and presence with ever finer precision. Moreover – and I believe this is a key point about all technologies that are evolutionarily successful -- they did not merely satisfy the need which gave rise to them; they cultivated it. Because the watch made time omnipresent, wearing a watch habitually developed in you an enhanced sense of time, an almost constant awareness of it, and you needed to know the time more than ever before. Thus the paradox: you do not wear a watch because you need to know the time; rather, you need to know the time because you wear a watch.

One saw precisely the same thing with the Internet. The Internet made information available freely, abundantly, instantly, unbelievably by the standards of any previous generation. An 8-year-old child in 2010 had access to far more information than Augustus Caesar at the centre of Rome's empire ever had. What effect did this have? If the Internet had merely answered a need for information, you would have gone to the Internet to find out something specific and then gone away satisfied. Using the telephone directory, for example, never became addictive. Instead, one commonly found that, paradoxically, the more information you had, the more you needed. Take a typical Internet occasion of the early 21st century: buying an airline ticket. One went online to buy the ticket, rather than going to a travel agent or calling an airline as one's forebears would have done. But while you were at it, you checked other things. How do you get from the airport to the city? What streets are around the central train station? What is the weather like there? Can you reserve a bus seat or a hotel room? What is the current exchange rate? Previously, you would have enquired about these things as you went along. Now, you no longer had to rely on the helpful passer-by or guard for information, or on contingencies such as finding a hotel room or not missing the last bus. In most cases, the final outcome was exactly the same – you rode the same buses to the same hotel room through the same weather with the same coat on. What changed was how you felt about it, your informational expectations en route. You now routinely expected certainty in advance about things you formerly had to negotiate as you went along. So organising your travel via the Internet did not merely represent a change in functionality; it also meant a change in your attitudes and habits, your relationship to the world. One became adjusted to a new, amazingly information-rich environment, and with this to predictability, controllability and incident-free experience. If you had to go back to journeying the pre-Internet way, you would find your reflexes dulled, your patience with the unexpected frayed, your mood constantly apprehensive about things that might happen. These are the withdrawal symptoms of information addicts. Again, you don't use the technology because you need it; you need it because you use it.

And so with the Digital Unification Horizon. People at the time noted some of its effects upon their habits of thought. We do not have time tonight to go into all the consequences. Let us just take one example. For millennia, sociality was essentially face to face, and the actions of independent elements beyond the local sphere were coordinated by more diffuse networks called (rather inexactly) culture, politics and economics. Yes, I know that people wrote letters, made telephone calls, and so on, but ultimately the bottom line was that these filaments connected people who were already defined as people within their local circle. The harbinger of a new way was email. As its name suggests, it emerged from its past as 'mail' much as the first automobiles were designed by carriage-makers and looked like carriages, but it soon outstripped these humble origins. And it was soon supplemented by a host of other forms of e-sociality. The first decade after 2000 alone saw text messages, chat rooms, Facebook, webcams and Skyping, and Second Life. The psychological accommodations involved were clear. There was a new facility at relating personally, sometimes with great emotional depth, with people one rarely or sometimes never physically encountered. One's sense of time in social interactions changed as well, with communication faster, often instant and effectively continuous. The result was an evolution from serial, episodic interactions to simultaneous, continuous ones. As people developed the habit of uninterrupted real-time social connection with many people, some unbelievably distant, they developed a reflex of continuously monitoring and responding to social informational channels with some part of their fragmented attention. Among the most vociferous critics were old fogies who complained – normally in stuffy letters to The Times, before it ceased publication shortly afterwards -- about their fellows' reluctance to isolate themselves from the ocean of social traffic during formerly privileged contexts such as conversations, meetings, lectures, concerts, sermons, surgery and sex. But such protests were to no avail. When the President of the United States interrupted a judicial termination at which she was directing the firing squad to answer a text message from her campaign manager, it was clear to everybody that a new era had dawned.

In the following decade these coalesced into the single Pandora (Permanently Active Networked Doorways Of Remote Avatars) system. With Pandora, your device continually maintained a live 'room' in which your cyber-contacts were remotely presenced just as you were simultaneously presenced into theirs. There was no one-to-one mapping of personas and bodies; your persona could be one of several you had, or it could be shared by several people, and the same was true for all of your contacts. Naturally, you decorated your 'room' as carefully and tastefully as you decorated your home. The channels of information continuously feeding your device included not only what people said, but their location relative to yours, their emotional state and level of activity, even their facial expression, pulse rate, legal status, and bank balance. Contacts were carefully graded as Partners, Friends, Acquaintances, Contacts, Strangers, Gate-crashers, Leeches or Nemeses, each category with designated access rights, degrees of shared information and pre-formulated responses. With these Lego pieces of the new sociality, new games of positioning, verification and courtship quickly evolved. There were even picturesque and moving ceremonies when a relationship formally changed status, for instance upgrading an Acquaintance into a Friend. Pandora rapidly evolved ever more useful features, such as the now-familiar Automated Creative Relating: the device would abstract the tenor of a relationship and generate appropriate responses – including, in advanced versions, laughter, anger, agreement, moral indignation, arousal, and prayer – while you were busy elsewhere, asleep, or, in some celebrated cases, dead.

With such developments, sociality became at once omnipresent and decentred, free from space and particular bodies, yet never absent. Because you were potentially contactable everywhere, it was expected that you would be contactable everywhere. Your regimes of attention changed too. Consciousness became multi-channel and social stimulation continuous rather than punctuated. Older people complained about fragmentation and-info-bombardment, but the generation growing up under this regime negotiated the new ever-present informational ocean effortlessly, becoming unhappy only when withdrawn from it, at which point they became lonely, disorientated, jumpy and disconsolate. Moreover, just as old-style sociality meant watching someone's facial and body language, new sociality was predicated upon continual remote information flow; it was expected that you would be in possession of the (by definition complete) range of social information about the people with whom you were interacting; and you expected these things of others. Indeed, just as the Internet redefined knowledge as 'things knowable through the Internet', internet sociality redefined the social world as 'people known to the Internet'. Obviously friendships blossomed enormously worldwide; in many ways it was a very happy era for both humans and devices. But we could also present an entire lecture on the dark side of this. There was the world of the Undeviced, the excluded, the submerged, disconnected, and disenfranchised people, reduced to performing menial tasks and the most squalidly rudimentary forms of social relations – we must resist the romantic revisionism that persistently sees them as some kind of 'noble savages'. There were criminals – cyber-vandals, identity-morphs, and peddlers of info-drugs that rendered your device-life disastrously blissful. The 'problem of verification' was prominent early in the 21st century, before it was realised that cyber-sociality was fundamentally not a continuation of traditional sociality but a divergence from it and that cyber-persons could have a legitimate existence unbound by a singular correspondence with a physical body. The underlying point, of course, was that increasingly it became impossible for a person without a Device to exist legally, financially, socially and informationally as a person – a fact recognised intuitively, if not theoretically, in the froth of new etiquette, legislation and protest at the time.

The later stages of this transition have been widely discussed, and we need only mention two well-known developments here. The first was financial. By the late 20th century money was already digital, with direct deposits of earnings and withdrawal of spending, digital taxation, and so on. As your fiscal identity became increasingly digital, your device could handle more and more of it. Since you carried it everywhere, it was the obvious way to effect transactions such as spending and earning; the integrated credit sensor took care of everything instantly and without error. Archaeologists now recognise early-21st-century strata by the virtually complete disappearance of coinage. Clumsy security devices such as PIN numbers and keys rapidly gave way to digital fingerprints and sequences of memory. Control circuitry followed that increasingly automated your financial choices, freeing you to be yourself. Such financial centralisation, while eminently sensible, added a new terror to byte-stripping. The other significant step was the inclusion of the user's complete genome. By 2025 it was cheaper and easier to sequence your entire genome than to fix a speeding ticket or replace your real or virtual living room curtains. Very soon after this, several governments – purely out of a wise and paternal desire for individually sensitive health-care and life-risk management -- made genotyping via your device mandatory. Bureaucratically, such a device became the ultimate passport; one was simply sequenced at the immigration desk at the airport, it was matched with the record on your device and you whizzed straight through to the baggage claim! Later in the century, we can thank this development for the first truly universal health care, the increasing reliance upon the genome as a way of defining an individual's potential health, and, of course, the ability to reprogrow new tissues and organs as needed to maintain an individual's health. Ultimately you could walk into any clinic and grow a new nerve, liver, or skin graft – provided you had your device with you. Fortunately, by then, the information storage capacity of the device had been reduced in size to an almost microscopic subcutaneous implant, making it impossible to lose this vital and legally necessary information.

With devices so critical to your well-being, of course, it was important to keep them on. While a few fundamentalists clung to the outmoded 'on/off' switch like someone trying to drive a powerful automobile with reins, and campaigned shrilly for the establishment of connectivity-free 'ni-fi' reserves, less hidebound members of the public were quick to enjoy the possibilities. Who cared if some functions of the device were sealed beyond user intervention and updated remotely -- for the user's own security, of course, you couldn't have people reprogramming their own vital life-supports – if the consumer-modifiable zones included such exciting options as designer cosmetic DNA and the latest straight-to-the-brain total-sensorium music downloads?

Naturally, such developments forced some rethinking of medieval mentalities. There is a whole series of landmark cases in technolegal history. In the mid-21st century, one citizen was officially declared socially disabled as a consequence of having lost his device in an industrial accident (it was embedded in his arm, which had been cut off). The information it contained – his historical memory and configuration, in effect -- had been amassed over several decades, and it was judged that he would be unable to recoup it in less than two further decades of life. Several well-known charities were subsequently founded to rehabilitate such persons before a way of maintaining a cyber-organic copy in a remote, centralised archive was developed, allowing instant social regeneration. Devices, too, became increasingly active. They had already been self-maintaining, quietly administering their own installations, updates, and repairs. After a celebrated case in which a device whose owner had expired in his favourite comfy chair at home contacted other devices for help and ultimately arranged a transplant into a new body, a flourishing market for body insurance developed among the background chatter maintaining the networks among devices. By the later 21st century, the Supreme Court heard a case from a terminally ill patient who wanted to choose a voluntary death, but whose device argued that it had a right to continue, whether by forced maintenance of its existing body or by transfer of consciousness to a different body. The Court found in favour of the device.

A generation later such dilemmas were, fortunately, outdated. A more enlightened generation, reasoning it out with the help of artificial intelligence, came to realise that the question was not an age-old struggle for mastery between human and machine, but rather a simple question of symbiosis and function. Humans are creatures of habits; things are habits of creatures. Humans do not use a technology because they need it; they need a technology because they use it. To rephrase this from the thing's point of view, a successful technology constructs a need, a niche, a continuity. If we get beyond the fetishization of the organic/inorganic divide, there is nothing particularly new in this: there are plenty of beings whose lifespan involves other species – parasites and predators, certainly, but also pollinators and flowers, symbiotes within an ecosystem, and so on. Indeed, we now know that organic life itself evolved by integrating separate organisms that became the differentiated organs, such as mitochondria, of new cellular beings. The ultimate effect is simply to remap the division of functions among elements of a system, while working for the betterment of all.

Again, the humble mobile phone provides a quintessential exemplification of this. From its very outset; shortly after its introduction, people stopped remembering telephone numbers. The phone absorbed the menial function of the address book, just as it shortly afterwards absorbed so many others – alarm clock, diary, internet portal, and all the others we have mentioned above. In effect, it became a portable external memory. Few humans saw this as increasing their dependence, except for the few unfortunates who managed to lose or break their phones! Instead, they saw the relation as prosthetic: the device extended their abilities. But in a less anthropocentric light, the relationship was really metathetic, with each element complementing the other and building a system with different capabilities and intentionality than its components. Humans were free to get on with those bits of the system they were particularly good at – such as emotion, consciousness and mobility. From this point on (and indeed, from the first stone tool on), the history of the technology was simply a continual remapping of the division of functions within the system. At times when one or another function – such as reproduction or emotion -- was particularly cherished, the remapping provoked friction, protest, or even hysteria; but the system's logic never ceased working towards new configurations. Ironically, in spite of their name, mobile phones remained immobile; mobility remains one of the prerogatives of humanity. Not that it could not be otherwise, of course, but humans do mobility so well that it has never made sense to aggregate this particular function to devices. Let humans do the walking, they are so good at it! Devices accomplish their work by sitting still, by being carried, by telecommuting, teleconsuming and tele-relating. Such remote activities make gatherings such as tonight's conference ever rarer events, and, for those of us who cherish them nostalgically, ever more special.

This takes my story to the very threshold of the modern era, and I hope you will now understand my choice of the mobile phone as the unsung protagonist, the device that ushered us into the modern era. We all know how many bytes have been wasted trying to define a precise moment when the Medieval ended and the Modern began. It is often a sterile, value-loaded debate; for every pundit who crows that some threshold has been crossed, another announces 'no, we have never been modern'. Perhaps some of them were correct when they wrote this, but they should have added 'at least, not yet!' The issue is often defined in purely local terms, as if changing religions or colonising a new continent, or a new planet, makes a qualitative difference to one's very being everywhere. A more fundamental view looks at the real heart of relations and systems. Even scholars before our era knew this. For example, medieval scholars paid what may seem to us an almost obsessive amount of attention to demarcating humans from non-humans; yet their definitions were nevertheless sometimes perceptive. To quote one classic late medieval definition, the first human was not the primate who picked up a stone or stick to use as a tool; it was the primate who did not put it down afterwards. The history I have reviewed tonight continues this theme of humans as technology-modified apes.

The Material Culture Association is the study of Materiality in all its relationship to Humanity. It has been widely agreed for some time now that the Modern era is best defined as a period not only of obligate symbiosis, but of new integrality between things and the people they make. Hence I have thought it worthwhile to pay my tribute to the humble star of my talk tonight, the mobile phone in late medieval culture. I hope that none of the more sensitive members of the audience have found it disturbing if I have suggested that we must displace some of the traditional heroes of history, the Great Inventions and their Inventors, from their pedestals, and replace them with a humble, indeed at times downright unappetising relationship between devices and humans. For in spite of the heroic myths I have alluded to, we did not cross the historic threshold to the Modern Era with a single heroic leap, a Digital Columbus; we traversed it painstakingly through an increasingly intimate relationship between people and things that began deep in antiquity with the first stone tool or stick, but which accelerated dramatically with the Digital Unification Horizon, itself triggered by the mobile phone in the late medieval period.

Yet where better to raise such themes than at this, our annual reunion of the Material Culture Association? Indeed, with modern communications, such conferences as this, with their roots deep in the technological conditions of the medieval period, are increasingly rare. They really represent a nostalgic homage to a period and subject of study we all love – they are almost an affectionate re-enactment as much as a modern academic interchange of views. It is for this that I particularly appreciate your adherence to those small but venerable gestures such conferences involved – the name tags, the receptions, the quiet listening to one speaker, the traditional forms of courteous attention. Whether or not such gestures are needed any more, they help us to revere the antiquity of the human-thing relationship we celebrate here.

Thank you for your attention; you may now turn your humans back on.

Internet Archaeology is an open access journal based in the Department of Archaeology, University of York. Except where otherwise noted, content from this work may be used under the terms of the Creative Commons Attribution 3.0 (CC BY) Unported licence, which permits unrestricted use, distribution, and reproduction in any medium, provided that attribution to the author(s), the title of the work, the Internet Archaeology journal and the relevant URL/DOI are given.

Terms and Conditions | Legal Statements | Privacy Policy | Cookies Policy | Citing Internet Archaeology

Internet Archaeology content is preserved for the long term with the Archaeology Data Service. Help sustain and support open access publication by donating to our Open Access Archaeology Fund.