Arch-I-Scan: Automated recording and machine learning for collating Roman ceramic tablewares and investigating eating and drinking practices

 

MOL(2)

Ceramic finewares are the most essential evidence for investigating the socio-cultural practices of eating and drinking across the Roman world and constitute some of the most extensive archaeological remains. However, there are currently major barriers to using this wealth of mainly tablewares to answer consumption-oriented questions. The extensiveness of these remains and current time-consuming and costly specialist processes for classifying and collating these truly 'big archaeological data' means they are often recorded selectively and summarily, rather than comprehensively and consistently.

Prof. Penelope Allison from the School of Archaeology and Ancient History and Prof. Ivan Tyukin from the School of Mathematics have been awarded Arts and Humanities Research Council funding to develop Arch-I-Scan, software that has the potential to revolutionise how artefacts are identified and recorded, the proof-of-concept MOLA (1)experiment of which was successfully carried out for the research network, 'Big Data on the Roman Table', also funded by the AHRC. They are collaborating with partners from the Museum of London, Museum of London Archaeology, University of Leicester Archaeology Service (ULAS), The Vindolanda Trust and Colchester and Ipswich Museum Service to train this service and to develop its machine-learning capacity on 100,000s of Roman tableware remains in these extensive collections from these different social and regional contexts in Roman Britain.

This project is thus developing a state-of-the-art image-recognition and machine-learning service which is recording complete/near complete vessels and more fragmentary remains.  Arch-I-Scan will learn to automatically recognise and record details of pottery remains, using handheld devices (e.g. mobile phones)  operated by non-specialists and specialists, and to digitallycollate and store large quantities of data.

Roman tableware remains, often from large-scale production centres (e.g. samian ware from South Gaul), constitute some of the most easily recognisable and extensive bodies of archaeological data with high levels of similarity, in ranges of forms and fabric types, across a wide geographical area. Thus, besides being crucial evidence for Roman food- and drink-consumption practices in different social contexts in Britain, the selected material comprises an excellent body of artefacts to ensure wider application of Arch-I-Scan at other Roman sites, in Britain and beyond.

MOL(1)Using artificial intelligence algorithms similar to facial recognition software, Arch-I-Scan will first compile a large dataset of images of ceramics and information on details of size, shape, design and texture. Then, through machine-learning, Arch-I-Scan will become more adept at identifying artefacts, with the eventual goal being a system that can identify precisely to which whole ceramic vessel recorded fragments belonged. Once Arch-I-Scan is sufficiently trained and has recorded and 'learned' to classify the artefacts fromMOL(3) these collections, the resulting datasets will be made freely available for other archaeologists to use as comparanda in their own analyses.  This can lead to more comprehensive analyses across Roman archaeology for more socially-oriented questions. Arch-I-Scan can continue to 'learn' from these and other types of pottery as well as other archaeological artefacts. Greater knowledge of how the micro-histories of objects - and the 'human-thing entanglements' of their micro-archaeological contexts -  play important roles in our understandings of socio-cultural practice in the context of Roman history can also transform material-cultural approaches to social practice in global history.

Like our page on Facebook and follow us for the latest project updates.

Follow us on Twitter @Arch_I_Scan

We are also on Instagram as @archiscanproject.

AHRC logo

MOLA logoMoL logoULAS logoCIMS logovindolanda logo

Share this page: