University of Cambridge > Talks.cam > Isaac Newton Institute Seminar Series > Bayesian networks for the evaluation of evidence when attributing paintings to painters

Bayesian networks for the evaluation of evidence when attributing paintings to painters

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact INI IT.

FOS - Probability and statistics in forensic science

Questions of provenance and attribution have long since motivated art historical research. Current authentication studies combine traditional humanities-based methods (for example stylistic analysis, archival research) with scientific investigation using instrumental analysis techniques like X-ray based methods, GC-MS, spectral imaging and metal-isotope research. Keeping an overview of information delivered by different specialists and establishing its relative weight is a growing challenge.   To help clarify complex situations where the relative weight of evidence needs to be established, the Bayesian framework for interpretation of evidence shows great promise. Introducing this mathematical system to calculate the probability of hypotheses based on various pieces of evidence, will strengthen the scientific basis for (art) historical and scientific studies of art. Bayesian networks can accommodate a large variation in data and can quantify the value of each piece of evidence. Their flexibility allows us to incorporate new evidence and quantify its influence.   In this presentation I will present the first results of a pilot study regarding the opportunities and the challenges of implementing Bayesian networks to structure evidence/arguments in painting attribution questions. This research is based on the painting Sunset at Montmajour that was attributed to Vincent van Gogh in 2013.



This talk is part of the Isaac Newton Institute Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity