Models are not innocent instruments that objectively represent the world as it ‘really’ is. Rather, models play an inscriptive role when it comes to understanding and dealing with risks. As various studies of modeling practices have shown, the development and use of models requires that target systems are translated into physical and/or computational models that facilitate experimentation. As a result, the use of models is ‘inscriptive’, which can be dangerous, due to potentially dangerous assumptions, uncertainties, and blind spots. Modeling practices shape the understanding of target systems, and may therefore have dangerous repercussions for practices that draw on model output. For example, model-based experiments can be used to determine the adequate height of flood defences. However, the determined height may turn out to be insufficient due to the omission or inadequate representation of crucial parameters of the target system, such as the amount of water, the impact of wind, and the structural stability of flood defences.
Although the inscriptive aspects of models should be taken seriously, they should not necessarily be lamented either. Models give an idea of what may happen in futures that have some probability, or in extreme circumstances that which may not often, or perhaps never actually occur. Moreover, models enable experimentation in a way that does not interfere with real-world systems, and can be much cheaper and more efficient than experimentation in such systems. By means of models, risks are imagined and reimagined in various ways, potentially opening up new vistas for fruitful intervention in real-world systems.
In sum, modeling practice involves a razor’s edge between dangerous omissions and fruitful intervention, which deserves further elaboration. Given the instrumental role of models in equipping society against risks, the extent to which model output is subjected to scrutiny warrants further attention. For example, a highly reflexive attitude towards modelling that questions how models imagine risks may or may not be adopted by certain social groups. Failing to approach models and their inscriptive aspects in a reflexive manner may render society vulnerable to the potentially negative effects of models.
Rather than wallowing in a defeatist elaboration of the inscriptive role of models, these perhaps sobering remarks are meant as a call to arms: the equipment used to model risks deserves more credit. Models need to be appreciated as equipment of crucial importance, provided reliance on modeling is met with persistent attempts to comprehend how models shape one’s understanding of risks. This also leads to questions about good modeling practice and good coding practices, in which engineers not only refine their craftsmanship, but also contribute to responsible use of simulations and models.