Page 150 -
P. 150
8 The Importance of Ontological Structure: Why Validation by ‘Fit-to-Data’... 147
variance is directly related to the number of weights in the network. High-variance
models can be adjusted using the parameters to realize a wide range of input-output
mappings, with the obvious cost of increasing the volume of search space in which
to find the optimum such as mapping.
Introducing bias just to make the modelling process feasible is arguably unsci-
entific: you are allowing your chosen modelling technique to drive your analysis of
a system, rather than allowing your knowledge of that system to determine the way
you describe it in your model. This kind of unscientific bias is one of the practices
that has led some in the agent-based modelling community to be critical of making
assumptions ‘for the sake of simplicity’ (e.g. Moss 2002; Edmonds and Moss 2005).
Although some of these criticisms are focused on the infeasibility of the analysis
itself were a more realistic representation to be used that did not make simplifying
assumptions (e.g. the computation is undecidable), the feasibility of an empirical
modelling process does depend on the availability of data.
Like neural networks, agent-based models potentially have large numbers of
parameters – a multiple of the number of agents and the number of links in the social
network. These parameters determine the heterogeneity and interaction dynamics
of the model. For more traditional modelling paradigms, having large numbers of
parameters is regarded with suspicion. From a practical perspective, there is a good
reason for this heuristic: a high-variance model is more challenging to calibrate.
Each dimension of parameter space adds exponentially to the scale of the search task
and to the requirement for data. Another reason is an interpretation of Ockham’s
razor in a modelling context: if I have two models with the same behaviour, I
prefer the one with fewer parameters. Ockham’s razor is often stated as entia non
sunt multiplicanda praeter necessitatum (literally, entities should not be multiplied
more than necessary, or more naturally, explanations should not use unnecessary
entities) – were it not for the qualifier, this statement would be the antithesis of
agent-based modelling! 2
However, the orthogonality of the parameters in agent-based models may be
more questionable than in traditional mathematical models. Essentially, in tra-
ditional mathematical modelling, each parameter is contributing to the potential
‘wiggliness’ (to use a term from the spline literature, e.g. Wood and Augustin 2002)
of the function the model realizes. Though it is possible (e.g. Gotts and Polhill
2010), it is not necessarily the case that having another agent in the system will
mean that the dynamics of the system as a whole are hugely different; adding
another connection in a neural network, by contrast, does increase the ‘power’ of its
function to realize different shapes in the mapping from input to output by adjusting
the weights. The suspicion of traditional mathematical modellers towards agent-
based models because of the apparently large number of parameters may therefore
2
The case for agent-based modelling being that it is necessary to represent all the agents if you
want to understand the emergent system-level dynamics.