Page 164 -
P. 164

8 The Importance of Ontological Structure: Why Validation by ‘Fit-to-Data’...  161


            illustrating their work. Since they sometimes use ontologies as part of their
            methodological approach to modelling with stakeholders, the work of authors such
            as Jean-Pierre Müller, Nicolas Becu and Pascal Perez and their collaborators are
            particularly worth investigating. Some example articles include Müller (2010), Becu
            et al. (2003) and Perez et al. (2009). Companion modellers are not the only ones to
            apply knowledge elicitation to model design, however – see, for example, Bharwani
            et al. (2015).
              Validation has long been a subject of discussion in agent-based modelling, and
            this chapter has not dedicated space to reviewing the excellent thinking that has
            already been done on the topic. The interested reader wanting to access some of
            this literature is advised to look for keywords such as validation, calibration and
            verification in the Journal of Artificial Societies and Social Simulation, currently
            the principal journal for publication of agent-based social simulation work. Notable
            recent articles include Schulze et al. (2017), Drchal et al. (2016), ten Broeke et al.
            (2016) and Lovelace et al. (2015). Other older articles worth a read are Elsenbroich
            (2012), Radax and Rengs (2010) and Rossiter et al. (2010). See also some of the
            debates such as Thompson and Derr’s (2009) critique of Epstein’s (2008) article
            and Troitzsch’s (2009) response and Moss’s (2008) reflections on Windrum et al.’s
            (2007) paper. A practical article on one approach to validating agent-based models
            outwith JASSS is Moss and Edmonds (2005).




            Appendix 1: Neural Networks

            Though there are variants, typically the excitation, x j , of a node j is given by the
            weighted sum of its inputs (8.2):

                                            X
                                                w ij o i                  (8.2)
                                      x j D
                                           i2inputs
            where o i (usually in the range [0, 1], though some formalisms use [ 1, 1]) is the
            output of a node i with a connection that inputs to node j and w ij is the strength
            (weight) of that connection. Nonlinearity of the behaviour of the node is critical
            to the power that the neural network has as an information processing system. It is
            introduced by making the output o j of a node a nonlinear function of its excitation x j .
            There are a number of ways this can be achieved. Since many learning algorithms
            rely on the differentiability of the output with respect to the weights, the sigmoid
            function is typically used:

                                               1
                                                                          (8.3)

                                     o j D
                                         1 C exp  x j
              So, a neural network essentially consists of a directed graph of nodes, where each
            of the links has a weight. If the graph is acyclic, the neural network is known as a
   159   160   161   162   163   164   165   166   167   168   169