JASSS logo

19 articles matched your search for the keywords:
Agent-Based Product Diffusion Model, Individual Purchase Time, Social Network Structure, Estimation, Sensitivity Analysis

A Common Protocol for Agent-Based Social Simulation

Matteo Richiardi, Roberto Leombruni, Nicole J. Saam and Michele Sonnessa
Journal of Artificial Societies and Social Simulation 9 (1) 15

Kyeywords: Agent-Based, Simulations, Methodology, Calibration, Validation, Sensitivity Analysis
Abstract: Traditional (i.e. analytical) modelling practices in the social sciences rely on a very well established, although implicit, methodological protocol, both with respect to the way models are presented and to the kinds of analysis that are performed. Unfortunately, computer-simulated models often lack such a reference to an accepted methodological standard. This is one of the main reasons for the scepticism among mainstream social scientists that results in low acceptance of papers with agent-based methodology in the top journals. We identify some methodological pitfalls that, according to us, are common in papers employing agent-based simulations, and propose appropriate solutions. We discuss each issue with reference to a general characterization of dynamic micro models, which encompasses both analytical and simulation models. In the way, we also clarify some confusing terminology. We then propose a three-stage process that could lead to the establishment of methodological standards in social and economic simulations.

An Agent-Based Competitive Product Diffusion Model for the Estimation and Sensitivity Analysis of Social Network Structure and Purchase Time Distribution

Keeheon Lee, Shintae Kim, Chang Ouk Kim and Taeho Park
Journal of Artificial Societies and Social Simulation 16 (1) 3

Kyeywords: Agent-Based Product Diffusion Model, Individual Purchase Time, Social Network Structure, Estimation, Sensitivity Analysis
Abstract: To maximise the possibility of success for a new product and minimise the risk and opportunity cost of a failed product, firms must understand the diffusion dynamics of competing products. The diffusion dynamics of competing products emerge from the aggregation of consumers' decisions. At the individual level, a consumer's decision consists of "which product to buy among the available products" and "when to buy a product". Individual product choices are affected by local and global social interactions among consumers. It would be helpful for firms to be able to determine the characteristics of the relevant social network for their target market and how changes in this social network influence their market shares. In addition, determining the distribution of product purchase times of consumers and how their variation affects market shares are interesting issues for firms. In this study, therefore, we propose an agent-based simulation model that generates the market share paths (market shares over time) of competing products. We apply the model to estimate the social network and purchase time distribution of the Korean netbook market. Our observation is that Korean netbook consumers tend to buy a product without hesitation, and their social network is rather regular but sparse. We also conduct sensitivity analyses with respect to the social network and the purchase time distribution.

Agent-Based Simulation of the Search Behavior in China's Resale Housing Market: Evidence from Beijing

Hong Zhang and Yang Li
Journal of Artificial Societies and Social Simulation 17 (1) 18

Kyeywords: Resale Housing Market, Search Behavior, Search Model, Agent-Based Simulation, Sensitivity Analysis
Abstract: In the paradigm of the search theory, we established the search model applicable to the characteristics of China's resale housing market, by modeling the search behavior for buyer and seller, respectively. Setting the parameters based on the Beijing housing market survey in August 2012, we implemented agent-based simulation to study the dynamics of the search behavior measured by search intensity and search time. Sensitivity test was also used to analyze the determinants of the search behavior for trading agents. The simulation results validate the idiosyncratic feature of the agent's search behavior, which is consistent with theoretical analysis. The increase of matching efficiency promotes the agents' search intensities, but the higher unit search cost can reduce the agents' search intensities. The buyer's search behavior is more sensitive to the change in the market tightness ratio. Brokerage service lowers the transaction price and lessens the agents' search intensities. Sensitivity test further reveals that, the matching efficiency and the market tightness ratio play very important role in improving housing market liquidity. The changes in the search cost and the broker commission rate can reduce the agents' search intensities significantly and there are critical turning points at which the abrupt change occurs.

Pricing and Timing Strategies for New Product Using Agent-Based Simulation of Behavioural Consumers

Keeheon Lee, Hoyeop Lee and Chang Ouk Kim
Journal of Artificial Societies and Social Simulation 17 (2) 1

Kyeywords: Product Diffusion, Pricing and Time Strategies, Korean Mobile Phone Market, Sensitivity Analysis
Abstract: In this study, we are interested in the problem of determining the pricing and timing strategies of a new product by developing an agent-based product diffusion simulation. In the proposed simulation model, agents imitate behavioural consumers, who are reference dependent and risk averse in the evaluation of new products and whose interactions create word-of-mouth regarding new products. Pricing and timing strategies involve the timing of a new product release, the timing of providing a discount on a new product, and the relative rates of discounts. We conduct two experiments in this study. In both experiments, we consider the urban young person segment in the mobile phone market in Korea, in which three major new products - two smartphones and one convergence product - compete with one another. The first experiment is sensitivity analysis on the product life cycle and social influence. The objective is to observe how consumer agents behave as the product life cycle and the degree of sensitivity on social influence change. The second experiment is sensitivity analysis on time-to-market, time-to-discount, and amount-to-discount. The marketing strategy that maximises the sales volume (or revenue) of the new convergence product is sought from the sensitivity analysis. Based on the result, we provide pricing and timing implications for firms pursuing sales volumes (or revenue) increase.

Facilitating Parameter Estimation and Sensitivity Analysis of Agent-Based Models: A Cookbook Using NetLogo and 'R'

Jan C. Thiele, Winfried Kurth and Volker Grimm
Journal of Artificial Societies and Social Simulation 17 (3) 11

Kyeywords: Parameter Fitting, Sensitivity Analysis, Model Calibration, Agent-Based Model, Inverse Modeling, NetLogo
Abstract: Agent-based models are increasingly used to address questions regarding real-world phenomena and mechanisms; therefore, the calibration of model parameters to certain data sets and patterns is often needed. Furthermore, sensitivity analysis is an important part of the development and analysis of any simulation model. By exploring the sensitivity of model output to changes in parameters, we learn about the relative importance of the various mechanisms represented in the model and how robust the model output is to parameter uncertainty. These insights foster the understanding of models and their use for theory development and applications. Both steps of the model development cycle require massive repetitions of simulation runs with varying parameter values. To facilitate parameter estimation and sensitivity analysis for agent-based modellers, we show how to use a suite of important established methods. Because NetLogo and R are widely used in agent-based modelling and for statistical analyses, we use a simple model implemented in NetLogo as an example, packages in R that implement the respective methods, and the RNetLogo package, which links R and NetLogo. We briefly introduce each method and provide references for further reading. We then list the packages in R that may be used for implementing the methods, provide short code examples demonstrating how the methods can be applied in R, and present and discuss the corresponding outputs. The Supplementary Material includes full, adaptable code samples for using the presented methods with R and NetLogo. Our overall aim is to make agent-based modellers aware of existing methods and tools for parameter estimation and sensitivity analysis and to provide accessible tools for using these methods. In this way, we hope to contribute to establishing an advanced culture of relating agent-based models to data and patterns observed in real systems and to foster rigorous and structured analyses of agent-based models.

The Complexities of Agent-Based Modeling Output Analysis

Ju-Sung Lee, Tatiana Filatova, Arika Ligmann-Zielinska, Behrooz Hassani-Mahmooei, Forrest Stonedahl, Iris Lorscheid, Alexey Voinov, J. Gareth Polhill, Zhanli Sun and Dawn C. Parker
Journal of Artificial Societies and Social Simulation 18 (4) 4

Kyeywords: Agent-Based Modeling, Methodologies, Statistical Test, Sensitivity Analysis, Spatio-Temporal Heterogeneity, Visualization
Abstract: The proliferation of agent-based models (ABMs) in recent decades has motivated model practitioners to improve the transparency, replicability, and trust in results derived from ABMs. The complexity of ABMs has risen in stride with advances in computing power and resources, resulting in larger models with complex interactions and learning and whose outputs are often high-dimensional and require sophisticated analytical approaches. Similarly, the increasing use of data and dynamics in ABMs has further enhanced the complexity of their outputs. In this article, we offer an overview of the state-of-the-art approaches in analyzing and reporting ABM outputs highlighting challenges and outstanding issues. In particular, we examine issues surrounding variance stability (in connection with determination of appropriate number of runs and hypothesis testing), sensitivity analysis, spatio-temporal analysis, visualization, and effective communication of all these to non-technical audiences, such as various stakeholders.

Which Sensitivity Analysis Method Should I Use for My Agent-Based Model?

Guus ten Broeke, George van Voorn and Arend Ligtenberg
Journal of Artificial Societies and Social Simulation 19 (1) 5

Kyeywords: Sensitivity Analysis, Emergent Properties, Harvest Decision Model, Variance Decomposition
Abstract: Existing methodologies of sensitivity analysis may be insufficient for a proper analysis of Agent-based Models (ABMs). Most ABMs consist of multiple levels, contain various nonlinear interactions, and display emergent behaviour. This limits the information content that follows from the classical sensitivity analysis methodologies that link model output to model input. In this paper we evaluate the performance of three well-known methodologies for sensitivity analysis. The three methodologies are extended OAT (one-at-a-time), and proportional assigning of output variance by means of model fitting and by means of Sobol’ decomposition. The methodologies are applied to a case study of limited complexity consisting of free-roaming and procreating agents that make harvest decisions with regard to a diffusing renewable resource. We find that each methodology has its own merits and exposes useful information, yet none of them provide a complete picture of model behaviour. We recommend extended OAT as the starting point for sensitivity analysis of an ABM, for its use in uncovering the mechanisms and patterns that the ABM produces.

Responsiveness of Mining Community Acceptance Model to Key Parameter Changes

Mark Kofi Boateng and Kwame Awuah-Offei
Journal of Artificial Societies and Social Simulation 20 (3) 4

Kyeywords: Mining Community, Agent-Based Modeling, Diffusion, Sensitivity Analysis, Mining
Abstract: The mining industry has difficulties predicting changes in the level of community acceptance of its projects over time. These changes are due to changes in the society and individual perceptions around these mines as a result of the mines’ environmental and social impacts. Agent-based modeling can be used to facilitate better understanding of how community acceptance changes with changing mine environmental impacts. This work investigates the sensitivity of an agent-based model (ABM) for predicting changes in community acceptance of a mining project due to information diffusion to key input parameters. Specifically, this study investigates the responsiveness of the ABM to average degree (total number of friends) of the social network, close neighbor ratio (a measure of homophily in the social network) and number of early adopters (“innovators”). A two-level full factorial experiment was used to investigate the sensitivity of the model to these parameters. The primary (main), secondary and tertiary effects of each parameter were estimated to assess the model’s sensitivity. The results show that the model is more responsive to close neighbor ratio and number of early adopters than average degree. Consequently, uncertainty surrounding the inferences drawn from simulation experiments using the agent-based model will be minimized by obtaining more reliable estimates of close neighbor ratio and number of early adopters. While it is possible to reliably estimate the level of early adopters from the literature, the degree of homophily (close neighbor ratio) has to be estimated from surveys that can be expensive and unreliable. Further, work is required to find economic ways to document relevant degrees of homophily in social networks in mining communities.

Direct and Indirect Economic Incentives to Mitigate Nitrogen Surpluses: A Sensitivity Analysis

Alena Schmidt, Magdalena Necpalova, Albert Zimmermann, Stefan Mann, Johan Six and Gabriele Mack
Journal of Artificial Societies and Social Simulation 20 (4) 7

Kyeywords: Sensitivity Analysis, Full Factorial Design, Nitrogen Input Tax, Nitrogen Surplus, Swiss Agriculture
Abstract: The reduction of nitrogen (N) surplus is an ongoing topic in the agri-environmental policies of many countries in the developed world. The introduction of N balance estimation in agricultural sector models is therefore pertinent and requires an interdisciplinary approach. We extended the agent based agricultural sector model SWISSland with an N farm gate balance estimation to pre-evaluate the introduction of a levy on N inputs, particularly a levy on fertilizer and imported concentrates, on N surplus reduction in the Swiss agriculture. The model was based on the Swiss farm accountancy data network (FADN) for 3,000 farms. The model’s ability to represent the N balance was assessed by conducting a structured full factorial sensitivity analysis. The sensitivity analysis revealed the possibility to switch to organic farming and the hectare based payments for ensuring food security as key parameters with the largest influence on the modelled N surplus. The evaluation of N input levy scenarios suggested that an introduction of a tax of 800% of N price will reduce the N surplus by 10% indicating a price elasticity of -0.03. The sensitivity analysis and the results from the levy scenarios suggest that indirect instruments, such as optimizing the direct payments scheme, should be considered rather than direct instruments for an effective N surpluses mitigation in Swiss agriculture.

Integrating Global Sensitivity Approaches to Deconstruct Spatial and Temporal Sensitivities of Complex Spatial Agent-Based Models

Nicholas R Magliocca, Virginia McConnell and Margaret Walls
Journal of Artificial Societies and Social Simulation 21 (1) 12

Kyeywords: Global Sensitivity Analysis, Variance Decomposition, Time-Varying Sensitivity Analysis, Spatial Uncertainty, Coastal Hazards
Abstract: Spatial agent-based models (ABMs) can be powerful tools for understanding individual level decision-making. However, in an attempt to represent realistic decision-making processes, spatial ABMs often become extremely complex, making it difficult to identify and quantify sources of model sensitivity. This paper implements a coastal version of the economic agent-based urban growth model, CHALMS, to investigate both space- and time-varying sensitivities of simulated coastal development dynamics. We review the current state of spatially- and temporally-explicit global sensitivity analyses (GSA) for environmental modeling in general, and build on the innovative but nascent efforts to implement these approaches with complex spatial ABMs. Combined variance- and density-based approaches to GSA were used to investigate the partitioning, magnitude, and directionality of model output variance. Time-varying GSA revealed sensitivity of multiple outputs to storm frequency and cyclical patterns of sensitivity for other input parameters. Spatially-explicit GSA showed diverging sensitivities at landscape versus (smaller-scale) zonal levels, reflecting trade-offs in residential housing consumer location decisions and spatial ‘spill-over’ interactions. More broadly, when transitioning from a conceptual to empirically parameterized model, sensitivity analysis is a helpful step to prioritize parameters for data collection, particularly when data collection is costly. These findings illustrate unique challenges of and need to perform comprehensive sensitivity analysis with dynamic, spatial ABMs.

How Policy Decisions Affect Refugee Journeys in South Sudan: A Study Using Automated Ensemble Simulations

Diana Suleimenova and Derek Groen
Journal of Artificial Societies and Social Simulation 23 (1) 2

Kyeywords: Refugee Modelling, Agent-Based Modelling, Automation Toolkit, Policy Decisions, Validation, Sensitivity Analysis
Abstract: Forced displacement has a huge impact on society today, as more than 68 million people are forcibly displaced worldwide. Existing methods for forecasting the arrival of migrants, especially refugees, may help us to better allocate humanitarian support and protection. However, few researchers have investigated the effects of policy decisions, such as border closures, on the movement of these refugees. Recently established simulation development approaches have made it possible to conduct such a study. In this paper, we use such an approach to investigate the effect of policy decisions on refugee arrivals for the South Sudan refugee crisis. To make such a study feasible in terms of human effort, we rely on agent-based modelling, and have automated several phases of simulation development using the FabFlee automation toolkit. We observe a decrease in the average relative difference from 0.615 to 0.499 as we improved the simulation model with additional information. Moreover, we conclude that the border closure and a reduction in camp capacity induce fewer refugee arrivals and more time spend travelling to other camps. While a border opening and an increase in camp capacity result in a limited increase in refugee arrivals at the destination camps. To the best of our knowledge, we are the first to conduct such an investigation for this conflict.

‘One Size Does Not Fit All’: A Roadmap of Purpose-Driven Mixed-Method Pathways for Sensitivity Analysis of Agent-Based Models

Arika Ligmann-Zielinska, Peer-Olaf Siebers, Nicholas R Magliocca, Dawn C. Parker, Volker Grimm, Jing Du, Martin Cenek, Viktoriia Radchuk, Nazia N. Arbab, Sheng Li, Uta Berger, Rajiv Paudel, Derek T. Robinson, Piotr Jankowski, Li An and Xinyue Ye
Journal of Artificial Societies and Social Simulation 23 (1) 6

Kyeywords: Sensitivity Analysis, Agent-Based Model, Individual-Based Model, Review
Abstract: Designing, implementing, and applying agent-based models (ABMs) requires a structured approach, part of which is a comprehensive analysis of the output to input variability in the form of uncertainty and sensitivity analysis (SA). The objective of this paper is to assist in choosing, for a given ABM, the most appropriate methods of SA. We argue that no single SA method fits all ABMs and that different methods of SA should be used based on the overarching purpose of the model. For example, abstract exploratory models that focus on a deeper understanding of the target system and its properties are fed with only the most critical data representing patterns or stylized facts. For them, simple SA methods may be sufficient in capturing the dependencies between the output-input spaces. In contrast, applied models used in scenario and policy-analysis are usually more complex and data-rich because a higher level of realism is required. Here the choice of a more sophisticated SA may be critical in establishing the robustness of the results before the model (or its results) can be passed on to end-users. Accordingly, we present a roadmap that guides ABM developers through the process of performing SA that best fits the purpose of their ABM. This roadmap covers a wide range of ABM applications and advocates for the routine use of global methods that capture input interactions and are, therefore, mandatory if scientists want to recognize all sensitivities. As part of this roadmap, we report on frontier SA methods emerging in recent years: a) handling temporal and spatial outputs, b) using the whole output distribution of a result rather than its variance, c) looking at topological relationships between input data points rather than their values, and d) looking into the ABM black box – finding behavioral primitives and using them to study complex system characteristics like regime shifts, tipping points, and condensation versus dissipation of collective system behavior.

Calibrating Agent-Based Models with Linear Regressions

Ernesto Carrella, Richard Bailey and Jens Koed Madsen
Journal of Artificial Societies and Social Simulation 23 (1) 7

Kyeywords: Agent-Based Models, Indirect Inference, Estimation, Calibration, Simulated Minimum Distance, Approximate Bayesian Computation
Abstract: In this paper, we introduce a simple way to parametrize simulation models by using regularized linear regression. Regressions bypass the three major challenges of calibrating by minimization: selecting the summary statistics, defining the distance function and minimizing it numerically. By substituting regression with classification, we can extend this approach to model selection. We present five example estimations: a statistical fit, a biological individual-based model, a simple real business cycle model, a non-linear biological simulation and heuristics selection in a fishery agent-based model. The outcome is a method that automatically chooses summary statistics, weighs them and uses them to parametrize models without running any direct minimization.

Estimating Spatio-Temporal Risks from Volcanic Eruptions Using an Agent-Based Model

J Jumadi, Nick Malleson, Steve Carver and Duncan Quincey
Journal of Artificial Societies and Social Simulation 23 (2) 2

Kyeywords: ABM, Volcanic Crisis, Risk Estimation, Spatio-Temporal Modeling, MCE, Merapi
Abstract: Managing disasters caused by natural events, especially volcanic crises, requires a range of approaches, including risk modelling and analysis. Risk modelling is commonly conducted at the community/regional scale using GIS. However, people and objects move in response to a crisis, so static approaches cannot capture the dynamics of the risk properly, as they do not accommodate objects’ movements within time and space. The emergence of Agent-Based Modelling makes it possible to model the risk at an individual level as it evolves over space and time. We propose a new approach of Spatio-Temporal Dynamics Model of Risk (STDMR) by integrating multi-criteria evaluation (MCE) within a georeferenced agent-based model, using Mt. Merapi, Indonesia, as a case study. The model makes it possible to simulate the spatio-temporal dynamics of those at risk during a volcanic crisis. Importantly, individual vulnerability is heterogeneous and depends on the characteristics of the individuals concerned. The risk for the individuals is dynamic and changes along with the hazard and their location. The model is able to highlight a small number of high-risk spatio-temporal positions where, due to the behaviour of individuals who are evacuating the volcano and the dynamics of the hazard itself, the overall risk in those times and places is extremely high. These outcomes are extremely relevant for the stakeholders, and the work of coupling an ABM, MCE, and dynamic volcanic hazard is both novel and contextually relevant.

Emergence of Small-World Networks in an Overlapping-Generations Model of Social Dynamics, Trust and Economic Performance

Katarzyna Growiec, Jakub Growiec and Bogumił Kamiński
Journal of Artificial Societies and Social Simulation 23 (2) 8

Kyeywords: Social Network Structure, Social Network Dynamics, Trust, Willingness to Cooperate, Economic Performance, Agent-Based Model
Abstract: We study the impact of endogenous creation and destruction of social ties in an artificial society on aggregate outcomes such as generalized trust, willingness to cooperate, social utility and economic performance. To this end we put forward a computational multi-agent model where agents of overlapping generations interact in a dynamically evolving social network. In the model, four distinct dimensions of individuals’ social capital: degree, centrality, heterophilous and homophilous interactions, determine their generalized trust and willingness to cooperate, altogether helping them achieve certain levels of social utility (i.e., utility from social contacts) and economic performance. We find that the stationary state of the simulated social network exhibits realistic small-world topology. We also observe that societies whose social networks are relatively frequently reconfigured, display relatively higher generalized trust, willingness to cooperate, and economic performance – at the cost of lower social utility. Similar outcomes are found for societies where social tie dissolution is relatively weakly linked to family closeness.

Metamodels for Evaluating, Calibrating and Applying Agent-Based Models: A Review

Bruno Pietzsch, Sebastian Fiedler, Kai G. Mertens, Markus Richter, Cédric Scherer, Kirana Widyastuti, Marie-Christin Wimmler, Liubov Zakharova and Uta Berger
Journal of Artificial Societies and Social Simulation 23 (2) 9

Kyeywords: Individual-Based Model, Surrogate Model, Emulator, Calibration, Sensitivity Analysis, Review
Abstract: The recent advancement of agent-based modeling is characterized by higher demands on the parameterization, evaluation and documentation of these computationally expensive models. Accordingly, there is also a growing request for "easy to go" applications just mimicking the input-output behavior of such models. Metamodels are being increasingly used for these tasks. In this paper, we provide an overview of common metamodel types and the purposes of their usage in an agent-based modeling context. To guide modelers in the selection and application of metamodels for their own needs, we further assessed their implementation effort and performance. We performed a literature research in January 2019 using four different databases. Five different terms paraphrasing metamodels (approximation, emulator, meta-model, metamodel and surrogate) were used to capture the whole range of relevant literature in all disciplines. All metamodel applications found were then categorized into specific metamodel types and rated by different junior and senior researches from varying disciplines (including forest sciences, landscape ecology, or economics) regarding the implementation effort and performance. Specifically, we captured the metamodel performance according to (i) the consideration of uncertainties, (ii) the suitability assessment provided by the authors for the particular purpose, and (iii) the number of valuation criteria provided for suitability assessment. We selected 40 distinct metamodel applications from studies published in peer-reviewed journals from 2005 to 2019. These were used for the sensitivity analysis, calibration and upscaling of agent-based models, as well to mimic their prediction for different scenarios. This review provides information about the most applicable metamodel types for each purpose and forms a first guidance for the implementation and validation of metamodels for agent-based models.

The Use of Surrogate Models to Analyse Agent-Based Models

Guus ten Broeke, George van Voorn, Arend Ligtenberg and Jaap Molenaar
Journal of Artificial Societies and Social Simulation 24 (2) 3

Kyeywords: Sensitivity Analysis, Surrogate Model, Support Vector Machine, Model Analysis
Abstract: The utility of Agent Based Models (ABMs) for decision making support as well as for scientific applications can be increased considerably by the availability and use of methodologies for thorough model behaviour analysis. In view of their intrinsic construction, ABMs have to be analysed numerically. Furthermore, ABM behaviour is often complex, featuring strong non-linearities, tipping points, and adaptation. This easily leads to high computational costs, presenting a serious practical limitation. Model developers and users alike would benefit from methodologies that can explore large parts of parameter space at limited computational costs. In this paper we present a methodology that makes this possible. The essence of our approach is to develop a cost-effective surrogate model based on ABM output using machine learning to approximate ABM simulation data. The development consists of two steps, both with iterative loops of training and cross-validation. In the first part, a Support Vector Machine (SVM) is developed to split behaviour space into regions of qualitatively different behaviour. In the second part, a Support Vector Regression (SVR) is developed to cover the quantitative behaviour within these regions. Finally, sensitivity indices are calculated to rank the importance of parameters for describing the boundaries between regions, and for the quantitative dynamics within regions. The methodology is demonstrated in three case studies, a differential equation model of predator-prey interaction, a common-pool resource ABM and an ABM representing the Philippine tuna fishery. In all cases, the model and the corresponding surrogate model show a good match. Furthermore, different parameters are shown to influence the quantitative outcomes, compared to those that influence the underlying qualitative behaviour. Thus, the method helps to distinguish which parameters determine the boundaries in parameter space between regions that are separated by tipping points, or by any criterion of interest to the user.

No Free Lunch when Estimating Simulation Parameters

Ernesto Carrella
Journal of Artificial Societies and Social Simulation 24 (2) 7

Kyeywords: Estimation, Calibration, Approximate Bayesian Computation, Random Forest, Generalized Additive Model, Bootstrap
Abstract: In this paper, we have estimated the parameters of 41 simulation models to find which of 9 estimation algorithms performs better. Unfortunately, no single algorithm was the best for all or even most of the models. Rather, five main results emerge from this research. First, each algorithm was the best estimator for at least one parameter. Second, the best estimation algorithm varied not only between models but even between parameters of the same model. Third, each estimation algorithm failed to estimate at least one identifiable parameter. Fourth, choosing the right algorithm improved estimation performance by more than quadrupling the number of model runs. Fifth, half of the agent-based models tested could not be fully identified. We therefore argue that the testing performed here should be done in other applied work and to facilitate this we would like to share the R package 'freelunch'.

Comparing Mechanisms of Food Choice in an Agent-Based Model of Milk Consumption and Substitution in the UK

Matthew Gibson, Raphael Slade, Joana Portugal Pereira and Joeri Rogelj
Journal of Artificial Societies and Social Simulation 24 (3) 9

Kyeywords: Food Choice, Milk Consumption, Consumer Behaviour, Agent-Based Modelling, Calibration Optimisation, Global Temporal Sensitivity Analysis
Abstract: Substitution of food products will be key to realising widespread adoption of sustainable diets. We present an agent-based model of decision-making and influences on food choice, and apply it to historically observed trends of British whole and skimmed (including semi) milk consumption from 1974 to 2005. We aim to give a plausible representation of milk choice substitution, and test different mechanisms of choice consideration. Agents are consumers that perceive information regarding the two milk choices, and hold values that inform their position on the health and environmental impact of those choices. Habit, social influence and post-decision evaluation are modelled. Representative survey data on human values and long-running public concerns empirically inform the model. An experiment was run to compare two model variants by how they perform in reproducing these trends. This was measured by recording mean weekly milk consumption per person. The variants differed in how agents became disposed to consider alternative milk choices. One followed a threshold approach, the other was probability based. All other model aspects remained unchanged. An optimisation exercise via an evolutionary algorithm was used to calibrate the model variants independently to observed data. Following calibration, uncertainty and global variance-based temporal sensitivity analysis were conducted. Both model variants were able to reproduce the general pattern of historical milk consumption, however, the probability-based approach gave a closer fit to the observed data, but over a wider range of uncertainty. This responds to, and further highlights, the need for research that looks at, and compares, different models of human decision-making in agent-based and simulation models. This study is the first to present an agent-based modelling of food choice substitution in the context of British milk consumption. It can serve as a valuable pre-curser to the modelling of dietary shift and sustainable product substitution to plant-based alternatives in Britain.