Abstract
Simulation offers management accounting research many benefits, such as the ability to model and to experiment with complex and large systems. At the same time, the acceptance of this method is hampered by a feeling of complexity often associated with simulation models and their behavior, as well as with challenges in communicating the modelsâ€™ results. This study shows how these challenges can be addressed via the systematic use of design of experiment (DOE) principles. The DOE process framework is applied to a simulation model of a cost accounting system that is used to quantitatively evaluate two different methods for the allocation of service costs. As a result, we not only demonstrate the potential and benefits of simulation in the field of management accounting, but also show how DOE principles can help to improve understandings of simulation model behavior and the communication of simulation results.
This is a preview of subscription content, access via your institution.
Similar content being viewed by others
Notes
Publications were selected on the basis of a keyword search, using combinations of (computer) simulation, numerical experiment or simulation methods and models and cost accounting. Publications that did not use simulation to conceptually investigate cost accounting systems were excluded (the majority of papers from the search use a cost simulation to estimate cost developments for certain processes).
The model is described using the ODD protocol by Grimm etÂ al. (2010) as a rough guideline.
For a numerical example of the procedure, see the â€śAppendixâ€ť.
The simulation experiment is run in MATLAB and uses the function randn to generate random numbers.
The factor levels are chosen based on interviews with firms and an empirical study by Keilus and Kramer (2006) to provide reasonable settings.
In simulation research, no standard for the analysis of effects has to date been established (here and in the following, Lorscheid 2014). In empirical research, a factorial ANOVA or regression are the traditional approaches. Both are based on the same principle, and their application for simulated data is controversial, since linearity and a normal distribution of the dependent variable is a precondition. Here, a graphical analysis was also conducted to test if the effects hold for the whole range of results and not only for the mean values.
In the simulation experiment, each parameter setting is run 50 times.
References
Antony, J. (2003). Design of experiments for engineers and scientists. Amsterdam: ButterworthHeinemann.
Axelrod, R. (1997). Advancing the art of simulation in the social sciences. In R. Conte, R. Hegselmann, & P. Terna (Eds.), Simulating social phenomena (pp. 21â€“40). Berlin: Springer.
Balakrishnan, R., Hansen, S., & Labro, E. (2011). Evaluating heuristics used when designing product costing systems. Management Science, 57, 520â€“541. doi:10.1287/mnsc.1100.1293.
Balakrishnan, R., & Penno, M. (2014). Causality in the context of analytical models and numerical experiments. Accounting, Organizations and Society, 39, 531â€“534.
Balakrishnan, N., Render, B., & Stair, R. M. (2007). Managerial decision modeling with spreadsheets (2nd ed.). Upper Saddle River: Pearson/Prentice Hall.
Balakrishnan, R., & Sivaramakrishnan, K. (2002). A critical overview of the use of fullcost data for planning and pricing. Journal of Management Accounting Research, 14, 3â€“31.
Barth, R., Meyer, M., & Spitzner, J. (2012). Typical pitfalls of simulation modellingâ€”lessons learned from armed forces and business. Journal of Artificial Societies and Social Simulation, 15(2), 5.
Becker, J., Niehaves, B., & Klose, K. (2005). A framework for epistemological perspectives on simulation. Journal of Artificial Societies and Social Simulation, 8(4), 1.
Box, G. E. P., Hunterm, S. J., & Hunter, W. G. (2005). Statistics for experimenters: design, innovation, and discovery. Hoboken: WileyInterscience.
Cooper, R., & Kaplan, R. S. (1999). The design of cost management systems: text and cases (2nd ed.). Upper Saddle River: Prentice Hall.
Davidsson, P. (2002). Agent based social simulation: a computer science view. Journal of Artificial Societies and Social Simulation, 5(1), 7.
Davis, J. P., Eisenhardt, K. M., & Bingham, C. B. (2007). Developing theory through simulation methods. Academy of Management Review, 32, 480â€“499.
Eason, R., Rosenberger, R., Kokalis, T., Selinger, E., & Grim, P. (2007). What kind of science is simulation? Journal of Experimental & Theoretical Artificial Intelligence, 19, 19â€“28.
Edmonds, T. P., Edmonds, C. T., Tsay, B.Y., & Olds, P. R. (2014). Fundamental managerial accounting concepts (7th ed.). New York: McGrawHill Irwin.
Frahm, M. (2011). Beschreibung von komplexen Projektstrukturen. PMaktuell, 2, 22â€“27.
Garrison, R. H., Noreen, E. W., & Brewer, P. C. (2012). Managerial accounting (14th ed.). New York: McGrawHill/Irwin.
Gilbert, N. (2008). Agentbased models. London: SAGE.
Gilbert, N., & Troitzsch, K. G. (2005). Simulation for the social scientist (2nd ed.). Berkshire: Open University Press.
Grimm, V., et al. (2006). A standard protocol for describing individualbased and agentbased models. Ecological Modelling, 198, 115â€“126. doi:10.1016/j.ecolmodel.2006.04.023.
Grimm, V., Berger, U., DeAngelis, D. L., Polhill, J. G., Giske, J., & Railsback, S. F. (2010). The ODD protocol: a review and first update. Ecological Modelling, 221, 2760â€“2768.
Grisar, C., & Meyer, M. (forthcoming). Use of simulation in controlling research: a systematic literature review for germanspeaking countries.
Hansen, D. R., & Mowen, M. M. (2006). Cost management: accounting and control (5th ed.). Mason: Thomson/SouthWestern.
Harrison, J. R., Carroll, G. R., & Carley, K. M. (2007). Simulation modeling in organizational and management research. Academy of Management Review, 32, 1229â€“1245.
Hendricks, W. A., & Robey, K. W. (1936). The sampling distribution of the coefficient of variation. The Annals of Mathematical Statistics, 7, 129â€“132.
Horngren, C. T., Bhimani, A., Datar, S. M., & Foster, G. (2005). Management and cost accounting (3rd ed.). Edinburgh: Pearson Education Limited.
Horngren, C. T., Datar, S. M., & Foster, G. (2006). Cost accounting: a managerial emphasis (12th ed.). Upper Saddle River: Pearson Prentice Hall.
Horngren, C. T., Datar, S. M., & Rajan, M. V. (2014). Cost accounting: a managerial emphasis (15th ed.). Upper Saddle River: Pearson/Prentice Hall.
Jacobs, F., & Marshall, R. (1999). Accuracy of service cost allocation. The Journal of Cost Analysis & Management, 1, 45â€“57.
Keilus, M., & Kramer, D. (2006). Rechnungswesen und Controlling im regionalen Umfeld Trier: Empirische Ergebnisse zum Stand und zu Entwicklungstendenzen. Fachhochschule Trier Arbeitsbericht Nr. 2.
Kleijnen, J. P. C., Sanchez, S. M., Lucas, T. W., & Cioppa, T. M. (2005). A userâ€™s guide to the brave new world of designing simulation experiments. INFORMS Journal on Computing, 17, 263â€“289. doi:10.1287/ijoc.1050.0136.
Klingert, F. M. A., & Meyer, M. (2012). Effectively combining experimental economics and multiagent simulation: suggestions for a procedural integration with an example from prediction markets research. Computational and Mathematical Organization Theory, 18, 63â€“90.
Labro, E. (2015). Using simulation methods in accounting research. Journal of Management Control Special Issue on Simulation in Management Accounting Research (forthcoming).
Labro, E., & Vanhoucke, M. (2007). A simulation analysis of interactions among errors in costing systems. The Accounting Review, 82, 939â€“962.
Labro, E., & Vanhoucke, M. (2008). Diversity in resource consumption pattern and robustness of costing systems to errors. Management Science, 54, 1715â€“1730.
Law, A. M. (2007). Simulation modelling and analysis (4th ed.). Boston: McGrawHill.
Leitner, S. (2014). A simulation analysis of interactions among intended biases in costing systems and their effects on the accuracy of decisioninfluencing information. Central European Journal of Operations Research, 22, 113â€“138. doi:10.1007/s1010001202752.
Lorscheid, I. (2014). Learning agents for mechanism design analysis. Hamburg: Dissertation, Hamburg University of Technology.
Lorscheid, I., Heine, B.O., & Meyer, M. (2012). Opening the â€śblack boxâ€ť of simulations: Increased transparency and effective communication through the systematic design of experiments. Computational and Mathematical Organization Theory, 18, 22â€“62.
Manuj, I., Mentzer, J. T., & Bowers, M. R. (2009). Improving the rigor of discreteevent simulation in logistics and supply chain research. International Journal of Physical Distribution & Logistics Management, 39, 172â€“201.
Martensson, A., Martensson, P. (2007). Extending rigor and relevance: towards credible, contributory and communicable research. In: ECIS Proceedings.
Meyer, M., Lorscheid, I., & Troitzsch, K. G. (2009). The development of social simulation as reflected in the first ten years of JASSS: a citation and cocitation analysis. Journal of Artificial Societies and Social Simulation 12.
Montgomery, D. C. (2008). Design and analysis of experiments (7th ed.). Hoboken: Wiley.
MĂĽller, B., et al. (2013). Describing human decisions in agentbased modelsâ€”ODD \(+\) D, an extension of the ODD protocol. Environmental Modelling & Software, 48, 37â€“48. doi:10.1016/j.envsoft.2013.06.003.
Railsback, S. S., & Grimm, V. (2012). Agentbased and individualbased modelling: a practical introduction. New Jersey: Princeton University Press.
Reiss, J. (2011). A plea for (good) simulations: nudging economics towards an experimental science. Simulation & Gaming, 42, 243â€“264.
Richiardi, M., Leombruni, R., Saam, N. J., & Sonnessa, M. (2006). A common protocol for agentbased social simulation. Journal of Artificial Societies and Social Simulation 9.
Robinson, S. (2004). Simulation: the practice of model development and use. Chichester: Wiley.
Rubinstein, R. Y., & Kroese, D. P. (2008). Simulation and the monte carlo method. Wiley series in probability and statistics (2nd ed.). Hoboken: Wiley.
Stinson, J. B. (2002). Cost allocationâ€”from the simple to the sublime. Management Accounting Quarterly, 4, 1â€“10.
Wall, F. (2014). Agentbased modeling in managerial science: an illustrative survey and study. Review of Managerial Science 1â€“59.
Wouters, M., Selto, F., Hilton, R., & Maher, M. (2012). Cost management: strategies for business decisions. Berkshire: McGrawHill Higher Education.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
1.1 Numerical example
In the following, we present a numerical example to illustrate the process of the simulation experiment. Therefore, the simulation experiment is conducted once ^{Footnote 10} with a given parameter setting, which is summarized in TableÂ 10. Each step of the simulation process is shown and explained, as well as the calculations, to visualize how the output is generated.
Step 1: Initializing the cost vector and the consumption matrix
The variable Cost distribution on service departments (Dtr_Cost) sets the upper boundary for a uniformly distributed interval, which is 0.5. Thus, each number in this interval has the same probability being drawn. The number of service departments is defined by the variable Number of service departments (No_SD) and is set to 3. In the first step, we draw a number from the interval {0, 0.5} for each of the three service departments. TableÂ 11 shows the resulting numbers. In the second step, we normalize the random numbers and multiply them with the total amount of costs [which is a control variable and set to 1,000,000 monetary units (MU)], to generate the direct costs at each service department (see TableÂ 12).
Next, we create the consumption matrix. The consumption matrix represents the amount of service flow from service departments to other departments. The first row represents the amount of service that the first service department provides to the department in the respective column. It is assumed that service departments do not consume their own service, but that all service departments provide services to production departments. The number of production departments is controlled by the variable Number of production departments (No_PD) and set to 5 in this example. Here again, random numbers are drawn from a uniformly distributed interval whose upper boundary is defined by the variable Variation of supplied services (Var_Serv). The variable is set to 0.5 and thus, for each matrix entry, we draw a number from the interval {0, 0.5}, except for the entries representing service flow between service departments. These entries are only filled to a certain degree, defined by the variable Degree of variety (Degr_Var). This variable is set to 0.4. This means that only 40Â % of the entries are filled with numbers greater than zero. TableÂ 13 shows the results of this step. One can see that between service departments 1 and 3, there is a reciprocal service exchange. Of the six possible service exchanges (the diagonal remains zero, as explained above), two are greater than zero. We normalize each row of the consumption matrix; in this way, for each service department, the sum of provided services adds up to 1. See TableÂ 14 for the results of this step.
Step 2: Calculating the final costs at production departments for both allocation methods
Based on the cost vector and the consumption matrix, both allocation methods calculate their results. The result is a cost vector that represents the allocated cost at each production department.

(a) Direct method
The direct method ignores any services between service departments. Let \(\hbox {x}_{\mathrm{SDi}\_\mathrm{SDn}}\) be the amount of service that is provided from service department i to service department n, and \(\hbox {x}_{\mathrm{SDi}\_\mathrm{PDj}}\) the amount of service that is provided from service department i to production department j. Let \(\hbox {cost}_{\mathrm{SDi}}\) be the total amount of costs at service department i, then \(\hbox {cost}_{\mathrm{SDi}\_\mathrm{PDj}}\)â€”the cost that is allocated from service department i to production department jâ€”is calculated as follows, with n being the total number of service departments:
$$\begin{aligned}&\hbox {cost}_{\mathrm{SDi}\_\mathrm{PDj}} =\frac{\hbox {x}_{\mathrm{SDiPDj}} }{1\mathop \sum \nolimits _{j=1}^n x_{SD_i SD_j }}\times cost_{SD_i }\\&\hbox {cost}_{{\mathrm{SD}}1\_\mathrm{PD}1} =\frac{0.1423}{( {10.2130} )}\times 181,935\,\hbox {MU}=32,896\, \hbox {MU}. \end{aligned}$$
We ignored the amount of service provided to service department 3 (subtracted from the total amount). The respective share of service delivered to production department 1 is multiplied with the direct costs at service department 1. This is step repeated for each service flow. At the end, for each production department, we sum up the allocated costs.
(b) Reciprocal method
The reciprocal method acknowledges the service exchange among service departments. Therefore, a set of equations is solved simultaneously to generate the transfer prices for each service department that consider the service exchange. In this example, only two equations must be solved, since service department 2 neither receives nor delivers any services to other service departments. Let coco\(_{\mathrm{SDi}}\) be the complete cost (including the reciprocal service) at service department i:
Solving for cost\(_{\mathrm{SD1}}\) and cost\(_{\mathrm{SD3}}\):
The costs that are allocated to production department j are calculated as:
We summarize the results from the simulation experiment in TableÂ 15. For each production department, we show the allocated costs using the reciprocal method and using the direct method.
Step 3: Evaluating the results
To evaluate the simulation results, we use the Euclidian distance measure (EUCD), taken from Balakrishnan etÂ al. (2011). It is defined as \(EUCD=\sqrt{\mathop \sum \nolimits _{i=1}^{Nr\_PD} ( {RM_i DM_i } )^{2}}\), with RM being the results for the reciprocal method at the ith production department and DM the respective result for the direct method. This measure represents in monetary units the misallocation for the direct method compared to the reciprocal method. The calculation of the EUCD in this example is:
Thus, the output of the simulation experiment for this run is 8224 MU. It can be interpreted as 8224 monetary units are in total wrongly allocated to production departments. Considering a total amount of 1,000,000 MU in the system, the distortion is relatively low.
Rights and permissions
About this article
Cite this article
Hocke, S., Meyer, M. & Lorscheid, I. Improving simulation model analysis and communication via design of experiment principles: an example from the simulationbased design of cost accounting systems. J Manag Control 26, 131â€“155 (2015). https://doi.org/10.1007/s001870150216z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s001870150216z