Advertisement

Abstract

The conventional maxim of "what gets measured gets done" has motivated many companies to systematically measure their performance over the years. From previously being focused solely on financial, backward-looking measures, it is now generally agreed that a performance measurement system (PMS) should align with a company’s long-term, strategic objectives. These objectives are largely dictated by the company’s production situation, and vice versa. When being approached by a Norwegian engineer-to-order (ETO) company requesting a PMS, the authors could not identify any literature explicitly referring to PMS for ETO. The authors therefore set out to design the PMS from scratch. The purpose of this paper is to illustrate how the PMS was designed in close collaboration with the case company, bearing the general characteristics and competitive priorities of ETO in mind.

Keywords

Performance measurement system design methodology engineer-to-order 

References

  1. 1.
    Amaro, G., Hendry, L., Kingsman, B.: Competitive advantage, customisation and a new taxonomy for non make-to-stock companies. Int. J. Oper. Prod. Man. 19, 349–371 (1999)CrossRefGoogle Scholar
  2. 2.
    Hicks, C., McGovern, T., Earl, C.: Supply chain management: A strategic issue in engineer to order manufacturing. Int. J. Prod. Econ. 65, 179–190 (2000)CrossRefGoogle Scholar
  3. 3.
    Bertrand, J., Muntslag, D.: Production control in engineer-to-order firms. Int. J. Prod. Econ. 30, 3–22 (1993)CrossRefGoogle Scholar
  4. 4.
    Muntslag, D.R.: Profit and risk evaluation in customer driven engineering and manufacturing. Int. J. Prod. Econ. 36, 97–107 (1994)CrossRefGoogle Scholar
  5. 5.
    Pandit, A., Zhu, Y.: An ontology-based approach to support decision-making for the design of ETO (Engineer-To-Order) products. Automat. Constr. 16, 759–770 (2007)CrossRefGoogle Scholar
  6. 6.
    Riley, D.R., Diller, B.E., Kerr, D.: Effects of delivery systems on change order size and frequency in mechanical construction. Journal of Construction Engineering and Management 131, 953–962 (2005)CrossRefGoogle Scholar
  7. 7.
    Neely, A., Gregory, M., Platts, K.: Performance measurement system design: A literature review and research agenda. Int. J. Oper. Prod. Man. 15, 80–116 (1995)CrossRefGoogle Scholar
  8. 8.
    Olhager, J.: Strategic positioning of the order penetration point. Int. J. Prod. Econ. 85, 319–329 (2003)CrossRefGoogle Scholar
  9. 9.
    Olhager, J., Rudberg, M., Wikner, J.: Long-term capacity management: Linking the perspectives from manufacturing strategy and sales and operations planning. Int. J. Prod. Econ. 69, 215–225 (2001)CrossRefGoogle Scholar
  10. 10.
    Caron, F., Fiore, A.: ‘Engineer to order’companies: How to integrate manufacturing and innovative processes. International Journal of Project Management 13, 313–319 (1995)CrossRefGoogle Scholar
  11. 11.
    Beatham, S., Anumba, C., Thorpe, T., Hedges, I.: KPIs: A critical appraisal of their use in construction. Benchmarking: An International Journal 11, 93–117 (2004)CrossRefGoogle Scholar
  12. 12.
    Chan, A.P., Chan, A.P.: Key performance indicators for measuring construction success. Benchmarking: An International Journal 11, 203–221 (2004)CrossRefGoogle Scholar
  13. 13.
    Takim, R., Akintoye, A.: Performance indicators for successful construction project performance. In: 18th Annual ARCOM Conference, pp. 545–555 (Year)Google Scholar
  14. 14.
    Robinson, H.S., Carrillo, P.M., Anumba, C.J., A-Ghassani, A.: Review and implementation of performance management models in construction engineering organizations. Construction Innovation: Information, Process, Management 5, 203–217 (2005)CrossRefGoogle Scholar
  15. 15.
    Navon, R.: Research in automated measurement of project performance indicators. Automat. Constr. 16, 176–188 (2007)CrossRefGoogle Scholar
  16. 16.
    Andersen, B., Fagerhaug, T.: Performance Measurement Explained: Designing and Implementing Your State-of-The-art-System. ASQ Quality Press, Wisconsin (2002)Google Scholar
  17. 17.
    Greenwood, D.J., Levin, M.: Introduction to action research: social research for social change, Thousand Oaks, California, Sage (2007)Google Scholar
  18. 18.
    McKay, J., Marshall, P.: The dual imperatives of action research. Information Technology & People 14, 46–59 (2001)CrossRefGoogle Scholar
  19. 19.
    Baskerville, R.L., Wood-Harper, A.T.: A critical perspective on action research as a method for information systems research. J. Inform. Technol. 11, 235–246 (1996)CrossRefGoogle Scholar
  20. 20.
    Behn, R.D.: Why measure performance? Different purposes require different measures. Public Admin. Rev. 63, 586–606 (2003)CrossRefGoogle Scholar
  21. 21.
    Kaplan, R.S., Norton, D.P.: Using the balanced scorecard as a strategic management system. Harvard Bus. Rev. 74, 75–85 (1996)Google Scholar
  22. 22.
    Bourne, M., Mills, J., Wilcox, M., Neely, A., Platts, K.: Designing, implementing and updating performance measurement systems. Int. J. Oper. Prod. Man. 20, 754–771 (2000)CrossRefGoogle Scholar
  23. 23.
    Fortuin, L.: Performance indicators—why, where and how? Eur. J. Oper. Res. 34, 1–9 (1988)CrossRefGoogle Scholar
  24. 24.
    Parmenter, D.: Key performance indicators (KPI): Developing, implementing, and using winning KPIs. John Wiley & Sons (2010)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2014

Authors and Affiliations

  • Børge Sjøbakk
    • 1
  • Ottar Bakås
    • 1
  1. 1.SINTEF Technology and Society, Industrial ManagementTrondheimNorway

Personalised recommendations