The Evaluation of the IT Service Quality Measurement Framework in Industry


The two objectives of this paper are: (a) to evaluate an IT service quality measurement framework, and (b) to refine the IT service quality measurement framework for the IT service industry. We explore the notion of IT service quality from a holistic point of view—we evaluate the IT service quality measures that could help IT service organizations to understand the quality of the IT services they offer and address the areas where provider-driven IT service improvement is needed. As an example of the interconnectivity between IT service quality measures, we take a closer look at how process performance relates to other IT service quality measures and to the overall IT service quality. To attain our research objectives, we evaluate the IT service quality measurement framework that we proposed earlier. The evaluation is done through semi-structured interviews with IT service providers. This study follows the design science research paradigm that is based on constructive research. The interviewed organizations collect and analyse data about various IT service quality measures from all the dimensions of the proposed framework without understanding the interdependencies between them. We use the systems thinking approach to interpret the results and to describe the importance of a holistic view in understanding the behaviour of a service system. Finally, we contextualize the IT service quality measurement framework that supports the provider-driven IT service improvement in the simple, complicated and complex contexts.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4


  1. Basili, V. R., Caldiera, G., & Rombach, H. D. (1994). Goal question metric approach. In J. C. Marciniak (Ed.), Encyclopedia of software engineering (pp. 528–532). London: Wiley.

    Google Scholar 

  2. Bradburn, N. M. (1982). Question-wording effects in surveys. In R. M. Hogarth (Ed.), New directions for methodology of social and behavioral science: The framing of questions and the consistency of response (pp. 65–76). San Fransisco: Jossey-Bass.

    Google Scholar 

  3. Chesbrough, H., & Spohrer, J. (2006). A research manifesto for services science. Communications of the ACM, 49(7), 35–40. doi:10.1145/1139922.1139945.

    Article  Google Scholar 

  4. Dettmer, H. W. (2011). Systems thinking and the cynefin framework—A strategic approach to managing complex systems. Port Angeles, WA: Goal Systems International.

    Google Scholar 

  5. Enquist, B., Camén, C., & Johnson, M. (2011). Contractual governance for public service value networks. Journal of Service Management, 22(2), 217–241.

    Article  Google Scholar 

  6. Commission Recommendation 2003/361/EC. (2003). Official Journal of the European Union. Vol 124, p. 36.

  7. Fuggetta, A. (2000). Software process: A roadmap. In A. Finkelstein (Ed.), The future of software engineering. Limerick: ACM Press.

    Google Scholar 

  8. Gray, E. M., & Smith, W. L. (1998). On the limitations of software process assessment and the recognition of a required re-orientation for global process improvement. Software Quality Journal, 7(1), 21–34.

    Google Scholar 

  9. Gregor, S., & Hevner, A. R. (2013). Positioning and presenting design science research for maximum impact. Management Information Systems Quarterly, 37(2), 337–355.

    Google Scholar 

  10. Grönroos, C., & Helle, P. (2010). Adopting a service logic in manufacturing—Conceptual foundation and metric for mutual value creation. Journal of Service Management, 21(5), 564–590.

    Article  Google Scholar 

  11. Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28(1), 75–105.

    Google Scholar 

  12. Hoverstadt, P. (2008). The Fractal Organization: Creating sustainable organizations with the Viable System Model. Hoboken: Wiley.

    Google Scholar 

  13. ISO/IEC 2000-1:2011, (2011). Information technology — Service management — Part 1: Service management system requirements, ISO/IEC JTC1 SC40.

  14. ISO/IEC 25010:2011, (2011). Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models, ISO/IEC JTC1 SC7.

  15. ISO/IEC JTC 1 Secretariat. (2011) ISO/IEC 20000:2, Part 2: Guidance on the application of service management systems.

  16. ISO/IEC 15504-5:2012, (2012). Information technology — Process assessment — Part 5: An exemplar software life cycle process assessment model, ISO/IEC JTC1 SC7.

  17. Järvinen, P. (2001). On research methods. Tampere: Juvenes Print.

    Google Scholar 

  18. Jia, R., & Reich, B. H. (2011). IT service climate—An essential managerial tool to improve client satisfaction with IT service quality. Information Systems Management, 28(2), 174–179.

    Article  Google Scholar 

  19. Kang, H., & Bradley, G. (2002). Measuring the performance of IT services: An assessment of SERVQUAL. International Journal of Accounting Information Systems, 3(3), 151–164. doi:10.1016/s1467-0895(02)00031-3.

    Article  Google Scholar 

  20. Kvist, A. K. J., & Klefsjö, B. (2006). Which service quality dimensions are important in inbound tourism?: A case study in a peripheral location. Managing Service Quality, 16(5), 520–537.

    Google Scholar 

  21. Lepmets, M., Cater-Steel, A., Gacenga, F., & Ras, E. (2012). Extending the IT service quality measurement framework through a systematic literature review. Journal of Service Science Research, 4(1), 7–47.

    Article  Google Scholar 

  22. Lepmets, M., Ras, E., Renault, A. A. (2011). Quality measurement framework for IT services. In Annual SRII Conference (pp. 767–774), San Jose, CA: IEEE.

  23. Lin, S. P., & Chan, Y. H. (2011). Enhancing service quality improvement strategies by integrating Kano’s model with importance-performance analysis. International Journal of Services, Technology and Management, 16(1), 28–48.

    Article  Google Scholar 

  24. March, S. T., & Smith, G. F. (1995). Design and natural science research on information technology. Decision Support Systems, 15, 251–266.

    Article  Google Scholar 

  25. Ostrom, A. L., Bitner, M. J., Brown, S. W., Burkhard, K. A., Goul, M., Smith-Daniels, V., et al. (2010). Moving forward and making a difference: Research priorities for the science of service. Journal of Service Research, 13(1), 4–36.

    Article  Google Scholar 

  26. Polter, S., Verheijen, T., & van Selm, L. (2008). ISO/IEC 20000—An introduction (First Edition, ITSM Library). Zaltbommel: Van Haren Publishing.

    Google Scholar 

  27. Practical Software and Systems Measurement: A Foundation for Objective Project Management, v4.0b, DoD, Addison-Wesley Professional, (1. Ed), 2001. p. 304.

  28. Rudd, C., & Lloyd, V. (2007). ITIL service design. In Office of Government Commerce (Ed.), ITIL (p. 334). London: TSO.

    Google Scholar 

  29. Rummler, G. A., Ramias, A., & Rummler, R. A. (2009). White space revisited: Creating value through process. New York, NY: Wiley.

    Google Scholar 

  30. Rust, R. (2004). A call for a wider range of service research. Journal of Service Research, 6(3), 211.

    Article  Google Scholar 

  31. Ryals, L. J., & Humphries, A. S. (2007). Managing key business-to-business relationships: What marketing can learn from supply chain management. Journal of Service Research, 9(4), 312–326. doi:10.1177/1094670507299380.

    Article  Google Scholar 

  32. Venable, J., Pries-Heje, J., & Baskerville, R. (2012). A comprehensive framework for evaluation in design science research. In Design Science Research in Information Systems. Advances in Theory and Practice (pp. 423–438). Berlin, Heidelberg: Springer.

  33. Walker, R. H., Johnson, L. W., & Leonard, S. (2006). Re-thinking the conceptualization of customer value and service quality within the service profit chain. Managing Service Quality, 16(1), 23–36.

    Article  Google Scholar 

  34. Zeithaml, V. A., Parasuraman, A., & Berry, L. L. (1990). Delivering quality service: balancing customer perceptions and expectation. London: Free Press.

    Google Scholar 

  35. Zehrer, A. (2009). Service experience and service design: concepts and application in tourism SMEs. Managing Service Quality, 19(3), 332–349.

    Google Scholar 

Download references


The present study is supported by the National Research Fund, Luxembourg and cofunded under the Marie Curie Actions of the European Commission (FP7-COFUND); and by CICYT-TIN2010-C03-03 “Simulation applied to team, process and service management, Sim4Gest” project funded by the Spanish Ministry of Science and Innovation.

Author information



Corresponding author

Correspondence to Marion Lepmets.


Appendix 1—Questionnaire used for Semi-Structured Interviews

Section A. Background of the Organization

  1. 1.

    What is the core business area of your organization?

    • Software Development

    • IT Service Provider

    • Software Developer and Service Provider

    • Consultancy

    • Other, please specify:

  2. 2.

    What service do you provide to your customers

    Please describe:

  3. 3.

    What is the core business area of your main customer(s)?

    Please choose only one of the following:

    • Manufacturing

    • Banking and Finance

    • Consultancy

    • Education and research

    • Other, please specify:

  4. 4.

    What is the size of your organization?

    Please choose only one of the following:

    • Micro (up to 9 persons)

    • Small (10–49 persons)

    • Medium (50–249 persons)

    • Large (>250 persons)

  5. 5.

    What is your role in the organization?

    Please choose only one of the following:

    • Senior/top manager

    • Project/service manager

    • Developer/Operator

    • Process manager

    • Quality manager

    • Technical lead

    • Consultant

    • Other:

Section B. Background of process improvement

  1. 6.

    What do you consider as improvement in your processes?

    Please choose all that apply:

    • Implementation of small changes to the way you work to become more efficient

    • Transferring knowledge to future projects after project closure (post-mortem analysis)

    • Implementation of new technologies/technological tools

    • Improved definition of roles and responsibilities

    • Improved definition of processes and work instructions

    • Implementation of process improvement programme/initiative

    Please select the most relevant response(s)!

  2. 7.

    When do you start to make improvements?

    Please choose all that apply:

    • At the end of the project (during post-mortem analysis) or service delivery

    • After hearing about a better method or tool

    • After completing a training course, reading a professional textbook or attending a conference

    • After receiving critical customer survey results

    • After a process assessment

    • When management is planning an organizational strategic change

    • When customer(s) requests it

    • Other:

  3. 8.

    How do you initiate improvements?

    Please choose all that apply:

    • Discussing informally with colleagues about ways to improve

    • Discussing and planning at a project/operation team meeting

    • Brainstorming in a department meeting

    • After processes have been assessed against an international standard or process model

    • After an organizational audit

    • Implementing industry best practices or an enterprise architecture framework

    • Other:

  4. 9.

    How do you measure process improvements?

    Please choose all that apply:

    • Improvements are not measured

    • Measuring personal performance and/or productivity

    • Evaluating the achievement of product or service quality requirements

    • Evaluating the achievement of project or service performance objectives

    • Measuring project productivity

    • Conducting model/standard based process assessments

    • Evaluating stakeholder satisfaction

    • Evaluating customer satisfaction

    • Evaluating employee satisfaction

    • Measuring organizational productivity

    • Evaluating the achievement of organizational goals

    • Calculating the return on investment to process improvement

    • Other:

  5. 10.

    Which process improvement model(s)/method(s) is/are used in your organization?

    Please choose all that apply:

    • Our own experience and knowledge

    • PSP/TSP—personal/team software process

    • ISO/IEC 15504

    • CMMI

    • Six Sigma

    • ITIL

    • CoBIT

    • ValIT

    • ISO/IEC 20000

    • ISO 9000

    • Lean

    • Theory of Constraints

    • No improvement methods are used

    • Other, please name:

  6. 11.

    Before improving your processes, did you do the following?

    Please choose all that apply:

    • Identify the process improvement goals

    • Identify the process improvement scope

    • Identify and communicate the organization’s business goals to the staff

    • Decide upon the change strategy for the organization

    • Get management’s support and commitment for the improvement

    • Allocate the roles and responsibilities for the improvements in the organization

    • Set the scope of change in the organization

    • Conduct a process assessment

    • Other, please describe:

  7. 12.

    How did the process improvements end?

    Please choose all that apply:

    • Interested parties met to share lessons learnt from the improvement

    • Improvement implementation plans were established

    • Methods to monitor the improvements were agreed upon

    • A future re-assessment was planned to check the effects of the improvements

    • No monitoring or re-assessment was planned

    • I don’t know how the improvements were finalized

    • When changes to the processes were implemented

    • Other, please specify:

  8. 13.

    Which IT service management processes have you improved?

    Please choose all that apply:

    • Incident Management

    • Change Management

    • Problem Management

    • Service Level Management

    • Others, please list:

Section C. Background of IT service quality measurement

  1. 14.

    How do you improve the quality of the IT services you provide?

    • Achieving SLRs in SLAs

    • Increasing IT Service Management process maturity

    • Increasing IT service stability

    • Increasing information system stability

    • Other, please describe:

  2. 15.

    In order to know and/or improve the quality of your IT services, what do you measure?

    • Customer satisfaction

    • Service behaviour, service delivery, employee morale

    • IT Service Management process quality

    • IT Service quality/stability

    • Information system quality

    • Other, please describe:

  3. 16.

    What and how do you measure customer satisfaction?

    What do you ask from your customers to know their level of satisfaction?

    • Appearance of physical facilities, equipment, personnel and communications material

    • IT service being provided dependably and accurately

    • IT service provider’s willingness to help customers and provide prompt service

    • IT service provider is conveying trust and confidence

    • IT service provider provides caring, individualized attention

    • Customer’s perception of Service Desk (handling of customer requests)—average call response time and the satisfaction with the incidents’ handling

    • Customer’s perception of IT service stability

    • Customer’s perception of IS quality (as a part of IT service)

    • Customer’s perception of IT service provider’s process performance

    • Other, please describe:


    • Customer feedback/survey

    • Other, please describe:

  4. 17.

    What and how do you measure process quality?

    What do you measure to know the level of process quality?

    • Process compliance to standards

    • Process productivity

    • Defect containment and rework

    • Other, please describe:


    • Process audit

    • Process assessment

    • Analysis of historical vs proposed and actual data

    • Rework effort in service design

    • Please describe:

  5. 18.

    What and how do you measure IT service quality/stability?

    What do you measure to know the level of quality of the IT service you offer?

    • IT service availability

    • IT service continuity

    • IT service capacity

    • IT service performance

    • IT service utilization

    • Information security

    • IT service reliability/dependability (customer support—handling of RFCs, problems, mean time to incident resolution)

    • Monetary value of IT service (cost, prize, accuracy of service functions’ forecast, and competitiveness of a service)

    • Please describe:


    • Please describe the measures per each category:

  6. 19.

    What and how do you measure information system quality?

    What do you measure to know the level of quality of your information system?

    • Defects

    • Security flaws and vulnerabilities

    • Standards compliance

    • Problems and errors

    • Time to restore the system (between incidents, between failures, nr of incidents resolved daily)

    • Performance of technical components

    • Capacity of technical components

    • Incidents related to the speed of growth

    • System complexity

    • System adjustability—how easy it is to customize the system according to business requirements

    • Other, please describe:


    • Please describe:

  7. 20.

    Do you measure value of the IT service you offer?

    • No

    • Yes, through:

      • Value co-creation through revenue growth

      • Identification of non-value added activities

      • Business/IT alignment

      • Value creation and value delivery

      • Other, please specify:

      • How, please specify:

  8. 21

    Do you measure service behaviour and employee morale?

    • No

    • Yes, through:

      • Managers stressing the goals and values of the organization

      • Understanding the value the offered service brings to the customer’s business

      • Managers are regularly discussing the best approach to service the customers

      • Having a shared vision of an excellent service and the best way to deliver it

      • Being evaluated based on the contribution we make to the quality of the service offering

      • Others, please describe:

Section D. Impact on process improvement on IT service quality

  1. 22.

    How do you know process improvement made a difference in your organization?

  2. 23.

    Do you think process improvement positively impacts your organization?

    • No

    • Yes

      • Why?

      • Which parts of your organization?

  3. 24.

    How do you justify process improvement initiatives to your organization?

    Please describe:

  4. 25.

    What data do you seek to justify process improvement initiatives?

    Please describe:

  5. 26.

    How do you detect that improvements did happen after the changes were implemented?

    Please describe:

  6. 27.

    After process improvement, did you see an increase in one of the following?

    • Customer satisfaction

    • Service behaviour

    • IT Service Management process quality

    • IT Service quality/stability

    • Information system quality

    • Other, please describe:

  7. 28.

    Can you describe how and when did you realize the impact in the previous IT service quality categories after the implementation of process improvements?

Appendix 2

See Table 3.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Lepmets, M., Mesquida, A.L., Cater-Steel, A. et al. The Evaluation of the IT Service Quality Measurement Framework in Industry. Glob J Flex Syst Manag 15, 39–57 (2014).

Download citation


  • IT service quality measurement
  • Process improvement
  • Provider-driven IT service improvement
  • Systems thinking