In IT management, maturity models have proved to be an important instrument because they allow for a better positioning of the organization and help find better solutions for change. Over the last few years, over a hundred maturity models have been developed to support IT management. However, the procedures and methods that led to these models have only been documented very sketchily. Using a scientific approach, we have developed criteria for the development of maturity models. These criteria also serve as a basis for the comparison of sparsely documented maturity approaches. The results thus obtained have been generalised and consolidated into a generally applicable model. A case study will illustrate the applicability of our model. The results of this paper are meant to serve as a manual for methodically well-founded designs and evaluations of maturity models.

1 The importance of maturity models for IT management

IT support of business processes has become indispensible for a lot of companies (Müller et al. 2006, p. 101). Moreover, innovative IT systems generally offer great opportunities to improve a company’s competitiveness (Henderson and Venkatraman 1993; McFarlan and Nolan 2003; Pößneck 2007). The responsibility for effective and efficient design and use of IT lies with the company’s IT management. The main goal here is to continually improve IT performance with regard to its economic efficiency.

Continual improvement requires the company’s positioning with regard to its IT capabilities and the quality of its goods and services. As a rule, this positioning involves a comparison with the company’s goals, external requirements (e. g. customer demands, laws or guidelines), or benchmarks. To achieve an objective assessment of a company’s position, however, often proves to be a difficult task. For each aspect of the company’s IT under investigation, the questions arise what needs to be measured and how, and what to compare it with, in order to assess the as-is situation of a company and to assign it a specific quality or degree of maturity. IT management therefore needs supportive tools to assess the as-is situation of a company, derive and prioritize improvement measures and subsequently control the progress of their implementation.

Maturity models are helpful tools for addressing these issues (de Bruin et al. 2005). A maturity model consists of a sequence of maturity levels for a class of objects. It represents an anticipated, desired, or typical evolution path of these objects shaped as discrete stages. Typically, these objects are organizations or processes. The bottom stage stands for an initial state that can be, for instance, characterized by an organization having little capabilities in the domain under consideration. In contrast, the highest stage represents a conception of total maturity. Advancing on the evolution path between the two extremes involves a continuous progression regarding the organization’s capabilities or process performance. The maturity model serves as the scale for the appraisal of the position on the evolution path. It provides criteria and characteristics that need to be fulfilled to reach a particular maturity level. During a maturity appraisal, a snap-shot of the organization regarding the given criteria is made. The characteristics found are evaluated to identify the appropriate organization-individual maturity level. The application of maturity models can be supported by predetermined procedures, e. g. by questionnaires. Based on the results of the as-is analysis, recommendations for improvement measures can be derived and prioritized in order to reach higher maturity levels (IT Governance Institute 2007).

Studies have shown that more than a hundred different maturity models have been proposed (de Bruin et al. 2005). The constant publication of new maturity models for often fairly similar applications however suggests a certain arbitrariness. The authors only rarely reveal their motivation and the development of the model, or their procedural method and the results of their evaluation.

The aim of this paper is therefore to propose a procedural model for the design of maturity models hoping to remedy these widespread shortcomings. In a first step, the requirements necessary for the development process of maturity models will be identified (section 2). On the basis of these requirements, the few well-documented development processes of maturity models will be compared (section 3). The results from section 3 provide the groundwork for the construction of a procedure model for the development of maturity models (section 4). In section 5, the procedure model will be illustrated by the development of a maturity model for the implementation of IT performance measurement supported by BI tools. Finally, an outlook on further research will complete this article (section 6).

2 Requirements for the development of maturity models

In order to establish a reasonable catalogue of requirements for the design of maturity models, the seven guidelines for design science defined by Hevner et al. (2004) have been chosen as the basis for our argument. Design science aims at the improvement of problem-solving capabilities by creating innovative artifacts, such as constructs, models, methods and instantiations (March and Smith 1995). Thus, maturity models may be understood as artifacts which serve to solve the problems of determining a company’s status quo of its capabilities and deriving measures for improvement therefrom. It can therefore be assumed that the development of maturity models falls within the application area for the guidelines developed by Hevner et al. (2004). The considerable attention that the paper by Hevner et al. (2004) has received, speaks in favor of our choice of argumentative premise. Despite the absence of empirical research in this field, it is probably safe to speculate that future reviewers of publications on maturity models will, for want of well-established alternatives, increasingly rely on these assessment criteria (Zelewski 2007, pp. 111–114). Accordingly, the criteria established by Hevner et al. (2004) suggest themselves as a basis for the development of maturity models.

By adopting these criteria, we do not, however, wish to abet the confrontation of behavioral and design science introduced by Hevner et al. (2004), nor do we wish, at the exclusion of other paradigms, to assign the development of maturity models exclusively to design science. For our further line of argument, the reasonable applicability of the criteria to the development of maturity models will be sufficient. Particularly Zelewski’s criticism of some guidelines (Zelewski 2007, pp. 91–103) has in some cases motivated the slightly divergent interpretation of Hevner’s et al. (2004) criteria and their considered adaptation to the design of maturity models.

The aim of design science is to develop an innovative problem-solving artifact (“Guideline 1: Design as an Artifact”) that will contribute to current research (“Guideline 4: Research Contributions”). For the development of maturity models this means:

  • R1 (Comparison with existing maturity models): The need for the development of a new maturity model must be substantiated by a comparison with existing models. The new model may also just be an improvement of an already existing one (Zelewski 2007, pp. 93–98).

The process description of design science requires an iterative procedure for the development of a problem solution (Peffers et al. 2007). “Guideline 6: Design as a Search Process” also emphasizes that, by using the available means, solutions must be iteratively proposed, refined, evaluated, and, if necessary, enhanced. For the design of a maturity model this means:

  • R2 (Iterative Procedure): Maturity models must be developed iteratively, i. e., step by step.

  • R3 (Evaluation): All principles and premises for the development of a maturity model, as well as usefulness, quality and effectiveness of the artifact, must be evaluated iteratively (for the problem of delimiting the evaluation criteria, see Zelewski 2007, pp. 92–93).

The last requirement is also emphasized in “Guideline 3: Evaluation.” Hevner et al. (2004) point out that the evaluation of all (intermediary) results must be carried out with appropriate scientific methodology. Since different methods may be used in the development of the artifact and in the evaluation of the particular results, design science characteristically adopts a multi-methodological procedure (Hevner et al. 2004). As accentuated in “Guideline 5: Research Rigor,” the selected methods need to be rigorously attuned. For our purposes, this yields the following requirement:

  • R4 (Multi-methodological Procedure): The development of maturity models employs a variety of research methods, the use of which needs to be well-founded and finely attuned. (Zelewski (2007, p. 98) points at the difficulties of operationalising research rigor and proposes an ontological approach (Frank 2004, p. 377) to address those.)

“Guideline 2: Problem Relevance” states that the problem-solving artifact must not only be innovative, but the problem to be solved must also be relevant for researchers and/or practitioners. This relevance can again be established through different scientific methods, e. g. by interviewing potential users of the maturity model in question. Establishing relevance also requires the exact definition of the problem, which in turn is prerequisite for ensuing evaluations. Accordingly, the following requirements are considered:

  • R5 (Identification of Problem Relevance): The relevance of the problem solution proposed by the projected maturity model for researchers and/or practitioners must be demonstrated.

  • R6 (Problem Definition): The prospective application domain of the maturity model, as well as the conditions for its application and the intended benefits, must be determined prior to design.

Documentation of the research process is of vital importance for the scientific procedure. “Guideline 7: Communication of Research” emphasizes that the presentation of results must be targeted at the specific user groups. It must, however, be pointed out here that, unlike the practitioners (technicians and managers) Hevner et al. (2004) focus on, researchers make specific demands not only on the presentation of results, but also on the documentation of the research process. Correspondingly, the two following requirements need to be included in our catalogue:

  • R7 (Targeted Presentation of Results): The presentation of the maturity model must be targeted with regard to the conditions of its application and the needs of its users.

  • R8 (Scientific Documentation): The design process of the maturity model needs to be documented in detail, considering each step of the process, the parties involved, the applied methods, and the results.

In the following section, a comparison will show to what extent existing maturity models meet these requirements. On the basis of these results, a generically applicable model for the development of maturity models will be extracted, enabling designers of maturity models to fulfil the requirements established above in the course of the design process.

3 Comparison of design processes of specific maturity models

An essential prerequisite for a comparison of design processes of specific maturity models can be found in requirement R8 (Scientific Documentation). Only maturity models for which a detailed documentation is available can be effectively compared. In order to identify adequate objects for the comparison, 51 maturity models were culled from the internet and pertinent literature, and then analyzed. Each of these models was checked for free and publicly available information about the design process (as of July 2008). In order to be able to include models that offered only little information about the design process, we asked model designers per e-mail to advise us of potential publications on the subject. This, however, led in only a few cases to an improvement of the data available. The maturity models have been described according to requirement R8 using three criteria which may also work in combination (see Tab. 1). Our results show that the documentation quality of these models is generally fairly patchy (for more detail see Becker et al. 2009).

Tab. 1 Evaluation of maturity models on the basis of scientific documentation

For the appraisal of the remaining requirements only those models can be considered that comply with requirement R8-III. On the basis of this appraisal, six maturity models were chosen and synoptically reviewed with regard to requirements R1 to R7 (see Tab. 2). Naturally, requirement R8 will not be reappraised here.

One of the selected models is the Analysis Capability Maturity Model (ACMM), which has been developed for the US-American National Reconnaissance Office (NRO). It has been designed to evaluate processes of organizations that conduct state-commissioned studies (Covey and Hixon 2005). The second model is the maturity model of Rosemann et al. (2006), which investigates Business Process Management Maturity (BPMM). The designers emphasize that their model must comply with scientific standards (de Bruin and Rosemann 2007; de Bruin et al. 2005). The third model included in our synopsis is Capability Maturity Model Integration (CMMI), which integrates several models that evolved from the context of the initially very popular Capability Maturity Model (CMM; CMMI Product Team 2006; Paulk et al. 1993). On the basis of CMM, Cook and Visconti started in 1992 to develop the Document Process Maturity Model (DPMM), which is also included in our study. This model focuses on documentation as an important supporting factor in software development (Cook and Visconti 2000). Another model included is the E-Learning Maturity Model (eMM), published by the Victoria University of Wellington and designed to help colleges and other institutions to assess their capabilities with regard to a sustained development, introduction and use of e-learning and to compare their results with other institutions (Marshall 2007). Finally, the IS/ICT Management Capability Maturity Framework (IC/ICT CMF) represents a maturity model for IT management (Renken 2004).

In spite of their comparatively good documentation, the following models have been excluded from the synopsis: the Business Process Maturity Model (Lee et al. 2007), which is less well documented than the BPMM, a maturity model from the same domain; the Capability Maturity Model (CMM; Paulk et al. 1993), since it is the precursor of CMMI; and the Knowledge Management Capability Assessment (KMCA; Freeze and Kulkarni 2005; Kulkarni and Freeze 2004), since the main phases of its development process match those of the BPMM (de Bruin et al. 2005).

Tab. 2 Synopsis of Design Processes of Maturity Models

It is noteworthy that, for all six models, a screening of existing maturity models was conducted prior to development. Similarly, an iterative procedure can be observed in each case, in which particularly case-study evaluations of intermediate versions (ACMM, eMM) led to subsequent modifications of the model. Comprehensive literature research invariably formed the basis for the core elements of all maturity models, and was frequently complemented by the consultation of domain experts (DPMM, IS/ICT CMF). In individual cases, the problem relevance was ascertained by specific assignments, but in general it was based on more generic derivations. The problem definitions tend to foreground the evaluation and the comparison of businesses with regard to their capabilities in specific domains – notably in the domain of IT management. The manner in which the results are presented varies considerably and ranges from a single conference paper (IS/ICT CMF) to reports and procedure reports of several hundred pages (CMMI). Discretionary questionnaires for self-assessment stand out favorably (DPMM, eMM).

4 Procedure model for the development of maturity models

In the following, we propose a procedure model that distinguishes eight phases in the development of maturity models (see Fig. 1). The elements of this model are informed by the requirements identified above and by correspondent procedures from the well-documented examples. In the graphic representation of the procedure model, which is based on a flow chart (see DIN 1966), notations for references to requirements R1R7 have been applied to the individual procedure model elements. Requirement R8 has been incorporated by identifying the documents generated in the course of the maturity model design, referenced by the document symbol which has been assigned to R8 in the caption. Moreover, the model generalizes the reviewed well-documented development processes and illustrates possible procedures for each phase by examples from the synopsis.

Fig. 1
figure 1

Procedure model for developing maturity models

According to R6, the procedure model starts with the problem definition. All reviewed models start by defining the problem. For this purpose both the targeted domain (e. g. IT management as a whole (see IS/ICT CMF) vs. partial discipline (see CMM, DPMM)) and the target group (e. g. intra-corporate vs. external) of the maturity model need to be determined.

At the same time, according to R5, the problem relevance, i.e. the actual demand for the maturity model, must be clearly demonstrated. The reviewed models, however, generally explain this demand only by pointing out the relevance of the targeted domain, except for ACMM and CMMI, which were initiated by state commissions.

Although varying in scope, a comparison of existing maturity models, as stipulated in R1, can be found in all reviewed examples. Shortcomings or lack of transferability often motivate improvements of older models (ACMM, BPMM). Occasionally, a renewed search for models already addressing these problems is omitted. Apart from active research for existing models that is motivated by the problem definition, the publication of a new maturity model by a third party may motivate a comparison (reactive initiation). In this case, examination of the new model may provide an incentive for modifications of one’s own maturity model.

A comprehensive comparison is requisite for a reasoned determination of the design strategy, which, according to R8, needs to be documented as well. The most important basic strategies that can be discerned are: the completely new model design, or the enhancement of an existing model (CMM); the combination of several models into a new one (CMMI); as well as the transfer of structures (DPMM, eMM) or contents (ACMM, IS/ICT CMF) from existing models to new application domains.

The central phase of the procedure model is the iterative maturity model development, which reflects requirement R2. The sub-steps of this phase, selecting the design level, selecting the approach, designing the model section, and testing the results will be iterated. The design level with the highest degree of abstraction provides the architecture, i.e. the fundamental structure of the maturity model. Besides the one-dimensional sequence of discrete steps (DPMM), a multi-dimensional maturity assessment is common (IS/ICT CMF). The different dimensions may be organized hierarchically (BPMM). After this basic structural design basic, the individual dimensions and their attributes must be devised to flesh out the model architecture.

According to R4, appropriate methods have to be chosen for each abstraction level. Widespread is the use of literature analysis (eMM), which extracts assessment criteria for the maturity model from success factors and typical developments. Explorative research methods, like, for instance, the Delphi method (BPMM), and creativity techniques (e. g., the iterative consolidation of indicators in IS/ICT CMF) are also suitable. In the next step, the selected model section needs to be designed in accordance with the chosen procedure. In order to comply with R3 (evaluation), the result then must be tested for comprehensiveness, consistency, and problem adequacy. The result of this evaluation will decide on the sequel of the design procedure.

In the phase following the design of the maturity model, conception of transfer and evaluation, the different forms of result transfer for the academic and the user communities need to be determined. Requirement 4 prescribes a reasoned selection of the different forms that the targeted communication of the maturity model can take. Besides the widespread publication of document-based check lists (DPMM, eMM) and manuals (CMMI), software-tool supported accessibilty of the maturity model (e. g. via internet) offers another alternative. Possiblities for the evaluation of the problem solution proposed by the maturity model should be incorporated into the transfer design. This requirement guarantees users the possibilty for feedback as early as the design stage of the media (e. g. questionnaires or forms for change requests in the manuals, comprehensive data collection via software tools). If the evaluation includes the differentiation between groups, the transfer concept must allow for the possibilty to discriminate, e. g., between an experiment group and a control group. The reviewed design projects have made comparatively little use of these integration potentials. For the IS/ICT CMF, a quantitative analysis via a software-based assessment tool was announced, but could not be retrieved.

The purpose of the phase implementation of the transfer media is to make the maturity model accessible in the planned fashion for all previously defined user groups. At this stage, the most important point is to target the transfer media, as specified in requirement R7. In the reviewed projects, voluminous reports prevail (ACMM, CMMI). Self-assessment questionnaires are sometimes available (DPMM, eMM), but are, for commercial reasons, often not made generally accessible (e. g. when management consultants develop maturity models for their own business).

According to requirement R3, the evaluation should establish whether the maturity model provides the projected benefits and an improved solution for the defined problem. The defined goals are to be compared with real-life observations. For this purpose, the reviewed projects have carried out case studies, in which a small exclusive group applies the new maturity model (BPMM). Alternatively, the model may be made accessible on the internet for free access. This has the advantage that a great number of users, e. g. via web-based self-assessment, will generate a great number of datasets which can be compared with the expected distribution of maturity levels in business reality.

The outcome of the evaluation may cause a reiteration of the design process (R2). It is also possible that the maturity model may be retained unchanged, while the conception of transfer and evaluation may need to be modified. Lastly, negative results may lead to a rejection of the model, in which case the model should be purposefully, explicitly, and if possible, actively taken off the market.

Maturity models inherently become obsolete because of changing conditions, technological progress or new scientific insights. If an unchanged maturity model is supposed to be permanently valid for its problem area, it needs to be validated regularly by appropriate evaluations. Modifications that may become necessary in time, can be accommodated in a new model version. Several existing models may become invalid if replaced by an integrated model as was the case with CMMI. Pertinent decisions need to be communicated to potential users.

5 Application of procedure model

The procedure model is currently used by the European Research Center for Information Systems (ERCIS) and the Deloitte Consulting GmbH for the development of a maturity model designed to assess the application of IT performance measurement. The development of the IT Performance Measurement Maturity Model (ITPM3) is aimed at providing a tool for the structured enhancement of Business Intelligence (BI) applications for the control of IT-formative and IT-supported processes in a company.

5.1 Problem definition

Using ratios and ratio systems, IT Performance Measurement aims at providing a comprehensive representation of a company’s IT. In alignment with the company’s overall strategy, it tracks and monitors the implementation of the IT strategy, the realization of projects, the use of resources, the process performance and IT performance (IT Governance Institute 2007). Besides the traditonally strong focus on costs, particular attention is here also given to the performance and value of IT (Bendl et al. 2004; Eul et al. 2006).

Business Intelligence Systems (BI systems) can collect internal cost and performance data as well as external market data from a variety of sources, process these data into significant information about IT performance and support the IT management by supplying this information for executive tasks (Chamoni and Gluchowski 2004). So far, in comparison with other business areas, IT management has been supported very little by BI systems (Chamoni and Gluchowski 2004, p. 125). Thus, the ITPM3 is meant to serve as an instrument that calls attention to this possible neglect of BI for the purposes of IT management and, in the process of a positioning, shows ways to improve the status quo. The relevance of the proposed model has been confirmed in interviews conducted in the first half of 2007 with ten IT management representatives of German companies. All reviewed companies strive for the improvement of their IT performance measurment. The identification of present shortcomings and future fields for action by means of a maturity model was broadly welcomed.

5.2 Comparison of existing maturity models

Maturity models that explicitly address BI supported IT performance management could not be identified. Instead, several maturity models could be detected that view IT management and BI as separate fields.

Among the maturity models that refer to IT management in a broader sense are, e. g., the Capability Maturity Model (CMM) and its further development, Capability Maturity Model Integration (CMMI Product Team 2006; Paulk et al. 1993); the maturity models designed for the COBIT processes (IT Governance Institute 2007); the IT Balanced Scorecard Maturity Model (Van Grembergen and Saull 2001); and the IS/ICT Management Capability Maturity Framework (Renken 2004).

These maturity models generally disregard computer-based support of IT management that, for instance, BI systems would afford. There are, however, separate maturity models for the latter, e. g. the two Business Intelligence Maturity Models by Chamoni and Gluchowski (2004) and by Eckerson (2006, pp. 89–95), as well as the Maturity Model for Performance Measurement Systems by Wettstein and Küng (2002).

5.3 Determination of development strategy

The comparison of existing maturity models with the problem definition suggests a design strategy that combines several models into a new one. We decided to use the identified maturity models as a starting point for the design process, since they already covered aspects of BI supported IT performance measurement. On the one hand, their graded structure and their differentiation of individual maturity levels by key areas provided basic solutions for the structuring of the maturity model. On the other hand, it was expected that parts of the contents (e. g., descriptions of individual maturity degrees) could be applied, or at least expediently transferred, to our problem area.

5.4 Iterative maturity model development

The maturity model design underwent five iterations. In the first iteration, a primary architecture was drafted, which allowed for four development stages: fragmented IT reporting, consolidated spreadsheet-assisted IT reporting, IT performance dashboard, and information portal for IT management. Following the biMM by Chamoni and Gluchowski (2004), these stages were described by the dimensions of contents, organization, and technology. Based on an extensive literature research, the dimensions and their attributes were defined during the second iteration. At this stage, nine semi-structured interviews with IT managers from German companies (duration: ca. 45 to 60 min.) confirmed the plausibilty of our model, whereas the strong orientation towards the technical aspects of IT performance management was criticized. To enable a more differentiated mapping of weakly developed IT performance measurement solutions, a refinement of the gradation was also deemed necessary. Consequently, the architecture of the model was adjusted accordingly in the third iteration and – following the maturity models in COBIT (IT Governance Institute 2007) – six degrees of maturity, ranging from non-existent (0) to optimized (5), were included. In order to diminish the technical orientation, a process-oriented approach based on Grothe and Gentsch (2000) was adopted.

In the fourth iteration, the modified architecture led to a second and completely revised version of the maturity model. In a group discussion with IT consultants the model received a largely positive feedback. Greater precision with regard to the contents of the dimension contents was however desired. Further adjustment of the architecture was therefore not necessary in the fifth iteration. Only final modifications of the criteria for the dimension contents were carried out for the – at present – last version of ITPM3.

Fig. 2
figure 2

The ITPM3 and influences of existing maturity models

The model, which has been released for externally targeted transfer and subsequent evaluation, describes the analytical BI process that transforms fragmented internal and external data into action-oriented information about the efficiency and effectiveness of the company’s information infrastructure, ranging from (0) non-existent to (5) optimized (see, also for the following, Fig. 2). The evolution path outlined here starts from a complete lack of IT performance transparency and ends at its highest level in a networked IT performance measurement process that is supported by companywide integrated BI tools. To facilitate a differentiated analysis of the IT performance measurement, the maturity model provides three dimensions that are characterized by five criteria. The dimension contents measures the relevance in substance of the applied IT performance measurement solution for the IT management. The dimension organization looks at both the integration of the solution into the organization and operations of the IT department and at its integration into companywide concepts. The dimension technology examines the components and architectures that are employed. The criteria factor in more specific questions of each dimension. Apart from the gradation into maturity levels and the dimensions, single criteria like cost-benefit analysis have been adapted form other maturity models. Further criteria are based on the maturity attributes of the generic COBIT maturity model (such as policies, standards, and procedures). The maturity model for the COBIT process, ME1 (IT Governance Institute 2007), and the IT-BSC Maturity Model by van Grembergen and Saull (2001) also had a major influence on the principal characterization of the individual stages.

5.5 Conception of transfer and evaluation

Besides academic publications, the current conception of transfer and evaluation includes the development of a web page enabling companies to calculate their degree of maturity. This is meant to expand the empirical basis, so far consisting of data from the expert interviews, for the evaluation of the model. When calculating their level of maturity, companies specify their values of the maturity model criteria. These specifications of the BI support for their IT performance measurement should allow a statistical survey of the distribution of the degrees of maturity in individual companies (see, e. g., the studies in Chamoni and Gluchowski (2004), and Philippi et al. (2006)). Additionally, the web page should provide for a census of the model acceptance, which may indicate need for further development.

6 Outlook

Maturity models are of major importance for IT management. With the great number of maturity models that have been designed in the last few years, there is however a danger of increasing arbitrariness in the development of these models. The increase in deficient documentations may serve as an indicator for this trend. By applying the guidelines that Hevner et al. (2004) have proposed for design science to the design of maturity models, eight requirements were postulated for the design process and an appropriate procedure model was developed. This provides a sound framework for the methodologically well-founded development and evaluation of maturity models. Such a reasoned approach is particularly necessary if maturity models are not to be reduced to the status of a mere marketing tool for business consultants. The main purpose of the procedure model proposed here is to raise awareness for a methodologically well-founded maturity model design.

The procedure model can be used as a guideline by projects designing maturity models. An important task for future research would be to conduct empirical studies on hypotheses about the effects that this application may have. One of these hypotheses would be that using the procedure model for the design of maturity models leads to improved documentation and to more profitable results than an intuitive procedure without recourse to a reference manual.

Our research focuses on the development of the IT Performance Measurement Maturity Model (ITPM3). At the time of writing this is in the conceptual phase of transfer and evaluation of maturity models.

The procedure model was developed on the basis of a specific criteria catalog. It is of course possible that, by expanding the criteria catalog, or through the choice of a different argumentative premise, the procedure model may be improved with regard to certain success criteria, such as documentation quality, scientific standards, developmental period, economic efficiency, or utility of the product. This opens up potentials for improvement to the procedure model as presented here, e. g. in form of a detailed specification and operationalization of particular requirements and their further epistemological substantiation, which could not have been achieved within this article (for more detail see Zelewski 2007).