Skip to main content

Evaluation Methods for e-Strategic Transformation

  • Chapter
  • First Online:
Government e-Strategic Planning and Management

Part of the book series: Public Administration and Information Technology ((PAIT,volume 3))

Abstract

Governments define strategies in order to transform their political missions to reality, deal with various challenges, schedule public spending, and control public investments. This phenomenon occurs even at supranational levels, where governments negotiate and agree on common objectives, which comes close enough to competitive political interests. Government strategic planning is crucial and defines necessary strategic elements such as vision, mission, success factors, stakeholders, etc. Moreover, strategic planning is a continuous process, since strategic updates follow previous plans and recognize existing successes and failures. Strategic implementation occurs via framework programs, which allocate funding on alternative priorities, while they identify methods for strategic assessment and monitoring. This chapter focuses on government e-strategies and aims to illustrate whether program evaluation can result in the assessment of strategic transformation. In this context, existing program evaluation methods are compared according to their applicability on strategic transformation’s evaluation. This comparison will be applied on data from Greece, where three government e-strategies have been evolved since late 1990s and respective framework programs have been implemented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leonidas Anthopoulos .

Editor information

Editors and Affiliations

Annex I: Program Assessment Grid, Used by the European Committee

Annex I: Program Assessment Grid, Used by the European Committee

Quality of the evaluation report

Quality of the evaluation process

(1) Meeting needs: The evaluation report adequately addresses the requests for information formulated by the commissioners and corresponds to the terms of reference

(2) Coherent objectives and program: The program objectives were coherent and the program was able to be evaluated

(3) Relevant scope: The rationale of the program, its outputs, results, impacts, interactions with other policies and unexpected effects have been carefully studied

(4) Adequate terms of reference: The terms of reference were well drawn up and proved useful and did not need to be revised

(5) Open process: The interested parties—both the partners of the program and the other stakeholders—have been involved in the design of the evaluation and in the discussion of the results in order to take into account their different points of view

(6) Tender selection: This was well conducted and the chosen tenderer was able to undertake the evaluation to a good standard

(7) Defensible design: The design of the evaluation was appropriate and adequate for obtaining the results (within their limits of validity) needed to answer the main evaluative questions

(8) Effective dialog and feedback: An inclusive forum and process was created that provided feedback and dialog opportunities with commissioners and managers that improved the quality of the evaluation

(9) Reliable data: The primary and secondary data collected or selected are suitable and reliable in terms of the expected use

(10) Adequate information: Required monitoring and data systems existed and were made available/accessed by administrations and partners

(11) Sound analysis: Quantitiative and qualitative data were analyzed in accordance with established conventions, and in ways appropriate to answer the evaluation questions correctly

(12) Good management: The evaluation team was well-managed and supported by program managers

(13) Credible results: The results are logical and justified by the analysis of data and by suitable interpretations and hypotheses

(14) Effective dissemination to commissioners: The reports/outputs of the evlauaiton were disseminated to commissioners including steering committee member s and program managements who responded appropriately with timely feedback/comments

(15) Impartial conclusions: The conclusions are justified and unbiased

(16) Effective dissemination to stakeholders: The reports/outputs of the evaluation were suitably disseminated to all stakeholders and where necessary targeted in ways that supported learning lessons

(17) Clear report: The report describes the context and goal, as well as the organization and results if the program in such a way that the information provided is easily understood

 

(18) Useful recommendations: The report provides recommendations that are useful to stakeholders and are detailed enough to be implemented

 

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Anthopoulos, L., Blanas, N. (2014). Evaluation Methods for e-Strategic Transformation. In: Anthopoulos, L., Reddick, C. (eds) Government e-Strategic Planning and Management. Public Administration and Information Technology, vol 3. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8462-2_1

Download citation

Publish with us

Policies and ethics