Official Statistics—Public Informational Infrastructure

  • Walter J. RadermacherEmail author


This chapter is about the ‘making of’ official statistics. The processes, structures and actors that are crucial for the high quality are to be presented. Official statistics are understood as industry that produces information. Consequently, in the presentation, the terms and concepts of modern management are used throughout. The first section starts with the business model of statistics with its dimensions of the processes (‘how’), the products (‘what’) and the producers (‘who’). It then deals with important overarching topics, such as quality management, national and international statistics and statistical confidentiality. With a look at the recent modernisation of the business model, the current status of Statistics 3.0 is summarised.

This chapter is about the ‘making of’ official statistics. The processes, structures and actors that are crucial for the high quality are to be presented. Official statistics are understood as industry that produces information. Consequently, in the presentation, the terms and concepts of modern management are used throughout.

The first section starts with the business model of statistics with its dimensions of the processes (‘how’), the products (‘what’) and the producers (‘who’). It then deals with important overarching topics, such as quality management, national and international statistics and statistical confidentiality. With a look at the recent modernisation of the business model, the current status of Statistics 3.0 is summarised.

2.1 The Business Model of Official Statistics

2.1.1 Core Aspects

Many of the essential definitions and foundations of official statistics can be found in the statutory provisions of Regulation 223 on European statistics.1 These represent an agreement of the partners cooperating in the European Statistical System (EU as well as Switzerland and EEA/EFTA, currently a total of 32 states), but which also applies in other European countries (e.g. candidates for EU accession).2

The regulation

establishes a legal framework for the development, production and dissemination of European statistics. (Art 1):

The development, production and dissemination of European statistics shall be governed by the following statistical principles (Art 2):
  • professional independence

  • impartiality

  • objectivity

  • reliability

  • statistical confidentiality

  • cost effectiveness

The statistical principles set out in this paragraph are further elaborated in the European Statistics Code of Practice. The development, production and dissemination of European statistics shall take into account international recommendations and best practice.

The following definitions shall apply (Art 3):
  • ‘statistics’ means quantitative and qualitative, aggregated and representative information characterising a collective phenomenon in a considered population;

  • ‘development’ means the activities aiming at setting up, strengthening and improving the statistical methods, standards and procedures used for the production and dissemination of statistics as well as at designing new statistics and indicators;

  • ‘production’ means all the activities related to the collection, storage, processing, and analysis necessary for compiling statistics;

  • ‘dissemination’ means the activity of making statistics and statistical analysis accessible to users;

  • ‘data collection’ means surveys and all other methods of deriving information from different sources, including administrative sources;

  • ‘statistical unit’ means the basic observation unit, namely a natural person, a household, an economic operator and other undertakings, referred to by the data;

  • ‘confidential data’ means data which allow statistical units to be identified, either directly or indirectly, thereby disclosing individual information. To determine whether a statistical unit is identifiable, account shall be taken of all relevant means that might reasonably be used by a third party to identify the statistical unit. (European Union 2015)

The following sections build on these foundations; they are interpreted and further elaborated.

2.1.2 Knowledge Generation

A simplified circular process chart describing the interaction between users and producers of information should help us to understand the main phases in the production and the use of statistical information:

The key-processes within the production sphere in Fig. 2.1 are3
Fig. 2.1

Knowledge generation and statistical production [See earlier versions in Blanc et al. (2001) and Radermacher et al. (2004)]

D: development and design
  • Input: information requests and needs for statistical information expressed in qualitative form (language)

  • Output: a work system that contains the necessary statistical specifications (variables, methodology, standards, sampling design, etc.) and concrete prescriptions with regard to the entire work programme and individual production lines.

P: production
  • Input: specifications of the work system

  • Output: statistical data and metadata.

C: communication/dissemination
  • Input: statistical data and metadata

  • Output: statistical information.

In addition, it is essential to include explicitly the following process on the user side.

U: creation of knowledge and application
  • Input: statistical information

  • Output: quantitative response to the qualitative information request.

The ultimate goal of statistical evidence is to contribute to better informed decisions of all kind and for all types of users, which can only be achieved when all four processes are considered and integrated in a comprehensive conceptual approach. Each of them should contribute to excellent information quality. Each of them can of course also fail and contribute to errors, misunderstandings and underperformance.

The process D has an external part (dialogue with users) and an internal part (development and testing of methods). Intensive cooperation with users is crucial for the adequacy of the entire process chain that follows.

During the production process P, the methods agreed in the preceding development phase are implemented. It is relatively straightforward to measure the quality of this process and its sub-processes against these predefined norms.

Communication processes C represent the other end of the user interaction. They can also be grouped into an internal part (preparing the results from the production process for different channels, access points, etc.) and an external part (interaction with users in all formats and through all channels). The internal part also belongs to the set of predefined methods and is in that way similar and closely linked to production.

The processes of application and use U are not under any kind of control or influence by statisticians. It is, however, obvious that users might not be sufficiently prepared or trained to interpret and use statistics in the best possible manner. Statistical literacy is therefore an area of interest also for statistical producers. Furthermore, statisticians should carefully observe cases of wrong interpretation and they must protect their information against misuse.

2.1.3 The Process Model, Business Architecture

The flow-model of knowledge generation and statistical production process (Sect. 2.1.1) can be further used and elaborated for the creation of a generic process model of official statistics, using the format of an input-output flowchart (Fig. 2.2).
Fig. 2.2

Main processes in official statistics

At the centre are individual production processes of specific statistics, starting with a survey the results of which are condensed in data processing into information that is analysed and published, thus, finalising the process. Close to these core processes were also the support processes and corresponding internal services (publication, IT, etc.). A highly branched organisation of these individual processes in isolated ‘silos’ was the historically grown one-to-one relationship between the producers and users of the individual statistical areas.4 Thus, the agricultural statistics unit produced as closely as possible what was desired by the Ministry of Agriculture; similar in economics, health, energy, etc. In total, this resulted in more than 200 parallel processes: a veritable spaghetti bowl.

Statistics in this logic was tailor-made and crafted for the needs of a particular customer (or customer group). For each of these areas, therefore, more or less the entire procedure schematised in the GSBPM (UNECE 2013) has been completed separately and without feedback from similar areas. In such an understanding of the manner in which statistics are produced, there are, thus, primarily individual production lines which are only weakly and insignificantly connected to each other. Such strands can therefore be organised, opened, closed and financed without any major impact on other areas.

Information technology has dramatically improved the possibilities of official statistics over the past four decades. However, these new possibilities have ultimately contributed to the fact that the already fragmented organisation disintegrated even more into heterogeneous and inefficient parts. While mainframe information technology was very centralised in the 1970s and 1980s, the introduction of personal computers also resulted in a wave of decentralisation in the 1990s and 2000s.

Not least because of the reduced budgets and resources, this form of official statistics was no longer possible, at least since the beginning of the 2000s. The isolated process organisations lacked efficiency and consistency. Parallel and non-coordinated areas of production have been targeted by reforms and modernisation projects (Eurostat 2009). Generally, this modernisation aims at substituting the stovepiped way5 of working by a new form, i.e. a new business model, which can be summarised as ‘Multiple Source—Mixed Mode Design’ at the data-input side, with a ‘multipurpose design’ at the information-output side and with a modularisation of exchangeable process elements6 within a standardised business architecture at the centre of the statistical factory. This will be discussed in more detail in Sect. 2.6.

In this context, statistical offices considered ways of making production more uniform so as to be more efficient and effective. The result of these considerations led to a kind of industrialisation of the processes with the typical components, i.e. standardisation (of methods, IT applications, etc.), centralisation of common components (IT, auxiliary services, etc.) and, last but not least, the introduction of an overarching business architecture as an ordering system. The flowchart in Fig. 2.3 explains this architecture in a very simplified and graphical manner.
Fig. 2.3

Business architecture of official statistics (Radermacher 2011)

Such a business model of the ‘factory’ is still relatively new and does not necessarily meet with the approval and sympathy of those who work in this institution. Centralisation and standardisation are perceived as a loss of self-responsibility, and the replacement of a craft by an industrial model is sensitive to the professional self-understanding of statisticians. Nevertheless, there is no way around this approach. Too powerful and urgent are the constraints of the general political situation and the dynamics of developments in information and communication technology.

2.1.4 Modes of Data Collection

It is one of the myths about official statistics that it is exclusively based on self-collected data. While this is true for large areas of economic and social statistics, in other areas, such as demography, health care statistics or education statistics, however, existing data sources are being evaluated. This has never been different; rather, it has been a standard approach, at least in the early days, before high-quality sample surveys were methodically enabled. Primary sources of data collection under the control and responsibility of the statistical office make up not more than half of the processes, while others evaluate existing (secondary) data sources, including registers. Even if this picture refers to the situation in Germany in 2008 (see Fig. 2.4), it is representative of the fact that official statistics is by far not a data collection engine.
Fig. 2.4

Primary and secondary modes of data collection—Germany 2008 (Radermacher 2008)

However, the data from different sources were generally not merged and used together to generate information. Rather, it was left to the users, in case of parallel running statistics with different origins, to make the right choice for themselves. Merging data from different sources requires rules (and algorithms) that allow synthesis to be transparent and not arbitrary; this would violate basic principles. For a long time, it was not considered to be the job of official statistics to do this ‘blending’. Instead, more restraint was maintained, and unprocessed results were provided. Last but not least, producers and users at that time agreed in the conviction that survey data was seen as superior to administrative data sources. Only in National Accounts was it considered inevitable and opportune to distil the best possible information from multiple data sources, to close data gaps with estimates in order to arrive at a complete and consistent picture.

This has changed.

The primacy of survey data over existing data sources was unsustainable for many reasons. In the end, it was a mixture of increasing availability of data in (administrative as well as statistical) registers, the potential of new IT (online transfer of data), cost pressures, and dissatisfaction of respondents with statistical burdens that reversed the prioritisation to its opposite. According to a modern prioritisation, it is appropriate and legitimate to collect data if and only if these data cannot already be obtained from existing sources of satisfactory quality. This opens a door to a completely different business model with fundamental changes in the tasks of a statistical office, with new components in the methodological toolbox (e.g. record linkage), with adaptations of the statistical governance, e.g., the creation of legal-administrative conditions (access to some sensitive administrative data7) as well as changes in quality management and in communication to users.

One might have the impression that the melange between survey and administrative data is nothing but replacing an item in a questionnaire by a similar piece of data from a register. This impression has been shown to be much too simplistic and not realistic.8 Instead, the entire design of one statistical process has to be reviewed and (quite often) revised. It is a long way from the classical design, where a traditional (‘knock on the door’) census survey every ten years was alternating with yearly interpolations of the population from administrative registers to a modern (fully integrated) design, where a regular matching of administrative and (sample) survey data ensures the best possible capturing of high dynamics in population changes on a continuous (yearly) bases, delivering a completely different mix of quality features (improvements in timeliness, coherence at the expense of accuracy in the traditional census years). It is hardly possible to overestimate the difficulties of change management in the transition from the traditional to the new design. Not all users are winners in such a change and not all producers welcome the changed production and their products.

In particular, the Nordic countries have re-engineered their statistical systems by shifting them entirely to the prioritised use of registers (see Fig. 2.5).
Fig. 2.5

Year of establishing registers in population censuses [From Register-based statistics in the Nordic countries, by UNECE Statistical Division, © 2007 United Nations. Reprinted with the permission of the United Nations. UNECE (2007, p. 5)]

This Nordic way cannot be followed in the same manner by every country. The legal conditions of access to individual data corresponding to administrative practices and political as well as cultural conditions (presence of high-quality registers, trust of citizens in government institutions, etc.) are too different. Nonetheless, the fundamental approach is widely used in the reality of official statistics nowadays. The trend towards population censuses, which are created entirely or partially from register data, illustrates this statement.9

In this respect, the consideration and interaction of ‘Big Data’ is nothing fundamentally different; the paradigm shift has already taken place. However, the task of statistics is further complicated because the possibility to influence the nature and structure of this external data continues to diminish (more precisely: no longer exists), but at the same time the general pressure and the expectation that it has to be used has increased immensely.

2.1.5 The Portfolio of Products (And Services) Statistical Products (And Services)

As a starting point for the consideration of statistical products, the definition in EU regulation 223 is used again: ‘‘statistics’ means quantitative and qualitative, aggregated and representative information characterising a collective phenomenon in a considered population’ (European Union 2015: Art. 3).

For an understanding of the functioning and internal organisation of official statistics, it is necessary to arrange different levels and types of statistical information according to their degree of aggregation and their quality profile.

At European level,10 the following types of statistical products are distinguished:
  • Data: information collected by statistical authorities, via traditional statistical activities (sample surveys, censuses, etc.)/data from other sources, re-used for statistical purposes. This information is tailored to serve needs in specific policy areas, e.g. the labour market, migration or agriculture. The term also includes data collected for administrative purposes but used by statistical authorities for statistical purposes (usually referred to as data from administrative sources).

  • Accounting systems: coherent and integrated accounts, balance sheets and tables based on a set of internationally agreed rules. An accounting framework ensures a high profile of consistency and comparability; statistical data can be compiled and presented in a format that is designed for the purposes of analysis and policy-making.

  • Indicators: an indicator is a summary measure related to a key issue or phenomenon and derived from a series of observed facts. Indicators can be used to reveal relative positions or show positive or negative change. Indicators are usually a direct input into EU and global policies. In strategic policy fields they are important for setting targets and monitoring their achievement.

This view can be condensed to an information pyramid (Fig. 2.6).
Fig. 2.6

Information pyramid of official statistics.

Source European statistical programme 2013–2017 (European Union 2011, p. 20)

Primarily, this presentation relies on a distinction of different aggregation levels, i.e. a level with many details (i.e. micro) for the basic statistics and a level with more abstract aggregates and models (i.e. macro) for accounts and indicators.

Furthermore, basic statistics and accounts are characterised as multipurpose,11 while indicators are closely tied to a specific use and determination.
  • ‘Multipurpose’ makes clear that such statistical information has the character of an infrastructure designed for wide and diverse use. Basically, this makes their design quite difficult because the different users and user groups have quite different ideas and priorities regarding what they need as information. What the quality label ‘fitness for purpose’ means in such statistics is therefore anything but trivial. How this problem is addressed is explained in Sect. 2.3.

  • In contrast, indicators are closely tied to a specific question and task. In particular, for European policies, it is typical that they provide and promote decision-making and governance based on indicators. This has the great advantage that the information requirements are usually very well known. Whether the statistics are ‘fit for purpose’ can therefore be assessed quite precisely. On the other hand, this closeness to political decisions (sometimes linked to immediate sanctions or other consequences) also carries considerable risks, which are discussed in Chap.  3.

Although this is not a strict and non-overlapping order, the approach helps to better describe the products, the processes and different quality profiles so that they can be better planned, managed and communicated.

The business model of modern statistics includes not only the products but also statistical services. This includes, above all, special services for individual users. For example, a statistical office may provide tailored analysis of its data of interest to individual users (e.g. businesses or media) or user groups (industry associations or NGOs). For research and teaching, elaborate work is carried out to allow access to micro-data without jeopardising the confidentiality of individual data. The Product Portfolio

As one would expect from an industry, the products are grouped and managed together in one complete programme, one ‘portfolio’. With the help of such a portfolio, internal planning and decisions (priorities) are made possible and a controlling (costs, quality) can be built up. It is very important for communication with users to offer the portfolio in such a form that they can get a good overview of the available information that enables them to make their own choices.

For the sake of clarity, reference is made below to European statistics as an example. In the multiannual planning of the statistical programme, this systematic approach was used to structure the portfolio of products.

The entire portfolio of products is listed according to this logic in a ‘catalogue’ that is used for internal purposes (planning, costing and management) as well as for the structuring of the database and website and for external communication purposes. Although the result in the form of such a catalogue seems logical, even trivial, it took a remarkably long time to agree on such a standardised structure and presentation. Given the very different cultures within the professional communities in official statistics, practices and well-trodden ways of production and similar resistance had to be overcome in order for this standard to be introduced. However, this is not unusual; rather, it is the fate of any form of standardisation.

At this point, it is not necessary or useful to present and review this catalogue in detail. Nevertheless, it is interesting to know what information such a catalogue contains about statistical products. As can be seen from the excerpt from the European statistics catalogue in Fig. 2.7, the products are given standardised names; they are coupled with the relevant production process, the temporal and regional resolutions are given, and the main users and the legal basis are mentioned.
Fig. 2.7

Eurostat catalogue of products.

Source Eurostat (2017b) (extract)

In the course of modernising official statistics (especially in Europe), it is becoming increasingly important to define modules and services that can be exchanged and shared within an agreed and standardised business architecture. A common product catalogue created, shared and applied by all producers in the statistical system of Europe, therefore, represents a decisive step forward on the path to efficient value chains and close cooperation between the partner institutions. In addition to the product catalogue, a service catalogue will increasingly play a major role.12

2.2 Skills and Human Resources

The decisive factor for the quality of statistics is the staff of the statistical institution. First and foremost, of course, this means that the statistical institution must have a sufficient number of sufficiently qualified professionals. In recent decades, there have been major changes in the amount and structure of staff. While the absolute number of employees has tended to decrease, the proportion of academically trained and qualified employees has increased (Figs. 2.8, 2.9 and 2.10).
Fig. 2.8

Staff in statistical offices—example: Statistics Netherlands (For example, the annual report 2015 of the CBS of the Netherlands expresses this in the following way: “The increasing complexity and further automation of statistical processes are contributing to the decline in the amount of semi-skilled and unskilled work and the increasing need for more highly educated staff. This is reflected in the composition of the workforce by job grade. In 2015, 77% of staff were in scale 9 or above. By comparison, the proportion in 1995 was only 50%.” (CBS 2016b, p. 20). Other (unpublished) figures, provided by CBS)

Fig. 2.9

Staff in statistical offices—example: statistics Denmark (unpublished figures, provided by Statistics Denmark)

Fig. 2.10

Staff in statistical offices—example: statistics Denmark (unpublished figures, provided by Statistics Denmark)

In this respect, the same development has taken place in official statistics as in other industries, where quantity has been substituted by the quality of the employees. This is the move from Official Statistics 2.0 to 3.0: from a manual to an industrial production of statistical products based on an all-embracing use of information technology.

Another very important consideration is the professional composition of the staff. In the previous sections, the variety of products and processes was explained. Ideally, experts and their knowledge would be available for all products and processes. Of course, that is not possible. In addition, the diversity is too large and the supply on the job market too limited. Above all, however, there is a lack of specialised training, from which graduates could be recruited for official statistics tasks. In addition, it has been shown in recent years that the dynamics of change are so great that, in any case, vocational training and internal training on the job are the more important qualification methods.

Which qualifications and professional orientations are actually needed? Which competences should be available among the team in the statistical office?

First, in the classic field of statistical production (including the development of methods, products and processes), one could falsely assume that primarily statisticians would come into play, who are familiar with survey techniques. This is true, but only in the areas in which data about surveys are originally collected, e.g., in a wide range of social statistics or business statistics.

With the increasing importance of administrative data and registers as a source of official statistics, the profile of requirements has already changed in recent years. Of course, if work processes no longer start with collecting data, and if instead existing data needs to be analysed, filtered and aggregated to meaningful information, the job profile will change accordingly. With the omnipresence of Big Data the working conditions will undergo even further dramatic change, which again requires new personal skills, knowledge and experience.

This does not mean that training in survey methodology is or will become irrelevant. Rather, paths must be taken in which these competencies are embedded in methods of data sciences and the management of complex production processes.

Second, it requires expertise in the area of accounts, be it the macro-economic, social (health, education …) or environmental accounts. A solid education in economics with an emphasis on the empirical focus on National Accounts is absolutely necessary here.

Third, and this is still a relatively recent domain, knowledge in the field of indicator methodology is needed. Needless to say, this requires a mix of statistical methods, communication skills and sensitivity for the policy dimension of the specific indicator (being aware of the respective opportunities and threats related to the closeness to policy-making).

Fourth, statistical office staff is expected to have sufficient knowledge of the area of application for which they are responsible. This can be a more specific and narrow area, such as agricultural, energy or health, or a wider one, such as business cycle, labour market or Sustainable Development.

Fifth, it is of course of strategic importance to have the necessary expertise in the fields of information and communication technologies available. However, due to the very dynamic development, it is increasingly difficult or even impossible to maintain this expertise in-house. An outsourcing of such services and the corresponding personnel capacity is essential.

Sixth, today it is more important than ever to have specialists in the field of communication and media in the team.

Finally, in today’s administrations, it is not only the classical administrators but also skilled professionals of modern management (quality management, controlling, cost accounting, etc.) who play an important role.

However, the actual composition of the staff depends very much on external factors and framework conditions: Are appropriate training courses offered at the universities? Is the statistical office attractive and competitive on the (local) labour market? In order to improve conditions in this regard, European statistics has launched ‘EMOS’,13 a Master’s degree programme designed to better prepare graduates for their employment in official statistics.

The difficulties that official statistics face in human resources are changing over time. For example, it has become increasingly problematic to find well-trained economists for working in National Accounts; apparently, the empirical dimension plays only a sub-ordinate role in today’s economics studies. In the context of digitisation, developments will take place, which must also be reflected in the composition of personal and professional skills. Finally, in the future, more attention will have to be paid to the interplay and interactions of statistics and society, which also requires corresponding specialist capacities. This corresponds with the topic of the present work.

For the sections that follow (in particular, the section on quality), it is important to understand the interaction between the composition of the staff and the statistical culture that is emerging in different areas of work. As a rule, two communities are represented and these could hardly be more different in their views and ways of working.

On the one hand, there are the survey statisticians (especially in the field of social statistics), whose quality reliance is based on the fact that the entire production process of survey design (from data collection and processing to the generation of aggregated results) is under their control. Here, quality aspects such as reliability and punctuality are in the foreground, while a complete coverage of a topic or consistency is seen as of minor importance. Such an approach is called ‘micro’.

On the other hand, in the field of accounting, the primary concern is a complete and consistent picture of a situation or a subject matter area, while accuracy in details plays a minor role. Such approaches are called ‘macro’.

From these two approaches and cultures arise, in some cases, considerable (micro-macro) differences in the statistical results on the same topic.14 Because of this, it is difficult to subsume the quality of statistical products under a single definition.

2.3 Quality in Official Statistics

2.3.1 Quality—An Old Objective—A Young Concept

In order for official statistics to function as a language, a ‘boundary object’ (Stamhuis 2008; Saetnan et al. 2011) for all kinds of societal interactions and decision-making, it is essential that the quality of statistical products and services is outstanding, an authority in itself. For Porter, ‘the language of quantification may be even more important than English in the European campaign to create a unified business and administrative environment’ (Porter 1995, p. 77). This is the brand-mark and the competitive advantage of official statistics. Once this authority is undermined, be it through real quality problems or only through perception, trust in official statistics will be replaced by suspicion and statistics will become part of political fights and games. Against this background, it is important to define quality of statistics with a much wider scope, including not only the production but also the use side of statistical information and how these two sides are interacting in a dynamic relationship.

As a consequence, the approach to quality in official statistics has changed radically over the past two decades. According to today’s prevailing opinion, statistics must be suitable for a particular utilisation: this is the criterion ‘Fitness for Purpose!’ However, unlike the earlier producer-related view and definition of quality, this new objective leads to a very complex world in which simple and one-dimensional solutions are no longer possible or appropriate.

Central to the remarks is to understand statistics as products, products of a larger whole (the portfolio), produced under given conditions and constraints and aimed at serving a not necessarily sharply defined group of users. For every single product as well as for the entire portfolio, it is important to find a (‘Pareto’) optimal solution, meaning to achieve the best of all possible solutions for each statistic and for the statistical program as a whole (Radermacher 1992, 1999a). This may sound abstract and difficult. However, it becomes plausible and practically solvable in an evolutionary process, with year-by-year changes in planning and production.

In European statistics, the first systematic steps in the area of statistical quality were made at the end of the 1990s through cooperation in the ESS Leadership Group (LEG); on quality initially, the LEG was struggling with difficulties inherent in the convergence of two schools of thought: classical approaches from statistical methodology and approaches from industrial quality management. It was very much in the spirit of W. E. Deming’s15 view on ‘profound knowledge’, quality management and learning organisations, which the LEG had finally elaborated in a synthesis report, including 21 recommendations for European statistics (Lyberg et al. 2001).

2.3.2 Quality Objectives and Means to Reach Them

Unfortunately, there is no unified glossary of quality terms in official statistics. A search on the corresponding page of the OECD gives a total of 131 hits.16

Therefore, a more general approach to the topic of quality in statistics will be made here, before going into the various aspects in detail.

Statistical information should, as far as possible, meet three different requirements (see Fig. 2.11).
Fig. 2.11

Dimensions of statistical quality

First, it should provide information about a phenomenon that is relevant to answering current questions. Statistics that interest nobody cannot be part of the tight budget of a statistical office. Here, of course, one is immediately confronted with the crucial problem that concerns the selection of topics and issues to which this relevance relates. So, who determines the statistical programme in the end?

Second, the statistics should be supported by a theory, i.e. they should meet scientific standards. In this regard, it has to be clarified which theory is meant here. For the National Accounts, the case is comparatively clear: they are closely linked to the macro-economy. Less clear, however, is this objective in the remaining areas, even if close links exist between empirical research and official statistics. In the more recent areas (environment, Sustainable Development), the situation is particularly difficult because of the various disciplines involved.17

Third, of course, statistics should meet the criteria of measurability, they should be reliable, punctual, comparable and accessible, to name but a few.

Apparently, different aspects and categories play a role in the (multidimensional) definition of statistical quality. Additionally, because it is not possible, under normally limited circumstances (costs, time, staffing, willingness to provide information), to fully meet all these objectives at the same time, priorities must be set, e.g., in the statistics programme or in the selection of statistic variables.

In this respect, it is advantageous that in the portfolio of statistics different products are included, which embody with their special quality profile, in different ways, the three objectives of relevance, consistency and measurability. While basic statistics in their great diversity are very much aligned with the goal of measurability, National Accounts focus on scientific consistency within a specific theory. In turn, indicators are closer to the goal of relevance. In this respect, the different types of statistical products are not only located at different levels of the information pyramid, but also complement each other.

In the European statistics the basic elements of such a quality approach are manifested in Regulation 223 (European Union 2015: Art 12):
European statistics shall be developed, produced and disseminated on the basis of uniform standards and of harmonised methods. In this respect, the following quality criteria shall apply:
  1. (a)

    ‘relevance’, which refers to the degree to which statistics meet current and potential needs of the users;

  2. (b)

    ‘accuracy’, which refers to the closeness of estimates to the unknown true values;

  3. (c)

    ‘timeliness’, which refers to the period between the availability of the information and the event or phenomenon it describes;

  4. (d)

    ‘punctuality’, which refers to the delay between the date of the release of the data and the target date (the date by which the data should have been delivered);

  5. (e)

    ‘accessibility’ and ‘clarity’, which refer to the conditions and modalities by which users can obtain, use and interpret data;

  6. (f)

    ‘comparability’, which refers to the measurement of the impact of differences in applied statistical concepts, measurement tools and procedures where statistics are compared between geographical areas, sectoral domains or over time;

  7. (g)

    ‘coherence’, which refers to the adequacy of the data to be reliably combined in different ways and for various uses.


2.3.3 Code of Practice

In European statistics, in this regard, one has agreed upon a structure—one could also say classification—further elaborated18 and manifested in the Code of Practice (Eurostat 2011):

The European Statistics Code of Practice sets out 15 key principles for the production and dissemination of European official statistics and the institutional environment under which national and Community statistical authorities operate. A set of indicators of good practice for each of the 15 principles provides a reference for reviewing the implementation of the Code.

The European Statistics Code of Practice was adopted by the Statistical Programme Committee in 2005 and was revised by the European Statistical System Committee in September 2011 and 2017 (Eurostat 2018b) (Fig. 2.12).
Fig. 2.12

European Statistics Code of Practice

ES Code of Practice:
  • Institutional environment
    • Institutional and organisational factors have a significant influence on the effectiveness and creditability of a statistical authority developing, producing and disseminating European Statistics. The relevant issues are professional independence, mandate for data collection, adequacy of resources, quality commitment, statistical confidentiality, impartiality and objectivity.

  • Statistical processes
    • European and other international standards, guidelines and good practices are fully observed in the processes used by the statistical authorities to organise, collect, process and disseminate European Statistics. The credibility of the statistics is enhanced by a reputation for good management and efficiency. The relevant aspects are sound methodology, appropriate statistical procedures, non-excessive burden on respondents and cost effectiveness.

  • Statistical output
    • Available statistics meet users’ needs. Statistics comply with the European quality standards and serve the needs of European institutions, governments, research institutions, business concerns and the public generally. The important issues concern the extent to which the statistics are relevant, accurate and reliable, timely, coherent, comparable across regions and countries, and readily accessible by users.

This short summary makes it clear that the quality concept follows the three dimensions that were already introduced at the beginning: Who? (Institutions), How? (Processes) and What? (Products). Like the Code of Practice itself, the way it is implemented is significantly inspired by the methods of Total Quality Management (TQM). This is mainly reflected in the Quality Assurance Framework (QAF) (Eurostat 2018d).

Box 2.1

Quality criteria for European statistics 19

Institutional environment

Statistical processes

Statistical output

1. Professional independence

1bis. Coordination and cooperation

2. Mandate for data collection

3. Adequacy of resources

4. Commitment to quality

5. Statistical confidentiality

6. Impartiality and objectivity

7. Sound methodology

8. Appropriate statistical procedures

9. Non-excessive burden on respondents

10. Cost effectiveness

11. Relevance

12. Accuracy and reliability

13. Timeliness and punctuality

14. Coherence and comparability

15. Accessibility and clarity

It is essential to the success of quality management that the governance (the ‘who?’) is thoroughly thought through and implemented in a timely manner. With the Code of Practice, e.g., in Europe, the committees of statistics are being reformed. Among other things, a supervisory board (European Governance Advisory Board20) was established. In addition, peer reviews21 for the producers of European statistics were carried out at longer intervals.

For the part of European statistics produced by the ECB and the national central banks, a similar quality framework is in place (ECB 2018).

The existence of codes of conduct in official statistics is still comparatively new, introduced in the past three decades. First, ethical standards for the professional statistician were adopted by the International Statistical Institute (ISI) in 1985 (ISI 2018); in 2010, the declaration of professional ethics followed (ISI 2010).

Box 2.2

ISI Professional Ethics 22

“Our shared professional values are respect, professionalism, truthfulness and integrity.”

Ethical Principles

  • Statisticians should pursue objectivity without fear or favor, only selecting and using methods designed to produce the most accurate results. …

  • Clarifying Obligations and Roles: … statisticians should take care to stay within their area of competence, and seek advice, as appropriate, from others with the relevant expertise.

  • Assessing Alternatives Impartially: Available methods and procedures should be considered and an impartial assessment provided to the employer, client, or funder of the respective merits and limitations of alternatives, along with the proposed method.

  • Conflicting Interests: Statisticians avoid assignments where they have a financial or personal conflict of interest in the outcome of the work. …

  • Avoiding Preempted Outcomes: Any attempt to establish a predetermined outcome from a proposed statistical inquiry should be rejected, …

  • Guarding Privileged Information: Privileged information is to be kept confidential. This prohibition is not to be extended to statistical methods and procedures utilized to conduct the inquiry or produce published data.

  • Exhibiting Professional Competence: Statisticians shall seek to upgrade their professional knowledge and skills, …

  • Maintaining Confidence in Statistics: In order to promote and preserve the confidence of the public, statisticians should ensure that they accurately and correctly describe their results, including the explanatory power of their data. …

  • Exposing and Reviewing Methods and Findings: Adequate information should be provided to the public to permit the methods, procedures, techniques, and findings to be assessed independently.

  • Communicating Ethical Principles: In collaborating with colleagues and others in the same or other disciplines, it is necessary and important to ensure that the ethical principles of all participants are clear, understood, respected, and reflected in the undertaking.

  • Bearing Responsibility for the Integrity of the Discipline: Statisticians are subject to the general moral rules of scientific and scholarly conduct: they should not deceive or knowingly misrepresent or attempt to prevent reporting of misconduct or obstruct the scientific/scholarly research of others.

  • Protecting the Interests of Subjects: Statisticians are obligated to protect subjects, individually and collectively, insofar as possible, against potentially harmful effects of participating. ….

The need for a set of principles governing official statistics became apparent at the end of the 1980s when countries in Central Europe began to change from centrally planned economies to market-oriented democracies. It was essential to ensure that national statistical systems in such countries would be able to produce appropriate and reliable data that adhered to certain professional and scientific standards. Towards this end, the Conference of European Statisticians developed and adopted the Fundamental Principles of Official Statistics in 1991 … a milestone in the history of international statistics was reached when the United Nations Statistical Commission at its Special Session of 11–15 April 1994 adopted the very same set of principles – with a revised preamble – as the United Nations Fundamental Principles of Official Statistics. (UNSD 2018)

The current version of the UN Fundamental Principles (see Box 2.3) was endorsed by the UN General Assembly in its resolution 68/261 of 29 January 2014 (United Nations 2014).

Box 2.3

UN Fundamental Principles of Official Statistics 23
  • Principle 1. Official statistics provide an indispensable element in the information system of a democratic society, serving the Government, the economy and the public with data about the economic, demographic, social and environmental situation. To this end, official statistics that meet the test of practical utility are to be compiled and made available on an impartial basis by official statistical agencies to honour citizens’ entitlement to public information.

  • Principle 2. To retain trust in official statistics, the statistical agencies need to decide according to strictly professional considerations, including scientific principles and professional ethics, on the methods and procedures for the collection, processing, storage and presentation of statistical data.

  • Principle 3. To facilitate a correct interpretation of the data, the statistical agencies are to present information according to scientific standards on the sources, methods and procedures of the statistics.

  • Principle 4. The statistical agencies are entitled to comment on erroneous interpretation and misuse of statistics.

  • Principle 5. Data for statistical purposes may be drawn from all types of sources, be they statistical surveys or administrative records. Statistical agencies are to choose the source with regard to quality, timeliness, costs and the burden on respondents.

  • Principle 6. Individual data collected by statistical agencies for statistical compilation, whether they refer to natural or legal persons, are to be strictly confidential and used exclusively for statistical purposes.

  • Principle 7. The laws, regulations and measures under which the statistical systems operate are to be made public.

  • Principle 8. Coordination among statistical agencies within countries is essential to achieve consistency and efficiency in the statistical system.

  • Principle 9. The use by statistical agencies in each country of international concepts, classifications and methods promotes the consistency and efficiency of statistical systems at all official levels.

  • Principle 10. Bilateral and multilateral cooperation in statistics contributes to the improvement of systems of official statistics in all countries.

2.3.4 Quality Management, Quality Assurance

In combination with European statistics quality standards and practices,24 individual statistical offices use methods and frameworks of modern quality management (TQM), which are also customary in industry, such as EFQM25 or Lean Six Sigma.26 Total Quality Management, in its various conceptual variants, always follows a holistic approach that captures a company in all its facets, from goal setting and strategy to input and processes to results, outputs and results of various forms.

TQM models have proven themselves in practice and guarantee a systematic approach that incorporates all factors of a continuous improvement process. As an example, the EFQM model makes it clear with its criteria27 that TQM is not purely about optimising and controlling production processes. Rather, TQM is at the forefront of the entire company. The basic ideas of a comprehensive and systemic quality management are based on thinkers like Russell L. Ackoff, Peter Drucker and above all W. Edwards Deming. Deming, who was a statistician, by the way, emphasised the importance of managers not interpreting their role purely technically or economically. If they want to be successful, i.e. to produce excellent quality, they must understand their company, its employees, the interrelationships and backgrounds and much more in a profound knowledge (see Chap.  3). When producing statistics, it is therefore also important to have in-depth professional knowledge of the managers and quality of management. Management of quality means first of all quality of management.

Another important branch in the field of statistics quality has to do with cooperation and above all international comparability. In Europe, important policy decisions and significant financial flows are immediately influenced by and directly dependent on the comparability and robustness of national statistics. These include, in particular, Member States’ contributions to the EU budget and the Maastricht Treaty criteria and indicators for monitoring the (excessive) deficit of the Member States. In these political areas, quality assurance frameworks have been deeply embedded in the European legal system: the statistical methods have been agreed and fixed in detail, reporting obligations have been standardised, and audit-like supranational controls have been introduced. In addition to the statistics with an average quality profile, a category of indicators has emerged as a result of the evidence-based decision-making, which has a special authority and very special significance for politics and public discourse. It can be predicted without great uncertainty that this trend will continue and with it the political expectations vis-à-vis this premium class of statistics. The extent to which the politicians’ willingness to pay will thus increase is just as uncertain, as is the answer to the question of whether this involves risky assignments to statistics and unrealistic expectations of quantifiability.

International statistics are increasingly being used in the service of the mainstream logic of ‘evidence-based decision-making’. If statistics are given more responsibility, it will result in a chain of consequences with quality assurance measures28 and, unfortunately, also in attempts to circumvent them.29

The risks arising from the closer relationship between indicators and policy decisions have been addressed by a package of measures30 that go beyond traditional quality reporting. These include, above all, so-called Peer Reviews,31 but also, in selected areas of indicators of particular political significance, surveillance and control measures and, in the case of public finance statistics, even possible sanctions in the event of non-compliance with statistical standards.

In summary, it can be said that the development of the topic of quality in statistics since the early 1990s mirrors the evolution of statistics over this entire period. It is above all the improvements in productivity through modernisation and industrialisation taking place throughout this era, but also the societal and political framework conditions that have had a lasting effect on and changed the statisticians’ relationship to ‘quality’.

2.3.5 Evolution and Continuous Adaptation The Learning Cycle, Continuous Improvement

The great diversity and complexity of the subject ‘quality’ in official statistics have become clear in the preceding sections. Simple and quick solutions to these questions are therefore not expected and nor would they be appropriate. Rather, it becomes very clear that the wide range of processes and products in the statistical factory can only develop over longer periods of time, and that this feature of continuous improvement will continue to be inseparable from official statistics in future.

It is therefore important to make a virtue of this fact by finding a dynamic development and balance between preserving and innovating in terms of W. E. Deming’s quality management (see Fig. 2.13):

The PDSA Cycle (Plan-Do-Study-Act) is a systematic process for gaining valuable learning and knowledge for the continual improvement of a product, process, or service. … The cycle begins with the Plan step. This involves identifying a goal or purpose, formulating a theory, defining success metrics and putting a plan into action. These activities are followed by the Do step, in which the components of the plan are implemented, such as making a product. Next comes the Study step, where outcomes are monitored to test the validity of the plan for signs of progress and success, or problems and areas for improvement. The Act step closes the cycle, integrating the learning generated by the entire process, which can be used to adjust the goal, change methods, reformulate a theory altogether, or broaden the learning – improvement cycle from a small-scale experiment to a larger implementation Plan. These four steps can be repeated over and over as part of a never-ending cycle of continual learning and improvement. (Deming Institute 2018)

Fig. 2.13

PDSA cycle

In the spirit of this continuous and systematic process of learning and improving official statistics, dialogue with users and interaction with science is crucial. The planning of official statistics at all levels (variable, product and programme) has to be understood and organised as an evolutionary process, as a sequence of learning cycles and feedback loops.

Over time, changes might be started from all three angles. New demands and political issues trigger new statistical developments, as new data sources or new methodologies do. Historically, it can be observed that these driving forces are also mutually influencing each other, thus stimulating new episodes in official statistics (Desrosières 1998).

It is therefore essential to link the communication process of today with the development of statistics for tomorrow. Partly, this loop could be a short one, if user feedbacks can lead to quick fixes and improvements in services. Partly, however, it might take time, since changes in a programme need profound preparations and even more profound developments (Fig. 2.14).
Fig. 2.14

Statistical learning process

This evolutionary development of statistics is confronted with several limiting factors, which could be practical limitations, such as:
  • Clandestine, non-observable phenomena

  • Statistical items in future and elsewhere, relevant for decisions now and here (e.g., capital goods, depreciation, trade chains, Sustainable Development)

  • Values and prices for non-market-goods (can we simulate non-existing markets?)

  • Limitations by resource or time constraints; in cases, where only a limited amount of information and data is available or where limited time is given for the decision-making process

  • Limitations could also relate to the understanding and use of data and information

  • Innumeracy, statistical and data illiteracy

  • High and too high expectations

  • (No) Appetite for high-quality information (Davies 2017).

Since the beginning of official statistics in the nineteenth century, the boundaries have been substantially stretched. Continuous improvement has opened new opportunities so that today many subjects (e.g., quality of life) that were impossible to observe only a few years ago are fully integrated elements of the standard statistical programme. Nevertheless, it is crucial to understand that basic principles must be respected, if the fundament on which trust in official statistics is built is not to be damaged. This is, for example, the reason to refrain from monetising natural resources and their services, if they are not valued by market transactions. Planning and Decision Procedures, Consultation of Users, Governance

Official statistics is a special application and form of statistics that belong to the public infrastructure of (modern) states. Working methods in official statistics reflect both their political and administrative position as well as the status and development of societies (i.e. the specific relationship between state and citizens).

How is public infrastructure planned and decided? This is generally an important issue, as it is about providing the greatest possible value and return on taxpayers’ money. The way the statistics programme is decided reflects what is called governance (and what Foucault would have called gouvernementalité32) and the status of the relationship between the state and its institutions on the one side and civil society (the citizens, interest groups) on the other.

Public statistics in a modern and democratic state has to be benchmarked against principles of good governance in the public sector, out of which the overarching and most important ones are (Fig. 2.15).
Fig. 2.15

Overarching principles of good governance.

Adapted from IFAC (2014)

At least for European statistics, it can be stated that the first criterion is already taken very seriously. Both the statistical programmes and the single statistical acts go through cumbersome legislative procedures, which place extremely high demands on ex-ante impact assessment and consensus-building prior to final decision-making. Of course, such ambitious legal procedures also have a price, namely the lack of adaptability and dynamics of the statistical system. It should also be mentioned that important control mechanisms (quality reports to the European Parliament, review by the European Court of Justice, etc.) have been set up and are in place for the implementation of European statistics.

For the second criterion (stakeholder engagement), however, the status quo leaves some room for improvement: In a modern and democratic definition, official statistics is no longer a knowledge tool in the hands of the powerful and mighty. Rather, it must follow principles of neutrality and impartiality, whereby this information infrastructure becomes an important democratic pillar, equally available and accessible for everyone.

The interaction with stakeholders must be governed by principles of transparency, democratic control/supervision and public/legislative conventions. In particular, the programme of work must emerge from a democratic decision-making process, at the end of which a choice is made in favour of the ‘Pareto-optimal’ composition of statistical tasks. Priority setting in this context has an important role to play, as it must facilitate the annual adaptation of the programme following changes in user needs.

The way in which this consultation and decision-making have been organised so far relies mainly on the functioning of ‘official’ procedures concerning the preparation of legislation and political decisions. Modern societies, however, ask for more—more in terms of wider consultation (more room for all active contributions from civil societies), new forms (collection of user needs through social media) and speed (quicker adaptation of the programme).

2.4 National, International and European Statistics

As explained at the beginning of this chapter, official statistics and its history are closely linked to the nation state. This particular form and institution of governance emerged in the eighteenth and nineteenth century. Since this time, there has been a strong need for statistics in all forms and for all kind of political decisions.

In order to be able to describe official statistics more precisely, it is therefore necessary to understand the methodical orientation towards the information needs and the institutional integration into the administrative system of the nation state. The entire structure, methodology, design of surveys, the conception of macro-economic accounts, indeed everything in today’s official statistics, bears a ‘national stamp’.

National Accounts provide an overarching ordering system within official statistics. The latest versions of the System of National Accounts SNA 2008 and the European System of Accounts ESA 2010 are the accumulated result of more than six decades evolution of statistical knowledge, emerging from intensive collaboration between specialists in accounting and statistical methodology, scientists and users. These are international standards of enormous importance for all official statistics. Hundreds, even thousands of difficult considerations, methodical decisions and conventions have melted and flowed into them. The most recent version of the ESA Handbook contains 650 pages of definitions and guidelines of all kinds (Eurostat 2013b).

All of these conventions and standards are geared towards and optimised for one goal, namely to quantify the economic activity (production, income, consumption) of a state in a temporal period (year, quarter) as comprehensively and precisely as possible. A closer look at some of the National Accounts aggregates33 explains this:
  • Gross domestic product at market prices (GDP): Gross domestic product at market prices is the final result of the production activity of resident producer units. It can be defined in three ways:

  • production approach: GDP is the sum of gross value added of the various institutional sectors or the various industries plus taxes and less subsidies on products (which are not allocated to sectors and industries). It is also the balancing item in the total economy production account;

  • expenditure approach: GDP is the sum of final uses of goods and services by resident institutional units (final consumption and gross capital formation), plus exports and minus imports of goods and services;

  • income approach: GDP is the sum of uses in the total economy generation of income account (compensation of employees, taxes on pro- duction and imports less subsidies, gross operating surplus and mixed income of the total economy).

  • By deducting consumption of fixed capital from GDP, we obtain net domestic product at market prices (NDP).

  • National income (at market prices): Gross (or net) national income (at market prices) represents total primary income receivable by resident institutional units. It equals GDP minus primary income payable by resident institutional units to non-resident institutional units plus primary income receivable by resident institutional units from the rest of the world.

  • Current external balance: The balancing item in the external account of primary income and current transfers represents the surplus (if it is negative) or the deficit (if it is positive) of the total economy on its current transactions (trade in goods and services, primary incomes, current transfers) with the rest of the world.

Following these definitions, it becomes obvious that the core of National Accounts is anything but trivial. The objective of quantifying the economy of one country in one specific year can only be achieved by establishing in principle what is to be regarded as an activity, which activity belongs to that country and under what conditions it is attributed to that specific year. This, in turn, requires ancillary calculations that allow the demarcation between that country and the rest of the world (e.g., import/export, balance of payments), and which allows for offsetting the periodic allocations (e.g., depreciation).

This brief introduction to the concepts of National Accounts should suffice to illustrate a paradigm that is fundamental to the structure and results of official statistics. The first and most important task of official statistics is to satisfy information needs at national level. A comprehensive definition of official statistics must therefore take this paradigm into account. The three dimensions of national official statistics are:
  • Temporal dimension: a fixed time period (very often a year, but sometimes also a quarter or a month) or a fixed date (for example, for the population census)

  • Spatial dimension: a country (political delimitation) or a region (province, local unit according to administrative delimitation, i.e. NUTS the ‘Nomenclature of Territorial Units for Statistics’)

  • Measurement object: resident population and their activities; methodologies (variables, classifications, sampling schemes, etc.) designed to address national needs and priorities (Fig. 2.16).
    Fig. 2.16

    National statistics

What is strong at the national level, however, is a difficulty in terms of comparability between the statistics of different countries. Because this comparability has grown in importance over the past decades, statistical methods have been standardised, especially nomenclatures and guidelines for general survey methods on important statistics such as the census. Nonetheless, the national shaping of official statistics limits international comparability, in particular, for all statistics that are produced on the basis of administrative sources, thus strongly depending on the respective national legal bases. For instance, labour market statistics, as prepared by the National Employment Agencies, are geared to the national labour market policy. This greatly restricts their use for international comparisons. In Europe, the need for comparability across EU Member States is particularly high, because the statistical indicators are incorporated into European policies and are partly associated with considerable consequences. For this reason, in important areas (such as the labour market), standardised sample surveys have been introduced that consider international definitions; for example, the Labour Force Survey (LFS).34 Another type of solution to this problem is represented by the Harmonised Index of Consumer Prices (HICP),35 which is derived from the National Consumer Price Indices using a standardised methodology.

These two approaches, namely standardisation and (European) sample surveys, have contributed to make great progress for European statistics in recent years, by methodologically unifying national statistics, thus making their results much more comparable.

However, this progress in terms of international comparability has a price in other categories of quality. From the point of view of users interested in national results only, the international compromise appears to be partly unsatisfactory because it does not meet their information needs as well as they were used to from purely national statistics. In cases where (as in the labour market) national and international statistics are produced side by side, there is regular user confusion, even conflicts in cases where the results allow different interpretations.

Above all, these few examples make one point clear: statistics optimised for national needs have their limits for international issues or comparisons, and vice versa, internationally agreed (or even produced) statistics do not necessarily meet all national requirements. After all, statistical quality is not absolute and not dependent solely on production. Rather, the quality concept is based on the fitness for purpose. If national needs are prioritised, as has been the case in the past, this has consequences at an international level. The transition to more internationally oriented statistical systems is necessary in the face of globalisation, but it will be iterative and will not be without difficult adjustment phase.

Official statistics produced by international organisations (most notably UN and OECD) are based essentially on the approach of coordinating and standardising national statistics. Even in European statistics, which by the very existence of the EU has a different (supranational) claim, the basic architecture of the European Statistical System does not look any different; the ‘system’ is a network of autonomous national partners.

This means that statistical design strives for a compromise between national and European data requirements, while the data collection and the computation of statistics remains within national competence. In the vast majority of cases, it has been sufficient to agree on methodological standards that are adhered to by all partners (Fig. 2.17).
Fig. 2.17

European Statistical System

In 2009, the European Statistical System (ESS) was established by the European Statistics Regulation 223 (European Union 2015: Art 4):

The European Statistical System (ESS) is the partnership between the Community statistical authority, which is the Commission (Eurostat), and the national statistical institutes (NSIs) and other national authorities responsible in each Member State for the development, production and dissemination of European statistics.

Member States collect data and compile statistics for national and EU purposes. The ESS functions as a network in which Eurostat’s role is to lead the way in the harmonisation of statistics in close cooperation with the national statistical authorities. ESS work concentrates mainly on EU policy areas – but, with the extension of EU policies, harmonisation has been extended to nearly all statistical fields. The ESS also coordinates its work with candidate countries and at European level with other Commission services, agencies and the ECB and international organisations such as OECD, the UN, the International Monetary Fund and the World Bank.36

The concept of a ‘partnership’, which was introduced already in the nineties in projects launched by the ESS, became in 2009 a key feature of the ESS itself and embodies a compromise which, on the one hand, introduces a ‘system’ that unites representatives of the European Commission (Eurostat) and the Member States (National Statistical Authorities) in one institutional framework. On the other hand, this system is not organised very stringently or even hierarchically, but as a decentralised network of equal partners.

European political developments have further asked for customised statistical solutions. The introduction of a European single market has created the need for another form of external trade statistics (i.e. Intrastat37), the Maastricht Treaty asked for special statistical monitoring of the excessive deficit (i.e. EDP statistics38), the European Central Bank requested solid and comparable price statistics (i.e. HICP39) (Radermacher 2012), European Household Surveys (EU-SILC40, LFS41) have been established.

With the increasing importance of policy-making and hardening austerity measures, such statistical obligations of Member States have become more and more legally prescribed through European regulations or directives. First of all, this was associated with advantages for all actors. However, the body of European legislation in the field of statistics has grown rapidly to over 300 individual regulations, resulting in a completely inadequate, inflexible, fragmented and incoherent regulatory apparatus. Parallel to the technical modernisation of the processes, it was therefore necessary to simplify and standardise this set of rules.

European statisticians were at the forefront of the international modernisation activities. With ‘Vision 404’ (Eurostat 2009), a strategy for the next years was outlined that contained all three dimensions: process, product and governance. With the communication ‘GDP and Beyond’ (European Commission 2009) the European Commission has set up a work programme aiming at substantial improvements of statistical information.

Meanwhile, these plans have been implemented:

The governance of the European Statistical System was substantially revised (Eurostat 2015; Radermacher 2014). The multiannual programme was adapted to the new business model (European Union 2013). The accounting layer (European System of Accounts 2010) has been modernised and broadened (Eurostat 2013c, 2016a). Basic statistics have been re-engineered (e.g., demographic statistics, population census, consumer price statistics, integrated social, business and agricultural statistics). Close cooperation amongst the partners in the European Statistical System was the enabling factor to forcefully implement the strategy that was outlined in the ESS Vision 2020 (Eurostat 2013a).

The governance framework set by the European Statistical System has been successfully used and filled by the modernisation measures of recent years. However, there are increasing difficulties that point to the substantial and principal shortcomings of the national statistics paradigm, which cannot be fixed with the toolkit used so far. Chapter  4 will elaborate on these shortcomings in more detail and explain the approaches to overcome them.

2.5 Confidentiality and Access to Confidential Data

Confidentiality of statistical data is a principle that is of immense strategic importance to official statistics. Only when it is completely certain that respondents’ information will be treated with complete confidentiality and that it will not be used for any purpose other than for statistical purposes can truth-based answers and statements in statistical interviews be expected. Any suspicion of other uses—for tax or other administrative purposes, not to mention police or judicial investigations—undermines the delicate relationship of trust between citizens, businesses and the (public) statistical institution.

In this sense, the judgment of the German Constitutional Court on the planned census of 1983 (openJur 1983) is to be interpreted. The court emphasised the right to ‘informational self-determination’ and prohibited a return of statistically collected data to the administration for the purpose of correcting errors in the registration records. Since then, statistics have been regarded as a closed-off area into which administrative data can enter, but not return back.

A development in the opposite direction has increased massively since the 1990s. For research, especially on social science issues, the aggregated results, as provided in tables and databases of official statistics, are important. However, the actual wealth of the micro-data is insufficiently explored. It was therefore necessary to open up ways and means of access to individual data for researchers, without endangering confidentiality. Through a combination of legal requirements with new mathematical methods, as well as through special secluded premises, such access was realised (under restrictive conditions, after all).

Among other things, the intensive discussion of this topic has led to closer cooperation between statisticians and researchers, as well as the establishment of new working groups and committees.42

2.6 Modernisation

A widely supported starting point concerning the strategic orientation for official statistics is:

Our output has traditionally been determined by the demands of our respective governments and other customers. The process is one of reasoning back from the output desired to survey design because often few or no pre-existing data were available. This paradigm has shaped the way official statistics are designed and produced. … In the future it will become increasingly unrealistic to expect meaningful statistics from this approach, even when results are collected and transmitted electronically. (Vale 2017)

Since the end of the 1990s, a re-engineering of the business model has been ongoing, according to which the single statistical production lines are bundled and integrated, common technical tools are developed and terminology is standardised, thus minimising redundancies, inefficiencies and sources of incoherence. Information is generated by (re-)using available data as far as possible, aiming at minimising the response burden and costly surveys. In terms of the above-mentioned learning process, this means that the current ‘development loop’ is driven by changes on the production side, which will lead to substantial improvements.

Nevertheless, one must take the implications for the other actors of the learning cycle into account. For example, it was not difficult in the past and with the traditional business model to organise functioning user-producer dialogues, since participants of these dialogues shared the interest and knowledge of same subject matter. Agricultural statistics was discussed between the specialists for agricultural policies and the technicians in a specialised branch of the statistical office; the same applied for labour market, population, health statistics and so forth; a balanced agreement sufficient for static and narrow user needs. As long as statistical offices did not have to cope with substantial resource scarcities (and rapidly changing user needs), it was therefore not necessary to establish an overall programme planning, to decide on priorities, etc. The programme was just the sum of a great number of partial solutions in each separated area; both users and producers were generally satisfied—users with their tailor-made products and producers with their control of the entire production process. This inefficient ‘spaghetti-bowl-business-model’ of the past is replaced the new ‘industrialised-process-model’: multiple-source inputs, standardised production and multiple purposes output. The new business model of production cannot be ‘administrated’ in a traditional manner. It needs to be ‘managed’, including the development of planning tools, a catalogue of products/services, marketing and cost accounting, which means not less than a complete overhaul of the traditional culture in official statistics.

The national and international modernisation programmes are meanwhile characterised by great convergence. From a European point of view, the ‘Vision 2020’ of the European Statistical System (Eurostat 2018a) as well as the cooperation in the United Nations Economic Commission for Europe (UNECE) (2018) deserves special mention.

At this point, therefore, a summary of the essential components should suffice.43 Work in this area includes:
  • Production and IT: development and introduction of a common business architecture as reference framework for the production processes, improving the conditions for the sharing of IT services and infrastructure and the exchange of (micro)data

  • Data sources and data collection: development of methodologies for mixed-mode and multisource collection, concepts for risk management in using new tools and sources

  • Communication: development of a strategy for dissemination and communication, operational and innovative communication tools

  • Standards and metadata: development of a metadata glossary, standards for linked open data/metadata

  • Human resources and organisational frameworks: creating positive conditions and capacities for the change management needed, training and learning, performance management, building competencies, introducing cost accounting for products and modernisation projects.

In fact, compared to the world of official statistics some 25 years ago, everything has fundamentally changed. Instead of a highly fragmented, de-centrally driven production, a centrally managed manufacturing, based on modular components, has entered. After a first phase, where registers and administrative sources have been established and used for statistical purposes, large and expensive surveys (such as the censuses) have been replaced more and more by mixed procedures, saving the costs and the burden on respondents. The only prominent case, for which a change went into the opposite direction (replacing an administrative data-based statistic to a survey-based statistic) is Intrastat, the statistics on the trade in goods between countries of the EU.

Nonetheless, this statistical modernisation, like any form of industrialisation, has its price. The tailored shoe and suit from the tailor fit better than the clothing from the factory. However, the key question is what you can afford and what is ‘Fit for Purpose’ under these conditions. The answer to this question is clear: it is only with the considerable efforts and efficiency improvements of recent decades that it is possible for official statistics to meet the political and social demands of today.

2.7 Conclusion: Official Statistics—Modern, Efficient, High Quality

Before elaborating on the driving forces of science and of political conditions in the history of official statistics in the next chapter, the following can be summarised:

First, official statistics are the result of a scientific process. If this sounds too ambitious from a purely scientific position, it may at least be pointed to scientific methods as their basis. Official statistics can be seen as a sub-category of ‘scientific data and information’ that helps to better understand how societies function and develop.

Secondly, it is equally important that the categories and variables used in official statistics reflect and represent societal conventions (Desrosières 2010, p. 126) and: ‘To surpass the great divide between knowledge and politics means to take the tools of knowledge seriously politically’ (Desrosières 2010, p 127). Official statistics are similar in this respect to the legal system; legislative rules are initially defined as a convention before they are subsequently executed by the administration (Supiot 2015; Radermacher and Bischoff 2018 forthcoming).

Third, official statistics is a factory whose task is to guarantee the regular production of standardised information and related services. This ‘industrial’ production has its strengths as well as its limitations. For example, large-scale surveys or even censuses can be pursued and regularly repeated under these conditions. On the other hand, new developments and innovations are to carefully assess whether they can finally enter routine operations and meet high-quality standards; budgetary restrictions require the setting of priorities in the statistics programme, etc. As in companies in industry, the application of quality management in all its facets (such as EFQM) is now commonplace in official statistics. The biennial quality conferences in European statistics44 are part of this and give a broad overview.

Fourth, as a consequence, the statistical information produced by official statistics are first and foremost ‘products’. Similar to all other products, they serve a purpose and their quality should be fit for that purpose; a simple conclusion with far-reaching implications. Mainly, it means that quality cannot be seen as something absolute or purely scientific, like under laboratory conditions (Hoof et al. 2014), rather than depending on many reality factors: ‘Quality by Design’ (Juran 1992); “What, then, should be taken as priority: data utility, or rather, data quality? Such dilemmas have no preconceived wrong or correct answers. It is best to balance both sides of a conflict in a sensible way.” (Piotrowska-Piątek and Sobieraj 2016, p. 20)

Fourth, official statistics provide a public infrastructure (a public good). It is therefore part of the state administration, which also creates special framework conditions for working conditions. As a public infrastructure, official statistics are basically tax-financed; their products and services are available free of charge to anyone; therefore, ‘open data’ is already taken for granted by official statistics. This has hitherto mainly applied to aggregated results of official statistics, while access to micro-data for legal reasons is subject to special regulations, just as the additional expense of providing micro-data to science requires separate funding. Here, too, solutions should be found which already take account of the current need for research data in the basic budgeting of official statistics.

Sixth, official statistics have much in common with the media, in particular, the public ones. The dissemination of information is the common mission and denominator. Close cooperation with journalists and an active role in social media are therefore just as essential for official statistics as keeping pace with the latest forms and methods of communication (Eurostat 2017a, c).

Finally, the way in which official statistics (i.e. statistical offices) are institutionalised and organised reflects the understanding of the role of the state as part of a political agenda. There is not one unique answer to the question ‘Why Have Government Statistics? (And How to Cut their Cost)’ (Thomas 1984). It was a symbolic act that Margaret Thatcher first targeted the official statistics of the UK when implementing her neo-liberal program of shrinking the state (will say the public administration).45 Similar to the public media, such as radio or TV broadcasters, it is necessary to regularly critically examine which part of the statistical information should be provided by public institutions and for which private providers (such as research institutes, opinion polls or universities) are better suited. As a rule, the answer to this question has emerged from the evolution of a country’s history in statistics and depends on whether the cooperation between public and private actors is effectively organised in practice. However, this also means that if conditions change (e.g., in the course of digitisation), adaptation and rationalisation of the different activities must be considered.

Since the beginning of the 1990, the environment around statistics has dramatically changed due to several factors, such as:
  • Pressure on the public sector; major cuts in budgets and human resources

  • Reduced willingness to respond to statistical surveys; response burden as a political target

  • Exponentially growing importance of ICT and new data sources (e.g., administrative data, GIS)

  • New political demands (e.g., environment, globalisation, migration) and crises (e.g., financial).

Official statistics has successfully met these challenges. With a radical adaptation of the business model, productivity was noticeably improved and new opportunities were created, even though budgetary conditions continued to deteriorate.

It will now be important to maintain this momentum and the change process. This will make it possible to make the transition to Official Statistics 4.0 and to successfully master the associated challenges, above all from digitisation and globalisation.


  1. 1.

    See more detailed comments on the legal provisions in Radermacher/Bischoff “Article 338” (Radermacher and Bischoff 2018 forthcoming).

  2. 2.

    The European Free Trade Association ( and the European Economic Area ( and for statistics

  3. 3.

    This circular flow chart corresponds to widely accepted standards on the producer’s side, such as the Generic Statistical Business Process Model (GSBPM) (UNECE 2013) or the Generic Statistical Information Model (GSIM) (UNECE 2017).

  4. 4.

    This stovepipe approach in the organisation of work is further strengthened if the financing of statistical branches is separated in different budget lines.

  5. 5.

    This way of producing statistics in parallel but only weakly coordinated processes could be called a ‘vertical organisation’.

  6. 6.

    This way of producing statistics with exchangeable modules could be called a ‘horizontal organisation’.

  7. 7.
  8. 8.

    See for example “Good practices when combining some selected administrative sources

  9. 9.

    See for example Darabi (2017) or Kyi et al. (2012).

  10. 10.

    See the European statistical programme 2013–2017 (European Union 2011, p. 20).

  11. 11.

    This consideration and distinction are only recent. It results from the modernisation process of the last 20 years, which will be presented in a later section.

  12. 12.

    See for example Eurostat (2016b).

  13. 13.
  14. 14.

    See for example the case of income, consumption and wealth (Brandolini 2016).

  15. 15.

    See Deming (2000).

  16. 16.
  17. 17.

    For the complexity inherent to multidisciplinarity see (Klein 2004).

  18. 18.

    EU Regulation 223: “The statistical principles set out in this paragraph are further elaborated in the Code of Practice” (European Union 2015: Art 12).

  19. 19.

    The 2017 edition of the CoP is based on 16 principles.

  20. 20.

    See ESGAB (2018).

  21. 21.

    See Eurostat (2018c).

  22. 22.

    ISI (2010).

  23. 23.

    Source United Nations (2014).

  24. 24.

    See for example ISTAT (2018) or INE Portugal (INE 2018).

  25. 25.

    See for example Destatis (2018) or Eurostat (2018d).

  26. 26.

    See for example Statistics Netherlands (CBS 2016a).

  27. 27.

    See EFQM (2013, p. 9).

  28. 28.

    See for example IMF (2017, 2018) or OECD (2015).

  29. 29.

    See for example Eurostat on Greece (European Commission 2010) or IMF on Argentina (IMF 2016).

  30. 30.

    See in particular “Towards robust quality management for European Statistics” (European Commission 2011).

  31. 31.

    See Eurostat (2018c).

  32. 32.

    See for example Foucault (1991).

  33. 33.

    ESA 2010 (Eurostat 2013b, pp. 273–74).

  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
  40. 40.
  41. 41.
  42. 42.

    Such as the German Data Forum RatSWD (

  43. 43.
  44. 44.

    See the overview in; the first one was the quality conference 2001 in Stockholm, Sweden initiated by the LEG Quality [recommendation 14 (Lyberg et al. 2001)].

  45. 45.

    see in particular “The legacy of Rayner” (Guy and Levitas 2005, p. 7).


  1. Blanc, Michel, Walter Radermacher, and Thomas Körner. 2001. Quality and Users—Chapter of the Final Report of the LEG on Quality. In Final Report of the LEG on Quality, ed. Lars Lyberg, Mats Bergdahl, Michel Blanc, Max Booleman, Werner Grünewald, Marta Haworth, Lili Japec, Tim Jones, Thomas Körner, Håkan Lindén, Gunilla Lundholm, Margarida Madaleno, Walter Radermacher, Marina Signore, Ioannis Tzougas, and Richard van Brakel. Luxembourg: Eurostat.Google Scholar
  2. Brandolini, Andrea. 2016. The Links Between Household Surveys and Macro Aggregate. In DGINS 2016, ed. Statistics Austria. Vienna.Google Scholar
  3. CBS. 2016a. Budget Cuts at Statistics Netherlands: State of Play. Statistics Netherlands. Accessed January 29, 2018.
  4. CBS. 2016b. Statistics Netherlands Annual Report for 2015. Hague: Statistics Netherlands.Google Scholar
  5. Darabi, Anoush 2017. The UK’s Next Census will be its Last—Here’s Why. In Apolitical. London: Apolitical.Google Scholar
  6. Davies. 2017. How Statistics Lost Their Power—and Why We Should Fear What Comes Next. The Guardian.Google Scholar
  7. Deming, W.E. 2000. Out of the Crisis. Massachusetts Institute of Technology, Center for Advanced Engineering Study.Google Scholar
  8. Deming Institute. 2018. PDSA Cycle. The Deming Institute, Accessed July 12, 2018.
  9. Desrosières, Alain. 1998. The Politics of Large Numbers—A History of Statistical Reasoning. Cambridge, MA: Harvard University Press.Google Scholar
  10. Desrosières, Alain. 2010. A Politics of Knowledge-tools—The Case of Statistics. In Between Enlightenment and Disaster, ed. Linda Sangolt. Brussels: P.I.E. Peter Lang.Google Scholar
  11. ECB. 2018. The ECB Statistics Quality Framework and Quality Assurance Procedures. ECB. Accessed January 29, 2018.
  12. EFQM. 2013. EFQM Excellence Model. Brussels: EFQM.Google Scholar
  13. ESGAB. 2018. European Statistical Governance Advisory Board. Eurostat, Accessed January 29, 2018.
  14. European Commission. 2009. Communication from the Commission to the European Parliament and the Council on the Production Method of EU Statistics: A Vision for the Next Decade. In COM(2009) 404.Google Scholar
  15. European Commission. 2010. Report on Greek Government Deficit and Debt Statistics. Brussels: European Commission.Google Scholar
  16. European Commission. 2011. Towards Robust Quality Management for European Statistics, ed. Eurostat, 10. Brussels: European Commission.Google Scholar
  17. European Union. 2011. European Statistical Programme 2013–2017. In COM(2011) 928 final, ed. the European Parliament and the Council. Brussels: European Commission.Google Scholar
  18. European Union. 2013. Regulation (EU) No 99/2013 of the European Parliament and of the Council of 15 January 2013 on the European Statistical Programme 2013–17, ed. European Commission, 9 February 2013. Brussels: Official Journal of the European Union.Google Scholar
  19. European Union. 2015. Regulation (EC) No 223/2009 of the European Parliament and of the Council of 11 March 2009 on European statistics and repealing Regulation (EC, Euratom) No 1101/2008 of the European Parliament and of the Council on the transmission of data subject to statistical confidentiality to the Statistical Office of the European Communities, Council Regulation (EC) No 322/97 on Community Statistics, and Council Decision 89/382/EEC, Euratom establishing a Committee on the Statistical Programmes of the European Communities. In 2009R0223—EN—08.06.2015—001.001—1, ed. European Union. Luxembourg: © European Union,, 1998–2019.
  20. Eurostat. 2009. Communication from the Commission to the European Parliament and the Council on the Production Method of EU Statistics: A Vision for the Next Decade, ed. European Commission. Brussels.Google Scholar
  21. Eurostat. 2011. European Statistics Code of Practice for the National and Community Statistical Authorities—Adopted by the European Statistical System Committee 28th September 2011, ed. Eurostat. Luxembourg: Eurostat.Google Scholar
  22. Eurostat. 2013a. ESS Vision 2020—Building the Future of European Statistics. Eurostat.
  23. Eurostat. 2013b. European System of Accounts ESA 2010. Luxembourg: Publications Office of the European Union.Google Scholar
  24. Eurostat. 2013c. Regulation (EU) No 549/2013 of the European Parliament and of the Council of 21 May 2013 on the European system of national and regional accounts in the European Union, ed. European Union. Brussels: Official Journal of the European Union.Google Scholar
  25. Eurostat. 2015. Regulation (EC) No 223/2009 of the European Parliament and of the Council. In 2009R0223—EN—08.06.2015—001.001—1. Luxembourg: Eurostat.Google Scholar
  26. Eurostat. 2016a. Report from the Commission to the European Parliament and the Council on the Implementation of Regulation (EU) No 691/2011 on European Environmental Economic Accounts. Brussels: European Commission.Google Scholar
  27. Eurostat. 2016b. Sharing Statistical Production and Dissemination Services and Solutions in the European Statistical System, ed. Dir B. Luxembourg: Eurostat.Google Scholar
  28. Eurostat. 2017a. Digicom—Users at the Forefront. Eurostat.
  29. Eurostat. 2017b. Eurostat Catalogue of Statistical Products 2017. Luxembourg: Eurostat.Google Scholar
  30. Eurostat. 2017c. PART 2—Communicating Through Indicators. Luxembourg: Eurostat.Google Scholar
  31. Eurostat. 2018a. ESS Vision 2020. Eurostat, Accessed January 30, 2018.
  32. Eurostat. 2018b. European Statistics Code of Practice—For the National Statistical Authorities and Eurostat, 20. Luxembourg: Publications Office of the European Union.Google Scholar
  33. Eurostat. 2018c. Peer Reviews in the European Statistical System. Eurostat. Accessed January 29, 2018.
  34. Eurostat. 2018d. Quality Assurance Framework of the European Statistical System. Eurostat.
  35. Foucault, Michel. 1991. Governmentality. In The Foucault Effect, ed. Graham Burchell, Colin Gordon and Peter Miller. Chicago: Chicago University Press.Google Scholar
  36. Guy, W., and R. Levitas. 2005. Interpreting Official Statistics. London: Taylor & Francis.CrossRefGoogle Scholar
  37. Hoof, F., E.M. Jung, and U. Salaschek. 2014. Jenseits des Labors: Transformationen von Wissen zwischen Entstehungs- und Anwendungskontext (transcript Verlag).Google Scholar
  38. IFAC. 2014. International Framework: Good Governance in the Public Sector—Executive Summary. London: The International Federation of Accountants (IFAC) and the Chartered Institute of Public Finance and Accountancy (CIPFA).Google Scholar
  39. IMF. 2016. Statement by the IMF Executive Board on Argentina. Washington DC: IMF.Google Scholar
  40. IMF. 2017. IMF Standards for Data Dissemination. IMF. Accessed January 29, 2018.
  41. IMF. 2018. Standards & Codes. IMF. Accessed January 29, 2018.
  42. INE. 2018. Quality in Statistics. Statistics Portugal. Accessed January 30, 2018.
  43. ISI. 2010. ISI Declaration on Professional Ethics. The Hague The Netherlands: International Statistical Institute.Google Scholar
  44. ISI. 2018. ISI Declaration on Professional Ethics—Adopted in August 1985—Background Note. ISI. Accessed January 29, 2018.
  45. ISTAT. 2018. Quality Commitment. ISTAT. Accessed January 30, 2018.
  46. Klein, Julie Thompson. 2004. Interdisciplinarity and Complexity: An Evolving Relationship. E:CO, Special Double Issue Vol. 6: 2–10.Google Scholar
  47. Juran, J.M. 1992. Juran on Quality by Design: The New Steps for Planning Quality Into Goods and Services. Free Press.Google Scholar
  48. Kyi, Gregor, Bettina Knauth, and Walter Radermacher. 2012. A Census is a Census is a Census? In UNECE-Eurostat Expert Group Meeting on Censuses Using Registers. Geneva: UNECE Conference of European Statisticians.Google Scholar
  49. Lyberg, Lars, Mats Bergdahl, Michel Blanc, Max Booleman, Werner Grünewald, Marta Haworth, Lili Japec, Tim Jones, Thomas Körner, Håkan Lindén, Gunilla Lundholm, Margarida Madaleno, Walter Radermacher, Marina Signore, Ioannis Tzougas, and Richard van Brakel. 2001. Summary Report from the Leadership Group (LEG) on Quality. Luxembourg: Eurostat.Google Scholar
  50. OECD. 2015. Recommendation of the OECD Council on Good Statistical Practice, ed. Statistics. Paris: OECD.Google Scholar
  51. openJur. 1983. BVerfG · Urteil vom 15. Dezember 1983 · Az. 1 BvR 209/83, 1 BvR 484/83, 1 BvR 420/83, 1 BvR 362/83, 1 BvR 269/83, 1 BvR 440/83 (Volkszählungsurteil). Accessed February 26, 2018.
  52. Piotrowska-Piątek, Agnieszka, and Małgorzata Sobieraj. 2016. On the Way to the Statistics’ Quality. Quality as a Challenge of Public Statistics in Poland. World Scientific News 48: 17–23.Google Scholar
  53. Porter, M.E. 1995. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, N.J., Chichester: Princeton University Press.Google Scholar
  54. Radermacher, Walter. 1992. Methoden und Möglichkeiten der Qualitätsbeurteilung von statistischen Informationen aus der Fernerkundung/Methods and Possibilities of Assessing the Quality of Statistical Data of Remote Sensing. Jahrbücher für Nationalökonomie und Statistik, 169–79.Google Scholar
  55. Radermacher, W. 1999. Indicators, Green Accounting and Environment Statistics: Information Requirements for Sustainable Development. International Statistical Review: A Journal of the International Statistical Institute and Its Associations 67: 339–354.Google Scholar
  56. Radermacher, Walter. 2008. Data Quality in Multiple Source Mixed Mode Designs. In Q2008 European Conference on Quality in Official Statistics, Rome.Google Scholar
  57. Radermacher, Walter. 2011. ESS Vision and Ways for Cooperation. In Workshop on Strategic Developments in Business Architecture in Statistics, ed. UNECE. Geneva: UNECE.Google Scholar
  58. Radermacher, Walter J. 2012. Zahlen zählen—Gedanken zur Zukunft der amtlichen Statistik in Europa. AStA Wirtsch Sozialstat Arch 2012: 285–298.CrossRefGoogle Scholar
  59. Radermacher, Walter. 2014. The European Statistics Code of Practice as a Pillar to Strengthen Public Trust and Enhance Quality in Official Statistics. Journal of the Statistical and Social Inquiry Society of Ireland 43: 27–33.Google Scholar
  60. Radermacher, Walter and Pierre Bischoff. 2018 forthcoming. Article 338. In Treaty on the Functioning of the European Union—A Commentary, ed. H.-J. Blanke, and S. Mangiameli. Springer.Google Scholar
  61. Radermacher, Walter, Joachim Weisbrod, and Dominik Asef. 2004. Bedarf, Qualität, Belastung Optimierung als Interessenausgleich. In WiSta Wirtschaft und Statistik.Google Scholar
  62. Saetnan, Ann Rudinow, Heidi Mork Lomell, and Svein Hammer. 2011. The Mutual Construction of Statistics and Society. New York, NY: Routledge.Google Scholar
  63. Stamhuis, Ida H. 2008. Statistical Thought and Practice. A Unique Approach in the history and development of sciences? In The Statistical Mind in Modern Society. The Netherlands 1850–1940, ed. I.H. Stamhuis, P.M.M. Klep, and J.G.S.J. van Maarseveen. Amsterdam: Aksant.Google Scholar
  64. Supiot, Alain. 2015. La Gouvernance par les nombres. Nantes: Librairie Arthème Fayard.Google Scholar
  65. Thomas, Ray. 1984. Why Have Government Statistics? (And How to Cut their Cost). Journal of Public Policy 4: 85–102.CrossRefGoogle Scholar
  66. UNECE. 2007. Register-Based Statistics in the Nordic Countries—Review of Best Practices with Focus on Population and Social Statistics. Geneva: United Nations Publication.Google Scholar
  67. UNECE. 2013. Generic Statistical Business Process Model (GSBPM).
  68. UNECE. 2017. The Generic Statistical Information Model (GSIM). UNECE.
  69. UNECE. 2018. Modernization of Official Statistics. UNECE. Accessed January 30, 2018.
  70. United Nations. 2014. Fundamental Principles of Official Statistics. New York.Google Scholar
  71. UNSD. 2018. Fundamental Principles of National Official Statistics. UNSD. Accessed January 29, 2018.
  72. Vale, Steven. 2017. Strategic Vision of the HLG-MOS. UNECE High-Level Group for the Modernisation of Official Statistics.

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Statistical SciencesSapienza University of RomeRomeItaly

Personalised recommendations