Public Information Notice: IMF Executive Board Reviews Data Standards Initiatives

Fifth Review:

Fifth Review of the Fund's Data Standards' Initiatives

Joint Supplementary Paper
The General Data Dissemination System and the Millennium Development Goals

Supplement on the Government Finance Statistics Manual 2001
Adjusting the Special Data Dissemination Standard Requirements for the Fiscal Sector


Reviews of the Fund's Data Standards' Initiatives

Special Data Dissemination Standard (SDDS)

General Data Dissemination System (GDDS)

Data Quality Reference Site (DQRS)



FIFTH REVIEW OF THE FUND's DATA STANDARDS INITIATIVES
Data Quality Assessment Framework and Data Quality Program

Prepared by the Statistics Department
(In consultation with other departments)

June 25, 2003

  1. Introduction

  2. The Data Quality Assessment Framework

  3. Refinements to the DQAF

  4. The DQAF at the Center of the DQP

  5. Priorities for Future Work

Text Boxes

  1. Data Quality Program (DQP)
  2. Helping to Define a Road Map for Statistical Development

Appendix

I. Introduction

1. The Data Quality Assessment Framework (DQAF) was developed to address the Executive Board's interest in data quality as expressed during the December 1997 discussion of the Progress Report on the Provision of Information to the Fund for Surveillance.1 This interest was reaffirmed at the Third Review of the Fund's Data Standards Initiatives in March 2000 and during the discussion of Data Provision to the Fund for Surveillance Purposes in June of that year. A paper outlining the DQAF was presented to the Executive Board in July 2001, as background to the discussion on the Fourth Review of the Fund's Data Standards Initiatives. The Executive Board welcomed the DQAF and supported its integration into the data module of the Report on the Observance of Standards and Codes (ROSC). The Board also endorsed the integration of the various applications of the DQAF in an overall data quality assessment program.

Data Quality Assessment Framework-Generic Framework



2. This Supplement presents an overview of the DQAF and its role in the data quality program (DQP).2 Section II summarizes the DQAF; Section III briefly describes a set of refinements to the DQAF. Section IV outlines the role of the DQAF in the DQP, and Section V sets out the priorities for the work ahead under the DQP.

II. The Data Quality Assessment Framework

The DQAF provides a structure for assessing data quality by comparing country statistical practices with best practices, including internationally accepted methodologies. Rooted in the United Nations Fundamental Principles of Official Statistics,3 it is the product of an intensive consultation with national and international statistical authorities and data users inside and outside the Fund. It focuses on the quality-related features of governance of statistical systems, core statistical processes, and statistical products. Under the DQAF, assessments have a six-part structure starting with a review of the legal and institutional environment (prerequisites of quality) and followed by an analysis of five dimensions of quality. The DQAF has a cascading structure, moving from the dimensions common to all datasets, as captured in the Generic Framework, to the more detailed aspects appropriate to individual datasets in the dataset-specific DQAFs. 3.

4. As intended, the DQAF has proved to be valuable for at least three groups of users:

  • the Fund-strengthening its operational work through assessments of data quality included in data ROSCs. These assessments help strengthen IMF surveillance and program design, e.g., by raising the profile of institutional weaknesses in discussions with country authorities. As well, they provide a sound basis for consideration of technical assistance (TA) needs and design of TA programs;

  • country authorities-enabling the self-assessment of statistical systems. Such assessments encourage preparation of well-targeted improvement plans that could help garner donor support as needed; and

  • private and public data users-providing an overview of the dimensions that make up data quality and equipping users to gauge data quality for their own purposes.

III. Refinements to the DQAF

5. The DQAF has been refined to reflect experience and international statistical developments since the Fourth Review of the Fund's Data Standards Initiatives. These experiences and developments include the following:

  • good statistical practices identified through data ROSCs and developments in the Special Data Dissemination Standard and the General Data Dissemination System;

  • methodological improvements made at both the international level (e.g., Government Finance Statistics Manual 2001) and the regional level (e.g., European Union guidelines); and

  • work with other international organizations on harmonizing approaches to data quality.

6. The refinements have been made to fill some gaps at the most detailed level and to address overlapping coverage in a few areas. Most refinements entailed introducing greater precision in the description of good statistical practices (e.g., separate assessment of the adequacy of staff resources, facilities, and computer resources). Drawing from the range of countries' experiences, more examples were cited. Greater attention was given to the effectiveness and efficiency of the management of statistical processes. In addition, some elements of the DQAF were reordered (e.g., "relevance" shifted from serviceability to prerequisites of data quality) or renamed ("integrity" renamed "assurance of integrity"). These refinements have been incorporated in the Generic Framework presented in the Appendix.

IV. The DQAF at the Center of the DQP

7. The DQP comprises a set of well-integrated initiatives centered on the DQAF.4 As shown in Box 1, it comprises applications of the DQAF, projects to support and promote good statistical practices identified in the DQAF, and the maintenance and development of dataset-specific DQAFs.

8. In addition to its use in data ROSCs, the DQAF has been applied in the Fund's statistical TA program and in the statistical capacity building indicators developed under the auspices of the Partnership in Statistics for Development in the 21st Century (PARIS21).5 The statistical capacity building indicators provide a snapshot view of the statistical system of a country with a set of 18 DQAF-based qualitative indicators and 16 quantitative indicators. For Fund TA, recent experience points to the key role of the DQAF in enhancing the prioritization and effectiveness of TA. In this setting, some data ROSC missions have provided diagnostics leading to TA, and others have validated statistical improvements resulting from earlier TA (Box 2). Also, data ROSC/TA missions have facilitated countries' subscription to the SDDS or participation in the GDDS.

Box 1: Data Quality Program (DQP)

The DQP comprises a set of well-integrated initiatives, centered on the DQAF, that permit the identification, promotion, and assessment of good statistical practices.

Applications of the DQAF

Data ROSCs
TA programs and their evaluation
PARIS21 statistical capacity building indicators
Countries' self-assessment and peer review
Users' guide to DQAF applications (continuing)

Support for good statistical practices the DQAF

Data Quality Reference Site (http://0-dsbb-imf-org.library.svsu.edu/Applications/web/dqrs/dqrshome/)
Development of guidance on specific practices
Compendium of good statistical practices (future work)

Maintenance and development of dataset-specific DQAFs

  • Currently used
    National accounts
    Consumer price indices
    Producer price indices
    Government finance statistics
    Monetary statistics
    Balance of payments

  • Jointly prepared with the World Bank
    Consumer expenditures under poverty

  • Under development with other international organizations
    External debt statistics (with the Inter-Agency Task Force on Finance Statistics)
    Merchandise trade (with the Task Force on International Trade Statistics)
    Education statistics (with the World Bank and the United Nations Educational, Scientific and Cultural Organization)
    Labor statistics (with the International Labour Office)

Box 2: Helping to Define a Road Map for Statistical Development

Two key elements of the Data Quality Program-data quality assessments included in data ROSCs and provision of TA-reinforce each other in helping to define a road map for statistical development.

Turkey is an example of a country where a data ROSC helped direct and prioritize TA.

  • A data ROSC in October 2001 confirmed that data dissemination was in observance of the SDDS (Turkey had subscribed in 1996), but identified shortcomings. In particular, the ROSC mission saw scope for updating the methodological basis of the national accounts, improving the accuracy and reliability of the price indices, and redressing the fragmentation of responsibility for fiscal accounting and enhancing the coverage and timeliness of the fiscal statistics.

  • Following up on the ROSC recommendations, in April 2002, a TA mission assisted the State Institute of Statistics in revising the national accounts, including a rebasing of the constant price estimates through 1996. In July 2002, a TA mission reviewed progress on the implementation of new accounting and reporting arrangements for fiscal data, and also provided input on the draft law on public financial management and financial control. In September 2002, a TA mission assisted in preparation for the planned revision of the price indices.

Kazakhstan is an example of a country where extensive TA helped establish sound statistical practices, which were subsequently validated by a data ROSC mission.

  • Prior to independence (1991), the statistical system supported central planning. In this context, relatively few data expressed in monetary terms were collected. Data on the balance of payments, external debt, and national accounts were not available, and shortcomings affected external trade, price, monetary, and fiscal statistics. With TA from the Fund and others, the new authorities addressed the challenge of transiting to a statistical system consistent with the data needs of a market economy. From the early 1990s, a sustained TA program focused on developing the institutional framework appropriate to the needs of a market economy and establishing procedures for collecting and compiling macroeconomic statistics.


  • In assessing Kazakhstan's statistical system, a data ROSC in 2002 concluded that the quality of its macroeconomic statistics had improved significantly. The ROSC mission noted that the authorities had a good track record of implementing TA recommendations. In March 2003, Kazakhstan became the first GDDS participant to graduate to the SDDS.

9. Data ROSCs and related TA have identified areas for priority work to provide the international infrastructure to support countries' efforts towards strengthening their statistical capacity. For example, ROSCs confirmed that, despite the critical importance of revisions of data, statistical agencies are often less than transparent in providing information about causes of revisions and the schedule of revisions. The May 2002 Executive Board discussion of Data Provision to the Fund for Surveillance encouraged national authorities to articulate their policies on revisions, and revision practices were also an issue in the June 2003 Board discussion on Strengthening the Effectiveness of Article VIII, Section 5. STA staff is nearing completion of a paper that, drawing on country experiences, proposes a set of good practices for revision policies. STA will invite international comment, including at an upcoming meeting of the heads of statistical units of international organizations, as a step towards an internationally agreed set of practices to guide countries.

10. Dataset-specific DQAFs are used in ROSCs for six datasets: national accounts, consumer price indices, producer price indices, government finance statistics, monetary statistics, and balance of payments. The success of the approach is evidenced by other international agencies' interest in developing additional dataset-specific DQAFs. In collaboration with the World Bank, a DQAF has been prepared for consumer expenditures under poverty. Four more are underway in collaboration with other international agencies (see Box 1).

V. Priorities for Future Work

11. The DQAF-based work within the DQP will proceed on three fronts: TA and statistical capacity building efforts, promotion of good statistical practices, and collaboration with international organizations.

TA and capacity building:

  • guiding countries' statistical plans and related TA requests, including coverage of legal and organizational matters; and

  • promoting the PARIS21 statistical capacity building indicators to serve for use by countries for self-assessment and peer review.

  • developing a users' guide to DQAF applications

Good statistical practices:

  • identifying and promoting good statistical practices in key areas, including completion of the work on revision policies; and

  • producing a compendium of good statistical practices drawing from data ROSCs.

Collaboration with international organizations:

  • harmonizing the DQAF and the related glossary with the quality frameworks emerging in other international organizations (e.g., Eurostat and the OECD);

  • pursuing work on methodological guidelines with supranational and regional bodies, including harmonization of guidelines, notably for the more recent guidelines such as the Government Finance Statistics Manual 2001 and the Monetary and Financial Statistics Manual (2000); and

  • extending the DQAF to socio-demographic datasets to buttress the GDDS' socio-demographic component and to support statistical development in areas relevant to countries' poverty reduction strategies.

APPENDIX

Data Quality Assessment Framework-Generic Framework
(July 2003 Framework)

Quality Dimensions

Elements

Indicators

0. Prerequisites of quality

0.1 Legal and institutional environment—The environment is supportive of statistics.
 
 
 
 
 
 
 
 
 

0.2 Resources—Resources are commensurate with needs of statistical programs.

 

0.3 Relevance—Statistics cover relevant information on the subject field.
 
 
0.4 Other quality management— Quality is a cornerstone of statistical work.

0.1.1 The responsibility for collecting, processing, and disseminating the statistics is clearly specified.
0.1.2 Data sharing and coordination among data-producing agencies are adequate.
0.1.3 Individual reporters' data are to be kept confidential and used for statistical purposes only.
0.1.4 Statistical reporting is ensured through legal mandate and/or measures to encourage response.

0.2.1 Staff, facilities, computing resources, and financing are commensurate with statistical programs.
0.2.2 Measures to ensure efficient use of resources are implemented.

0.3.1 The relevance and practical utility of existing statistics in meeting users' needs are monitored.
 
 
0.4.1 Processes are in place to focus on quality.
0.4.2 Processes are in place to monitor the quality of the statistical program.
0.4.3 Processes are in place to deal with quality considerations in planning the statistical program.

1. Assurances of integrity

The principle of objectivity in the collection, processing, and dissemination of statistics is firmly adhered to.

1.1 Professionalism—Statistical policies and practices are guided by professional principles.
 
 
 
 
 
 

1.2 Transparency—Statistical policies and practices are transparent.

 
 
 
 
 

 
 
 
1.3 Ethical standards—Policies and practices are guided by ethical standards.

1.1.1 Statistics are produced on an impartial basis.
1.1.2 Choices of sources and statistical techniques as well as decisions about dissemination are informed solely by statistical considerations.

1.1.3 The appropriate statistical entity is entitled to comment on erroneous interpretation and misuse of statistics.

1.2.1 The terms and conditions under which statistics are collected, processed, and disseminated are available to the public.
1.2.2 Internal governmental access to statistics prior to their release is publicly identified.
1.2.3 Products of statistical agencies/units are clearly identified as such.
1.2.4 Advanced notice is given of major changes in methodology, source data, and statistical techniques.

1.3.1 Guidelines for staff behavior are in place and are well known to the staff.

2. Methodological soundness

The methodological basis for the statistics follows internationally accepted standards, guidelines, or good practices.

2.1 Concepts and definitions—
Concepts and definitions used are in accord with internationally accepted statistical frameworks.

 
2.2 Scope—The scope is in accord with internationally accepted standards, guidelines, or good practices.


2.3 Classification/sectorization—
Classification and sectorization systems are in accord with internationally accepted standards, guidelines, or good practices.

 
 

2.4 Basis for recording—Flows and stocks are valued and recorded according to internationally accepted standards, guidelines, or good practices.

2.1.1 The overall structure in terms of concepts and definitions follows internationally accepted standards, guidelines, or good practices.
 
 

2.2.1 The scope is broadly consistent with internationally accepted standards, guidelines, or good practices.
 
 
2.3.1 Classification/sectorization systems used are broadly consistent with internationally accepted standards, guidelines, or good practices.
 
 
 
 

2.4.1 Market prices are used to value flows and stocks.
2.4.2 Recording is done on an accrual basis.
2.4.3 Grossing/netting procedures are broadly consistent with internationally accepted standards, guidelines, or good practices.

3. Accuracy and reliability

Source data and statistical techniques are sound and statistical outputs sufficiently portray reality.

3.1 Source data—Source data available provide an adequate basis to compile statistics.
 
 
 

 
 
3.2 Assessment of source data— Source data are regularly assessed.
 
 
 
 
 

3.3 Statistical techniques— Statistical techniques employed conform to sound statistical procedures.
 
 

 
 
3.4 Assessment and validation of intermediate data and statistical outputs—Intermediate results and statistical outputs are regularly assessed and validated.
 
 
3.5 Revision studies—Revisions, as a gauge of reliability, are tracked and mined for the information they may provide.

3.1.1 Source data are obtained from comprehensive data collection programs that take into account country-specific conditions.
3.1.2 Source data reasonably approximate the definitions, scope, classifications, valuation, and time of recording required.
3.1.3 Source data are timely.

3.2.1 Source data-including censuses, sample surveys and administrative records-are routinely assessed, e.g., for coverage, sample error, response error, and non-sampling error; the results of the assessments are monitored and made available to guide statistical processes.

3.3.1 Data compilation employs sound statistical techniques to deal with data sources.
3.3.2 Other statistical procedures (e.g., data adjustments and transformations, and statistical analysis) employ sound statistical techniques.

3.4.1 Intermediate results are validated against other information where applicable.
3.4.2 Statistical discrepancies in intermediate data are assessed and investigated.
3.4.3 Statistical discrepancies and other potential indicators of problems in statistical outputs are investigated.

3.5.1 Studies and analyses of revisions are carried out routinely and used internally to inform statistical processes (see also 4.3.3).

4. Serviceability

Statistics, with adequate periodicity and timeliness, are consistent and follow a predictable revisions policy.

4.1 Periodicity and timelinessPeriodicity and timeliness follow internationally accepted dissemination standards.
 
 
 
4.2 Consistency—Statistics are consistent within the dataset, over time, and with major datasets.
 
 
 

 
4.3 Revision policy and practice—Data revisions follow a regular and publicized procedure.

4.1.1 Periodicity follows dissemination standards.
4.1.2 Timeliness follows dissemination standards.
 
 
 
4.2.1 Statistics are consistent within the dataset.
4.2.2 Statistics are consistent or reconcilable over a reasonable period of time.
4.2.3 Statistics are consistent or reconcilable with those obtained through other data sources and/or statistical frameworks.

4.3.1 Revisions follow a regular and transparent schedule.
4.3.2 Preliminary and/or revised data are clearly identified.
4.3.3 Studies and analyses of revisions are made public (see also 3.5.1).

5. Accessibility

Data and metadata are easily available and assistance to users is adequate.

5.1 Data accessibility—Statistics are presented in a clear and understandable manner, forms of dissemination are adequate, and statistics are made available on an impartial basis.
 
 
 
 
 
 
 
5.2 Metadata accessibility—Up-to-date and pertinent metadata are made available.
 
 
 
 
 
 
5.3 Assistance to users—Prompt and knowledgeable support service is available.

5.1.1 Statistics are presented in a way that facilitates proper interpretation and meaningful comparisons (layout and clarity of text, tables, and charts).
5.1.2 Dissemination media and format are adequate.
5.1.3 Statistics are released on a pre-announced schedule.
5.1.4 Statistics are made available to all users at the same time.
5.1.5 Statistics not routinely disseminated are made available upon request.

5.2.1 Documentation on concepts, scope, classifications, basis of recording, data sources, and statistical techniques is available, and differences from internationally accepted standards, guidelines, or good practices are annotated.
5.2.2 Levels of detail are adapted to the needs of the intended audience.
 
5.3.1 Contact points for each subject field are publicized.
5.3.2 Catalogues of publications, documents, and other services, including information on any charges, are widely available.



1See Summing Up by the Acting Chairman, SUR/97/132 of December 11, 1997.
2The term "data quality program" more accurately reflects the purpose of the program than the term "data quality assessment program" introduced in the Fourth Review of the Fund's Data Standards Initiatives.
3The Fundamental Principles of Official Statistics were adopted by the United Nations Statistical Commission in a special session in 1994. The principles are intended to guide producers of official statistics in fulfilling their obligations and to inform users of statistics of what they should expect.
4The DQAF in turn evolved from the SDDS and the GDDS.
5PARIS21 was launched in November 1999 with the Fund as a funding organizer. Its purpose is to initiate statistical capacity building programs in target countries, namely those qualifying for the Initiative for Heavily Indebted Poor Countries and other countries producing Comprehensive Development Frameworks and/or United Nations Development Assistance Frameworks, with primary focus on Poverty Reduction and Growth Facility countries. The PARIS21 consortium consists of some 120 members (governments, multinational and regional agencies, and nongovernmental and private organizations).
6PARIS21 was launched in November 1999 to initiate statistical capacity building programs in target countries, namely those qualifying for the Initiative for Heavily Indebted Poor Countries and other countries producing Comprehensive Development Frameworks and/or United Nations Development Assistance Frameworks, with primary focus on Poverty Reduction and Growth Facility countries. The PARIS21 consortium consists of some 120 members (governments, multinational and regional agencies, and nongovernmental and private organizations).
7See Summing Up by the Acting Chairman, SUR/97/132 of December 11, 1997.
8The term "data quality program" more accurately reflects the purpose of the program than the term "data quality assessment program" introduced in the Fourth Review of the Fund's Data Standards Initiatives.
9The Fundamental Principles of Official Statistics were adopted by the United Nations Statistical Commission in a special session in 1994. The principles are intended to guide producers of official statistics in fulfilling their obligations and to inform users of statistics of what they should expect.
10PARIS21 was launched in November 1999 to initiate statistical capacity building programs in target countries, namely those qualifying for the Initiative for Heavily Indebted Poor Countries and other countries producing Comprehensive Development Frameworks and/or United Nations Development Assistance Frameworks, with primary focus on Poverty Reduction and Growth Facility countries. The PARIS21 consortium consists of some 120 members (governments, multinational and regional agencies, and nongovernmental and private organizations).