Experts discuss trends and approaches to assessing statistical quality

AuthorCandida Andrade
PositionIMF Statistics Department
Pages10-13

Page 10

Statistical quality is an important and timely topic, increasingly recognized as key to the future of statistics. The need to have a clearer view of statistical quality brought together leading experts from national and international statistical agencies at the Statistical Quality Seminar 2000, cosponsored by the Korea National Statistical Office (KNSO) and the IMF, and held in Jeju Island, Republic of Korea, on December 6-8, 2000. The seminar was jointly funded by the Korean authorities and the Japan Administered Account, under which Japan contributes to the IMF's training and technical assistance activities. The seminar covered a broad range of issues related to data quality, including trends and approaches to statistical quality assessment and national experiences in assessing and improving the quality of official statistics. In his opening remarks, Young-Dae Yoon, KNSO Commissioner, noted that there was broad recognition that statistical quality was a multidimensional concept that went well beyond traditional views that equated data quality with accuracy. This set the stage for the enthusiastic and thought-provoking exchange of views that followed.

Creating a framework

Presenting the lead-off paper, "Toward a Framework for Assessing Data Quality," Carol S. Carson, Director of the IMF's Statistics Department, explained that work on data quality has been under way in the IMF for some time. The subject had been tackled using a two-pronged approach-attention was given first to defining data quality and, second, to developing a structure and a common language that could be used as a framework to assess data quality. This led to twoPage 11 initiatives-the establishment of the IMF's Data Quality Reference Site (see box, page 12), and the development of generic and data set-specific quality assessment frameworks.

Carson explained that the emerging frameworks were the product of an extensive, iterative consultation process with statisticians from a broad range of national and international organizations. They were designed to be a flexible, comprehensive tool for assessing data quality that could be used by statisticians and nonstatisticians alike. The frameworks, which aimed to bring together best practices and internationally accepted concepts and definitions in statistics, were developed by drawing on the growing literature on data quality, the Statistics...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT