Data integrity, empirical research and economic planning


Abdullah A Dewan | Published: January 08, 2020 20:43:49


Data integrity, empirical research and economic planning

Former Finance Minister AMA Muhith, while speaking at a two-day research almanac organised by the Bangladesh Institute of Development Studies (BIDS) in Dhaka, has applauded the studies and research pursued at BIDS while deprecating research pursued at private institutions. He said, "We need more such institutions (BIDS) in the country, as their works are very much relevant to policy formulation. We have some research organisations in the private sector, and they are 'rubbish'. (FE, December 02, 2019). I wonder if Mr. Muhith could assure the country that "rubbish" research had its debut soon after his 10 years of leadership in the financial sector of the economy.

In his turn as a special guest speaker, Planning Commission's economist Dr. Shamsul Alam underscored the primacy of reliable data for developing economic policy to realise desired outcomes. He said, "If we don't get accurate and quality data, the policies prepared by using such (inaccurate) data will not work. I'm confused about the data, as it does not match with the country's rice market and recently with onion market. Given the rice production data, we should have a surplus of 5.0-6.0 million tonnes of rice. But we import around 1.0 million tonnes of rice almost every year. I don't understand why it happens so, if the official data is accurate. Similarly, local onion production is around 2.2 million tonnes per year, which is very close to the country's annual demand of 2.3-2.4 million tonnes. If the production is ample, could we believe it is market failure due to inefficient distribution of goods and services?"

My question is: if BIDS is producing commendable work benefiting economic policy planning as Mr. Muhith claimed, why would a Planning Commission's economist be complaining about data inaccuracy, market failure and failed policy outcomes? Do economists at private research institutions produce 'rubbish' research results (dubbed by Mr. Muhith) are using a starkly contaminated or fabricated data or using old and sloppy statistical estimation techniques? Do they lack economics and econometric literacy?

I have come across many postings on the Web about data manipulations by the political party in power. In many developing countries the government has been alleged to influence their respective data collection and analysis department to produce favourable economic growth, inflation statistics, poverty alleviation, and unemployment data. Even in the US -- believe it not -- Donald Trump and many of his political hacks including some TV Fox News hosts accused President Obama's administration of cooking growth and unemployment data as the economy was expanding after the Great Recession of 2008. The data manipulation of the US economy is nearly impossible given there are 75 government and private sources of Economic Data, Statistics, Reports, and Commentaries. Add to that the omnipresent print, digital, and broadcast media.

Writing research papers for scholarly journals are quite different from research reports produced for economic policy formulation and development. Publishing research papers in journals are a long process - often over two years from the point of submission to acceptance because of meticulous referee reviews, and painstaking revision and resubmission by authors whereas policy papers must be done by exigencies. Journal papers and the results and implications take a long time to adopt and implement in future policy making. That by no means makes them useless and 'rubbish' - as though most fall in that rubbish category.  

In Bangladesh and most other developing countries a predominant amount of studies is devoted to funded projects to study and evaluate specific issues and then there are university faculties who generally pursue academic research. Project analysis provides information to policy makers while quantitative academic research seeks new knowledge and understanding of economic theories for future potential application to real life economic policy making.

There are over 1500 economic journals globally and of them only the top 50 or so publish papers that contain refreshing ideas about numerous intricate economic issues. Not surprisingly though, only a fraction of these papers sees their intrusion into Ph.D. class room lecture contents. It takes a long time - often decades - to have some of the remarkable research papers going through the academia and policy application to get recognition in the economics profession. Even then, their eminence and influence are sidelined when the economy is hit by recession, deflation or stagflation. For example, when the US and the global economy was debilitated by the Great Recession of 2008, policy makers resorted to the 1950s Keynesian aggregate demand strategies of expansionary monetary and fiscal policies to resuscitate the economy back to breathing. That did not discourage economists to seek a better of the economy. Without ceaseless efforts to expand new knowledge in all areas of human and nonhuman welfare, we all will be living in a zombie world. 

Quantitative economics do not always produce desired outcomes for many different reasons. The approach uses a range of complex mathematical and statistical procedures to analyse economic hypothesis implied by theories. These techniques - although by no means perfect -- help analysts explain economic issues, as well as predict future economic conditions. The primary analytical method of quantitative economics is regression analysis, which studies economic outcomes as functions of one or more economic predictor variables. These regression techniques are developed to produce desirable outcomes in which the data must follow certain assumptions about the data generating process.

Data analysis and the resulting economic implications are only as reliable as the quality and integrity of data used in the estimation process. To assure data integrity - prior to conducting any statistical analysis - I, as a researcher, always took a glimpse of the plotted time path of the data (in first difference format) to see if there were any suspicious intrusion of outliers in the data series. Additionally, I have also conducted formal statistical tests for data accuracy to assure that the data are as accurate as possible. Once any suspicious data points (outliers) are detected, I assured myself if the outliers were part of the data generating process or an intrusion of policy induced changes or simply data imputing error. There are statistical techniques (such as use of dummy variables) which can correct for outliers. Omitting a zero here or adding a number not generated by the underlying economic process can compromise the accuracy of the data and the resulting estimated parameters become rubbish. 

An example of irregularity or presence of outlier in time series data would be the single digit lending interest rate in the week or month in which it was forced on banks - instead of being set by free market discipline of supply and if the researchers fail to use appropriate statistical technique to account for the abrupt data break (outlier) from 13 per cent to 9.0 per cent lending rate, the empirical results based on the outlier inflicted data would be misleading.

It is not uncommon that some researchers under pressure from 'publish or perish' push and Ph.D. students who just want to comply with their thesis requirement, may not apply rigorous and critical data analysis. At other times, data accuracy may be deliberately compromised by some researchers and students because of the lack of consistency of estimated results implied by economic theory.

Finally, quality research is predicated upon appropriate support with quality data, availability of journals, and state-of-the-art statistical software. Clicking the menus and buttons of statistical estimation software loaded in computers is easy and often mechanical, but if the data (even a single number) used in such automated data analysis is inaccurate, the results would be nothing more than just plain rubbish - consistent with the famous adage "Garbage in, garbage out."

Abdullah A Dewan, formerly a physicist and a nuclear engineer at Bangladesh Atomic Energy Commission (BAEC), is Professor of Economics at Eastern Michigan University, USA.

adewan@emich.edu

 

Share if you like