Consensus Economics Announces G7 & Western Europe 2016 Forecast Accuracy Award Winners

London, April 27, 2017: Consensus Economics, the world’s leading economic survey organization, has announced the recipients of its 2016 Forecast Accuracy Award (FAA). The FAA program recognises the achievements of a select group of expert country economic forecasters who have most accurately predicted the performance of GDP growth and Consumer Price Inflation in their targeted economies for the year 2016.

USA Forecast Accuracy 2016

FAA winners vary from year to year and across variables, due to shocks, surprises and cyclical and structural adjustments. However, the winners of the 2016 FAA program have been recognised for their high quality research, their commitment to regular forecasts and their ability to identify most accurately the trends and levels of key indicators over the 24 month forecasting cycle.

2016 Forecast Accuracy Award Winners

Country Company Economists
United States The Conference Board Bart van Ark
Gad Levanon
Eliza Winger
Ken Goldstein
Japan Credit Suisse Japan Economics Team
Germany M.M. Warburg & CO Carsten Klude
France HSBC Olivier Vigna
Chantana Sam
United Kingdom NatWest Markets Ross Walker
Italy Bank of America – Merrill Lynch Europe Economics – BofA Merrill Lynch Global Research
Canada Citigroup Dana M. Peterson
Euro zone Barclays Philippe Gudin
Antonio Garcia Pascual
Tomasz Wieladek
Fabio Fois
Francois Cabau
Apolline Menut
Netherlands BNP Paribas Economic Research Raymond Van der Putten
Colin Bermingham
Norway Oxford Economics Thomas Ash
Spain FUNCAS Raymond Torres
María Jesús Fernández
Sweden National Institute – NIER The Forecast Department of NIER
Switzerland HSBC Liz Martins
Chantana Sam

Consensus Economics collects forecasts from over 700 economists around the world each month. It was founded in 1989 to measure consensus expectations, which are seen as macroeconomic forecast benchmarks by investment and planning managers, as well as government and public sector institutions, who find the data it collects useful, timely and accurate.

Each monthly Consensus Forecasts publication shows the average (mean) forecasts, as well as individual forecaster predictions. It is distributed in hard-copy, PDF, and Excel formats, and is also available across the major statistical data platforms. Download a Sample

 

Forecast Accuracy Awards – Methodology Note 

The Forecast Accuracy Awards were determined using Mean Absolute Error analysis. The errors measured were the differences between the forecasts and the actual data outturns. The forecaster with the lowest average error rate was deemed to have been the most accurate over the testing period.

The Forecasts Used The forecasts we examined were our panellists’ monthly survey contributions for 2016 Real GDP growth and 2016 Consumer Price Inflation (CPI) expectations (annual average % change). We began collecting these forecasts in January 2015, and the 24 month forecasting cycle ended in December 2016, providing 24 monthly data points to test for each variable.

The Outturns Used The outturns we used as a comparison to the forecasts were the official estimates of 2016 GDP and CPI, which were released between January and April 2017, depending on the country and variable concerned.

The Calculation When calculating the error (the difference between the forecast and the outturn) we looked at the absolute errors, ignoring sign, as errors resulting from over-estimation and under-estimation are equivalent. We then calculated the mean average absolute error for the forecasts of each panellist for both GDP and CPI over the 24 month forecasting period.

Establishing the Most Accurate Forecaster To determine the most accurate forecaster for a given year, we averaged the mean absolute error rates for GDP and Inflation to identifiy the panellist with the lowest overall error rate. Smaller errors are best when considering accuracy, and to win the award requires the panellist to have exhibited a strong forecasting performance across both GDP and Inflation variables over the 24 month forecasting horizon.

Qualification To be considered for the award, a panellist needed to have participated consistently in our monthly surveys over the 24 month forecasting period. This ensured that no panellist gained an advantage through non-participation, or by only providing forecasts toward the end of the 24 month cycle, when the steady release of official data can assist forecasters in making revisions.