New algorithm improves accuracy of interest rate forecasts
A major breakthrough in econometrics could help central banks increase the accuracy of their forecasts by up to eight per cent.
New research by Melbourne Business School Assistant Professor Jamie Cross enables the functionality of well-established models, known as stochastic volatility in mean vector autoregression (SVMVAR) models, to be extended and significantly improved.
The original algorithm for estimating these models was originally developed during World War II to try to predict the path of enemy rockets in the air. Today, they are used by financial institutions and researchers to predict the impact of volatile events, such as climate change or COVID-19, on other issues, such as stock market prices or weather patterns.
While an important tool, the use of SVMVARs has been hampered in the past by their computational sensitivity, and the results supplied have not been highly accurate or efficiently delivered.
Assistant Professor Cross’ algorithm addresses these shortcomings, giving SVMVAR models the potential to improve the accuracy of interest rate forecasting by 8 per cent, real GDP forecasting by 4 per cent and inflation forecasting by as much as 39 per cent, in times of uncertainty.
If those improvements were realised in the Australian context, it could mean the difference between the Reserve Bank of Australia increasing rates again, or holding, Assistant Professor Cross said.
"Macroeconomic forecasting is notoriously difficult in periods of unpredictability," Assistant Professor Cross said.
"We saw the trouble the Reserve Bank had trying to predict the cash rate throughout the pandemic, and we are now living with those effects.
"These models now have the potential to mitigate future disparities between predictions and reality by giving central banks the ability to increase the accuracy of their predictions."
The culmination of six years of work, the breakthrough came by replacing a complex step in the estimation process that limited the number of variables which could be used.
The SVMVAR models are already being used by members of the Federal Reserve System to retrospectively examine the economic effects of macroeconomic and financial uncertainties. The new algorithm will now allow them to extend their analyses to predict key variables such as inflation, economic growth, and unemployment.
A computationally complex model
"SVMVARs have been used for a long time to study and predict the economic effects of uncertainty," Assistant Professor Cross said.
"For instance, how will people’s waning confidence with the central bank’s policy behavior effect spending? Or what will happen if there are changes in the tax code?"
"Right now, there is a lot of uncertainty about what is going to happen around the conflict in the Middle East. What will be the economic consequences of it?"
While SVMVARs are specifically designed to answer these types of questions, a huge challenge of these models is that they were incredibly complicated and computationally intensive.
"The computational demands have constrained economists to specify models with only a small number of variables, three to four at the most,” he said.
"What this means, is that you are restricted when trying to investigate the relative effects of different types of uncertainty. If you just want to focus on monetary policy, or just focus on fiscal policy, it's fine. But what happens when you want to look at how each of these uncertainties impacted each other in one framework?"
For instance, throughout COVID-19 there was a need to understand how uncertainties relating to the global economic, environment supply constraints, and the various domestic and international economic policies was going to impact macroeconomic indicators such as unemployment, inflation and real GDP.
“With our algorithm, we have been able to increase the number of variables from just three or four, up to 20, so studying relativity is now a possibility,” he said.
A new perspective on an 80-year-old problem
The key to Assistant Professor Cross and colleague’s improvements was replacing a step in the estimation process that had been in use since WWII.
“Since uncertainty is unobservable, estimating it has traditionally relied on non-linear filtering methods from signal processing,” Assistant Professor Cross explained.
“Filtering is a statistical method which is used to model unobservable phenomena such as sounds, radio waves and underlying statistical trends.
“The first filters were developed as a way of tracking rockets in World War II. In that case the radar needed to filter the signal from the rocket from all the other signals being passed through the air in order to predict its actual location."
"In our case, uncertainty is the unobservable signal.”
While there has been a lot of iterations at improving this method since then, it remains a complicated process.
“In the paper, we proposed an alternative estimation method which not only resulted in huge gains in accuracy, but we have also increased the speed of the process by up to 26 per cent.”
“So not only are we providing policy makers with a more accurate picture, but we are also providing it to them in a timely manner.”
Applications beyond the fiscal scene
Assistant Professor Cross stresses that the algorithm has applications beyond economics.
“In our paper we focused a lot on monetary policy uncertainty because that’s what we’re interested in, but the algorithm can basically be applied to anything you can think of,” Assistant Professor Cross said.
“For example, this class of models are currently being used to analyse the effect of climate uncertainties. This algorithm could help establish a more accurate picture and in doing so, improve policy decisions around climate issues.”
Assistant Professor Cross has published the algorithm and all the required programs on his website, although stipulates it should not be used for commercial purposes.
To learn more about our academic research, strengths and insights, visit the Faculty and Research page.