Monday, May 14, 2018

Monetary Policy and Global Spillovers

The Bank of Chile's latest Annual Conference volume, Monetary Policy and Global Spillovers: Mechanisms, Effects, and Policy Measures, is now out, here.  In addition to the research presented in the volume, I also love the picture on its front cover. So peaceful.

Monday, May 7, 2018

Fourth Penn Quantitative Policy Workshop



Some years ago I blogged on the first Workshop on Quantitative Tools for Macroeconomic Policy Analysis hosted by the Penn Institute for Economic Research (PIER). We just completed the fourth! It was a great group as usual, with approximately 25 participants from around the globe, mostly economists at country central banks, ECB, etc. Some of the happy campers, along with yours truly, appear in the photo. You can find all sorts of information on the workshop site. Information / registration for the next Workshop (May 2019) will presumably be posted in fall. Please consider joining us, and tell your friends!






Monday, April 30, 2018

Pockets of Predictability

Some months ago I blogged on "Pockets of Predictability," here. The Farmer-Schmidt-Timmermann paper that I mentioned is now available, here.

Monday, April 23, 2018

Ghysels and Marcellino on Time-Series Forecasting

If you're teaching a forecasting course and want a good text, or if you're just looking for an informative and modern treatment, see Applied Economic Forecasting Using Time Series Methods, by Eric Ghysels and Massimilliano Marcellino. It will be published this week by Oxford University Press. It has a very nice modern awareness of Big Data with emphasis on reduced-rank structure, regularization methods -- LASSO appears as early as p. 23! -- , structural change, mixed-frequencies, etc. It's also very tastefully done in terms of what's included and what's excluded, emphasizing what's most important and de-emphasizing the rest. As regards non-linearity, for example, volatility dynamics and regime-switching are in, and most of the rest is out.

Monday, April 16, 2018

The History of Forecasting Competitions

Check out Rob Hyndman's "Brief History of Time Series Forecasting Competitions". I'm not certain whether the title's parallel to Hawking's Brief History of Time is intentional. At any rate, even if Hyndman's focus is rather more narrow than the origin and fate of the universe, his post is still fascinating and informative. Thanks to Ross Askanasi for bring it to my attention.

Monday, April 9, 2018

An Art Market Return Index

Rare and collectible goods, from fine art to fine wine, have many interesting and special aspects. Some are shared and some are idiosyncratic.

From the vantage point of alternative investments (among other things), it would be useful to have high-frequency indices for those asset markets, just as we do for traditional "financial" asset markets like equities.

Along those lines, in "Monthly Art Market Returns" Bocart, Ghysels, and Hafner develop a high-frequency measurement approach, despite the fact that art sales generally occur very infrequently. Effectively they develop a mixed-frequency 
repeat-sales model, which captures the correlation between art prices and other liquid asset prices that are observed much more frequently. They use the model to extract a monthly art market return index, as well as sub-indices for contemporary art, impressionist art, etc.

Quite fascinating and refreshingly novel.

Monday, April 2, 2018

Econometrics, Machine Learning, and Big Data

Here's a useful slide deck by Greg Duncan at Amazon, from a recent seminar at FRB San Francisco (powerpoint, ughhh, sorry...). It's basically a superset of the keynote talk he gave at Penn's summer 2017 conference, Big Data in Predictive Dynamic Econometric Modeling. Greg understands better than most the close connection between "machine learning" and econometrics / statistics, especially between machine learning and the predictive perspective emphasized in time series for a century or so.

Monday, March 26, 2018

Classic Jacod (1994) Paper

J. Financial Econometrics will soon publish Jean Jacod's brilliant and beautiful 1994 paper, "Limit of Random Measures Associated with the Increments of a Brownian Semimartingale", which I just had the pleasure of reading for the first time. (Ungated version here.) Along with several others, I was asked to supply some comments for the issue's introduction. What follows is adapted from those comments, providing some historical background. (Except that it's not really historical background -- keep reading...)

Jacod's paper effectively lays the foundation for the vast subsequent econometric "realized volatility" (empirical quadratic variation) literature of the past twenty years.  Reading it leads me to recall my early realized volatility work with Torben Andersen and Tim Bollerslev in the late 1990's and early 2000's. It started in the mid-1990's at a meeting of the NBER Asset Pricing Program, where I was the discussant for a paper of theirs, eventually published as Andersen and Bollerslev (1998). They were using realized volatility as the "realization" in a study of GARCH volatility forecast accuracy, and my discussion was along the lines of, "That's interesting, but I think you've struck gold without realizing it -- why not skip the GARCH and instead simply characterize, model, and forecast realized volatility directly?".

So we decided to explore realized volatility directly. Things really took off with Andersen et al. (2001) and Andersen et al. (2003). The research program was primarily empirical, but of course we also wanted to advance the theoretical foundations. We knew some relevant stochastic integration theory, and we made progress culminating in Theorem 2 of Andersen et al. (2003). Around the same time, Ole Bardorff-Nielsen and Neil Shephard were also producing penetrating and closely-related results (most notably Barndorff-Nielsen and Shephard, 2002). Very exciting early times.

Now let's return to Jacod's 1994 paper, and consider it against the above historical background of early econometric realized volatility papers. Doing so reveals not only its elegance and generality, but also its prescience: It was written well before the "historical background"!! One wonders how it went unknown and unpublished for so long.

References

Andersen, T. G. and T. Bollerslev (1998), "Answering the Skeptics: Yes, Standard Volatility Models do Provide Accurate Forecasts," International Economic Review, 39, 885-905.

Andersen, T.G., T. Bollerslev, F.X. Diebold, and P. Labys (2001), "The Distribution of Realized Exchange Rate Volatility," Journal of the American Statistical Association, 96, 42-55.

Andersen, T.G., T. Bollerslev, F.X. Diebold, and P. Labys (2003), "Modeling and Forecasting Realized Volatility," Econometrica, 71, 579-625.

Barndorff-Nielsen, O. and N. Shephard (2002), "Econometric Analysis of Realized Volatility and its Use in Estimating Stochastic Volatility Models," Journal of the Royal Statistical Society, 64,
253-280.

Jacod, J. (1994), "Limit of Random Measures Associated with the Increments of a Brownian Semimartingale," Manuscript, Institute de Mathematiques de Jussieu, Universite Pierre et Marie Curie, Paris.

Monday, March 19, 2018

Big Data and Economic Nowcasting

Check out this informative paper from the Federal Reserve Bank of New York: "Macroeconomic Nowcasting and Forecasting with Big Data", by Brandyn Bok, Daniele Caratelli, Domenico Giannone, Argia Sbordone, and Andrea Tambalotti.

Key methods for confronting big data include (1) imposition of restrictions (for example, (a) zero restrictions correspond to "sparsity", (b) reduced-rank restrictions correspond to factor structure, etc.), and (2) shrinkage (whether by formal Bayesian approaches or otherwise).

Bok et al. provide historical perspective on use of (1)(b) for macroeconomic nowcasting; that is, for real-time analysis and interpretation of hundreds of business-cycle indicators using dynamic factor models. They also provide a useful description of FRBNY's implementation and use of such models in policy deliberations.

It is important to note that the Bok et al. approach nowcasts current-quarter GDP, which is different from nowcasting "the business cycle" (as done using dynamic factor models at FRB Philadelphia, for example), because GDP alone is not the business cycle. Hence the two approaches are complements, not substitutes, and both are useful.

Monday, March 12, 2018

Sims on Bayes

Here's a complementary and little-known set of slide decks from Chris Sims, deeply insightful as always. Together they address some tensions associated with Bayesian analysis and sketch some resolutions. The titles are nice, and revealing. The first is "Why Econometrics Should Always and Everywhere Be Bayesian". The second is "Limits to Probability Modeling" (with Chris' suggested possible sub-title: "Why are There no Real Bayesians?").