Current Research Projects

I am working on several projects related to the New parametrization of Correlation Matrices with Ilya Archakov, and I work with Chen Tong on Asset Pricing with Time-Varying Pricing Kernel and multivariate heavytailed distributions, Convolution-t Distributions. With Yiyao Luo and several current students, I work on Robust Correlations and related quantities, and I work with Werner Ploberger on Admissible Tests for Factor Structures in High-Dimensional Covariance Matrices.
ArXiv for recent working papers.
My Google Scholar Page

Publications and Working Papers

Co-author(s): Chen Tong

[ArXiv]

Abstract: We introduce a new class of multivariate heavy-tailed distributions that are convolutions of heterogeneous multivariate t-distributions. Unlike commonly used heavy-tailed distributions, the multivariate convolution-t distributions embody cluster structures with flexible nonlinear dependencies and heterogeneous marginal distributions. Importantly, convolution-t distributions have simple density functions that facilitate estimation and likelihood-based inference. The characteristic features of convolution-t distributions are found to be important in an empirical analysis of realized volatility measures and help identify their underlying factor structure.

Co-author(s): Yiyao Luo

[ArXiv]

Abstract: Time-varying volatility is an inherent feature of most economic time-series, which causes standard correlation estimators to be inconsistent. The quadrant correlation estimator is consistent but very inefficient. We propose a novel subsampled quadrant estimator that improves efficiency while preserving consistency and robustness. This estimator is particularly well-suited for high-frequency financial data and we apply it to a large panel of US stocks. Our empirical analysis sheds new light on intra-day fluctuations in market betas by decomposing them into time-varying correlations and relative volatility changes. Our results show that intraday variation in betas is primarily driven by intraday variation in correlations.

Co-author(s): Chen Tong

[ArXiv]

Abstract: We introduce a novel pricing kernel with time-varying variance risk aversion that yields closed-form expressions for the VIX. We also obtain closed-form expressions for option prices with a novel approximation method. The model can explain the observed time-variation in the shape of the pricing kernel. We estimate the model with S&P 500 returns and option prices and find that time-variation in volatility risk aversion brings a substantial reduction in derivative pricing errors. The variance risk ratio emerges as a fundamental variable and we show that it is closely related to economic fundamentals and key measures of sentiment and uncertainty.

Co-author(s): Ilya Archakov, Asger Lunde

[ArXiv]

Abstract: We propose a novel class of multivariate GARCH models that utilize realized measures of volatilities and correlations. The central component is an unconstrained vector parametrization of the correlation matrix that facilitates modeling of the correlation structure. The parametrization is based on the matrix logarithmic transformation that retains the positive definiteness as an innate property. A factor approach offers a way to impose a parsimonious structure in high dimensional system and we show that a factor framework arises naturally in some existing models. We apply the model to returns of nine assets and employ the factor structure that emerges from a block correlation specification. An auxiliary empirical finding is that the empirical distribution of parametrized realized correlations is approximately Gaussian. This observation is analogous to the well-known result for logarithmically transformed realized variances.

2024

Archakov, I, and P.R. Hansen (2024) Review of Economics and Statistics

[ArXiv]

Abstract: We obtain a canonical representation for block matrices. The representation facilitates simple computation of the determinant, the matrix inverse, and other powers of a block matrix, as well as the matrix logarithm and the matrix exponential. These results are particularly useful for block covariance and block correlation matrices, where evaluation of the Gaussian log-likelihood and estimation are greatly simplified. We illustrate this with an empirical application using a large panel of daily asset returns. Moreover, the representation paves new ways to regularizing large covariance/correlation matrices, test block structures in matrices, and estimate regressions with many variables.

Archakov, I, P.R. Hansen and Y. Luo (2024) Econometrics Journal

[ArXiv]

Abstract: We propose a new method for generating random correlation matrices that makes it simple to control both location and dispersion. The method is based on a vector parameterization, , which maps any distribution on to a distribution on the space of non-singular nxn correlation matrices. Correlation matrices with certain properties, such as being well-conditioned, having block structures, and having strictly positive elements, are simple to generate. We compare the new method with existing methods.

Hansen P.R., C. Kim, W. Kimbrough (2024) Journal of Financial Econometrics Vol. 22, 224-251.

[ArXiv]

Abstract: We study recurrent patterns in volatility and volume for major cryptocurrencies, Bitcoin and Ether, using data from two centralized exchanges (Coinbase Pro and Binance) and a decentralized exchange (Uniswap V2). We find systematic patterns in both volatility and volume across day-of-the-week, hour-of-the-day, and within the hour. These patterns have grown stronger over the years and can be related to algorithmic trading and funding times in futures markets. We also document that price formation mainly takes place on the centralized exchanges while price adjustments on the decentralized exchanges can be sluggish.

Hansen, P.R., A. Huang, C. Tong, and T Wang (2024) Journal of Financial Econometrics Vol. 22, 187–223.

[ArXiv]

Abstract: We show that the Realized GARCH model yields close-form expression for both the Volatility Index (VIX) and the volatility risk premium (VRP). The Realized GARCH model is driven by two shocks, a return shock and a volatility shock, and these are natural state variables in the stochastic discount factor (SDF). The volatility shock endows the exponentially affine SDF with a compensation for volatility risk. This leads to dissimilar dynamic properties under the physical and risk-neutral measures that can explain time-variation in the VRP. In an empirical application with the S&P 500 returns, the VIX, and the VRP, we find that the Realized GARCH model significantly outperforms conventional GARCH models.

2023

Hansen, P.R. and C. Tong (2023) Economic Letters Vol. 233, 111433

[ArXiv]

Abstract: The Clustered Factor (CF) model induces a block structure on the correlation matrix and is commonly used to parameterize correlation matrices. Our results reveal that the CF model imposes superfluous restrictions on the correlation matrix. This can be avoided by a different parametrization, involving the logarithmic transformation of the block correlation matrix.

2022

Hansen, P.R., C. Tong, and A. Huang (2022) Journal of Futures Markets Vol. 42, 1409-1433

[ArXiv]

Abstract: We introduce a new volatility model for option pricing that combines Markov switching with the Realized GARCH framework. This leads to a novel pricing kernel with a state-dependent variance risk premium and a pricing formula for European options, which is derived with an analytical approximation method. We apply the Markov switching Realized GARCH model to S&P 500 index options from 1990 to 2019 and find that investors' aversion to volatility-specific risk is time-varying. The proposed framework outperforms competing models and reduces (in-sample and out-of-sample) option pricing errors by 15% or more.

Hansen, P.R. (2022) Econometrics Journal Vol. 25, 739-761.

[ArXiv] [Code] [YouTube] Scoop (Media)

Abstract: We propose a simple dynamic model for estimating the relative contagiousness of two virus variants. Maximum likelihood estimation and inference is conveniently invariant to variation in the total number of cases over the sample period and can be expressed as a logistic regression. We apply the model to Danish SARS-CoV-2 variant data. We estimate the reproduction numbers of Alpha and Delta to be larger than that of the ancestral variant by a factor of 1.51 [CI 95%: 1.50, 1.53] and 3.28 [CI 95%: 3.01, 3.58], respectively. In a predominately vaccinated population, we estimate Omicron to be 3.15 [CI 95%: 2.83, 3.50] times more infectious than Delta. Forecasting the proportion of an emerging virus variant is straight forward and we proceed to show how the effective reproduction number for a new variant can be estimated without contemporary sequencing results. This is useful for assessing the state of the pandemic in real time as we illustrate empirically with the inferred effective reproduction number for the Alpha variant.

Hansen, P.R. and E. Dumitrescu (2022) Journal of Econometrics Vol. 230, 535-558.

[PDF]

Abstract: We study parameter estimation from the sample , when the objective is to maximize the expected value of a criterion function, , for a distinct sample, . This is the situation that arises when a model is estimated for the purpose of describing other data than those used for estimation, such as in forecasting problems. A natural candidate for solving is the innate estimator, . While the innate estimator has certain advantages, we show that the asymptotically efficient estimator takes the form , where is defined from a likelihood function in conjunction with . The likelihood-based estimator is, however, fragile, as misspecification is harmful in two ways. First, the likelihood-based estimator may be inefficient under misspecification. Second, and more importantly, the likelihood approach requires a parameter transformation that depends on the true model, causing an improper mapping to be used under misspecification.

2021

Archakov, I, and P.R. Hansen (2021) Econometrica Vol. 89, 1699-1715.

[ArXiv] [YouTube]

Abstract: We introduce a novel parametrization of the correlation matrix. The reparametriza- tion facilitates modeling of correlation and covariance matrices by an unrestricted vector, where positive definiteness is an innate property. This parametrization can be viewed as a generalization of Fisher’s Z-transformation to higher dimensions and has a wide range of potential applications. An algorithm for reconstructing the unique correlation matrix from any vector in is provided, and we derive its numerical complexity.

Hansen P.R. and M. Schmidtblaicher (2021) Journal of Business & Economic Statistics Vol. 39, 259-271.

[PDF]

Abstract: Increased vaccine hesitancy presents challenges to public health and undermines efforts to eradicate diseases such as measles, rubella, and polio. The decline is partly attributed to misconceptions that are shared on social media, such as the debunked association between vaccines and autism. Perhaps, more damaging to vaccine uptake are cases where trusted mainstream media run stories that exaggerate the risks associated with vaccines. It is important to understand the underlying causes of vaccine refusal, because these may be prevented, or countered, in a timely manner by educational campaigns. In this article, we develop a dynamic model of vaccine compliance that can help pinpoint events that disrupted vaccine compliance. We apply the framework to Danish HPV vaccine data, which experienced a sharp decline in compliance following the broadcast of a controversial TV documentary, and we show that media coverage significantly predicts vaccine uptake.

Kåre Mølbak and Peter Reinhard Hansen (2021) Book chapter in Når medierne sætter dagsordenen (in 🇩🇰).

2020

Hansen P.R., N.T. Brewer, and M. Schmidtblaicher (2020) Vaccine Vol. 38, 1842-1848.

[PDF]

Abstract: Background: Immunization programs’ resilience to shocks is central to their success, but little empirical evidence documents resilience in action. We sought to characterize the decline of HPV vaccination in Denmark after negative media coverage and recovery during a national information campaign. Methods: We conducted a population-based retrospective cohort study of all girls born in Denmark from 1997 to 2006 (N = 328,779), aged 12–15. The outcome measure was HPV vaccine uptake (first dose), as reported to the Danish national health registry from 2009 to 2019, when HPV vaccine was freely available to girls in primary care clinics in Denmark. Events that created 4 natural time periods for study were HPV vaccine reaching the uptake of other vaccines in the national program (2009), some negative media cov- erage of HPV vaccination (2013), extensive negative media coverage (2015), and a national information campaign about the vaccine’s safety and effectiveness (2017–2019). Results: In the period with some negative media coverage, HPV vaccine uptake fell to 83.6% (95% CI:78.0%–89.7%) of baseline uptake. In the period with extensive negative media coverage, uptake fell even further to 49.6% (95% CI:44.5%–55.2%) of baseline uptake. After the information campaign, HPV vaccine uptake recovered to its baseline level (109.2%, 95% CI:90.1%–132.4%) due in part to catch-up doses. Despite the recovery, an estimated 26,000 fewer girls initiated the vaccine than if uptake had not declined. Conclusions: The experience in Denmark offers one of the first opportunities to document how a nation grappled with negative media coverage of HPV vaccination and the steadying impact of action by national authorities.

Gørtz, M, N.T. Brewer, P.R. Hansen and M. Ejrnæs (2020) Vaccine Vol. 38, 4432-4439.

[PDF]

Abstract: Background: Human papillomavirus (HPV) vaccine coverage was high in Denmark until it plunged fol- lowing negative media coverage. We examined whether the decline in HPV vaccination undermined uptake of another adolescent vaccine, measles, mumps and rubella (MMR). Methods: The Danish national health register provided data on uptake of MMR vaccine dose 2 (at age 13) for children born from 1991 to 2003 (n = 827,716). The primary exposure variable comprised three time periods: before HPV vaccine introduction, during high HPV vaccine coverage, and after the drop in HPV vaccine coverage. To examine the effect of HPV vaccination on MMR2 uptake, we estimated MMR2 uptake by age 13 using logistic regression, controlling for gender, birth month, birth year, and maternal education. Findings: MMR2 vaccination coverage was high for both girls and boys (86% and 85%) in 2009. Following the introduction of HPV vaccine for girls in 2009, MMR2 coverage increased for girls even as it decreased for boys (gender gap 46 percentage points, 95% CI 43 to 48). Coverage with MMR2 for girls continued to be high over the following four years, and almost all girls (91%) who received MMR2 vaccination also received HPV1 vaccination within the same week. When negative media coverage led to a decline in HPV vaccination, MMR2 uptake for girls also declined. By 2015, MMR2 coverage for girls and boys had become similar again (80% and 79%). Families with the highest level of maternal education showed the strongest decline in MMR2 coverage for girls. Interpretation: Concomitant vaccine provision can increase overall vaccine uptake. However, reduced demand for one vaccine may reduce concomitant vaccination and undermine resiliency of a country’s vaccination program.

2019

Gorgi, P, P.R. Hansen, P. Janus, and S.J. Koopman (2019) Journal of Financial Econometrics Vol. 17, 1-32.

[ArXiv]

Abstract: We study recurrent patterns in volatility and volume for major cryptocurrencies, Bitcoin and Ether, using data from two centralized exchanges (Coinbase Pro and Binance) and a decentralized exchange (Uniswap V2). We find systematic patterns in both volatility and volume across day-of-the-week, hour-of-the-day, and within the hour. These patterns have grown stronger over the years and can be related to algorithmic trading and funding times in futures markets. We also document that price formation mainly takes place on the centralized exchanges while price adjustments on the decentralized exchanges can be sluggish.

2017

Hansen, P.R., Z. Huang, and T. Wang (2017) Journal of Futures Markets Vol. 37, 328–358.

[ArXiv]

Abstract: We derive a pricing formula for European options for the Realized GARCH framework based on an analytical approximation using an Edgeworth expansion for the density of cumulative return. Existing approximations in this context are based on a Gram–Charlier expansion while the proper Edgeworth expansion is more accurate. In relation to existing discrete-time option pricing models with realized volatility, our model is log-linear, non-affine, with a flexible leverage effect. We conduct an extensive empirical analysis on S&P500 index options and the results show that our computationally fast formula outperforms competing methods in terms of pricing errors, both in-sample and out-of-sample.

2016

Hansen P.R. and A. Huang (2016) Journal of Business & Economic Statistics Vol. 34 269-287.

[PDF]

Abstract: We introduce the realized exponential GARCH model that can use multiple realized volatility measures for the modeling of a return series. The model specifies the dynamic properties of both returns and realized measures, and is characterized by a flexible modeling of the dependence between returns and volatility. We apply the model to 27 stocks and an exchange traded fund that tracks the S&P 500 index and find specifications with multiple realized measures that dominate those that rely on a single realized measure. The empirical analysis suggests some convenient simplifications and highlights the advantages of the new specification.

2015

Hansen, P.R. and A. Timmermann (2015) Econometrica Vol. 83, 2485-2505.

[PDF]

Abstract: We derive a pricing formula for European options for the Realized GARCH framework based on an analytical approximation using an Edgeworth expansion for the density of cumulative return. Existing approximations in this context are based on a Gram–Charlier expansion while the proper Edgeworth expansion is more accurate. In relation to existing discrete-time option pricing models with realized volatility, our model is log-linear, non-affine, with a flexible leverage effect. We conduct an extensive empirical analysis on S&P500 index options and the results show that our computationally fast formula outperforms competing methods in terms of pricing errors, both in-sample and out-of-sample.

Hansen, P.R. (2015) Economics Letters Vol. 133, 14-18.

[PDF]

Abstract: We consider a multivariate time series whose increments are given from a homogeneous Markov chain. We show that the martingale component of this process can be extracted by a filtering method and establish the corresponding martingale decomposition in closed-form. This representation is useful for the analysis of time series that are confined to a grid, such as financial high frequency data.

Hansen, P.R., G. Horel, A. Lunde, and I. Archakov (2015) In The Fascination of Probability, Statistics and their Applications Vol. 33, 17-21.

[PDF]

Abstract: We introduce a multivariate estimator of financial volatility that is based on the theory of Markov chains. The Markov chain framework takes advantage of the discreteness of high-frequency returns. We study the finite sample properties of the estimation in a simulation study and apply it to high-frequency commodity prices.

Hansen, P.R. and A. Timmermann (2015) Journal of Business & Economic Statistics Vol. 83, 2485-2505.

[PDF]

2014

Hansen P.R., A. Lunde, and V. Voev (2014) Journal of Applied Econometrics Vol. 29, 774-799.

[PDF]

Abstract: We introduce a multivariate generalized autoregressive conditional heteroskedasticity (GARCH) model that incorporates realized measures of variances and covariances. Realized measures extract information about the currentlevels of volatilities and correlations from high-frequency data, which is particularly useful for modeling financialreturns during periods of rapid changes in the underlying covariance structure. When applied to market returnsin conjunction with returns on an individual asset, the model yields a dynamic model specification of the conditional regression coefficient that is known as thebeta. We apply the model to a large set of assets and find the conditional betas to be far more variable than usually found with rolling-window regressions based exclusively on daily returns. In the empirical part of the paper, we examine the cross-sectional as well as the time variation ofthe conditional beta series during the financial crises.

Hansen, P.R. and A. Lunde (2014) Econometric Theory Vol 30, 60- 93.

[PDF]

Abstract: An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized volatility measures that are imperfect estimates of actual volatility. In an empirical analysis using realized measures for the Dow Jones industrial average stocks, we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our application despite the large sample. Unit root tests that are based on the IV estimator have better finite sample properties in this context.

2012

Hansen, P.R., Z. Huang, and H. Shek (2012) Journal of Applied Econometrics Vol. 27, 877-906.

[PDF]

Abstract: We introduce a new framework, Realized GARCH, for the joint modeling of returns and realized measures of volatility. A key feature is a measurement equation that relates the realized measure to the conditional variance of returns. The measurement equation facilitates a simple modeling of the dependence between returns and future volatility. Realized GARCH models with a linear or log-linear specification have many attractive features. They are parsimonious, simple to estimate, and imply an ARMA structure for the conditional variance and the realized measure. An empirical application with Dow Jones Industrial Average stocks and an exchange traded index fund shows that a simple Realized GARCH structure leads to substantial improvements in the empirical fit over standard GARCH models that only use daily returns.

2011

Hansen, P.R., A. Lunde, and J.M. Nason (2011) Econometrica Vol. 79, 453-497.

[PDF]

Abstract: This paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS acknowledges the limitations of the data, such that uninformative data yield a MCS with many models, whereas informative data yield a MCS with only a few models. The MCS procedure does not assume that a particular model is the true model; in fact, the MCS procedure can be used to compare more general objects, beyond the comparison of models. We apply the MCS procedure to two empirical problems. First, we revisit the inflation forecasting problem posed by Stock and Watson (1999), and compute the MCS for their set of inflation forecasts. Second, we compare a number of Taylor rule regressions and determine the MCS of the best regression in terms of in-sample likelihood criteria.

Barndorff-Nielsen, O.E, P.R. Hansen, A. Lunde, and N. Shephard (2011) Journal of Econometrics Vol. 160, 204-219.

Abstract: In a recent paper we have introduced the class of realised kernel estimators of the increments of quadratic variation in the presence of noise. We showed that this estimator is consistent and derived its limit distribution under various assumptions on the kernel weights. In this paper we extend our analysis, looking at the class of subsampled realised kernels and we derive the limit theory for this class of estimators. We find that subsampling is highly advantageous for estimators based on discontinuous kernels, such as the truncated kernel. For kinked kernels, such as the Bartlett kernel, we show that subsampling is impotent, in the sense that subsampling has no effect on the asymptotic distribution. Perhaps surprisingly, for the efficient smooth kernels, such as the Parzen kernel, we show that subsampling is harmful as it increases the asymptotic variance. We also study the performance of subsampled realised kernels in simulations and in empirical work.

Barndorff-Nielsen, O.E, P.R. Hansen, A. Lunde, and N. Shephard (2011) Journal of Econometrics Vol. 79, 453-497.

Abstract: We propose a multivariate realised kernel to estimate the ex-post covariation of log-prices. We show this new consistent estimator is guaranteed to be positive semi-definite and is robust to measurement error of certain types and can also handle non-synchronous trading. It is the first estimator which has these three properties which are all essential for empirical work in this area. We derive the large sample asymptotics of this estimator and assess its accuracy using a Monte Carlo study. We implement the estimator on some US equity data, comparing our results to previous work which has used returns measured over 5 or 10 min intervals. We show that the new estimator is substantially more precise.

Hansen, P.R. and A. Lunde (2011) In Oxford Handbook on Economic ForecastingChapter 19, 525-556.

[PDF]

2009

Barndorff-Nielsen, O.E, P.R. Hansen, A. Lunde, and N. Shephard (2011) Econometrics Journal Vol. 12, 1-32.

Abstract: Realized kernels use high-frequency data to estimate daily volatility of individual stock prices. They can be applied to either trade or quote data. Here we provide the details of how we suggest implementing them in practice. We compare the estimates based on trade and quote data for the same stock and find a remarkable level of agreement. We identify some features of the high-frequency data, which are challenging for realized kernels. They are when there are local trends in the data, over periods of around 10 minutes, where the prices and quotes are driven up or down. These can be associated with high volumes. One explanation for this is that they are due to non-trivial liquidity effects.

2008

Barndorff-Nielsen, O.E, P.R. Hansen, A. Lunde, and N. Shephard (2008) Econometrica Vol. 79, 453-497.

[PDF]

Abstract: This paper shows how to use realized kernels to carry out efficient feasible inference on the ex post variation of underlying equity prices in the presence of simple models of market frictions. The weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which equals that of the maximum likelihood estimator in the parametric version of this problem. Realized kernels can also be se- lected to (i) be analyzed using endogenously spaced data such as that in data bases on transactions, (ii) allow for market frictions which are endogenous, and (iii) allow for temporally dependent noise. The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice.

Hansen, P.R. (2008) Journal of Statistical Planning and Inference Vol. 138, 2688-2697.

Abstract: We derive an identity for the determinant of a product involving non-squared matrices. The identity can be used to derive the maximum likelihood estimator in reduced-rank regressions with Gaussian innovations. Furthermore, the identity sheds light on the structure of the estimation problem that arises when the reduced-rank parameters are subject to additional constraints.

Hansen, P.R., A. Lunde, and J. Large (2008) Econometrics Reviews Vol. 27, 79-111.

Abstract: We examine moving average (MA) filters for estimating the integrated variance (IV) of a financial asset price in a framework where high-frequency price data are contaminated with market microstructure noise. We show that the sum of squared MA residuals must be scaled to enable a suitable estimator of IV. The scaled estimator is shown to be consistent, first-order efficient, and asymptotically Gaussian distributed about the integrated variance under restrictive assumptions. Under more plausible assumptions, such as time-varying volatility, the MA model is misspecified. This motivates an extensive simulation study of the merits of the MA-based estimator under misspecification. Specifically, we consider nonconstant volatility combined with rounding errors and various forms of dependence between the noise and efficient returns. We benchmark the scaled MA-based estimator to subsample and realized kernel estimators and find that the MA-based estimator performs well despite the misspecification.

Bentzen, E., P.R. Hansen, A. Lunde, and A.A. Zebedee (2008) Financial Markets and Portfolio Management Vol. 22, 3-20.

Abstract: In this paper, we provide an intraday analysis of the impact of monetary policy on the equity markets. Specifically, we study changes in prices and changes in volatility for the S&P 500 associated with Federal Open Market Committee announcements as well as real-time changes in market expectations about future policy. The analysis shows an economically and statistically significant inverse relationship between equity market returns and changes in the Fed funds rate target. The magnitude of the response is dependent on whether the change was expected or unexpected. An expected change in the Fed funds rate target of 25 basis points results in approximately a 30 basis point decline in the broad equity market, while an unexpected change of 25 basis points in the Fed funds rate target results in approximately 125 basis point decline in the broad equity market. The speed of these market reactions is rapid with the equity market reaching a new equilibrium within fifteen minutes. In contrast to these results, the analysis also shows a positive relationship exists between equity market returns and changes in expectations about future monetary policy. Taken together, these results regarding price changes (returns) suggest that the price discovery process in the equity markets is dominated by the realization of expectations and not market expectations per se. Meanwhile, the volatility analysis suggests a volatility spike follows both FOMC announcements and real-time changes in expectations, but the duration of these spikes is relatively short-lived and dampens out within one hour.

2006

Hansen, P.R. and A. Lunde (2006) Journal of Business and Economic Statistics Vol. 24, 127-218.

Abstract: We study market microstructure noise in high-frequency data and analyze its implications for the realized variance (RV) under a general specification for the noise. We show that kernel-based estimators can unearth important characteristics of market microstructure noise and that a simple kernel-based estimator dominates the RV for the estimation of integrated variance (IV). An empirical analysis of the Dow Jones Industrial Average stocks reveals that market microstructure noise is time-dependent and correlated with increments in the efficient price. This has important implications for volatility estimation based on high-frequency data. Finally, we apply cointegration techniques to decompose transaction prices and bid–ask quotes into an estimate of the efficient price and noise. This framework enables us to study the dynamic effects on transaction prices and quotes caused by changes in the efficient price.

Hansen, P.R. and A. Lunde (2006) Journal of Econometrics Vol. 131, 97-121.

Abstract: We show that the empirical ranking of volatility models can be inconsistent for the true ranking if the evaluation is based on a proxy for the population measure of volatility. For example, the substitution of a squared return for the conditional variance in the evaluation of ARCH-type models can result in an inferior model being chosen as ‘best’ with a probability that converges to one as the sample size increases. We document the practical relevance of this problem in an empirical application and by simulation experiments. Our results provide an additional argument for using the realized variance in out-of-sample evaluations rather than the squared return. We derive the theoretical results in a general framework that is not specific to the comparison of volatility models. Similar problems can arise in comparisons of forecasting models whenever the predicted variable is a latent variable.

2005

Hansen, P.R. (2005) Journal of Business and Economic Statistics Vol. 23, pp. 365-380.

Abstract: We propose a new test for superior predictive ability. The new test compares favorably to the reality check (RC) for data snooping, because it is more powerful and less sensitive to poor and irrelevant alternatives. The improvements are achieved by two modifications of the RC. We use a studentized test statistic that reduces the influence of erratic forecasts and invoke a sample-dependent null distribution. The advantages of the new test are confirmed by Monte Carlo experiments and an empirical exercise in which we compare a large number of regression-based forecasts of annual U.S. inflation to a simple random-walk forecast. The random-walk forecast is found to be inferior to regression-based forecasts and, interestingly, the best sample performance is achieved by models that have a Phillips curve structure.

Hansen, P.R. and A. Lunde (2005) Journal of Financial Econometrics Vol. 3, 525-554.

Abstract: We consider the problem of deriving an empirical measure of daily integrated variance (IV) in the situation where high-frequency price data are unavailable for part of the day. We study three estimators in this context and characterize the assumptions that justify their use. We show that the optimal combination of the realized variance and squared overnight return can be determined, despite the latent nature of IV, and we discuss this result in relation to the problem of combining forecasts. Finally, we apply our theoretical results and construct four years of daily volatility estimates for the 30 stocks of the Dow Jones Industrial Average.

Hansen, P.R. and A. Lunde (2005) Journal of Applied Econometrics Vol. 20, 873-889.

Abstract: We compare 330 ARCH-type models in terms of their ability to describe the conditional variance. The models are compared out-of-sample using DM–$ exchange rate data and IBM return data, where the latter is based on a new data set of realized variance. We find no evidence that a GARCH(1,1) is outperformed by more sophisticated models in our analysis of exchange rates, whereas the GARCH(1,1) is clearly inferior to models that can accommodate a leverage effect in our analysis of IBM returns. The models are compared with the test for superior predictive ability (SPA) and the reality check for data snooping (RC). Our empirical results show that the RC lacks power to an extent that makes it unable to distinguish ‘good’ and ‘bad’ models in our analysis.

Hansen, P.R. (2005) Econometrics Journals Vol. 8, 23-38.

Abstract: We compare 330 ARCH-type models in terms of their ability to describe the conditional variance. The models are compared out-of-sample using DM–$ exchange rate data and IBM return data, where the latter is based on a new data set of realized variance. We find no evidence that a GARCH(1,1) is outperformed by more sophisticated models in our analysis of exchange rates, whereas the GARCH(1,1) is clearly inferior to models that can accommodate a leverage effect in our analysis of IBM returns. The models are compared with the test for superior predictive ability (SPA) and the reality check for data snooping (RC). Our empirical results show that the RC lacks power to an extent that makes it unable to distinguish ‘good’ and ‘bad’ models in our analysis.

2003

Hansen, P.R. (2003) Journal of Econometrics Vol. 114, 261-295.

Abstract: This paper generalizes the cointegrated vector autoregressive model of Johansen (J. Econom. Dyn. Control 12 (1988) 231–254) to allow for structural changes. We take the time of the change points and the number of cointegration relations as given. Estimation under various hypotheses is made possible by a new estimation technique, which makes it simple to derive a number of interesting likelihood ratio tests. For example, one can test for m structural changes against m+k structural changes, or test linear parameter restrictions in the presence of structural changes. The asymptotic distribution of the likelihood ratio statistic is in both cases. The model is applied to US term structure data, and structural changes in September 1979 and October 1982, which coincide with large changes in the Fed's policy, are found to be significant. After accounting for these structural changes, we cannot, contrary to previous studies, reject the long-run implication of the expectations hypothesis.

Hansen, P.R., A. Lunde, and J.M. Nason (2003) Oxford Bulletin of Economics and Statistics Vol. 65, 839-861.

Abstract: This paper applies the model confidence set (MCS) procedure of Hansen, Lunde and Nason (2003) to a set of volatility models. An MCS is analogous to the confidence interval of a parameter in the sense that it contains the best forecasting model with a certain probability. The key to the MCS is that it acknowledges the limitations of the information in the data. The empirical exercise is based on 55 volatility models and the MCS includes about a third of these when evaluated by mean square error, whereas the MCS contains only a VGARCH model when mean absolute deviation criterion is used. We conduct a simulation study which shows that the MCS captures the superior models across a range of significance levels. When we benchmark the MCS relative to a Bonferroni bound, the latter delivers inferior performance.