Click on the items listed below to see some recent research material.

Robustness

Risk and Valuation

Operator Methods

Generalized Method of Moments Estimation


Research on Robustness

 

Working Papers [top]

  • "Risk and Robustness in General Equalibrium," with E. W. Anderson and T. J. Sargent (March 8, 1998)

This early paper gives a continuous-time, stochastic formulation of robust control theory and a characterization of prices. While this paper was substantially revised and given a new title, the original manuscript remains interesting and is cited in our subsequent work.  This paper provided the impetus for much of our subsequent work. The published paper is "A Quartet of Semigroups for Model Specification, Robustness, Prices of Risk and Model Detection." View at JSTOR

Published Papers[top]

  • "Three Types of Ambiguity," with T. J. Sargent, Journal of Monetary Economics, Volume 59, Issue 5, July 2012, pp. 422-445.  View at ScienceDirect 

For each of three types of ambiguity, we compute a robust Ramsey plan and an associated worst-case probability model. Ex post, ambiguity of type I implies endogenously distorted homogeneous beliefs, while ambiguities of types II and III imply distorted heterogeneous beliefs. Martingales characterize alternative probability specications and clarify distinctions among the three types of ambiguity. We use recursive formulations of Ramsey problems to impose local predictability of commitment multipliers directly. To reduce the dimension of the state in a recursive formulation,we transform the commitment multiplier to accommodate the heterogeneous beliefs that arise with ambiguity of types II and III. Our formulations facilitate comparisons of the consequences of these alternative types of ambiguity.

  • "Robustness and Ambiguity in Continuous Time," with T. J. Sargent, Journal of Economic Theory, Volume 146, Issue 3, May 2011, pp. 1195-1223. View at ScienceDirect 

We use statistical detection theory in a continuous-time environment to provide a new perspective on calibrating a concern about robustness or an aversion to ambiguity. A decision maker repeatedly confronts uncertainty about state transition dynamics and a prior distribution over unobserved states or parameters. Two continuous-time formulations are counterparts of two discrete-time recursive specifications of Hansen and Sargent (2007). One formulation shares features of the smooth ambiguity model of Klibanoff et al. (2005) and (2009). Here our statistical detection calculations guide how to adjust contributions to entropy coming from hidden states as we take a continuous-time limit.

  • "Small Noise Methods for Risk-Sensitive/Robust Economies," with E.W. Anderson and T. J. Sargent,  Journal of Economic Dynamics and Control, Volume 36, Issue 4, April 2012, pp. 468-500. View at ScienceDirect 

We provide small noise expansions for the value function and decision rule for the recursive risk-sensitive preferences specified by Hansen and Sargent (1995), Hansen, Sargent and Tallarini (1999), and Tallarini (2000). We use the expansions (1) to provide a fast method for approximating solutions of dynamic stochastic problems, and (2) to quantify the effects on decisions of uncertainty and concerns about robustness to misspecification.

  • "Wanting Robustness in Macroeconomics," with T. J. Sargent,   editors: B. M. Friedman and M. Woodford, Handbook of Monetary Economics, Volume 3, Issue 11, 2010, pp. 1097-1157. View at ScienceDirect 

Robust control theory is a tool for assessing decision rules when a decision maker distrusts either the specification of transition laws or the distribution of hidden state variables or both. Specification doubts inspire the decision maker to want a decision rule to work well for an empty set of models surrounding his approximating stochastic model. We relate robust control theory to the so-called multiplier and constraint preferences that have been used to express ambiguity aversion. Detection error probabilities can be used to discipline empirically plausible amounts of robustness. We describe applications to asset pricing uncertainty premia and design of robust macroeconomic policies.

  • "The Changing History of Robustness," by S. Stigler (Keynote Address at ICORES10, Prague, June 2010; includes a discussion of Hansen and Sargent's Robustness), American Statistician, Volume 64, Number 4, November 2010, pp. 277-281. View at ASA
  • "Fragile Beliefs and the Price of Uncertainty," with T. J. Sargent, Quantitative Economics, Volume 1, Issue 1, July 2010, pp. 129-162. View at Quantitative Economics

A representative consumer uses Bayes’ law to learn about parameters of several models and to construct probabilities with which to perform ongoing model averaging. The arrival of signals induces the consumer to alter his posterior distribution over models and parameters. The consumer’s specification doubts induce him to slant probabilities pessimistically. The pessimistic probabilities tilt toward a model that puts long-run risks into consumption growth. That contributes a countercyclical history-dependent component to prices of risk.

  • "Doubts or Variability?" with F. Barillas and T. J. Sargent, Journal of Economic Theory, Volume 144, Issue 6, November 2009, pp. 2388-2418. View at ScienceDirect

Reinterpreting most of the market price of risk as a price of model uncertainty eradicates a link between asset prices and measures of the welfare costs of aggregate fluctuations that was proposed by Hansen et al., Tallarini, and Alvarez and Jermann [Lars Peter Hansen, Thomas Sargent, Thomas Tallarini, Robust permanent income and pricing, Rev. Econ. Stud. 66 (1999) 873–907; Thomas D. Tallarini, Risk-sensitive real business cycles, J. Monet. Econ. 45 (3) (2000) 507–532; Fernando Alvarez, Urban J. Jermann, Using asset prices to measure the cost of business cycles, J. Polit. Econ. 112 (6) (2004) 1223–1256]. Prices of model uncertainty contain information about the benefits of removing model uncertainty, not the consumption fluctuations that Lucas [Robert E. Lucas Jr., Models of Business Cycles, Basil Blackwell, Oxford and New York, 1987; Robert E. Lucas Jr., Macroeconomic priorities, American Economic Review, Papers and Proceedings 93 (2003) 1–14] studied. A max–min expected utility theory lets us reinterpret Tallarini's risk-aversion parameter as measuring a representative consumer's doubts about the model specification. We use model detection instead of risk-aversion experiments to calibrate that parameter. Plausible values of detection error probabilities give prices of model uncertainty that approach the Hansen and Jagannathan [Lars Peter Hansen, Ravi Jagannathan, Implications of security market data for models of dynamic economies, J. Polit. Econ.99 (1991) 225–262] bounds. Fixed detection error probabilities give rise to virtually identical asset prices as well as virtually identical costs of model uncertainty for Tallarini's two models of consumption growth.

  • “Robustness and U.S. Monetary Policy Experimentation,” (August 8, 2008) with T. Cogley, R. Colacito and T. J. Sargent, Journal of Money Credit and Banking, Volume 40, Issue 8, December 2008, pp. 1599-1623.  View at Wiley

 

We study how a concern for robustness modifies a policy marker’s incentive to experiment. A policy maker has a prior over two submodels of inflation-unemployment dynamics. One submodel implies an exploitable trade-off, the other does not. Bayes’ law gives the policy maker an incentive to experiment. The policy maker fears that both submodels and his prior probability distribution over them are misspecified. We compute decision rules that are robust to misspecifications of each submodel and of the prior distribution over submodels. We compare robust rules to ones that Cogley, Colacito and Sargent (2007) computed assuming that the models and the prior distribution are correctly specified. We explain how the policy maker’s desires to protect against misspecifications of the submodels, on the one hand, and misspecifications of the prior over them, on the other, have different effects on the decision rule.

  • “Beliefs, Doubts and Learning: Valuing Macroeconomic Risk, Richard T. Ely Lecture” , The American Economic Review Volume 97, Number 2, May 2007, pp. 1-30.  View at AEA

 

This essay examines the problem of inference within a rational expectations model from two perspectives: that of an econometrician and that of the economic agents within the model. The assumption of rational expectations has been and remains an important component to quantitative research. It endows economic decision makers with knowledge of the probability law implied by the economic model. As such, it is an equilibrium concept. Imposing rational expectations removed from consideration the need for separately specifying beliefs or subjective components of uncertainty. Thus, it simplified model specification and implied an array of testable implications that are different from those considered previously. It reframed policy analysis by questioning the effectiveness of policy levers that induce outcomes that differ systematically from individual beliefs.


  • "Recursive Robust Estimation and Control without Commitment," with T. J. Sargent, Journal of Economic Theory, Volume 136, Issue 1, September 2007, 1-27.  View at ScienceDirect 

In a Markov decision problem with hidden state variables, a posterior distribution serves as a state variable and Bayes’ law under an approximating model gives its law of motion. A decision maker expresses fear that his model is misspecified by surrounding it with a set of alternatives that are nearby when measured by their expected log likelihood ratios (entropies). Martingales represent alternative models. A decision maker constructs a sequence of robust decision rules by pretending that a sequence of minimizing players choose increments to martingales and distortions to the prior over the hidden state. A risk sensitivity operator induces robustness to perturbations of the approximating model conditioned on the hidden state. Another risk sensitivity operator induces robustness to the prior distribution over the hidden state. We use these operators to extend the approach of Hansen and Sargent [Discounted linear exponential quadratic Gaussian control, IEEE Trans. Automat. Control 40(5) (1995) 968–971] to problems that contain hidden states. 

  • "Robust Control and Model Misspecification," with T. J. Sargent, G. Turmuhambetova, and N. Williams, Journal of Economic Theory, Volume 128, Issue 1, May 2006, pp. 45-90.  View at ScienceDirect  

A decision maker fears that data are generated by a statistical perturbation of an approximating model that is either a controlled diffussion or a controlled measure over continuous functions of time. A perturbation is constrained by relative entropy. Several two-player zero-sum games yield robust decision rules and are related to one another and to max-min expected utility theory of Gilboa and Schmeidler (1989). Alternative sequential and non-sequential versions of robust control theory present identical robust decision rules that are dynamically consistent in a useful sense.

  • "Robust Estimation and Control under Commitment," with T. J. Sargent Journal of Economic Theory, Volume 124, Issue 2, October 2005, pp. 258-301.  View at ScienceDirect  

This paper studies robust decision problems with hidden state variables.  It gives the recursive implementation of the commitment solution with discounting from robust control theory.  The recursive implication shows formally how discounting and commitment are encoded in the robust decision rules.  We suggest alternative recursive formulations of the decision problem that are attractive alternatives to the commitment solution.

  • "A Quartet of Semigroups for Model Specification, Robustness, Prices of Risk and Model Detection," with E. W. Anderson and T. J. Sargent, Journal of the European Economic Association, Volume 1, Issue 1, March 2003, pp. 68-123. View at JSTOR

A representative agent fears that his model, a continuous time Markov process with jump and diffusion components, is misspecified and therefore uses robust control theory to make decisions. Under the decision maker’s approximating model, cautious behavior puts adjustments for model misspecification into market prices for risk factors. We use a statistical theory of detection to quantify how much model misspecification the decision maker should fear, given his historical data record. A semigroup is a collection of objects connected by something like the law of iterated expectations. The law of iterated expectations defines the semigroup for a Markov process, while similar laws define other semigroups. Related semigroups describe (1) an approximating model; (2) a model misspecification adjustment to the continuation value in the decision maker’s Bellman equation; (3) asset prices; and (4) the behavior of the model detection statistics that we use to calibrate how much robustness the decision maker prefers. Semigroups 2, 3, and 4 establish a tight link between the market price of uncertainty and a bound on the error in statistically discriminating between an approximating and a worst case model.

  • "Robust Control and Model Uncertainty," with T. J. Sargent,  The American Economic Review, Volume 91, Issue 2, May 2001, pp. 60-66. View at JSTOR 

This paper describes links between the max-min expected utility theory of Itzhak Gilboa and David Schmeidler (1989) and the applications of robust-control theory proposed by Evan Anderson et al. (2000) and Paul Dupuis et al. (1998).

  • "Robust Permanent Income and Pricing," with T. J. Sargent and T. D. Tallarini, Jr., Review of Economic Studies, Volume 66, Issue 4, October 1999, pp. 873-907. View at JSTOR

 

Comments[top]

  • "Comment on Exotic Preferences for Macroeconomics by D. K. Backus, B. R. Routledge and S. E. Zin," edited by M. Gertler and K. Rogoff. NBER Macroeconomics Annual 2004. View at JSTOR

 



Research on Risk and Valuation

 

Working Papers[top]

  • "Examining Macroeconomic Models through the Lens of Asset Pricing, " with J. Borovicka (December 8, 2011)    View at SSRN

Dynamic stochastic equilibrium models of the macro economy are designed to match the macro time series including impulse response functions. Since these models aim to be structural, they also have implications for asset pricing. To assess these implications, we explore asset pricing counterparts to impulse response functions. We use the resulting dynamic value decomposition (DVD) methods to quantify the exposures of macroeconomic cash flows to shocks over alternative investment horizons and the corresponding prices or compensations that investors must receive because of the exposure to such shocks. We build on the continuous-time methods developed in Hansen and Scheinkman (2010), Borovicka et al. (2011) and Hansen (2011) by constructing discrete-time shock elasticities that measure the sensitivity of cash flows and their prices to economic shocks including economic shocks featured in the empirical macroeconomics literature. By design, our methods are applicable to economic models that are nonlinear, including models with stochastic volatility. We illustrate our methods by analyzing the asset pricing model of Ai et al. (2010) with tangible and intangible capital.

  • "Modeling and Measuring Systemic Risk," with M. K. Brunnermeier, A. K. Kashyap, A. Krishnamurthy, and A. W. Lo (October 15, 2010)  View at NSF

In this white paper we identify the need for innovative research to improve our ability to quantify systemic financial risk. There are at least three major components to this challenge: modeling, measurement, and data accessibility. Progress on this challenge will require extending existing research in many directions and will require collaboration between economists, statisticians, decision theorists, sociologists, psychologists, and neuroscientists. This paper was submitted as part of American Economic Association's response to the 2010 National Science Foundation call for white papers "to frame innovative research for the year 2020 and beyond" in the social and behavioral sciences.

Published Papers[top]

  • "Challenges in Identifying and Measuring Systemic Risk,"  editors: M. K. Brunnermeier and A. Krishnamurthy, forthcoming in Systemic Risk and Macro Modeling, Chapter 1, University of Chicago Press, 2012. 

Sparked by the recent "great recession" and the role of financial markets, considerable interest exists among researchers within both the academic community and the public sector in modeling and measuring systemic risk. In this essay I draw on experiences with other measurement agendas to place in perspective the challenge of quantifying systemic risk, or more generally, of providing empirical constructs that can enhance our understanding of linkages between financial markets and the macroeconomy.

  • "Recursive utility in a Markov environment with stochastic growth, " with J. Scheinkman, Proceedings of the National Academy of Sciences, Volume 109, Issue 30, July 2012, pp. 11967-72.  View at PNAS

Recursive utility models of the type introduced by Kreps and Porteus (1978) are used extensively in applied research in macroeconomics and asset pricing in environments with uncertainty. These models represent preferences as the solution to a nonlinear forward-looking difference equation with a terminal condition. Such preferences feature investor concerns about the intertemporal composition of risk. In this paper we study infinite horizon specifications of this difference equation in the context of a Markov environment. We establish a connection between the solution to this equation and to an arguably simpler Perron-Frobenius eigenvalue equation of the type that occurs in the study of large deviations for Markov processes. By exploiting this connection, we establish existence and uniqueness results. Moreover, we explore a substantive link between large deviation bounds for tail events for stochastic consumption growth and preferences induced by recursive utility.

  • "Risk Pricing over Alternative Investment Horizons,"  editors: G. Constantinides, M. Harris, and R. Stultz, forthcoming in Handbook of Finance, 2012.

I explore methods that characterize model-based valuation of stochastically growing cash flows. Following previous research, I use stochastic discount factors as a convenient device to depict asset values. I extend that literature by focusing on the impact of compounding these discount factors over alternative investment horizons. In modeling cash flows, I also incorporate stochastic growth factors. I explore dynamic value decomposition (DVD) methods that capture concurrent compounding of a stochastic growth and discount factors in determining risk-adjusted values. These methods are supported by factorizations that extract martingale components of stochastic growth and discount factors. These components reveal which ingredients of a model have long-term implications for valuation. The resulting martingales imply convenient changes in measure that are distinct from those used in mathematical finance, and they provide the foundations for analyzing model-based implications for the term structure of risk prices. As an illustration of the methods, I re-examine some recent preference based models. I also use the martingale extraction to revisit the value implications of some benchmark models with market restrictions and heterogenous consumers.

  • "Dynamic Valuation Decomposition Within Stochastic Economies,"  Econometrica, Volume 80, Number 3, May 2012, pp. 911-967. (Previously titled "Modeling the Long Run: Valuation in Dynamic Stochastic Economies." )View at Wiley

I explore the equilibrium value implications of economic models that incorporate responses to a stochastic environment with growth. I propose dynamic valuation decompositions (DVD's) designed to distinguish components of an underlying economic model that influence values over long investment horizons from components that impact only the short run. A DVD represents the values of stochastically growing claims to consumption payoffs or cash flows using a stochastic discount process that both discounts the future and adjusts for risk. It is enabled by constructing operators indexed by the elapsed time between the trading date and the date of the future realization of the payoff. Thus formulated, methods from applied mathematics permit me to characterize valuation behavior and the term structure of risk prices in a revealing manner. I apply this approach to investigate how investor beliefs and the associated uncertainty are reflected in current-period values and risk-price elasticities.

  • "Pricing Growth-Rate Risk," with J. Scheinkman, Finance and Stochastics, Volume 16, Number 1, January 2012, pp. 1-15. View at SpringerLink

We characterize the compensation demanded by investors in equilibrium for incremental exposure to growth-rate risk. Given an underlying Markov diffusion that governs the state variables in the economy, the economic model implies a stochastic discount factor process S. We also consider a reference growth process G that may represent the growth in the payoff of a single asset or of the macroeconomy. Both S and G are modeled conveniently as multiplicative functionals of a multidimensional Brownian motion. We consider the pricing implications of parametrized family of growth processes G ε, with G 0=G, as ε is made small. This parametrization defines a direction of growth-rate risk exposure that is priced using the stochastic discount factor S. By changing the investment horizon, we trace a term structure of risk prices that shows how the valuation of risky cash flows depends on the investment horizon. Using methods of Hansen and Scheinkman (Econometrica 77:177–234, 2009), we characterize the limiting behavior of the risk prices as the investment horizon is made arbitrarily long.

  • "Risk Price Dynamics," with J. Scheinkman, J. Borovička, and M. Hendricks, Journal of Financial Econometrics, Volume 9, Issue 1, Winter 2011, pp. 3-65. View at Oxford Journals

We present a novel approach to depicting asset-pricing dynamics by characterizing shock exposures and prices for alternative investment horizons. We quantify the shock exposures in terms of elasticities that measure the impact of a current shock on future cash flow growth. The elasticities are designed to accommodate nonlinearities in the stochastic evolution modeled as a Markov process. Stochastic growth in the underlying macroeconomy and stochastic discounting in the representation of asset values are central ingredients in our investigation. We provide elasticity calculations in a series of examples featuring consumption externalities, recursive utility, and jump risk. (JEL: C52, E44, G12)

  • "Pricing Kernels and Stochastic Discount Factors," with E. Renault, Encyclopedia of Quantitative Finance, Chapter 19-009, Wiley Press May 2010.  View at Wiley

In this entry we characterize pricing kernels or stochastic discount factors that are used to represent valuation operators in dynamic stochastic economies. A kernel is commonly used mathematical term used to represent an operator. The term stochastic discount factor extends concepts from economics and finance to include adjustments for risk. As we will see, there is a tight connection between the two terms. The terms pricing kernel and stochastic discount factor are often used interchangeably. After deriving convenient representations for prices, we provide several examples of stochastic discount factors and discuss econometric methods for estimation and testing of asset pricing models that restrict the stochastic discount factors.

  • "Long Term Risk: an Operator Approach," with J. Scheinkman, Econometrica, Econometric Society, Volume 77, Issue 1, January 2009, pp. 177-234. View at JSTOR

We build a family of valuation operators indexed by the increment of time between the payoff date and the current period value. These operators are necessarily related by what is known as the semigroup property or The Law of Iterated Values. The operator formulation we develop provides a way to link short term risk adjustments to what happens in the medium and long term. We apply this apparatus to give a precise notion of a long term risk-return tradeoff.

·        "Consumption Strikes Back?: Measuring Long Run Risk," with J.C. Heaton and N. Li, Journal of Political Economy, Volume 116, Issue 2, April 2008, 260-302. View at JSTOR

We study structural models of stochastic discount factors and explore alternative methods of estimating such models using data on macroeconomic risk and asset returns. Particular attention is devoted to recursive utility models in which risk aversion can be modified without altering intertemporal substitution. We characterize the impact of changing the intertemporal substitution and risk aversion parameters on equilibrium short-run and long-run risk prices and on equilibrium wealth.

·        "Intertemporal Substitution and Risk Aversion," with J. Heaton, J. Lee, N. Roussanov, Handbook of Econometrics, Volume 6, Part 1, 2007, pp. 3967-4056. View at ScienceDirect

We characterize and measure a long-term risk-return trade-off for the valuation of cash flows exposed to fluctuations in macroeconomic growth. This trade-off features risk prices of cash flows that are realized far into the future but continue to be reflected in asset values. We apply this analysis to claims on aggregate cash flows and to cash flows from value and growth portfolios by imputing values to the long-run dynamic responses of cash flows to macroeconomic shocks. We explore the sensitivity of our results to features of the economic valuation model and of the model cash flow dynamics.

·        "Intangible Risk?" with J.C. Heaton and N. Li, Measuring Capital in the New Economy (NBER Books), Corrado, Haltiwanger and Sichel, eds., 2005, 111-152. View at NBER

  • "Value in an Uncertain Economy," (Address at the 474th Convocation at the University of Chicago) 2004.

 

Comments[top]

  • "Comment on Exotic Preferences for Macroeconomics by D. K. Backus, B. R. Routledge and S. E. Zin," edited by M. Gertler and K. Rogoff, NBER Macroeconomics Annual 2004.
  • "Comments on Housing Price Booms and the Current Account by A. Klaus, P. Kuang, and A. Marcet," NBER Macroeconomics Annual 2011, Volume 26.



Research on Operator Methods

 

Working Papers[top]

  • "Principal Components and the Long Run," with X. Chen and J. Scheinkman (November, 2005)

This early paper investigates a method for extracting nonlinear principal components. These components maximize variation subject to smoothness and orthogonal constraints; but we allow for a general class of densities and constraints, including densities without compact support and even densities with algebraic tails. We also characterize the limiting behavior of the associated eigenvalues, the objects used to quantify the incremental importance of the principal components. A major portion of this paper was published in the Annals of Statistics under the title: "Nonlinear Principal Components and Long Run Implications of Mutivariate Diffusions."

Published Papers[top]

  • "Dynamic Valuation Decomposition Within Stochastic Economies,"  (September 25, 2011) Econometrica, Volume 80, Issue 3, May 2012, pp. 911-967.. (Previously titled "Modeling the Long Run: Valuation in Dynamic Stochastic Economies" )View at Wiley

I explore the equilibrium value implications of economic models that incorporate responses to a stochastic environment with growth. I propose dynamic valuation decompositions (DVD's) designed to distinguish components of an underlying economic model that influence values over long investment horizons from components that impact only the short run. A DVD represents the values of stochastically growing claims to consumption payoffs or cash flows using a stochastic discount process that both discounts the future and adjusts for risk. It is enabled by constructing operators indexed by the elapsed time between the trading date and the date of the future realization of the payoff. Thus formulated, methods from applied mathematics permit me to characterize valuation behavior and the term structure of risk prices in a revealing manner. I apply this approach to investigate how investor beliefs and the associated uncertainty are reflected in current-period values and risk-price elasticities.

  • "Nonlinearity and Temporal Dependence," with X. Chen and M. Carrasco, Journal of Econometrics, Volume 155, Issue 2, April 2010, pp. 155-169. View at ScienceDirect

Nonlinearities in the drift and diffusion coefficients influence temporal dependence in diffusion models. We study this link using three measures of temporal dependence: ρ−mixing, β−mixing and α−mixing. Stationary diffusions that are ρ − mixing have mixing coefficients that decay exponentially to zero. When they fail to be ρ−mixing, they are still β−mixing and α−mixing; but coefficient decay is slower than exponential. For such processes we find transformations of the Markov states that have finite variances but infinite spectral densities at frequency zero. The resulting spectral densities behave like those of stochastic processes with long memory. Finally we show how state-dependent, Poisson sampling alters the temporal dependence.

  • "Operator Methods for Continuous-Time Markov Processes," with Y. Ait-Sahalia and J. Scheinkman, Handbook of Financial Econometrics, Volume 1, Chapter 1, 2010, pp. 1-66, Elsevier Press. View at ScienceDirect

This chapter surveys relevant tools, based on operator methods, to describe the evolution in time of continuous-time stochastic process, over different time horizons. Applications include modeling the long-run stationary distribution of the process, modeling the short or intermediate run transition dynamics of the process, estimating parametric models via maximum-likelihood, implications of the spectral decomposition of the generator, and various observable implications and tests of the characteristics of the process.

  • "Nonlinear Principal Components and Long Run Implications of Mutlivariate Diffusions," with X. Chen and J. Scheinkman, Annals of Statistics, Volume 37, Number 6B, 2009, pp. 4279-4312. PDF version

We investigate a method for extracting nonlinear principal components (NPCs). These NPCs maximize variation subject to smoothness and orthogonality constraints; but we allow for a general class of constraints and multivariate probability densities, including densities without compact support and even densities with algebraic tails. We provide primitive sufficient conditions for the existence of these NPCs. By exploiting the theory of continuous-time, reversible Markov diffusion processes, we give a different interpretation of these NPCs and the smoothness constraints. When the diffusion matrix is used to enforce smoothness, the NPCs maximize long-run variation relative to the overall variation subject to orthogonality constraints. Moreover, the NPCs behave as scalar autoregressions with heteroskedastic innovations; this supports semiparametric identification and estimation of a multivariate reversible diffusion process and tests of the overidentifying restrictions implied by such a process from low-frequency data. We also explore implications for stationary, possibly nonreversible diffusion processes. Finally, we suggest a sieve method to estimate the NPCs from discretely-sampled data.

  • "Long Term Risk: an Operator Approach," with J. Scheinkman , Econometrica, Volume 77, Number 1, January 2009, pp. 177-234. View at JSTOR

We create an analytical structure that reveals the long-run risk-return relationship for nonlinear continuous-time Markov environments. We do so by studying an eigenvalue problem associated with a positive eigenfunction for a conveniently chosen family of valuation operators. The members of this family are indexed by the elapsed time between payoff and valuation dates, and they are necessarily related via a mathematical structure called a semigroup. We represent the semigroup using a positive process with three components: an exponential term constructed from the eigenvalue, a martingale, and a transient eigenfunction term. The eigenvalue encodes the risk adjustment, the martingale alters the probability measure to capture long-run approximation, and the eigenfunction gives the long-run dependence on the Markov state. We discuss sufficient conditions for the existence and uniqueness of the relevant eigenvalue and eigenfunction. By showing how changes in the stochastic growth components of cash flows induce changes in the corresponding eigenvalues and eigenfunctions, we reveal a long-run risk-return trade-off.

  • "A Quartet of Semigroups for Model Specification, Robustness, Prices of Risk and Model Detection," with E. W. Anderson and T. J. Sargent, Journal of the European Economic Association, Volume 1, Issue 1, March 2003, pp.68-123. View at JSTOR

A representative agent fears that his model, a continuous time Markov process with jump and diffusion components, is misspecified and therefore uses robust control theory to make decisions. Under the decision maker’s approximating model, cautious behavior puts adjustments for model misspecification into market prices for risk factors. We use a statistical theory of detection to quantify how much model misspecification the decision maker should fear, given his historical data record. A semigroup is a collection of objects connected by something like the law of iterated expectations. The law of iterated expectations defines the semigroup for a Markov process, while similar laws define other semigroups. Related semigroups describe (1) an approximating model; (2) a model misspecification adjustment to the continuation value in the decision maker’s Bellman equation; (3) asset prices; and (4) the behavior of the model detection statistics that we use to calibrate how much robustness the decision maker prefers. Semigroups 2, 3, and 4 establish a tight link between the market price of uncertainty and a bound on the error in statistically discriminating between an approximating and a worst case model.

  • "Spectral Methods for Identifying Scalar Diffusions," with J. Scheinkman and N. Touzi, Journal of Econometrics, Volume 86, Issue 1, June 1998, pp. 1-32. View at RePEc

This paper shows how to identify nonparametrically scalar stationary diffussions from discrete-time data. The local evolution of the diffusion is characterized by a drift and diffussion coefficient along with the specification of boundary behavior. We recover this local evolution from two objects that can be inferred directly from discrete-time data: the stationary density and a conveniently chosen eigenvalue-eigenfunction pair of the conditional expectation operator over a unit interval of time. This construction also lends itself to a spectral characterization of the over-identifying restrictions implied by a scalar diffusion model of a discrete-time Markov process.

  • "Short-Term Interest Rates as Subordinated Diffusions," with T. G. Conley, E. G. J. Luttmer and J. Scheinkman, Review of Financial Studies, Volume 10, Number 3, Autumn 1997, pp. 525-577. View at JSTOR

In this article we characterize and estimate the process for short-term interest rates using federal funds interest rate data. We presume we are observing a discrete-time sample of a stationary scalar diffusion. We concentrate in a class of models in which the local volatility elasticity is constant and the drift has a flexible specification. To accommodate missing observations and to break the link between "economic time" and calendar time, we model the sampling scheme as an increasing process that is not directly observed. We propose and implement two methods for estimation. We find evidence for a volatility elasticity between one and one-half and two. When interest rates are high, local mean reversion is small and the mechanics for introducing stationarity is the increased volatility of the diffusion process.

  • "Bootstrapping the Long Run," with T. G. Conley and W. F. Liu, Macroeconomic Dynamics, Volume 1, Issue 2, 1997, pp. 279-311. View at Cambridge Journals

We develop and apply bootstrap methods for diffusion models when fitted to the long run as characterized by the stationary distribution of the data. To obtain bootstrap refinements to statistical inference, we simulate candidate diffusion processes. We use these bootstrap methods to assess measurements of local mean reversion or “pull” to the center of the distribution for short-term interest rates. We also use them to evaluate the fit of the model to the empirical density.

  • "Back to the Future: Generating Moment Implications for Continuous Time Markov Processes," with J. Scheinkman, Econometrica, Volume 63, Number 4, July 1995, pp. 767-804. View at JSTOR

Continuous-time Markov processes can be characterized conveniently by their infinitesimal generators. For such processes there exist forward and reverse-time generators. We show how to use those generators to construct moment conditions implied by stationary Markov processes. Generalized methods of moments estimators and tests can be constructed using these moment conditions. The resulting econometric methods are to be applied to discrete-time data obtained by sampling continuous-time Markov processes.

 



Research on GMM

Working Papers[top]

Published Material[top]

  • "Proofs for Large Sample Properties of Generalized Method of Moments Estimators," Journal of Econometrics, Volume 170, Issue 2, October 2012, pp. 325-330. View at Science Direct

Description: These are unpublished proofs for my paper "Large Sample Properties of Generalized Method of Moments Estimators", Econometrica , Volume 50, Number 4, July 1982, pp. 1029-1054. View at JSTOR

  • "Underidentification?" with M. Arellano and E. Sentana (October 31, 2011) Journal of Econometrics, Volume 170, Issue 2, October 2012, pp. 256-280. View at Science Direct

We develop methods for testing the hypothesis that an econometric model is underidentified and inferring the nature of the failed identification. By adopting a generalized method-of moments perspective, we feature directly the structural relations and we allow for nonlinearity in the econometric specification. We establish the link between a test for overidentication and our proposed test for underidentification. If, after attempting to replicate the structural relation, we find substantial evidence against the overidentifying restrictions of an augmented model, this is evidence against underidentification of the original model.

  • "Generalized Methods of Moments Estimation, " (June 17, 2007)

Description: GMM entry for The New Palgrave Dictionary of Economics, Second Edition, 2008.

  • "Interview with Lars Peter Hansen," E. Ghysels and A. Hall, Journal of Business and Economic Statistics, Volume 20, Number 4, October 2002, pp. 442-447. View at JSTOR
  • "Generalized Methods of Moments: A Time Series Perspective, " (2000)

It gives a perspective on the time series formulation and application of Generalized Method of Moments estimation. This file corresponds to the original paper that appeared later in the Encyclopedia, somewhat modified and under the new title of 'Method of Moments.' The full reference for the published version is: International Encyclopedia of the Social and Behavioral Sciences, N. J. Smelser and P. B. Bates (editors), Pergamon: Oxford, 2000.