Featured Researches

Economics

A Note on Kuhn's Theorem with Ambiguity Averse Players

Kuhn's Theorem shows that extensive games with perfect recall can equivalently be analyzed using mixed or behavioral strategies, as long as players are expected utility maximizers. This note constructs an example that illustrate the limits of Kuhn's Theorem in an environment with ambiguity averse players who use maxmin decision rule and full Bayesian updating.

Read more
Economics

A Partial Solution to Continuous Blotto

This paper analyzes the structure of mixed-strategy equilibria for Colonel Blotto games, where the outcome on each battlefield is a polynomial function of the difference between the two players' allocations. This paper severely reduces the set of strategies that needs to be searched to find a Nash equilibrium. It finds that there exists a Nash equilibrium where both players' mixed strategies are discrete distributions, and it places an upper bound on the number of points in the supports of these discrete distributions.

Read more
Economics

A Proposal to Extend Expected Utility in a Quantum Probabilistic Framework

Expected utility theory (EUT) is widely used in economic theory. However, its subjective probability formulation, first elaborated by Savage, is linked to Ellsberg-like paradoxes and ambiguity aversion. This has led various scholars to work out non-Bayesian extensions of EUT which cope with its paradoxes and incorporate attitudes toward ambiguity. A variant of the Ellsberg paradox, recently proposed by Mark Machina and confirmed experimentally, challenges existing non-Bayesian models of decision-making under uncertainty. Relying on a decade of research which has successfully applied the formalism of quantum theory to model cognitive entities and fallacies of human reasoning, we put forward a non-Bayesian extension of EUT in which subjective probabilities are represented by quantum probabilities, while the preference relation between acts depends on the state of the situation that is the object of the decision. We show that the benefits of using the quantum theoretical framework enables the modeling of the Ellsberg and Machina paradoxes, as the representation of ambiguity and behavioral attitudes toward it. The theoretical framework presented here is a first step toward the development of a `state-dependent non-Bayesian extension of EUT' and it has potential applications in economic modeling.

Read more
Economics

A Quantum-like Model of Selection Behavior

In this paper, we introduce a new model of selection behavior under risk that describes an essential cognitive process for comparing values of objects and making a selection decision. This model is constructed by the quantum-like approach that employs the state representation specific to quantum theory, which has the mathematical framework beyond the classical probability theory. We show that our quantum approach can clearly explain the famous examples of anomalies for the expected utility theory, the Ellsberg paradox, the Machina paradox and the disparity between WTA and WTP. Further, we point out that our model mathematically specifies the characteristics of the probability weighting function and the value function, which are basic concepts in the prospect theory.

Read more
Economics

A Rank-Based Approach to Zipf's Law

An Atlas model is a rank-based system of continuous semimartingales for which the steady-state values of the processes follow a power law, or Pareto distribution. For a power law, the log-log plot of these steady-state values versus rank is a straight line. Zipf's law is a power law for which the slope of this line is -1. In this note, rank-based conditions are found under which an Atlas model will follow Zipf's law. An advantage of this rank-based approach is that it provides information about the dynamics of systems that result in Zipf's law.

Read more
Economics

A Simple Algorithm for Solving Ramsey Optimal Policy with Exogenous Forcing Variables

This algorithm extends Ljungqvist and Sargent (2012) algorithm of Stackelberg dynamic game to the case of dynamic stochastic general equilibrium models including exogenous forcing variables. It is based Anderson, Hansen, McGrattan, Sargent (1996) discounted augmented linear quadratic regulator. It adds an intermediate step in solving a Sylvester equation. Forward-looking variables are also optimally anchored on forcing variables. This simple algorithm calls for already programmed routines for Ricatti, Sylvester and Inverse matrix in Matlab and Scilab. A final step using a change of basis vector computes a vector auto regressive representation including Ramsey optimal policy rule function of lagged observable variables, when the exogenous forcing variables are not observable.

Read more
Economics

A Simple extension of Dematerialization Theory: Incorporation of Technical Progress and the Rebound Effect

Dematerialization is the reduction in the quantity of materials needed to produce something useful over time. Dematerialization fundamentally derives from ongoing increases in technical performance but it can be counteracted by demand rebound - increases in usage because of increased value (or decreased cost) that also results from increasing technical performance. A major question then is to what extent technological performance improvement can offset and is offsetting continuously increasing economic consumption. This paper contributes to answering this question by offering some simple quantitative extensions to the theory of dematerialization. The paper then empirically examines the materials consumption trends as well as cost trends for a large set of materials and a few modern artifacts over the past decades. In each of 57 cases examined, the particular combinations of demand elasticity and technical performance rate improvement are not consistent with dematerialization. Overall, the theory extension and empirical examination indicate that there is no dematerialization occurring even for cases of information technology with rapid technical progress. Thus, a fully passive policy stance that relies on unfettered technological change is not supported by our results.

Read more
Economics

A Statistical Model of Inequality

This paper develops a nonparametric statistical model of wealth distribution that imposes little structure on the fluctuations of household wealth. In this setting, we use new techniques to obtain a closed-form household-by-household characterization of the stable distribution of wealth and show that this distribution is shaped entirely by two factors - the reversion rates (a measure of cross-sectional mean reversion) and idiosyncratic volatilities of wealth across different ranked households. By estimating these factors, our model can exactly match the U.S. wealth distribution. This provides information about the current trajectory of inequality as well as estimates of the distributional effects of progressive capital taxes. We find evidence that the U.S. wealth distribution might be on a temporarily unstable trajectory, thus suggesting that further increases in top wealth shares are likely in the near future. For capital taxes, we find that a small tax levied on just 1% of households substantially reshapes the distribution of wealth and reduces inequality.

Read more
Economics

A Stochastic Electricity Market Clearing Formulation with Consistent Pricing Properties

We argue that deterministic market clearing formulations introduce arbitrary distortions between day-ahead and expected real-time prices that bias economic incentives and block diversification. We extend and analyze the stochastic clearing formulation proposed by Pritchard et al. (2010) in which the social surplus function induces penalties between day-ahead and real-time quantities. We prove that the formulation yields price distortions that are bounded by the bid prices, and we show that adding a similar penalty term to transmission flows and phase angles ensures boundedness throughout the network. We prove that when the price distortions are zero, day-ahead quantities converge to the quantile of real-time counterparts. The undesired effects of price distortions suggest that stochastic settings provide significant benefits over deterministic ones that go beyond social surplus improvements. We propose additional metrics to evaluate these benefits.

Read more
Economics

A Theory of Market Efficiency

We introduce a mathematical theory called market connectivity that gives concrete ways to both measure the efficiency of markets and find inefficiencies in large markets. The theory leads to new methods for testing the famous efficient markets hypothesis that do not suffer from the joint-hypothesis problem that has plagued past work. Our theory suggests metrics that can be used to compare the efficiency of one market with another, to find inefficiencies that may be profitable to exploit, and to evaluate the impact of policy and regulations on market efficiency. A market's efficiency is tied to its ability to communicate information relevant to market participants. Market connectivity calculates the speed and reliability with which this communication is carried out via trade in the market. We model the market by a network called the trade network, which can be computed by recording transactions in the market over a fixed interval of time. The nodes of the network correspond to participants in the market. Every pair of nodes that trades in the market is connected by an edge that is weighted by the rate of trade, and associated with a vector that represents the type of item that is bought or sold. We evaluate the ability of the market to communicate by considering how it deals with shocks. A shock is a change in the beliefs of market participants about the value of the products that they trade. We compute the effect of every potential significant shock on trade in the market. We give mathematical definitions for a few concepts that measure the ability of the market to effectively dissipate shocks.

Read more

Ready to get started?

Join us today