Featured Researches

Econometrics

Identifying Different Definitions of Future in the Assessment of Future Economic Conditions: Application of PU Learning and Text Mining

The Economy Watcher Survey, which is a market survey published by the Japanese government, contains \emph{assessments of current and future economic conditions} by people from various fields. Although this survey provides insights regarding economic policy for policymakers, a clear definition of the word "future" in future economic conditions is not provided. Hence, the assessments respondents provide in the survey are simply based on their interpretations of the meaning of "future." This motivated us to reveal the different interpretations of the future in their judgments of future economic conditions by applying weakly supervised learning and text mining. In our research, we separate the assessments of future economic conditions into economic conditions of the near and distant future using learning from positive and unlabeled data (PU learning). Because the dataset includes data from several periods, we devised new architecture to enable neural networks to conduct PU learning based on the idea of multi-task learning to efficiently learn a classifier. Our empirical analysis confirmed that the proposed method could separate the future economic conditions, and we interpreted the classification results to obtain intuitions for policymaking.

Read more
Econometrics

Identifying Preferences when Markets are Incomplete

The paper shows that preferences in heterogeneous agent economies are identifiable without taking a stance on the underlying source of market incompleteness and heterogeneity. While the proposed restrictions deliver bounds for these parameters, distributional information such as the extensive margin of financially constrained households can sharpen identification. The paper provides estimates using Spanish data and shows that accounting for the extensive margin can contribute in bridging the gap between macro and micro estimates of relative risk aversion and the Frisch elasticity of labor supply. This has implications for measuring distortions in bond and equity markets and for predicting the equity premium.

Read more
Econometrics

Identifying and Estimating Perceived Returns to Binary Investments

I describe a method for estimating agents' perceived returns to investments that relies on cross-sectional data containing binary choices and prices, where prices may be imperfectly known to agents. This method identifies the scale of perceived returns by assuming agent knowledge of an identity that relates profits, revenues, and costs rather than by eliciting or assuming agent beliefs about structural parameters that are estimated by researchers. With this assumption, modest adjustments to standard binary choice estimators enable consistent estimation of perceived returns when using price instruments that are uncorrelated with unobserved determinants of agents' price misperceptions as well as other unobserved determinants of their perceived returns. I demonstrate the method, and the importance of using price variation that is known to agents, in a series of data simulations.

Read more
Econometrics

Identifying causal channels of policy reforms with multiple treatments and different types of selection

We study the identification of channels of policy reforms with multiple treatments and different types of selection for each treatment. We disentangle reform effects into policy effects, selection effects, and time effects under the assumption of conditional independence, common trends, and an additional exclusion restriction on the non-treated. Furthermore, we show the identification of direct- and indirect policy effects after imposing additional sequential conditional independence assumptions on mediating variables. We illustrate the approach using the German reform of the allocation system of vocational training for unemployed persons. The reform changed the allocation of training from a mandatory system to a voluntary voucher system. Simultaneously, the selection criteria for participants changed, and the reform altered the composition of course types. We consider the course composition as a mediator of the policy reform. We show that the empirical evidence from previous studies reverses when considering the course composition. This has important implications for policy conclusions.

Read more
Econometrics

Identifying the effect of a mis-classified, binary, endogenous regressor

This paper studies identification of the effect of a mis-classified, binary, endogenous regressor when a discrete-valued instrumental variable is available. We begin by showing that the only existing point identification result for this model is incorrect. We go on to derive the sharp identified set under mean independence assumptions for the instrument and measurement error. The resulting bounds are novel and informative, but fail to point identify the effect of interest. This motivates us to consider alternative and slightly stronger assumptions: we show that adding second and third moment independence assumptions suffices to identify the model.

Read more
Econometrics

Impact of Congestion Charge and Minimum Wage on TNCs: A Case Study for San Francisco

This paper describes the impact on transportation network companies (TNCs) of the imposition of a congestion charge and a driver minimum wage. The impact is assessed using a market equilibrium model to calculate the changes in the number of passenger trips and trip fare, number of drivers employed, the TNC platform profit, the number of TNC vehicles, and city revenue. Two charges are considered: (a) a charge per TNC trip similar to an excise tax, and (b) a charge per vehicle operating hour (whether or not it has a passenger) similar to a road tax. Both charges reduce the number of TNC trips, but this reduction is limited by the wage floor, and the number of TNC vehicles reduced is not significant. The time-based charge is preferable to the trip-based charge since, by penalizing idle vehicle time, the former increases vehicle occupancy. In a case study for San Francisco, the time-based charge is found to be Pareto superior to the trip-based charge as it yields higher passenger surplus, higher platform profits, and higher tax revenue for the city.

Read more
Econometrics

Incentive-Compatible Critical Values

Statistical hypothesis tests are a cornerstone of scientific research. The tests are informative when their size is properly controlled, so the frequency of rejecting true null hypotheses (type I error) stays below a prespecified nominal level. Publication bias exaggerates test sizes, however. Since scientists can typically only publish results that reject the null hypothesis, they have the incentive to continue conducting studies until attaining rejection. Such p -hacking takes many forms: from collecting additional data to examining multiple regression specifications, all in the search of statistical significance. The process inflates test sizes above their nominal levels because the critical values used to determine rejection assume that test statistics are constructed from a single study---abstracting from p -hacking. This paper addresses the problem by constructing critical values that are compatible with scientists' behavior given their incentives. We assume that researchers conduct studies until finding a test statistic that exceeds the critical value, or until the benefit from conducting an extra study falls below the cost. We then solve for the incentive-compatible critical value (ICCV). When the ICCV is used to determine rejection, readers can be confident that size is controlled at the desired significance level, and that the researcher's response to the incentives delineated by the critical value is accounted for. Since they allow researchers to search for significance among multiple studies, ICCVs are larger than classical critical values. Yet, for a broad range of researcher behaviors and beliefs, ICCVs lie in a fairly narrow range.

Read more
Econometrics

Inference by Stochastic Optimization: A Free-Lunch Bootstrap

Assessing sampling uncertainty in extremum estimation can be challenging when the asymptotic variance is not analytically tractable. Bootstrap inference offers a feasible solution but can be computationally costly especially when the model is complex. This paper uses iterates of a specially designed stochastic optimization algorithm as draws from which both point estimates and bootstrap standard errors can be computed in a single run. The draws are generated by the gradient and Hessian computed from batches of data that are resampled at each iteration. We show that these draws yield consistent estimates and asymptotically valid frequentist inference for a large class of regular problems. The algorithm provides accurate standard errors in simulation examples and empirical applications at low computational costs. The draws from the algorithm also provide a convenient way to detect data irregularities.

Read more
Econometrics

Inference for Large-Scale Linear Systems with Known Coefficients

This paper considers the problem of testing whether there exists a non-negative solution to a possibly under-determined system of linear equations with known coefficients. This hypothesis testing problem arises naturally in a number of settings, including random coefficient, treatment effect, and discrete choice models, as well as a class of linear programming problems. As a first contribution, we obtain a novel geometric characterization of the null hypothesis in terms of identified parameters satisfying an infinite set of inequality restrictions. Using this characterization, we devise a test that requires solving only linear programs for its implementation, and thus remains computationally feasible in the high-dimensional applications that motivate our analysis. The asymptotic size of the proposed test is shown to equal at most the nominal level uniformly over a large class of distributions that permits the number of linear equations to grow with the sample size.

Read more
Econometrics

Inference for Linear Conditional Moment Inequalities

We consider inference based on linear conditional moment inequalities, which arise in a wide variety of economic applications, including many structural models. We show that linear conditional structure greatly simplifies confidence set construction, allowing for computationally tractable projection inference in settings with nuisance parameters. Next, we derive least favorable critical values that avoid conservativeness due to projection. Finally, we introduce a conditional inference approach which ensures a strong form of insensitivity to slack moments, as well as a hybrid technique which combines the least favorable and conditional methods. Our conditional and hybrid approaches are new even in settings without nuisance parameters. We find good performance in simulations based on Wollmann (2018), especially for the hybrid approach.

Read more

Ready to get started?

Join us today