Featured Researches

Economics

Cooperation under Incomplete Information on the Discount Factors

In repeated games, cooperation is possible in equilibrium only if players are sufficiently patient, and long-term gains from cooperation outweigh short-term gains from deviation. What happens if the players have incomplete information regarding each other's discount factors? In this paper we look at repeated games in which each player has incomplete information regarding the other player's discount factor, and ask when full cooperation can arise in equilibrium. We provide necessary and sufficient conditions that allow full cooperation in equilibrium that is composed of grim trigger strategies, and characterize the states of the world in which full cooperation occurs. We then ask whether these "cooperation events" are close to those in the complete information case, when the information on the other player's discount factor is "almost" complete.

Read more
Economics

Corruption-free scheme of entering into contract: mathematical model

The main purpose of this paper is to formalize the modelling process, analysis and mathematical definition of corruption when entering into a contract between principal agent and producers. The formulation of the problem and the definition of concepts for the general case are considered. For definiteness, all calculations and formulas are given for the case of three producers, one principal agent and one intermediary. Economic analysis of corruption allowed building a mathematical model of interaction between agents. Financial resources distribution problem in a contract with a corrupted intermediary is considered.Then proposed conditions for corruption emergence and its possible consequences. Optimal non-corruption schemes of financial resources distribution in a contract are formed, when principal agent's choice is limited first only by asymmetrical information and then also by external influences.Numerical examples suggesting optimal corruption-free agents' behaviour are presented.

Read more
Economics

Cross Ranking of Cities and Regions: Population vs. Income

This paper explores the relationship between the inner economical structure of communities and their population distribution through a rank-rank analysis of official data, along statistical physics ideas within two techniques. The data is taken on Italian cities. The analysis is performed both at a global (national) and at a more local (regional) level in order to distinguish "macro" and "micro" aspects. First, the rank-size rule is found not to be a standard power law, as in many other studies, but a doubly decreasing power law. Next, the Kendall and the Spearman rank correlation coefficients which measure pair concordance and the correlation between fluctuations in two rankings, respectively, - as a correlation function does in thermodynamics, are calculated for finding rank correlation (if any) between demography and wealth. Results show non only global disparities for the whole (country) set, but also (regional) disparities, when comparing the number of cities in regions, the number of inhabitants in cities and that in regions, as well as when comparing the aggregated tax income of the cities and that of regions. Different outliers are pointed out and justified. Interestingly, two classes of cities in the country and two classes of regions in the country are found. "Common sense" social, political, and economic considerations sustain the findings. More importantly, the methods show that they allow to distinguish communities, very clearly, when specific criteria are numerically sound. A specific modeling for the findings is presented, i.e. for the doubly decreasing power law and the two phase system, based on statistics theory, e.g., urn filling. The model ideas can be expected to hold when similar rank relationship features are observed in fields. It is emphasized that the analysis makes more sense than one through a Pearson value-value correlation analysis.

Read more
Economics

Data-based Automatic Discretization of Nonparametric Distributions

Although using non-Gaussian distributions in economic models has become increasingly popular, currently there is no systematic way for calibrating a discrete distribution from the data without imposing parametric assumptions. This paper proposes a simple nonparametric calibration method based on the Golub-Welsch algorithm for Gaussian quadrature. Application to an optimal portfolio problem suggests that assuming Gaussian instead of nonparametric shocks leads to up to 17% overweighting in the stock portfolio because the investor underestimates the probability of crashes.

Read more
Economics

Decision structure of risky choice

As we know, there is a controversy about the decision making under risk between economists and psychologists. We discuss to build a unified theory of risky choice, which would explain both of compensatory and non-compensatory theories. For risky choice, according to cognition ability, we argue that people could not build a continuous and accurate subjective probability world, but several order concepts, such as small, middle and large probability. People make decisions based on information, experience, imagination and other things. All of these things are so huge that people have to prepare some strategies. That is, people have different strategies when facing to different situations. The distributions of these things have different decision structures. More precisely, decision making is a process of simplifying the decision structure. However, the process of decision structure simplifying is not stuck in a rut, but through different path when facing problems repeatedly. It is why preference reversal always happens when making decisions. The most efficient way to simplify the decision structure is calculating expected value or making decisions based on one or two dimensions. We also argue that the deliberation time at least has four parts, which are consist of substitution time, first order time, second order time and calculation time. Decision structure also can simply explain the phenomenon of paradoxes and anomalies. JEL Codes: C10, D03, D81

Read more
Economics

Deep Learning in (and of) Agent-Based Models: A Prospectus

A very timely issue for economic agent-based models (ABMs) is their empirical estimation. This paper describes a line of research that could resolve the issue by using machine learning techniques, using multi-layer artificial neural networks (ANNs), or so called Deep Nets. The seminal contribution by Hinton et al. (2006) introduced a fast and efficient training algorithm called Deep Learning, and there have been major breakthroughs in machine learning ever since. Economics has not yet benefited from these developments, and therefore we believe that now is the right time to apply Deep Learning and multi-layered neural networks to agent-based models in economics.

Read more
Economics

Demonetization and Its Impact on Employment in India

On November 08, the sudden announcement to demonetization the high denomination currency notes sent tremors all across the country. Given the timing, and socioeconomic and political repercussions of the decision, many termed it a financial emergency. Given high proportion of these notes in circulation, over 86 percent, it led to most economic activities, particularly employment, affected in a big way. Political parties, however, seemed divided on the issue, i.e. those in favor of the decision feel it will help to curb the galloping size of black money, fake currency, cross boarder terrorism, etc. In sharp contrast, the others believe it is a purely misleading, decision, based on no or poor understanding of black economy, and hence is only politically motivated in wake of the assembly elections due in a couple of states.

Read more
Economics

Dependence of technological improvement on artifact interactions

Empirical research has shown performance improvement of many different technological domains occurs exponentially but with widely varying improvement rates. What causes some technologies to improve faster than others do? Previous quantitative modeling research has identified artifact interactions, where a design change in one component influences others, as an important determinant of improvement rates. The models predict that improvement rate for a domain is proportional to the inverse of the domain interaction parameter. However, no empirical research has previously studied and tested the dependence of improvement rates on artifact interactions. A challenge to testing the dependence is that any method for measuring interactions has to be applicable to a wide variety of technologies. Here we propose a patent-based method that is both technology domain-agnostic and less costly than alternative methods. We use textual content from patent sets in 27 domains to find the influence of interactions on improvement rates. Qualitative analysis identified six specific keywords that signal artifact interactions. Patent sets from each domain were then examined to determine the total count of these 6 keywords in each domain, giving an estimate of artifact interactions in each domain. It is found that improvement rates are positively correlated with the inverse of the total count of keywords with correlation coefficient of +0.56 with a p-value of 0.002. The empirical results agree with model predictions and support the suggestion that domains with higher number of artifacts interactions (higher complexity) will improve at a slower pace.

Read more
Economics

Dis-embedded Openness: Inequalities in European Economic Integration at the Sectoral Level

The process of European integration resulted in a marked increase in transnational economic flows, yet regional inequalities along many developmental indicators remain. We analyze the unevenness of European economies with respect to the embedding of export sectors in upstream domestic flows, and their dependency on dominant export partners. We use the WIOD data set of sectoral flows for the period of 1995-2011 for 24 European countries. We found that East European economies were significantly more likely to experience increasing unevenness and dependency with increasing openness, while core countries of Europe managed to decrease their unevenness while increasing their openness. Nevertheless, by analyzing the trajectories of changes for each country, we see that East European countries are also experiencing a turning point, either switching to a path similar to the core, or to a retrograde path with decreasing openness. We analyze our data using pooled time series models and case studies of country trajectories.

Read more
Economics

Disruptive firms

This study proposes the concept of disruptive firms: they are firms with market leadership that deliberate introduce new and improved generations of durable goods that destroy, directly or indirectly, similar products present in markets in order to support their competitive advantage and/or market leadership. These disruptive firms support technological and industrial change and induce consumers to buy new products to adapt to new socioeconomic environment. In particular, disruptive firms generate and spread path-breaking innovations in order to achieve and sustain the goal of a (temporary) profit monopoly. This organizational behaviour and strategy of disruptive firms support technological change. This study can be useful for bringing a new perspective to explain and generalize one of the determinants that generates technological and industrial change. Overall, then this study suggests that one of the general sources of technological change is due to disruptive firms (subjects), rather than disruptive technologies (objects), that generate market shifts in a Schumpeterian world of innovation-based competition.

Read more

Ready to get started?

Join us today