Measuring non-exchangeable tail dependence using tail copulas
AA new class of tail dependence measures and their maximization
Takaaki Koike ∗ , Shogo Kato † and Marius Hofert ‡ February 1, 2021
Abstract
A new class of measures of bivariate tail dependence is proposed, which is defined as a limit of ameasure of concordance of the underlying copula restricted to the tail region of interest. The proposedtail dependence measures include tail dependence coefficients as special cases, but capture the extremalrelationship between random variables not only along the diagonal but also along all the angles weightedby the so-called tail generating measure. As a result, the proposed tail dependence measures overcomethe issue that the tail dependence coefficients underestimate the extent of extreme co-movements. Wealso consider the so-called maximal and minimal tail dependence measures, defined as the maximum andminimum of the tail dependence measures among all tail generating measures for a given copula. It turnsout that the minimal tail dependence measure coincides with the tail dependence coefficient, and themaximal tail dependence measure overestimates the degree of extreme co-movements. We investigateproperties, representations and examples of the proposed tail dependence measures, and their performanceis demonstrated in a series of numerical experiments. For fair assessment of tail dependence and stabilityof estimation under small sample size, we support the use of tail dependence measures weighted over allangles compared with maximal and minimal ones.
MSC classification:
Keywords:
Copula; Measure of concordance; Tail copula; Tail dependence; Tail dependence coefficient
The dependence between two continuous random variables X and Y is summarized by the copula C of( X, Y ), the distribution function of ( F X ( X ) , F Y ( Y )), where F X and F Y are distribution functions of X and Y ,respectively. A particular interest in extreme value analysis and application in, for example, finance, insuranceand risk management, is to quantify dependence in tail regions, namely, to summarize the tendency for X and Y to jointly take on extremely small (or large) values by a single number. One popular measure of suchtail dependence is the tail dependence coefficient (TDC) λ ( C ) = lim p ↓ C ( p,p ) p (Sibuya, 1960). Despite its ∗ Risk Analysis Research Center, the Institute of Statistical Mathematics, Tachikawa, Tokyo, Japan, E-mail: [email protected] † Risk Analysis Research Center, the Institute of Statistical Mathematics, Tachikawa, Tokyo, Japan, E-mail: [email protected] ‡ Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, ON, Canada, E-mail: [email protected] a r X i v : . [ m a t h . S T ] J a n ●● ● ●● ●● ● ● ●● ●●● ●●● ●●● ●● ● ●● ● ● ●● ●● ●●● ●● ● ●● ●● ● ●●● ● ●● ●● ●● ● ●● ●● ●● ●● ●● ●● ●●●● ● ●●●●● ● ●● ● ● ● ●● ●● ● ● ●●● ●● ●●● ●● ●● ● ●● ●● ●● ● ●● ● ● ●● ● ● ●● ●● ●●● ● ●● ●● ●● ●● ●● ● ●●●● ●●● ●● ●● ●● ● ●●● ●● ●●● ●● ● ●●● ●● ●●● ●● ● ●● ●● ●●● ●● ● ●●●● ●●● ●● ●● ● ●●●● ●●● ●●●● ● ●● ●●● ●●● ● ●● ●● ● ● ●●● ●● ● ●● ●● ●●● ●● ●● ● ● ●●● ●● ● ●●● ●● ●●● ● ● ● ●● ●● ● ●● ●●●● ●● ● ●●● ●●● ● ●●● ●●● ● ●●● ●● ●● ●● ● ●● ●● ● ●● ●● ●● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ●● ●●●● ●● ● ●●●● ● ●● ●● ●● ● ●●● ● ● ●● ●●● ●●●● ●●● ●●●●● ● ● ●● ●● ●● ●●●● ● ●● ●●●●● ● ●● ●●●●●● ●●● ●● ●●● ●●● ● ●●●● ●●●● ●● ●●● ●● ●● ● ● ●● ● ● ●●●●● ●● ●● ●●● ● ●● ● ●●●● ●● ●● ● ●●● ●● ●● ● ● ●● ●●● ●●●● ● ●●●● ● ●●● ●●●● ●● ●●● ●●● ●● ● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ● ●●●● ●●● ●● ● ●● ●●● ● ●● ● ●● ●●● ●● ● ●●● ●● ●● ●● ●● ● ●● ●● ● ●● ●●●●● ● ●●● ●● ●●●● ●●● ● ●● ●● ●●● ● ● ●● ● ●●● ●●● ● ●●● ●● ● ●●●●● ● ●●●● ●●●● ● ● ●●● ●● ● ●●● ● ●● ●● ● ●● ●● ●●●● ●● ● ●● ● ●● ●● ●● ●●● ● ● ●● ●● ●● ● ●●●● ● ●● ●● ●●●●● ●● ●● ●●● ● ●● ●●● ●● ●●● ●●● ●●●●● ● ●●● ●●● ●●●● ●● ●● ● ●●● ●● ● ●● ●● ●● ●● ●● ●● ●●● ● ●● ● ●● ● ●●● ●● ● ●● ●●●●● ● ● ●●●●● ● ●● ●●● ●●● ●● ●● ● ●●●● ● ●● ● ●●●● ●● ● ●●● ●●● ●● ● ● ● ●●● ●●●●● ●● ●●● ●●●● ●●● ● ● ●●● ●●● ●● ●●●● ●● ●● ●● ●● ●●● ●● ●● ●● ●● ●●● ● ●●● ●●● ●● ●● ●● ● ● ●●● ●●● ● ● ●●● ● ●● ●●●●● ● ●●● ●● ●● ●● ● ●●● ●● ●●● ●● ● ●●● ●●●●● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●● ● ●●●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ● ●● ●● ● ●●● ●●● ● ●● ● ●●● ●●● ●●●●●● ●● ●● ● ●●● ●● ● ●●●● ●●● ●● ●● ● ●●●●● ● ●●●● ●● ● ●●● ●● ● ●●● ●●● ●● ● ●● ● ●● ●●● ● ●● ●● ● ●● ● ● ● ●●● ●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ●●● ● ● ●●● ●● ●●● ●● ●● ●● ● ●● ●● ●●● ● ●●● ● ●●●● ● ●● ● ●● ● ●● ●● ● ●● ●●●●● ●● ● ●●● ●●● ● ●● ● ●●● ●●●●●● ●● ●●●● ●●● ● ●●● ●●●● ●● ●● ●● ● ●● ●●● ●● ● ●● ●● ●● ● ●● ● ●● ●● ●●● ●● ●● ●●● ●●● ●●●● ● ●●●● ● ● ●● ● ● ●●● ● ●● ●●● ● ●● ●●● ● ●● ● ●● ●●● ●● ●●●●● ● ● ●●●●●● ●● ●● ● ●●●● ● ● ●●● ●● ●● ●●●● ●●●● ● ● ●● ● ● ● ●●● ● ●●● ● ●● ●● ●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●●● ● ●● ●● ●● ● ●● ● ●●● ● ●●● ●● ●●● ● ●● ● ●●●● ● ● ●● ● ●●● ● ●●● ●●●● ●● ●● ● ● ● ●●●● ●●● ●● ●●●● ● ●●● ● ●● ● ●● ● ●● ●●● ●●● ●●●● ●● ●●● ● ●● ●● ● ●● ●● ● ●● ● ●● ● ●● ●●● ● ●● ●●●● ● ●●● ●● ● ●●●● ●● ● ●● ●●●● ● ● ●● ●●●● ●● ●● ●● ●●●● ● ●●●● ● ●●●● ●● ●● ● ● ● ●● ●●● ●● ● ●●● ● ●● ●●●● ●●● ●●● ● ●● ● ●●● ●● ● ●● ●●● ●● ●●● ● ●●●●● ●●● ● ●●● ● ●● ●● ●● ● ●● ●●● ●● ●●● ●● ●● ●● ● ●● ●● ●●● ● ●● ●● ●●● ● ●●● ●●● ●● ●● ●●● ●● ● ●●●● ● ●●● ● ●●●● ●● ● ●● ●● ●● ●●● ●● ●● ●● ●● ● ● ●● ● ● ●●● ●●● ● ●● ● ●● ●●● ● ●●●● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ●● ●●● ●● ●● ●●● ● ●● ●● ● ● ● ●●● ●● ●● ● ●●● ●● ● ●● ●●●● ●● ● ●● ● ●●●● ●● ●●● ● ●●● ●● ●● ●●● ●●● ●●● ●●●●● ●● ●●● ●● ●●● ●● ●● ● ●● ●● ●●● ●●● ● ●● ●● ● ● ●●● ● ●●● ●● ●● ● ●● ●●●● ● ●● ●● ●● ● ●●●● ● ●●● ●● ● ●●●●● ● ● ●●● ●●● ●● ●● ●●● ●● ● ● ●● ●● ●●●● ●● ● ●● ●● ● ●● ●● ●●● ●● ●●● ●● ● ● ●● ●●● ●●● ●● ●●●● ● ●●● ●● ●● ● ●● ●●●● ●● ● ●● ●● ●● ●●● ● ●● ●● ●●● ●● ●● ● ● ● ●● ●● ●●● ●● ● ●●● ●● ● ●● ●● ● ●● ● ● ●●● ●● ●●● ● ●● ●● ● ●● ●● ● ●●● ●●●● ●●●● ● ● ●● ●● ● ● ●● ●● ●● ●●●● ●● ●● ●●● ●●●●● ●● ● ●● ● ●●● ● ● ●● ● ●● ●●● ● ●●● ●● ●●● ●● ●●● ● ●●● ●●● ● ●●●● ● ●●●● ● ●● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●●●● ●●● ●●● ●● ● ●● ●● ●● ● ●●●●● ● ●● ● ●● ●● ●● ●●●● ●●● ●● ●● ●● ●●● ●●● ●● ● ●● ●● ●●● ●● ● ● ●● ●● ●● ●● ●● ●●● ● ●●● ●● ● ●● ●● ●●●● ●● ●● ●● ●● ● ●●●●●● ●● ●●● ● ● ●●● ●● ● ●● ●●● ● ● ● ●● ● ●● ● ●● ●●●●● ●●● ● ●●●● ●●● ●●● ●● ● ●●● ● ●● ●●● ●● ●● ●●● ●● ●● ●● ● ●● ●●● ● ●● ●●● ● ●●● ●● ●●●● ●●● ● ●●●● ● ● ●●●● ● ● ●● ●● ●● ●● ● ●● ● ●● ●●●● ●●● ●● ● ●●● ●● ●●● ● ●● ●● ●●● ●● ●● ● ●● ●●● ●● ● ● ●● ●● ●● ●● ● ●● ●● ●● ● ●●● ● ●●● ●● ● ●● ●●●● ●● ● ● ● ● ● ●● ● ●● ●● ● ● ●●●● ●● ●● ●● ●● ●● ● ●● ● ●● ● ● ●● ● ●● ●● ● ●●● ● ● ●●●● ● ● ●●● ● ●● ● ●● ● ● ●●● ●●●● ●●● ● ●● ● ● ●● ●● ● ● ●● ●●●●● ● ●● ●● ● ●● ●●● ●● ●● ●●● ●● ● ●●● ●● ●●● ●● ●●● ●●● ●● ● ●● ●● ●●● ● ●● ● ●●● ●●●● ●●● ● ●●● ● ●● ●●● ●●● ● ● ●●●● ● ● ●●● ●● ●● ● ●●● ●●● ● ● ●● ●● ● ●● ●● ●●● ● ● ●●● ● ●● ● ●● ●● ●● ●● ●● ● ●● ●● ●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●●● ●● ●● ●● ●● ●●● ● ●●●● ●● ● ●●●● ●● ● ● ● ●● ●●● ●● ●●●● ● ●●●● ●● ●●● ● ●● ● ●● ● ●● ●● ●● ● ● ●●●● ● ● ● ●● ●●● ●● ● ● ●●● ● ●●●● ●●● ● ●● ● ●● ● ●●● ●● ●● ●● ●● ●● ●● ●●● ● ●●●●● ●● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●● ● ●● ●●●● ●● ● ●●● ●● ●● ●●● ●● ●● ●●●● ●● ●● ● ●● ●● ●● ● ● ●● ● ● ●●● ● ●●● ●● ●● ●●● ●● ● ●● ●●● ● ●● ● ●●●● ●●●●● ●● ●● ●● ● ● ●● ●●● ● ●● ● ●● ●● ● ●● ● ●● ●● ● ●● ●●● ● ● ● ●● ●●● ●● ●● ● ●● ●●● ● ●●●● ● ●● ●●● ● ● ●● ● ●● ●●● ●● ●●● ●● ●●● ●● ● ●● ●●●● ● ●● ●● ●● ●●●● ●● ● ● ●●● ● ●●●●● ● ● ●●● ● ●● ●● ● ●●●● ● ●●● ● ●●● ●● ● ●●● ●●● ● ● ●● ●●● ● ● ●● ●● ●●●● ● ●●●● ● ● ●●● ●● ●● ●● ● ●●● ● ●● ●● ● ●●● ●● ● ●● ● ●● ● ●●●● ● ●●●● ●●● ● ●● ●● ●● ● ●●●●● ● ●● ● ●●● ●●● ●●● ● ●● ●●● ●● ●●●● ● ●● ●●●●●● ●●● ●● ●●●● ● ●● ●● ●● ●●●● ●● ● ● ●●● ●●● ● ● ●● ● ●●● ● ●● ● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●● ●●● ●● ● ●●●●● ● ●● ●●● ● ●●● ●● ●● ●●● ●●●● ●● ● ●● ●●● ●●● ● ●● ● ●● ●● ●●● ●● ●● ●●● ●● ●●● ●● ● ●● ● ● ●● ●● ● ● ●● ●●● ●● ● ● ●●● ● ●●● ●● ● ●● ●●●● ● ●● ●● ●● ●●● ● ●● ●● ●●● ● ●●● ●●●●● ● ● ● ●●●● ● ●● ●● ● ●● ●● ● ●● ●● ●●● ●● ●● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ●● ●●● ●●● ●● ●●● ●● ●● ●● ●● ●● ● ● ●● ●● ●● ●●● ● ●●● ● ●●● ●● ●●● ●● ● ● ●● ● ●● ● ●●● ● ●● ● ●● ●● ●● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ● ● ●● ●● ● ●● ●● ● ● ●● ●● ● ●●● ●● ● ● ● ●● ●●● ●● ● ● ●● ●● ●● ● ●● ●●● ● ● ●● ●●● ● ●● ● ●●● ●● ●● ●● ●●●● ● ● ●●●● ●●● ● ●●● ● ●●● ●●● ● ● ●●● ● ●●●● ●● ● ● ●● ● ●●● ● ●●● ● ● ●●● ●● ●● ●● ● ●●● ●● ● ●●● ●● ● ●● ●●● ●● ● ● ● ● ●●● ● ●●● ● ● ●● ●●● ●● ●●●● ●● ●●● ●● ●● ●● ●● ●● ● ●●● ● ●● ●●● ●●● ●●● ●●●●● ● ●●● ●● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●●●● ●●● ● ● ●●● ●●● ●● ● ●● ●●● ●● ●● ●● ●●● ●●● ●● ●●●●● ●●● ●●● ●●● ● ●● ● ●●● ● ●● ●●● ● ●●● ● ●●● ●●● ●● ●●●●● ●● ●●●● ●●● ● ●● ●● ● ●● ●● ●●●● ●● ●● ● ●● ●● ●● ●● ●●●● ●● ● ●●● ●●● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●● ●●●●● ● ● ●●● ●●●● ●●●● ●●● ●● ● ●● ● ●●●● ● ●●● ●● ●● ●● ●● ● ● ●●● ●●●● ● ●● ●● ● ●● ● ●●● ● ● ● ●● ● ●● ●●● ●● ●●●● ● ●● ●●●●● ●● ●●● ● ● ● ●●● ●● ● ●●● ●● ● ●● ●● ●●● ● ●●● ●● ●● ●● ●● ●●● ● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●●● ● ●● ●●● ●● ●● ●● ●● ● ●●● ●● ● ● ●● ●● ●● ●● ● ● ●●● ●●●●● ●● ●● ●● ● ●● ● ● ●● ● ●● ●● ● ●● ●● ● ●●● ● ●●● ● ● ●● ●●● ●● ● ●●● ●●● ● ●● ●●●● ●● ●●● ● ●●● ●● ● ● ●● ●●● ● ●●● ●● ● ●●● ● ●● ●● ●● ●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●●● ●● ●● ●● ●●● ● ●● ●●●● ●● ●● ● ● ●●●● ● ● ●● ●● ● ●●●● ● ●●● ●●●● ●● ●● ●● ● ●● ● ● ●●● ●● ● ● ● ●● ●●● ●● ●●● ● ●● ●● ● ●● ●●●●●● ●● ● ● ●● ● ● ●●● ●●●● ●● ●● ●● ●●●● ● ● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●●● ● ●● ●● ● ●●● ●● ● ●● ●●● ●●● ●● ●● ● ●●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ● ●●● ● ●● ● ● ●● ●●● ● ● ● ●●● ●● ● ●● ●● ●●● ● ●●● ●● ●● ● ●●● ●●● ●● ● ●● ●●●● ● ●● ● ● ●● ●● ● ●● ●●● ● ● ●● ●●●● ● ● ●● ● ●●● ● ●● ●● ●● ●● ●● ● ●● ●● ●●● ● ●● ●● ● ●●●● ● ●●●● ● ●●● ● ● ●● ●●● ●● ● ●●●● ●●● ●● ●●●● ●● ●●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ● ●● ●● ●● ●● ● ●● ● ●● ●●● ●●● ● ●● ●● ●● ●● ● ●●● ● ●● ● ● ●●● ●●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●● ●●●● ●●● ●● ●● ●●● ●●●● ●●● ● ●● ● ●●● ●● ●● ● ●●● ● ●● ●● ●● ●●● ● ●● ●● ●●●● ●● ● ● ●●● ●●● ●● ● ●●●●● ●● ● ●● ●● ●●● ●● ● ● ●● ●●● ●● ● ●● ● ●● ● ●●● ●● ●●●●● ●●● ●●● ●●● ● ●● ●● ●●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ● ●● ● ● ●●●● ●● ●●● ●● ● ●● ● ● ● ●● ●● ●● ● ● ●●● ●●● ●●● ●● ●● ● ● ●●●●●● ● ● ●●● ● ●● ● ●●●●● ● ●● ●●●●● ●●● ●● ● ●● ● ●● ● ●●● ● ● ●● ● ●● ●●●● ● ●● ●● ●●● ●●● ●● ● ●● ●● ●● ●● ●● ●●● ●●● ●● ●●● ●●● ●●● ●● ● ● ●● ●● ● ●● ●●●●● ●●● ●● ● ●●●● ● ●● ●● ● ●● ● ●●●● ●● ●●● ● ●●● ●● ●●● ●● ●● ●● ●● ●● ● ●●● ● ● ●●● ● ●● ●● ●●●● ●● ●●● ●● ●●● ●● ●●● ● ●●● ● ●● ●●● ● ●● ●●● ● ●● ● ● ●●● ●● ● ●● ● ●● ● ●● ● ● ●● ●● ● ●●● ●● ●●● ● ●● ● ●●●● ●● ●●● ● ●● ●●● ●● ●● ●● ● ● ●●● ●● ● ●● ●● ●●● ●●● ●● ● ●● ●● ● ●● ●●● ●●● ●● ● ●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●● ●●●● ● ●●● ●● ●● ●●● ●●●● ●● ●● ●●● ●● ●● ●●● ● ●●● ● ●● ●● ●● ● ●●● ●● ●● ●● ●● ●●● ●● ●●● ●● ●●● ● ●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●●● ●● ●● ● ●● ●● ● ● ●● ●● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●●● ● ●● ●● ● ● ●●●●●● ●● ● ●●● ● ●● ● ●● ●● ●●● ● ●● ●● ●●● ●● ● ● ●● ● ● ● ●● ●● ● ●● ● ●● ●●●●●● ● ● ●●● ● ●●●●● ●●●● ●●●● ●● ●● ●●● ●●● ● ● ●●●● ●● ●●● ●●● ● ●●●● ●● ●●● ●● ●●●● ●● ● ●●● ● ●● ●●●● ● ●●●● ●●● ●●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ● ●● ● ●●● ● ● ●● ●● ● ●● ●● ●●● ●●●● ● ●● ●● ● ● ●●●● ● ●● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●● ● ●● ●●● ●●●●● ● ● ●●● ●● ●●●● ● ●●●● ●●● ● ●●● ● ●●●● ●● ●● ● ●● ●● ● ●● ●●● ●● ●●● ● ●● ●●● ● ●● ●● ●● ●●● ● ●● ●● ● ●● ●●● ●● ● ● ● ●●● ●●●● ●●●●● ●● ●●● ● ●●● ●● ●● ● ●● ●●●●● ●●●●● ●● ●●● ●● ●●●●● ●● ●● ● ●● ● ● ●●●● ● ●●●●● ●● ●● ● ●●● ● ● ●● ●●● ●●● ●●● ● ●●● ● ● ●● ●● ●● ●● ●● ●●● ●●●● ●● ●● ●●● ● ●● ● ●●● ●●● ●● ●●● ● ● ●●● ● ●● ●● ●●● ● ●●●● ●● ●● ●● ●●● ●●●● ●● ●●●● ● ●● ● ●● ● ●● ●● ● ●●● ● ●● ●● ●● ● ●●●● ● ●● ● ●●● ● ●● ●● ●● ● ● ●● ●●● ● ●●● ●●● ● ●● ●● ● ● ●●● ●● ●● ●● ●● ● ●● ●● ● ● ●● ●● ● ●●● ● ● ● ●●●● ● ● ●● ● ● ●●● ● ●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●● ●●●● ●●● ● ● ●●● ●● ●●● ● ●● ● ●●●● ●● ● ●● ●● ●● ●●● ●● ● ●● ● ●●● ●● ● ●●●● ● ● ●●● ●●●● ●● ●● ●● ●● ●●●● ●●● ●● ● ● ●●● ●●●● ● ● ●●● ● ● ●● ● ●● ●● ●● ● ●● ● ●●● ●● ● ●●● ●●●● ●●●● ● ●● ●● ●● ●● ●● ●● ●● ●●● ● ●● ● ●● ●● ●● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●● ● ● ●● ●●● ● ● ● ●●● ●●● ● ●●●● ●● ●● ●● ●●● ●●●● ●● ●●● ●● ●● ●●●●● ●● ● ●●● ● ● ●● ●● ●● ● ●● ● ●●● ●● ●● ●●● ● ● ●● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●● ●●● ●● ●●● ● ●●● ●●●● ● ●●● ●● ● ●● ●● ●●● ● ●●● ●● ●● ●●●● ●● ● ●●●● ● ●●● ●● ●● ●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ● ● ●● ●●● ●● ●●● ●● ●●●● ●●●●● ● ● ●● ●●● ● ●●● ●●● ●● ● ●●●● ● ● ●●●● ●●● ● ● ●● ●●●●●● ●● ● ●●● ● ● ●●● ● ● ●● ● ●● ● ●● ●● ●● ● ●● ●●● ● ●●●●● ●● ●● ● ●●●● ●● ●● ● ●●● ● ●● ● ● ●● ● ●●● ●● ●● ●● ● ●● ● ●●●● ● ●● ●●●● ● ●●● ●● ● ●●● ● ● ●●● ● ●● ●● ●● ●● ●●● ● ●●● ●●● ●●● ● ● ●● ● ●● ● ●●● ●● ●● ● ●● ●● ● ●● ●● ● ●● ● ● ●● ●● ●● ●●● ● ●● ●● ●●●● ● ●● ●● ●●● ●● ●●● ● ●● ●● ●● ● ●● ●● ● ● ●●● ● ●● ● ●●●● ● ●● ●● ●● ● ●● ●● ●● ● ●●● ●●●● ●● ●● ●●● ● ● ● ●● ●●● ●●● ● ●●● ●● ●● ●●● ● ● ● ●● ● ●● ● ● ● ●●● ●●● ●● ●● ● ●●●● ● ●●●● ●● ● ● ●● ●● ●●●● ● ●● ● ●● ●● ● ● ●● ● ●● ●●●●● ●● ● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ● ●● ●● ●●● ●●● ●●●● ●● ●●●● ●● ● ●● ●● ●● ● ● ●●●● ●● ●● ●●●●●● ●● ● ●●● ●●● ● ●●● ●● ●● ● ●● ●● ●● ● ● ●● ●●● ● ●● ●●●● ●● ●● ●● ● ● ●●● ●● ●●●●●● ●● ● ●● ●● ●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●●● ● ●● ●● ●● ●● ● ●●● ● ●●● ●● ●● ●●● ●● ●●● ●● ●●● ●●● ●●● ●●● ●● ●● ● ● ● ●● ●● ●● ● ●● ● ●● ●●● ●●● ● ●● ●●●●● ● ●● ●● ●●●● ● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●●●●● ● ●●●●● ● ● ●●● ● ●● ●●●●● ● ●●● ● ●● ●● ●●● ● ●●● ●● ● ●● ●●● ● ● ●●● ●●●●● ●● ● ●●● ●● ● ●● ● ●● ● ● ● ●● ●● ●●●● ●● ● ●●● ●●● ● ●●● ● ●● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●● ●●● ●● ● ●●● ● ●● ● ●● ●●● ●● ●● ●●● ●● ● ● ●● ●●●● ●● ●●● ●● ● ● ●●● ● ●● ● ●● ●●●● ●●● ● ●●● ● ●● ● ●●● ● ●● ●●●● ●●● ●●● ●●● ●●● ●●● ●● ● ● ●●● ●● ●● ●●● ●●●●● ●● ●● ●● ●● ● ● ●●● ●● ●● ●● ● ●● ● ●● ●●● ●●● ●● ●● ● ●●● ●● ●● ●●● ●●● ●● ●●● ● ● ●● ● ●●● ●● ●●● ●● ●●●● ● ●● ●● ●●●● ●● ●●● ●● ● ●● ● ●● ●● ● ● ●● ● ● ●●●● ● ● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●●● ● ●● ●●● ●●● ●● ●● ● ●●● ●● ● ●● ●● ● ●● ●● ●● ●● ●●● ●● ●●●●●● ● ● ●●● ●● ●●●● ●●● ●● ●● ●● ●● ● ●● ● ●● ●● ●● ●●●● ●●● ●●● ●●●● ●● ●● ●●● ● ●● ● ●● ● ●●●● ● ● ●●● ●●● ●● ●●● ●● ● ●●●●● ●●●● ●●●● ● ●●● ●●● ● ●●● ●● ● ● ●●● ●● ●●● ● ●● ●● ● ● ●● ●●●● ●● ● ● ●● ●● ● ● ● ●●●● ● ● ●●● ●● ● ● ●●● ● ●●● ●●● ● ● ●● ● ●● ●● ●●● ●● ● ●● ●●● ● ●● ● ●● ●● ●● ● ●●●● ● ● ●●● ●● ●●●● ● ● ●● ● ● ●● ●● ● ●●● ● ●●●● ●● ● ●● ●●● ●● ●● ●● ● ●● ● ● ● ●● ●●●● ●●●● ●●● ● ●● ●● ●● ●● ● ● ●● ● ●●● ●● ● ●● ●● ● ●● ●● ●●●● ●●● ●● ● ●● ●● ● ●● ● ●●●●●●● ●● ●● ● ●● ● ● ●● ●● ● ●● ●● ●●●● ● ●● ●● ●●● ● ●●● ●● ● ● ● ●●●●● ● ●●●● ● ●●● ●● ●●●●● ●● ●●●● ●●● ● ●●●●● ●●● ●●●● ●● ●● ● ●● ●● ●● ●● ●●● ● ● ●●●● ●● ● ●● ●● ●● ●●●● ● ●● ●● ● ●● ●●●● ●● ●● ●● ● ●● ●●● ●●●● ●● ●●● ●● ●●●● ●● ● ●●●●●● ● ● ●● ● ●● ● ● ●●● ● ●● ● ●● ● ● ● ●●●● ●● ●● ● ●● ●●● ● ●● ●● ●● ●●● ●● ●●●● ●● ● ●●● ●●●● ● ●● ● ●●●●●● ●●● ● ●● ● ●● ●●● ● ●●●● ● ●● ●● ● ●● ● ● ●● ●● ● ●● ●● ●● ● ● ●● ●●●● ● ●●● ●●● ●●●● ●●● ●● ●● ●● ●●● ● ●● ●●● ● ●●● ● ●● ●● ●● ● ●●● ●● ● ●● ● ● ●● ●● ● ●● ●●● ●●●● ●● ●● ●●●● ●●● ●●● ● ●●●●● ●● ●●● ●●●● ●● ●●● ● ● ●● ● ●●● ●●●● ●●● ● ● ●● ●●● ● ● ●●●● ●●● ●● ● ●●● ●● ●●● ● ● ●●● ● ●● ●● ●● ● ●●● ● ●● ●●● ● ●● ●●● ● ●● ●● ●● ●●● ●● ● ●●● ● ●●● ●● ●● ● ●● ●● ●● ●● ● ●● ●●● ●● ● ●● ● ●● ● ● ● ●● ●● ●●● ● ●●● ●● ●●● ● ●●●● ●● ●● ● ●● ●● ●● ●●●● ● ● ●● ●●● ●● ●●● ●●●● ● ●● ● ●●● ●●●● ●● ● ●● ●●● ●●● ●●●● ● ●●● ● ● ●●●●● ● ●● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ●●● ●●● ●● ● ●● ● ● ●● ●●● ● ● ●●● ● ●●● ●●●● ●● ● ●● ● ●● ● ● ●● ●● ●●● ● ●●●● ● ●● ●● ● ●●● ●●● ●● ● ●● ●● ●● ●●● ●●●●● ●● ●●●●● ●● ●● ● ●● ● ●●●● ● ●● ● ●● ●● ●●● ●●● ●● ●● ● ● ●● ● ●● ● ●●●●● ● ● ●●● ●● ● ●●● ● ● ● ●●● ●● ● ●● ● ●● ● ●● ●● ●●●● ●● ● ● ●●●●● ● ●● ● ● ●● ●●● ● ●●● ●●● ●● ●●●● ●● ● ●●● ●●●● ●● ● ●● ●● ●●● ●●●●● ● ●● ●●●● ● ●●●● ● ●● ● ●● ● ●●●● ● ● ●● ●●● ● ●● ● ●●● ●●● ●●●● ● ●● ●● ●●● ●●● ●●● ●● ●●●● ●●● ● ● ●● ●● ●●●● ●●● ●●● ● ●● ●● ● ●● ●● ●● ● ●●● ●● ● ●●● ● ●● ● ●● ●●● ● ● ● ●●●● ●● ●● ● ●●● ●● ● ●●● ●●● ●● ●●● ● ● ●●●●● ●● ●● ●● ● ● ●●● ● ●●● ●● ●● ●● ●● ●● ● ●●● ●● ●● ● ●●●● ●● ●●● ●● ●● ●●●● ● ●●● ●● ● ●● ●● ●● ● ● ●● ●● ● ● ● ●● ●●● ●● ●●● ●● ●● ●●●● ● ●●● ● ● ● ●● ●●● ● ●●●●●● ● ● ●●●● ●●●●● ●●● ●● ●● ●●●● ●● ● ●●● ● ●●● ● ● ●●● ●●● ●● ● ●● ●● ● ●● ●●●●●● ●● ● ●● ● ● ●● ● ●●● ● ●● ● ●● ●●●● ●●● ● ● ● ●● ●●●● ●● ●● ●● ● ● ●●● ●●● ●● ● ●● ● ●● ●● ●●● ●● ●●●●● ●●● ●● ●●● ● ● ●● ●● ● ● ●● ●● ●●● ●● ● ●● ●● ●●● ● ●● ●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ●● ● ●● ●● ● ●● ●●● ● ●●●● ● ●● ●●● ●● ● ●● ●●● ● ●● ●● ● ●● ●●● ●● ●● ● ●● ●● ●●●● ● ● ●● ● ●● ●●●●● ●● ● ●● ●●● ●● ●● ● ●● ● ●●● ●●● ● ●●● ● ●●● ●● ●● ● ●● ●● ● ●●● ● ●● ●● ● ●● ●● ● ●● ● ● ●● ●● ●● ●● ●● ●● ●●● ● ●● ●● ●●● ● ●● ●● ● ●●● ●● ● ●●●● ●●● ● ●●● ● ●● ● ● ●●● ●● ● ●●● ● ●● ●●● ● ●●●● ● ●●● ● ● ●● ●● ● ●●●● ● ● ●● ●●●● ●● ●● ● ●● ●● ●● ● ●●● ●●●● ●●●●● ●● ●●● ● ●●●●●● ● ● ●●● ● ●● ●●●● ●● ● ● ●●● ● ●●●●● ●●● ●● ● ●● ●● ●● ●●● ●● ●● ● ●●● ● ●●●●● ● ●●●● ● ● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ● ●● ●● ●● ●● ●●●●● ● ●●● ● ● ●● ● ●●●●● ●●●● ●● ●● ●● ● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●●● ●●● ● ● ●●● ●● ●● ●● ●● ●●●● ●●● ● ● ●●● ● ● ●●● ● ●● ●●●●● ●● ●●●● ● ● ●● ●●● ●● ● ●● ● ● ●● ●●●● ● ●● ● ●● ●● ●●● ● ● ● ●● ●●● ●● ●●●● ●●● ●● ●●● ● ●● ● ●●● ●● ● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●● ●● ● ●●●● ● ● ● ●● ● ● ●●●● ●●●●● ● ● ●● ●●● ● ● ●● ●●● ● ●●● ● ●● ●●● ● ●● ●● ●●● ●● ● ●●● ●● ●●● ● ● ●● ● ●● ●● ●● ●●●● ●● ●● ● ●● ●●● ● ● ● ●●● ● ●● ●●●● ● ●● ● ● ●● ● ●●●●● ● ●● ●● ●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●●● ● ●● ● ●● ●●● ●● ●● ● ● ●●●● ●●● ● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ●●●● ● ●● ●● ●●●● ● ●● ●●● ●●● ●● ●● ●● ●●● ●● ●●● ●●● ●● ●● ●● ●●●● ●●● ● ●●● ●● ● ●● ● ●●● ●●●● ●●● ●●● ●●● ●● ● ● ●● ●●● ●● ●●●● ● ●● ●●● ● ● ●●● ● ●● ●● ●● ● ●● ●●●● ●● ● ●● ●● ●●● ●●● ●●● ●● ●● ● ●● ●●●● ● ●●● ● ● ●●● ●● ●● ●● ●● ● ●●● ●● ● ●●●● ●● ●●● ●● ●● ● ●● ● ● ● ●● ●● ● ●● ●●● ●●● ●● ●● ●●● ● ● ●● ●● ●● ●● ● ● ●● ●● ●●●● ●● ●● ● ●●● ●● ●● ●●● ●● ● ●● ● ●●●● ●● ● ●●● ●● ● ● ●● ●● ● ●●● ● ●● ● ●● ●●● ● ● ●● ●●● ● ●● ● . . . . . . U V ● ●●● ●●●●● ●● ●● ● ●● ●●● ● ●●●● ●●● ●●● ●●● ●●● ●● ● ● ●● ● ●● ● ●●● ● ●● ●● ● ● ●● ●● ●● ●● ● ●●● ●●●● ●● ● ●● ●● ● ● ●●●● ●● ● ●●●● ● ●●● ●● ●●● ● ●●● ●● ●● ● ●●● ●●● ●●● ●●● ● ●● ●●● ●● ● ●●● ●● ●● ● ● ●●● ●●● ●● ●● ● ●●●● ●●●●● ● ●● ●●●●● ●● ● ● ●●●●● ●● ● ●●● ● ●●● ●● ● ●●●● ●● ●● ●●● ● ●●● ●● ●● ● ● ●● ● ●●● ●● ● ●●●● ●● ● ● ● ●●● ● ●● ●● ●● ●●●● ● ●● ● ● ●● ● ●●●● ● ●●● ●●●● ●● ●● ●● ●● ●●●● ●● ●●● ●●● ●● ●● ● ●● ● ●●● ●●●● ●● ● ●●●● ● ●●● ● ●● ●●● ● ● ● ●●● ●●● ● ●● ● ● ●●● ●● ●● ●● ● ● ●●● ●●●● ● ●●● ●● ●● ●●● ●●● ●● ●●● ●● ●●● ● ●● ●● ●●● ●● ●● ●●●● ●● ●● ●●●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ●●● ●●●● ●● ●●● ●●● ● ●● ●●● ●●● ● ●● ●● ● ● ● ●● ●● ●●● ●● ●● ●●● ●●●●● ●● ●●●● ● ●● ● ● ●● ● ● ●●●●●● ●● ●● ●● ● ●● ● ●● ●●● ●●● ●●● ●● ● ●●● ●●● ● ●● ● ●● ●● ●●● ● ●●● ●●●● ● ●●● ●● ● ●●●● ● ●● ●● ●●● ●●●● ●●●● ● ● ●● ●● ● ●● ●●● ●●●● ●● ●● ●● ●● ●● ●●● ● ●●● ● ●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ●●● ●● ● ● ● ● ●●● ● ●●●● ● ●● ● ●● ●●● ●● ● ●● ●● ● ● ●● ●●● ●● ●● ●● ● ● ●● ●●● ● ● ● ●● ●●●● ●●●● ●● ●● ●●● ●● ●● ● ● ●● ●● ● ●●● ●● ●● ●●●● ● ●● ●● ●● ●● ● ● ●● ●● ●● ● ●● ●● ●●●●●● ●●● ● ●● ●● ●● ●● ●● ●●● ●● ●●● ●● ● ●●● ● ● ●●●● ●● ● ● ●●● ●● ●● ●● ●●●● ● ●● ●● ● ●●● ● ● ●●● ●● ● ●●● ● ●●●●●●● ● ● ●● ●● ●● ●● ●●●● ●●● ● ●● ● ● ●●●● ● ●● ●● ●● ●●● ●●●● ●●● ●●●●●● ●● ●● ● ● ●●●●● ●● ●● ●●● ●●● ● ●●● ● ● ●●● ● ●● ●●● ●● ● ●● ●● ● ●●● ● ●●● ●●● ●● ●●● ●●● ● ●●● ● ●●● ●●● ●● ●● ●●● ● ● ● ●● ●● ●● ●● ●● ● ●● ●●●●●● ●● ●●● ● ● ●● ●●● ●●● ● ●●● ●● ● ●● ● ●●● ●● ●●● ●● ●● ● ●● ●● ●● ● ●●● ●● ● ●● ● ●●● ●●● ● ●● ●● ●● ●● ●●●●● ● ● ●●● ●● ●●●● ● ●● ●●● ●●● ●● ● ● ●● ●● ●● ●●● ●●●● ●● ● ●●●●● ● ● ● ●●● ●● ●● ● ●●● ●● ●● ●●● ●● ●●●● ● ● ●● ●● ●● ● ● ●●● ● ●●●● ● ● ●●● ●● ● ● ●●● ●● ●● ●●● ● ●● ● ●● ●● ●●● ●● ●●● ●●● ● ●● ● ●●● ●● ●● ●●● ●●● ●● ●●● ●●● ●●● ● ●●● ●● ●●●● ●●● ●●● ●● ●● ● ●●●●● ● ●● ●● ●● ● ●●●●● ●● ● ● ●●● ● ●● ●● ● ●●●● ● ●● ●●● ●●● ● ● ●●● ● ● ● ●●●●● ●● ●●● ● ●● ●●● ●● ●● ●●● ● ●● ●● ● ● ●●● ●● ●● ●● ● ●● ●● ●● ●● ● ● ●● ●● ● ● ● ● ●● ●● ● ●●●● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ●● ●●●●● ● ●● ●● ● ●● ●● ●● ● ●● ● ● ●● ● ●● ● ●● ● ●●● ●● ● ●● ● ● ● ●● ●● ●● ● ●●●● ●●●●●● ●● ●● ●● ●● ●●●● ●● ●●● ●● ●● ●● ●●● ●● ●● ●● ●●● ● ●● ● ●● ●●● ● ●● ● ●●●●●● ●● ● ●● ●●● ●● ●●● ●●● ● ●● ●● ●●● ● ● ● ●● ● ●● ● ●●●● ● ● ● ● ●●●● ●● ●●●●● ●● ● ● ●●●●● ● ●● ●●● ● ● ●● ●● ●● ●● ●● ● ●●●●● ●● ●●●●● ● ●● ● ●● ●● ● ●● ●●● ●●●● ● ● ●● ● ●● ●● ● ●● ●● ● ●● ● ●● ●● ●● ●●●● ●● ● ●●● ●● ●● ●● ● ● ●●●● ●● ● ● ●●● ● ● ●● ● ●● ●●● ●● ●●●●● ● ●●●● ●● ●●● ● ● ● ●●● ●●● ●● ● ●● ●●●●● ●● ●●● ● ●● ● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●● ●● ●● ●● ●●● ● ●● ●●● ●● ●● ● ●● ●●●●● ●● ●● ● ● ●●● ●●● ● ● ●●●● ● ● ●● ● ●●●●● ●● ●● ●●● ●●● ● ● ●● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ● ●● ●●● ● ●●●● ● ● ●● ●●● ●● ●●● ●● ●●● ●● ●● ●● ●●● ● ●●● ●●● ● ● ● ●●● ●● ●●●● ●●● ● ●●● ●●●●● ●●● ●● ●●●● ●●● ●●●● ●● ●● ● ●● ●●● ●● ● ●● ●●● ●● ●●● ● ●● ●● ●●● ●●● ● ●● ●● ● ●● ●● ●● ●●● ● ●●●●●● ●● ●● ● ●● ● ●● ●● ●●● ●● ● ●●●● ●● ●●● ●● ●●● ●●● ●●● ●●●● ● ● ●● ●● ●●● ●●● ● ●●● ●● ●● ●●● ● ●● ●●●● ● ● ●● ● ●● ● ●● ●● ●● ● ●●● ● ●● ●●●● ● ● ●●●● ● ●●● ●●●●● ●● ●● ●● ● ● ●● ●● ●●● ●● ●● ●● ●●● ●● ●● ● ●●● ●● ●●● ●● ●●● ●● ●● ● ●● ●●●● ●● ●●● ●● ●●●● ●●●● ●● ●●● ●● ● ●●●● ●● ●● ●●●● ●● ● ●● ●● ●● ●● ●●●● ●● ●●● ●●●● ● ●● ●● ●● ●● ●●● ● ●● ●●● ●●● ●●●●●● ● ●●● ●● ●●● ● ●● ●●● ● ●●● ● ●●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●●●● ●●● ●● ●● ●● ● ●● ● ●● ●●● ●● ●● ●●● ●● ● ●● ●● ●●● ●● ●● ●●● ● ● ●●● ●● ● ●●● ●●●● ●● ●● ●●●● ●● ●● ● ● ●●● ● ●● ●● ●● ●●● ● ●● ●● ●● ● ●● ● ● ● ●● ●● ●● ●● ●● ●●● ●● ●● ●● ● ● ●●●● ● ●●●● ●● ●● ●●● ● ●●● ● ●●● ●● ● ●● ●●● ●● ●● ● ●● ●●● ●● ●● ● ●● ●●● ● ● ●●●●● ●● ● ●●●● ●● ●●●● ●●●● ●● ●● ● ●● ●● ● ●● ●●●●●● ●● ●● ●● ●● ●● ●●●● ●● ● ●● ● ●●● ●● ●● ●●● ●● ● ●●● ●●● ●● ●● ●● ●●●● ●● ● ●● ●●● ●● ● ● ●●● ●● ●● ●●● ●● ●● ● ●●●●● ● ●● ● ●●●● ● ●●●● ●●● ●●●●● ●●● ●● ●●● ●● ●● ●● ●●● ●● ●● ● ● ●●● ●●● ●●●● ●●●● ●● ●●● ●● ●● ● ● ●● ●● ● ●● ● ●●● ●● ● ●● ● ●● ● ●●●●● ●● ● ●● ●● ● ●●● ●● ●● ●●●●● ●● ●● ●● ●● ●●● ●●●● ●● ● ●●●● ●● ●●● ● ●●● ●● ●● ● ●● ●●●● ●● ●● ● ●● ● ●● ●● ●● ●●●●● ●● ●●● ●●● ●● ●●● ● ●● ●● ● ●● ●●● ●● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ● ● ●● ●●● ●●● ●●● ●● ● ●●●● ●● ●● ●● ●●● ● ● ●● ● ●● ● ●●●●● ●● ● ●●●● ●● ● ●●● ● ●●● ● ● ●●● ●● ●● ●● ●●● ●● ●● ●●● ● ● ●●● ●● ● ●● ●●● ● ● ●● ●● ● ●●●●●●● ● ● ●● ●● ● ●●● ●●●● ●● ●●●●● ●● ● ● ●●●●● ● ●● ● ● ●● ●●●● ●● ● ●●● ● ●● ●●●●● ●● ●● ●● ●● ● ● ●● ●●●● ●● ●● ●●●● ● ●●● ● ●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●● ● ●● ●● ● ●● ●●● ●● ●● ●●● ●●●●● ● ●●● ●●● ●● ●● ●●● ●● ● ● ●● ●● ●●● ●●● ●● ●●● ● ● ● ●● ● ●●● ●● ●● ● ●●●● ● ●●●● ● ●●● ●●●● ●●● ●● ● ●●●● ●●● ● ●● ● ●● ●●● ●● ●●●● ●● ●●● ●● ●●● ●● ●●● ●● ●● ● ●●● ●● ●●● ●● ●●● ●● ● ● ●● ●● ●●● ●● ● ● ●●● ●●● ● ● ●●● ●●● ● ● ●●● ● ● ●●●● ●● ● ●● ●● ● ●● ●●● ● ● ●● ●● ●● ● ●● ●●●● ● ●●● ● ● ●●●● ● ● ●● ●● ● ●● ●●●● ● ●●● ●●● ● ●● ●● ● ●● ●● ● ●● ● ●●● ●●●● ●●●● ●●●● ●●●● ●● ●●●● ●●● ●● ● ●● ●● ● ●●●●● ●● ● ● ●●●●● ● ●● ●●● ● ●● ●● ● ●●● ● ● ●●●●● ●●●● ●● ● ● ●●● ●● ●● ● ●● ●●● ●● ● ● ●●● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ● ● ●●● ●● ● ●● ● ●● ●● ●● ● ●● ●● ● ● ●●●● ● ●●● ● ●● ● ●● ●● ●●● ● ●●●● ●●● ● ● ● ●●● ●● ●● ● ●● ●●●●● ●● ●●● ●● ● ● ●● ●● ●● ●● ●● ●●● ●● ●● ● ●●● ●● ●● ● ● ●●●● ●● ●● ●●● ●●● ● ●●●●● ●●●●● ● ●● ●●● ●● ● ●●● ●● ●●●● ●● ●● ●●●● ●●● ● ● ●● ●●● ●●●●● ● ●●● ●●● ●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ●●●● ●●● ●● ● ●● ●● ●● ●● ●●●● ● ● ●●●● ●● ●● ●● ●●● ●● ● ●●●● ●● ●● ●● ● ●● ●● ●●● ●● ●● ●● ●●● ●●● ●●● ●● ●●● ● ● ●● ●●● ●● ●● ●● ● ●●● ● ● ●● ●● ●●●● ● ●●● ●●● ●●● ● ●●● ●●● ●● ●● ●● ● ●● ●●●● ●● ●●● ●●● ● ●●● ● ●● ●● ●● ●● ●●●● ●● ●● ●●● ● ●●● ● ● ● ●●● ●●● ●● ●●● ●●●● ●● ●●●● ●● ●● ●● ● ●●● ● ●●● ●● ●●●● ● ●●●● ●●●● ● ●● ●● ● ●● ●● ●●●● ● ●●● ●● ●●● ●● ● ●●● ●● ● ●●● ● ● ●● ●●●● ● ● ●●● ●● ●●● ●● ●●● ●●● ● ●● ● ● ●● ●● ●●● ● ●●● ●● ●●● ●● ● ●●● ●●● ●●● ●●● ●●● ●● ● ●● ●●● ● ●● ●●●● ●● ●● ●● ●●● ● ●●●●● ●● ●● ●● ●● ●●●● ● ●● ●● ●● ● ●● ●●● ●●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ● ●● ●●●● ●● ●●● ●●●● ●●●●● ●● ●● ●● ● ●●●●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ● ●●●●● ● ●● ● ●● ●● ●●●●● ●● ●●●● ●● ● ●● ●● ●● ●●● ●● ● ●●●● ● ●● ● ● ●● ●●●● ●● ●●● ●● ●● ●●●● ●● ●● ● ●● ●●● ●● ●●● ●●● ●●● ● ●●●●●●●● ● ●● ● ●●●● ●● ●●● ●●●● ● ●● ●● ● ●●●● ● ●●● ●● ●●● ●● ●● ● ●●● ●● ●●● ●●●● ●●●● ●●●● ●● ●●●● ●● ● ●●● ●● ●●● ● ●● ● ●●● ●● ●● ●●● ●●● ● ●● ●● ●● ●● ●●●● ● ●●● ●● ●● ●● ●● ●●● ● ●●●● ●● ●● ●● ● ●● ●●● ●●● ● ●●●●●● ●●● ● ●● ●●● ●●● ● ●●●● ●● ●●●● ●●●● ●●● ● ●●● ● ● ● ●●●● ●●● ● ●● ●●● ● ●●● ● ● ●● ●● ●●● ●● ● ●●●● ●●●● ●● ● ●● ●● ●● ●● ●● ● ●● ● ●●● ●● ●●● ●●●● ●●● ●● ●●● ●● ● ● ●● ●● ● ●●●●● ●● ●● ● ●● ●●● ● ●● ● ●●● ●● ●●● ● ●●●● ● ●●● ●● ● ● ●● ● ●●● ● ●● ● ●●●● ●● ● ●● ●● ●● ●● ●●●●●● ●●● ● ●● ●● ● ● ●● ●●●●● ●●●● ● ● ●● ● ●● ●●● ●● ● ●● ●● ●● ●● ●● ● ●●● ●●●● ●●● ● ●●● ●● ●●● ● ● ● ●●● ●● ●●● ● ● ●●● ●●●● ● ●● ●●● ●● ●● ● ●● ●● ●●●● ● ● ●● ● ●●● ● ●● ●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●●● ●●● ● ●●● ● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ●● ● ●● ●●●● ●●● ●● ● ●● ●● ●● ●●● ●●●● ●● ●● ●● ●● ● ●●●● ●● ●●● ●● ●● ●●●● ●● ●● ●●●● ●● ●● ●● ●●● ● ● ●●●● ● ●●● ●● ●●● ●●●● ●● ●●●● ●● ●● ●●● ●●● ●●●● ●●● ●● ●● ●● ●●●● ● ●●●●● ●● ●●●● ● ●●●●●● ●●●● ●●●●● ●●● ●●● ●●● ● ●● ●●● ●●● ●●●● ●●● ●● ●●● ●●● ● ●●● ●●●●●● ●●●●● ●●● ● ●●● ●● ●●●●● ●● ●● ●●● ●●●● ●● ●●● ●● ●●●● ●● ●●●● ● ●● ●● ●●● ●●●●● ●●● ● ●● ●●●● ●● ● ●●●● ●● ● ●●●● ●●●●●●●● ● ●●●●●● ●●●● ●●●● ● ●●● ●● ●●●● ● ●● ●●● ●●● ●● ● ● ●●● ●●● ●● ●●● ●●●● ●●● ● ●● ●●● ●● ●● ●● ●●●● ●● ●● ●● ●● ●●● ●●● ● ●● ●●●● ● ●●● ●● ● ●● ●●●● ●●● ●● ●●●●● ●●● ●● ●●●●● ● ●●● ● ●●● ● ●● ●●● ●● ● ●●● ●● ●●● ●● ● ● ●●● ●●● ● ●●● ●●●● ●●●● ●● ●● ●●●● ●● ●●● ● ●● ●● ●●● ( C ) = lim p C ( p, p ) p ( C ) = lim p C ( pu , p ) pu µ,p ( C ) = R (0 , C ( pu, pv ) d µp R (0 , M d µ µ ( C ) = lim p µ,p ( C ) ( u , Figure 1: Scatter plot of the survival Marshall-Olkin copula ˆ C MO a,b , ( a, b ) = (0 . , . λ evaluates the tail dependence along the diagonalline (red, dotted). The µ -tail dependence measure is a limit of the measure of concordance restricted to thetail region (blue) with generating measure µ . Finally, the maximal tail dependence measure λ is the maximumof λ µ , which, in this case, is attained by a generating measure concentrated on the red solid line from (0 ,
0) to( u ,
1) with u = a/b = 0 . C ; see Furman et al. (2015) and references therein.Various measures of tail dependence have been proposed to overcome the issue of underestimation oftail dependence. Furman et al. (2015) proposed variants of tail dependence coefficients, which evaluate theunderlying copula C in the tail region not only along the main diagonal but also along all possible pathstoward the corner of the tail region. The main difficulty of their proposed indices is their computation andestimation since the path of maximal dependence for a given C is not analytically available in general. Analternative construction of measures of tail dependence is based on dependence measures, such as Spearman’srho and Kendall’s tau, of the conditional, or unconditional distribution of C in the tail part of interest; see,for example, Charpentier (2003) for the former approach and Schmid and Schmidt (2007) and Asimit et al.(2016) for the latter. However, there is no unified framework for comparing these measures. In addition,relationships of these measures to path-based measures are still unclear.For conservative evaluation of tail dependence, we propose a new class of tail dependence measures λ µ called the µ - tail dependence measures ( µ -TDM, see Definition 2.6), which arise as a tail counterpart of therepresentation of linear measures of concordance in terms of the so-called D -invariant generating measure µ : B ([0 , ) → [0 , ∞ ) (Edwards and Taylor, 2009). Considering the intractability of the conditional distributionof the copula C , our proposed µ -TDMs depend on the unconditional copula C in the tail region of interest, andthus on the so-called tail dependence function (TDF, also called tail copula ) Λ( u, v ) = lim p ↓ C ( pu,pv ) p , u, v ≥ µ controls the weights of angles along which the extremal relationshipbetween random variables is evaluated, it turns out that tail dependence coefficients and tail Spearman’s rhoconsidered in Schmid and Schmidt (2007) are special cases of λ µ , and new tail dependence measures, such as generalized TDC and tail Gini’s gamma , are induced; see Examples 2.8 and 2.9. Utilizing the homogeneityproperty of TDF (see Proposition A.5), we study axiomatic properties and radial representations of µ -TDMs.In particular, an intuitive interpretation of the monotonicity axiom of µ -TDMs in terms of tail probabilities isderived from the concordance order among TDFs. We also show that TDC is the minimal µ -TDM over allmeasures µ , and thus TDC always underestimates degree of tail dependence regardless of the choice of µ .To solve the issue of underestimation of tail dependence, we also propose the so-called maximal taildependence measure (MTDM) λ , which is the supremum of λ µ over all generating measures µ for a givencopula C . An illustration of TDC, µ -TDC and MTDM can be found in Figure 1. We prove that MTDMcan be generally derived as a limit of normalized TDF, but is not always attainable as a µ -TDM for somegenerating measure µ . We also exhibit a relationship of MTDM with path-based tail dependence coefficient(which we call maxilal tail dependence coefficient , MTDC) considered in Furman et al. (2015). We show thatMTDM arises as the path-based maximum of normalized tail dependence coefficient. We also find that MTDCcan be computed by maximizing TDF over derivatives of the paths at the origin. Our analysis unifies thepath-based framework of tail indices and that of measures of concordance in tail regions. Through examplesof the maximal type tail dependence measures for various parametric copulas, we show that MTDM mayoverestimate, and MTDC may still underestimate degree of tail dependence. Finally, performance of theproposed tail dependence measures are demonstrated in simulation and empirical studies. We reveal thatstable estimation of MTDM is challenging in particular when sample size is small, and thus, together with theissues of over- and underestimations of tail dependence, we recommend the use of µ -TDMs with µ supportedon all the angles for comparing degree of tail dependence.The present paper is organized as follows. In Section 2, we introduce the proposed µ -tail dependencemeasures with various examples such as tail Spearman’s rho and tail Gini’s gamma. Their properties,representations and relationships with tail dependence coefficients are studied in Section 3. Section 4 concernsmaximal type tail dependence measures and their interpretations in the maximization framework of path-basedtail indices. Numerical experiments are conducted in Section 5, and Section 6 concludes with potential futureresearch. A brief review and examples of tail dependence function are provided in Appendix A, and statisticalinference of the proposed tail dependence measures can be found in Appendix B. In this section we present a new class of tail dependence measures constructed based on measures ofconcordance. A brief review of measures of concordance is provided in Section 2.1, and a new class of taildependence measures is proposed in Section 2.2. We end this section with examples given in Section 2.3.
For an integer d ≥
2, a d -dimensional copula ( d -copula) C : I d → I , I = [0 , d -dimensional distributionfunction with standard uniform univariate marginal distributions. Denote by C d the set of all d -copulas. Wecall C ∈ C d more concordant than C ∈ C d , denoted by C (cid:22) C , if C ( u ) ≤ C ( u ) for all u = ( u , . . . , u d ) ∈ I d .The d -dimensional comonotonicity and independence copulas are defined by M d ( u ) = min( u j ; j = 1 , . . . , d )and Π d ( u ) = u · · · u d , respectively. We omit the subscript d when d = 2, that is, M = M and Π = Π . The2-dimensional counter-monotonicity copula is defined by W ( u, v ) = max( u + v − , Fréchet–Hoeffdinginequalities , it holds that W (cid:22) C (cid:22) M for all C ∈ C .Throughout the paper, we fix an atomless probability space (Ω , A , P ) on which all appearing randomvariables are defined. Moreover, we identify κ ( C ), C ∈ C d , for a map κ : C d → R with κ ( U ), where U ∼ C denotes a d -dimensional random vector following the distribution C .For d = 2, the dependence between two random variables can be summarized by so-called measures ofconcordance of the underlying copula C ∈ C . The following definition of a measure of concordance is amodified version of those found in Scarsini (1984) and Schweizer and Wolff (1981). Definition 2.1 (Measures of concordance) . A map κ : C → R is called a measure of concordance if it satisfiesthe following conditions.1. (Normalization) κ ( M ) = 1.2. (Permutation invariance) κ ( V, U ) = κ ( U, V ) for any (
U, V ) ∼ C ∈ C .3. (Reflection symmetry) κ ( U, − V ) = − κ ( U, V ) for any (
U, V ) ∼ C ∈ C .4. (Monotonicity) κ ( C ) ≤ κ ( C ) if C (cid:22) C for C, C ∈ C .5. (Continuity) lim n →∞ κ ( C n ) = κ ( C ) if C n → C pointwise for C n ∈ C , n ∈ N , and C ∈ C .Property 1 and Property 3 imply that κ ( W ) = −
1, and thus that − ≤ κ ( C ) ≤ C ∈ C by Property 4 and the Fréchet–Hoeffding inequalities. Moreover, Property 3 implies that κ (Π) = 0 since( U, V ) d = ( U, − V ) for ( U, V ) ∼ Π. Therefore, all the axioms of measures of concordance stated in Scarsini(1984) are recovered from those in Definition 2.1.We now introduce a representation of measures of concordance, which will be a building block forconstructing tail dependence measures in Section 2.2. Let D = { ι, τ, σ , σ , τ σ , τ σ , σ σ , τ σ σ } bethe group of symmetries on I , where ι ( u, v ) = ( u, v ) , σ ( u, v ) = (1 − u, v ) , σ ( u, v ) = ( u, − v ) , τ ( u, v ) = ( v, u ) . The following characterization of a measure of concordance is provided in Theorem 1 of Edwards and Taylor(2009).
Theorem 2.2 (Characterization of a linear measure of concordance) . A map κ : C → R is a measure ofconcordance satisfying the linearity condition: κ ( tC + (1 − t ) C ) = tκ ( C ) + (1 − t ) κ ( C ) for every C, C ∈ C and t ∈ I , if and only if κ admits the representation κ ( C ) = R I ( C − Π) d µ R I ( M − Π) d µ , (1)where µ is a Borel measure on ( I , B ( I )) satisfying the following conditions:1. ( D -invariance) µ ( ξ ( A )) = µ ( A ) for all ξ ∈ D ,2. (Finiteness) < R I ( M − Π) d µ < ∞ .We denote a measure of concordance of Form (1) by κ µ , and call µ the generating measure of κ µ . D -invariance is required for κ µ to satisfy permutation invariance (Property 2) and reflection symmetry(Property 3), and finiteness is required for κ µ to be well-defined. Note that Edwards and Taylor (2009)require µ to have no mass on the boundary of I , and to satisfy R I ( M − Π) d µ = 1 so that µ ∈ M isuniquely determined. These conditions do not affect the characterization since masses on the boundary of I do not change the values of the numerator and denominator in Representation (1), and µ ∈ M can alwaysbe normalized to be ˜ µ = µ/ R I ( M − Π) d µ ∈ M by Finiteness of µ . Therefore, we do not require theseconditions and omit them for later convenience. Example 2.3 (Examples of linear measures of concordance) . Blomqvist’s beta : The Dirac measure µ = δ ( , ) generates Blomqvist’s beta (also known as mediancorrelation ) β ( C ) = 4 C (cid:0) , (cid:1) −
1; see Blomqvist (1950).2.
Spearman’s rho : If µ is the uniformly distributed measure on I , then it generates Spearman’s rho ρ S ( C ) = 12 R I C dΠ −
3; see Spearman (1904).3.
Gini’s gamma : If µ is a probability measure of M + W , that is, if µ is the uniformly distributed measureon the two diagonals y = x and y = − x +1, then it generates Gini’s gamma γ ( C ) = 8 R I C d (cid:0) M + W (cid:1) − τ ( C ) = 4 R I C ( u, v ) d C ( u, v ) − µ -tail dependence measures For C ∈ C , let us start with measuring local dependence in the tail region [0 , p ] for some small p ∈ (0 , ,
0) on I . Tail dependence propertiesaround the other three corners (1 , , ,
1) can be equivalently studied by replacing C with rotatedcopulas σ ∗ C , σ ∗ C and σ ∗ σ ∗ C , respectively, where ξ ∗ C , ξ ∈ D , is the distribution function of ξ ( U, V )for (
U, V ) ∼ C .Motivated from Representation (1) of a linear measure of concordance, we consider the map λ µ,p ( C ) = R I ( C − Π)( pu, pv ) d µ ( u, v ) R I ( M − Π)( pu, pv ) d µ ( u, v ) (2)for some Borel measure µ on I . The measure λ p,µ can be interpreted as the normalized average differencebetween C and Π on the region of interest [0 , p ] weighted by the generating measure µ . Different conditionsare required from those in Theorem 2.2 in order to construct a meaningful measure of tail dependence. First,we do not require λ µ,p to satisfy permutation invariance (Property 2) and reflection symmetry (Property 3) inDefinition 2.1 since permutation and reflection of C correspond to measuring other tail parts of C . Therefore,we do not require µ to be D -invariant. Next, boundary conditions of µ become relevant. Denote the slices of I by ∂ I x = u = { ( u, v ) ∈ I : u = u } , and ∂ I y = v = { ( u, v ) ∈ I : v = v } . Although the integrand ( C − Π)( pu, pv ) vanishes if u or v is zero, it does not vanish if u or v is one. Therefore,the values of µ on the boundary ∂ I x =0 ∪ ∂ I y =0 do not affect the value of λ µ,p whereas those on ∂ I x =1 ∪ ∂ I y =1 do. Finally, finiteness of µ in Definition 2.2 is still required so that λ µ,p is well-defined. Taking these conditionsinto account, we assume that µ is a probability measure on I whose probability mass is not concentrated on ∂ I x =0 ∪ ∂ I y =0 . Such a µ satisfies Z I ( M − Π)( pu, pv ) d µ ( u, v ) ≤ p Z I M ( u, v ) d µ ( u, v ) ≤ p Z I u d µ ( u, v ) ≤ . Together with ( M − Π)( pu, pv ) > u, v ) ∈ (0 , , we realize that µ satisfies the finiteness condition. Theresulting measure is defined as follows. Definition 2.4 ( µ -tail dependence measure at level p ) . Let p ∈ (0 ,
1) and M be the set of all Borelprobability measures on I whose probability mass is not concentrated on ∂ I x =0 ∪ ∂ I y =0 . For µ ∈ M , themap λ µ,p : C → R defined by (2) is called the µ - tail dependence measure ( µ -TDM) at level p . The measure µ is called the tail generating (probability) measure . Remark 2.5 (Probability mass of µ on ∂ I x =0 ∪ ∂ I y =0 ) . For µ ∈ M , let ( U, V ) ∼ µ , w = P ( U = 0 or V =0) ∈ [0 ,
1) and w = 1 − w . Then λ µ,p ( C ) = E [( C − Π)( pU, pV )] E [( M − Π)( pU, pV )] = w w E [( C − Π)( pU, pV ) | { U = 0 and V = 0 } ] w w E [( M − Π)( pU, pV ) | { U = 0 and V = 0 } ]= E [( C − Π)( pU, pV ) | { U = 0 and V = 0 } ] E [( M − Π)( pU, pV ) | { U = 0 and V = 0 } ] = λ ˜ µ,p ( C ) , where ˜ µ is the probability measure of ( U, V ) |{ U = 0 and V = 0 } which has no probability mass on ∂ I x =0 ∪ ∂ I y =0 . Therefore, we can assume without loss of generality that w = 0, that is, µ ∈ M has no probabilitymass on ∂ I x =0 ∪ ∂ I y =0 .Next, we consider the limiting case when p ↓
0. For p ∈ (0 , p - tail dependence function ( u, v ) P ( U ≤ pu, V ≤ pv | U ≤ p ) = P ( U ≤ pu, V ≤ pv | V ≤ p ) = C ( pu, pv ) p , ( u, v ) ∈ I , calculates the probability for two variables to take on small values given that one takes on a small value.Provided it exists, we call the limit Λ( u, v ) = Λ( u, v ; C ) = lim p ↓ C ( pu, pv ) p , the tail dependence function (TDF) of C , which is also called a tail copula ; see Jaworski (2004), Schmidtand Stadtmüller (2006), Klüppelberg et al. (2007), Nikoloulopoulos et al. (2009) and Joe et al. (2010). Theexistence of TDF for a given C ∈ C can equivalently be stated in terms of the so-called tail expansion of C ;see Jaworski (2004), Jaworski (2006) and Jaworski (2010) for details. Basic properties and examples of TDFare summarized in Appendix A. Let C L2 = (cid:26) C ∈ C : lim p ↓ C ( pu, pv ) p exists for all ( u, v ) ∈ R (cid:27) be the set of all copulas admitting TDFs, and L = (cid:26) Λ : R → R + : there exists C ∈ C such that Λ( u, v ) = lim p ↓ C ( pu, pv ) p (cid:27) be the set of all TDFs. By their constructions, C L2 and L are convex sets. Moreover, the inclusion relationship C L2 ⊆ C is strict; see Corollary 8.3.2 of Jaworski (2010).Let C ∈ C L2 . Since C ( pu,pv ) p ≤ C ( p,p ) p ≤ M ( p,p ) p = 1 for every ( u, v ) ∈ I , the bounded convergence theoremimplies thatlim p ↓ λ µ,p ( C ) = lim p ↓ R I C ( pu, pv ) d µ ( u, v ) − p R I uv d µ ( u, v ) p R I M ( u, v ) d µ ( u, v ) − p R I uv d µ ( u, v ) = lim p ↓ R I C ( pu,pv ) p d µ ( u, v ) R I M ( u, v ) d µ ( u, v )= R I lim p ↓ C ( pu,pv ) p d µ ( u, v ) R I M ( u, v ) d µ ( u, v ) . Therefore, the limiting measure λ µ ( C ) = lim p ↓ λ µ,p ( C ) = R I Λ( u, v ; C ) d µ ( u, v ) R I M ( u, v ) d µ ( u, v ) , C ∈ C L2 , (3)is well-defined. Since λ µ can be regarded as a map on L , we also write it as λ µ (Λ) for Λ ∈ L if there is noconfusion. Moreover, we have that 0 ≤ λ µ (Λ) ≤ ∈ L since0 = Λ( u, v ; Π) ≤ Λ( u, v ) ≤ Λ( u, v ; M ) = M ( u, v )by definition of Λ. Note that the domain of M is extended from I to R . We thus obtain the following classof tail dependence measures. Definition 2.6 ( µ -tail dependence measure) . Let M be the set of all Borel probability measures on I whoseprobability mass is not concentrated on ∂ I x =0 ∪ ∂ I y =0 . For µ ∈ M , the map λ µ : C L2 → I defined by (3)is called the µ - tail dependence measure ( µ -TDM) . The measure µ is called the tail generating (probability)measure .Although the µ -TDM at a fixed level p can be used in practice to quantify local dependence, we hereafterfocus the limiting case when p ↓ λ µ,p does not capture tail dependence in the sense that one canalways find a copula C ∈ C L2 such that λ µ ( C ) = 1 but λ µ,p ( C ) is arbitrarily close to zero; such a copula canbe constructed by a shuffle of M approximating Π with comonotone dependence on the grid containing (0 , µ -TDMs inrelation with the (lower) tail dependence coefficient, which will be investigated in Section 3.1. Besides, we areinterested in the minimal and maximal tail dependence measures λ ( C ) = inf µ ∈M λ µ ( C ) and λ ( C ) = sup µ ∈M λ µ ( C ) , C ∈ C C2 , which will be addressed in Sections 3.1 and 4, respectively. µ -tail dependence measures In this section we present examples of µ -TDM. We first show that tail dependence coefficient is a specialcase of µ -TDM. Example 2.7 (Tail dependence coefficient) . By Part 4) of Proposition A.5, the Dirac measure µ = δ ( , ) generates λ δ ( , )(Λ) = Λ (cid:0) , (cid:1) = Λ(1 , , which is known as the (lower) tail dependence coefficient (TDC) denoted by λ ( C ) (Sibuya, 1960). Since δ ( , ) is the generating measure of Blomqvist’s beta as seen in Example 2.3, TDC can be seen as the tail counterpartof Blomqvist’s beta. More generally, if µ is a probability measure whose probability mass is concentrated onthe main diagonal { ( u, v ) ∈ (0 , : u = v } , then the resulting µ -tail dependence measure is TDC.Since any Dirac measure δ ( u ,v ) , ( u , v ) ∈ (0 , , can be used as a tail generating measure, TDC can begeneralized as follows. Example 2.8 (Generalized tail dependence coefficient) . For a fixed ( u , v ) ∈ (0 , , consider the Diracmeasure µ = δ ( u ,v ) . The resulting µ -TDM is given by λ δ ( u ,v (Λ) = Λ( u , v ) M ( u , v ) = u v Λ (cid:16) , v u (cid:17) , if u ≥ v , v u Λ (cid:16) u v , (cid:17) , if u < v , which we call the generalized tail dependence coefficient (GTDC) or tail generalized Blomqvist’s beta . More gener-ally, let µ l ( a ) , a ∈ (0 , ∞ ), be any probability measure concentrated on the line segment (cid:8) ( u, au ) : 0 < u ≤ min (cid:0) , a (cid:1)(cid:9) .Then the resulting µ -tail dependence measure is given by λ µ l ( a ) (Λ) = Λ(1 , a ) M (1 , a ) = Λ(1 ,a ) a , if 0 < a ≤ ,a Λ (cid:0) a , (cid:1) , if 1 < a, which coincides with λ δ ( u ,v (Λ) with a = v u . GTDC will play an important role for analyzing maximalTDMs; see Section 4.1.Next we consider tail counterparts of Spearman’s rho and Gini’s gamma. Example 2.9 (Tail Spearman’s rho and tail Gini’s gamma) . Tail Spearman’s rho : Let µ be the generating measure of Spearman’s rho and denote the corresponding µ -TDM by λ S . Since R M dΠ = , we have that λ S ( C ) = 3 Z I Λ( u, v ) d u d v, which is a measure introduced in Schmid and Schmidt (2007), and we call it tail Spearman’s rho .2. Tail Gini’s gamma : Let µ be the generating measure of Gini’s gamma and denote the corresponding µ -TDM by λ G . Since R M d M = and R M d W = , we have that λ G ( C ) = R Λ d M + R Λ d W + = 23 (cid:18) Λ(1 ,
1) + 2 Z I Λ( u, − u ) d u (cid:19) , which we call tail Gini’s gamma . More generally, denote by λ G w , w ∈ I , the µ -TDM where µ is aprobability measure which puts the probability mass w ∈ I on the main diagonal { ( u, v ) ∈ (0 , : u = v } and the remaining probability mass 1 − w on the sub-diagonal { ( u, v ) ∈ (0 , : u = 1 − v } . Then wehave that λ G w (Λ) = w Λ(1 , R I u d µ ( u ) + (1 − w ) R I Λ( u, − u ) d µ ( u ) w R I u d µ ( u ) + (1 − w ) R I M ( u, − u ) d µ ( u ) , for some probability measures µ and µ on I .Further examples of µ -TDM will be provided in Section 3.3. We end this section with a remark on tailKendall’s tau. Remark 2.10 (Tail Kendall’s tau) . For Λ ∈ L such that Λ(1 , >
0, the function Λ( u,v )Λ(1 , , ( u, v ) ∈ I , definesa distribution function on I by Proposition A.5. For µ Λ being its probability measure, the measure λ µ Λ (Λ)can be regarded as a tail counterpart of Kendall’s tau; see Asimit et al. (2016) for a related tail dependencemeasure. This measure, however, is not a µ -TDM since µ ∈ M cannot be chosen dependent on Λ. µ -tail dependence measures In this section we show that µ -tail dependence measures introduced in Section 2.2 satisfy desirableproperties for quantifying extremal co-movements between random variables. Axiomatic properties andrelationships of µ -TDM with tail dependence coefficients are studied in Section 3.1. Monotonicity axiom of µ -TDM as an analog to Property 4 in Definition 2.1 is particularly investigated in Section 3.2 since concordanceorder among tail dependence functions can be interpreted differently from that among copulas. Finally, basedon homogeneity property of TDF found in Part 4) of Proposition A.5, we derive various representations of µ -TDM in Section 3.3, which will be useful to interpret µ -TDM and to investigate maximal TDM consideredin Section 4. µ -tail dependence measures We begin with proving axiomatic properties of µ -TDM as an analog to those in Definition 2.1 for measureof concordance. Since we focus only on the tail part of an underlying copula, we define the concordance orderin the tail part as follows. Definition 3.1 (Tail concordance order) . For
C, C ∈ C , we say that C is more concordant in the tail than C , denoted by C (cid:22) L C , if there exists (cid:15) > C ( u, v ) ≤ C ( u, v ) for all ( u, v ) ∈ (0 , (cid:15) ) .Note that C (cid:22) C implies that C (cid:22) L C and C (cid:22) L C implies that Λ( · ; C ) (cid:22) Λ( · ; C ), that is, Λ( u, v ; C ) ≤ Λ( u, v ; C ) for all ( u, v ) ∈ R . The following axiomatic properties are immediate consequences from thedefinition of µ -TDM.0 Proposition 3.2 (Basic properties of µ -tail dependence measures) . For µ ∈ M , the µ -tail dependencemeasure λ µ satisfies the following properties.1. (Range) : 0 ≤ λ µ ( C ) ≤ C ∈ C L2 .2. (Bounds) : λ µ (Π) = 0 and λ µ ( M ) = 1.3. (Asymptotic independence) : λ µ ( C ) = 0 if and only if Λ(1 , C ) = 0.4. (Linearity) : λ µ ( tC + (1 − t ) C ) = tλ µ ( C ) + (1 − t ) λ µ ( C ) for every C, C ∈ C L2 and t ∈ I .5. (Monotonicity) If C (cid:22) L C for C, C ∈ C L2 , then λ µ ( C ) ≤ λ µ ( C ).6. (Continuity) : If C n ∈ C L2 , n = 1 , , . . . , converges to C ∈ C L2 pointwise as n → ∞ , then lim n →∞ λ µ ( C n ) = λ µ ( C ). Proof. λ µ . 3) will be shown in Part 2)of Proposition 3.3. Therefore, it remains to show 6). By Theorem 1.7.6 of Durante and Sempi (2015), C n converges to C uniformly. Therefore, we have that lim n →∞ C n ( pu,pv ) p = C ( pu,pv ) p uniformly for p > u, v ) ∈ R , and hencelim n →∞ Λ n ( u, v ) = lim n →∞ lim p ↓∞ C n ( pu, pv ) p = lim p ↓∞ lim n →∞ C n ( pu, pv ) p = lim p ↓∞ C ( pu, pv ) p = Λ( u, v )by Moore-Osgood theorem (Taylor, 1985). Since Λ n and Λ are bounded by Part 7) of Proposition A.5, wehave that lim n →∞ λ µ ( C n ) = λ µ ( C ) by the bounded convergence theorem.Regarding Part 5) of Proposition 3.2, the inequality λ µ (Λ) ≤ λ µ (Λ ) holds under the weaker assumptionthat Λ (cid:22) Λ than C (cid:22) L C . In fact, the relationship Λ (cid:22) Λ can be simplified since Part 4) of Proposition A.5implies that Λ( u, v ) = || ( u, v ) || Λ (cid:18) u || ( u, v ) || , v || ( u, v ) || (cid:19) , Λ ∈ L , (4)where || · || denotes any norm on R , and thus Λ (cid:22) Λ is equivalent toΛ( u, v ) ≤ Λ ( u, v ) for all ( u, v ) ∈ R such that || ( u, v ) || = 1 . See Li (2013) for tail dependence orders implied by Λ (cid:22) Λ . The relationship Λ (cid:22) Λ will be further investigatedin Section 3.2.Next we investigate relationships of µ -TDM and the (lower) TDC, which is a special case of µ -TDM asseen in Example 2.7. Recall that the minimal tail dependence measure is defined by λ ( C ) = inf µ ∈M λ µ ( C ), C ∈ C L2 . Proposition 3.3 (Relationships with tail dependence coefficient) . Let λ ( C ) = Λ(1 , C ) be the tail depen-dence coefficient and λ µ be the µ -tail dependence measure for some µ ∈ M . Then λ µ and λ satisfy thefollowing properties.1. λ ≤ λ µ ≤ min(1 , aλ ) where a = a ( µ ) = R I max( u,v ) d µ ( u,v ) R I M ( u,v ) d µ ( u,v ) ≥
1, and thus λ ( C ) = λ ( C ).2. λ ( C ) = 0 if and only if λ µ ( C ) = 0.13. If λ ( C ) = 1, then λ µ ( C ) = 1.4. For any (cid:15) ∈ (0 , µ ∈ M such that λ µ ( C (cid:15) ) = 1 but λ ( C (cid:15) ) = (cid:15) for some C (cid:15) ∈ C L2 . Therefore, λ µ ( C ) = 1 does not imply λ ( C ) = 1 in general.5. λ µ ( C ) = 1 implies λ ( C ) = 1 if µ satisfies the following condition:there exists u ∈ (0 ,
1) such that µ ( B (cid:15) ( u , u ) ∩ I ) > (cid:15) > , (5)where B (cid:15) ( u, v ) is the open ball around ( u, v ) with radius (cid:15) >
0. Therefore, λ µ ( C ) = 1 and λ ( C ) = 1 areequivalent under (5). Proof.
1) comes directly from Part 8) of Proposition A.5. 2) and 3) follow from 1). 4) An example willbe provided in Example 3.4. Therefore, it remains to show 5). Suppose, by way of contradiction, that λ = λ ( C ) ≤ − δ < δ >
0. Let u ∈ (0 ,
1) be the point as described in Condition (5). Then atleast one of µ ( B (cid:15) ( u , u ) ∩ { ( u, v ) : v ≥ u } ) > (cid:15) > µ ( B (cid:15) ( u , u ) ∩ { ( u, v ) : v ≤ u } ) > (cid:15) > u and v .Let c ( δ ) = − δ − δ ∈ (0 ,
1) and R δ = { ( u, v ) ∈ (0 , : c ( δ ) v ≤ u ≤ v } . Then, for ( u, v ) ∈ R δ , we have that v ≤ uc ( δ ) and Λ(1 ,
1) = λ ( C ) ≤ − δ , and thusΛ( u, v ) ≤ Λ( v, v ) = v Λ(1 , ≤ c ( δ ) u (1 − δ ) = (cid:18) − δ (cid:19) u = (cid:18) − δ (cid:19) M ( u, v ) . Since Λ( u, v ) ≤ M ( u, v ) when ( u, v ) ∈ (0 , \ R δ , we have that λ µ (Λ) = R (0 , Λ d µ R (0 , M d µ ≤ R (0 , M d µ − δ R R δ M d µ R (0 , M d µ = 1 − δ R R δ M d µ R (0 , M d µ . Moreover, we have that Z R δ M d µ ≥ Z R δ ∩ B (1 − c ( δ )) u ( u ,u ) M d µ = Z R δ ∩ B (1 − c ( δ )) u ( u ,u ) u d µ ( u, v ) ≥ c ( δ ) u µ ( R δ ∩ B (1 − c ( δ )) u ( u , u )) > , and hence λ µ (Λ) <
1, which contradicts the assumption that λ µ (Λ) = 1. Example 3.4 ( λ µ ( C ) = 1 does not imply λ ( C ) = 1 in general) . For a ∈ (0 , ∞ ), let µ l ( a ) be a tail generatingmeasure considered in Example 2.8. For a fixed (cid:15) ∈ (0 , C (cid:15) be the singular copula considered inExample A.3. Then we have that λ µ l ( a ) ( C (cid:15) ) = a Λ (cid:0) a , (cid:1) = 1 , if a ≥ /(cid:15),a Λ (cid:0) a , (cid:1) = a(cid:15) ∈ ( (cid:15), , if 1 < a < /(cid:15), Λ(1 ,a ) a = θaa = (cid:15), if a ≤ . Therefore, it holds that λ µ l ( a ) ( C (cid:15) ) = 1 , when a = 1 (cid:15) , but λ ( C (cid:15) ) = λ µ l (1) ( C (cid:15) ) = (cid:15) < . µ -TDM over TDC. First, Part 1) reveals thatthe minimal TDM is TDC, that is, µ -TDM always provides a more conservative evaluation of tail de-pendence than TDC does, and the gap between λ and λ µ can be quantified by the coefficient a ( µ ) = R I max( u, v ) d µ ( u, v ) / R I M ( u, v ) d µ ( u, v ). As implied by Part 2) and Part 3), this gap is not relevant whenthe underlying copula admits tail independence (Λ(1 ,
1) = 0) or TDM exhibits maximal tail dependence(Λ(1 ,
1) = 1). However, as seen in Part 4), λ and λ µ may quantify asymptotic dependence quite differently. Infact, in Example 3.4, the coefficient a ( µ l ( (cid:15) )) = (cid:15) is extremely large for a small (cid:15) >
0, which yields the large gapbetween λ and λ µ l ( (cid:15) ). Using the identity u + v = max( u, v ) + min( u, v ), we have that a ( δ ( , )) = a ( M ) = 1, a (Π) = 2, a ( W ) = 3 and a ( wM + (1 − w ) W ) = − w w ∈ [1 ,
3] for w ∈ I . In general, when µ = B for a copula B ∈ C , we obtain the formula a ( B ) = 1 R I M d B − ∈ [1 , , and its lower and upper bounds are attained when B = M and B = W , respectively. Finally, Part 5) showsthat maximal tail dependence is equivalently measured by λ and λ µ if µ has a positive probability aroundsome point on the main diagonal. This condition is fulfilled when, for example, µ is the uniform measure on I , and the same equivalence result in this particular case is shown in Proposition 3 of Schmid and Schmidt(2007). This equivalence is, however, not true in general; in fact, the measure µ l ( (cid:15) ) considered in Example 3.4violates Condition (5). µ -tail dependence measures As we mentioned in Section 3.1, the concordance order Λ (cid:22) Λ , Λ , Λ ∈ L , implies that λ µ (Λ) ≤ λ µ (Λ ).However, ordering two functions with respect to (cid:22) should be considered with care when these functions havedifferent margins; for example, if H M is a distribution function whose probability mass is uniformly distributedon the line segment from (1 ,
1) to (2 ,
2) and H Π = Π, then H M (cid:22) H Π although the copula M of H M is moreconcordant than the copula Π of H Π . This section is devoted to provide more intuitive interpretation of theorder Λ (cid:22) Λ to show that the concordance order between TDFs is consistent with our intention of comparingdegrees of tail dependence.For ( U, V ) ∼ C , consider the conditional probabilitiesΛ ( u ) = Λ ( u ; C ) = Λ( u,
1) = lim p ↓ P ( U ≤ pu | V ≤ p ) , u ∈ I , Λ ( v ) = Λ ( v ; C ) = Λ(1 , v ) = lim p ↓ P ( V ≤ pv | U ≤ p ) , v ∈ I . The limits Λ and Λ exist if C ∈ C L2 . For C, C ∈ C L2 , since Λ ( u ; C ) ≤ Λ ( u ; C ) and Λ ( v ; C ) ≤ Λ ( v ; C )for all u, v ∈ I mean that the conditional tail probabilities are larger under C than those under C , we definethe following order to compare degrees of tail dependence. Definition 3.5 (Marginal tail order) . For
C, C ∈ C L2 , we say that C is greater than C in marginal tail order ,denoted by C (cid:22) M C , if Λ ( u ; C ) ≤ Λ ( u ; C ) and Λ ( v ; C ) ≤ Λ ( v ; C ) for all u, v ∈ I .The next proposition shows that the marginal tail order between C and C is equivalent to the concordanceorder between Λ( · ; C ) and Λ( · ; C ).3 Proposition 3.6 (Marginal tail order and concordance order) .
1. Let Λ ( u ) = Λ( u,
1) and Λ ( v ) = Λ(1 , v ) for u, v ∈ I . Then we have thatΛ( u, v ) = v Λ (cid:16) uv (cid:17) { u ≤ v } + u Λ (cid:16) vu (cid:17) { v
2. For
C, C ∈ C L2 , we have that C (cid:22) M C if and only if Λ( · ; C ) (cid:22) Λ( · ; C ). Proof.
1) immediately comes from Part 4) of Proposition A.5, and 2) is directly implied by 1).Proposition 3.6 shows that the concordance order Λ (cid:22) Λ between TDFs is consistent with our intentionof comparing degrees of tail dependence in the sense of Definition 3.5. The following corollary restatesmonotonicity of µ -TDM to show that λ µ quantifies the degree of tail dependence consistent with the marginaltail order. Corollary 3.7 (Monotonicity of λ µ with respect to marginal tail order) . Let µ ∈ M . For C, C ∈ C L2 , if C (cid:22) M C , then λ µ ( C ) ≤ λ µ ( C ). Remark 3.8 (Concordance order between Pickands dependence functions) . Jaworski (2019) considereddependence measures among extreme value copulas, and required them to be monotone with respect to theconcordance order A ( w ) ≤ A ( w ), for all w ∈ I , between Pickands dependence functions A, A ∈ A . Thisorder between A and A is equivalent to the marginal tail order C A (cid:22) M C A between EV copulas, which canbe seen from Equation (20) and Part 2) of Proposition 3.6. µ -tail dependence measures As implied by Equation (4), concordance order of Λ ∈ L is completely determined by the values of Λ( u, v )on the unit circle { ( u, v ) ∈ R : || ( u, v ) || = 1 } where || · || is any norm on R . This property for the case when || · || is the L ∞ -norm can also be deduced from Part 1) of Proposition 3.6. Since information of Λ( u, v ) on theentire domain ( u, v ) ∈ R is redundant for comparing degree of tail dependence, we expect µ -TDMs to besimplified so that they depend only on more essential information of Λ. With this motivation in mind, thissection is dedicated to derive two radial representations of µ -TDM in terms of L - and L ∞ -norms.To this end, let us write a point ( u, v ) ∈ I in polar coordinate system:( u, v ) = r (cos φ, sin φ ) where 0 ≤ φ ≤ π , ≤ r ≤ r ( φ ) and r ( φ ) = (cos φ ) − , if φ ∈ (cid:2) , π (cid:3) , (sin φ ) − , if φ ∈ (cid:0) π , π (cid:3) . Then it is straightforward to see that the µ -TDM, µ ∈ M , can be expressed as λ µ (Λ) = R ( , π ) × (0 ,r ( φ )] r Λ(cos φ, sin φ ) d µ φ,r ( φ, r ) R ( , π ) × (0 ,r ( φ )] rM { cos φ, sin φ } d µ φ,r ( φ, r ) , where µ φ,r ( A ) = µ ( { ( r cos φ, r sin φ ) : ( φ, r ) ∈ A } ), A ∈ B (cid:0)(cid:8) ( φ, r ) ∈ (cid:0) , π (cid:1) × (0 , r ( φ )] (cid:9)(cid:1) . Therefore, bydefining the new probability measure as µ φ ( B ) = R B × (0 ,r ( φ )] r d µ φ,r ( φ, r ) R ( , π ) × (0 ,r ( φ )] r d µ φ,r ( φ, r ) , B ∈ B (cid:16)(cid:16) , π (cid:17)(cid:17) , (6)we obtain the following representation of µ -TDM.4 Proposition 3.9 ( L -radial representation of µ -tail dependence measure) . Let µ ∈ M . Then the µ -taildependence measure can be expressed as λ µ (Λ) = R ( , π ) Λ(cos φ, sin φ ) d µ φ ( φ ) R ( , π ] sin φ d µ φ ( φ ) + R ( π , π ) cos φ d µ φ ( φ ) , (7)where µ φ is a Borel probability measure on (cid:0) , π (cid:1) defined in (6).A similar representation can be derived by writing a point ( u, v ) ∈ I as ( u, v ) = r (1 ∧ cot φ, ∧ tan φ ),where r = max( u, v ) ∈ [0 , φ = uv , tan φ = vu and 0 ≤ φ ≤ π . The resulting representation is provided inthe following proposition. Proposition 3.10 ( L ∞ -radial representation of µ -tail dependence measure) . Let µ ∈ M . Then λ µ (Λ) = R ( , π ) Λ(1 ∧ cot φ, ∧ tan φ ) d µ φ ( φ ) R ( , π ] tan φ d µ φ ( φ ) + R ( π , π ) cot φ d µ φ ( φ ) (8)= w Λ(1 ,
1) + w R (0 , Λ( t,
1) d µ ( t ) + w R (0 , Λ(1 , t ) d µ ( t ) w + w R (0 , t d µ ( t ) + w R (0 , t d µ ( t ) , (9)for some Borel probability measure µ φ on (cid:0) , π (cid:1) , w , w , w ≥ w + w + w = 1 and some Borelprobability measures µ and µ on (0 , Proof.
Representation (8) is shown as an analog to Representation (7) in Proposition 3.9. We deriveRepresentation (9) by means of random variables for later use. Let µ ∈ M and ( U, V ) ∼ µ . As discussed inRemark 2.5, we can assume without loss of generality that P ( U = 0) = P ( V = 0) = 0. Hence we write R = V, T = UV , R = U, T = VU . (10)Let ˜ µ and ˜ µ be the probability measures of ( R , T ) | { T < } and ( R , T ) | { T < } , respectively. Bydefining the probability measures on (0 ,
1) as µ ( A ) = R (0 , × A r ˜ µ ( r, t ) R (0 , × (0 , r ˜ µ ( r, t ) and µ ( A ) = R (0 , × A r ˜ µ ( r, t ) R (0 , × (0 , r ˜ µ ( r, t ) , A ∈ B ((0 , , and the nonnegative real numbers as˜ w = P ( T = 1) E [ R | T = 1] , ˜ w = P ( T < E [ R | T < , ˜ w = P ( T < E [ R | T < , it holds that Z I Λ d µ = E µ [Λ( U, V )]= P ( U = V ) E [Λ( U, V ) | U = V ] + P ( U < V ) E [Λ( U, V ) | U < V ] + P ( U > V ) E [Λ( U, V ) | U > V ]= P ( T = 1) E [Λ( R , R ) | T = 1] + P ( T < E [Λ( R T , R ) | T <
1] + P ( T < E [Λ( R , R T ) | T < P ( T = 1)Λ(1 , E [ R | T = 1] + P ( T < E [ R Λ( T , | T <
1] + P ( T < E [ R Λ(1 , T ) | T < w Λ(1 ,
1) + ˜ w Z (0 , Λ( t,
1) d µ ( t ) + ˜ w Z (0 , Λ(1 , t ) d µ ( t ) . Therefore, we obtain the desired representation (9) by defining ( w , w , w ) = ( ˜ w , ˜ w , ˜ w )˜ w + ˜ w + ˜ w where ˜ w + ˜ w + ˜ w = E [max( U, V )] ∈ (0 , R and T , and R and T are independent randomvariables as pairs. In what follows, the I -valued random variable having the monomial cumulative distributionfunction G m ( x ) = x m , x ∈ I , m >
0, is denoted by X ∼ Mo( m ). By calculation, we have that E [ X ] = mm +1 for X ∼ Mo( m ). Proposition 3.11 (Polynomially weighted tail dependence measures) . For m , m >
0, let U ∼ Mo( m ) and V ∼ Mo( m ) be independent random variables and let µ m ,m be the probability measure of ( U, V ). Then λ µ m ,m (Λ) = ( m + 1)( m + 1) m + m + 2 Z (cid:8) Λ( t, t m − + Λ(1 , t ) t m − (cid:9) d t. (11) Proof.
Denote by ˜ F and ˜ F the cumulative distribution functions of ( ˜ R , ˜ T ) = ( R , T ) | { T < } and( ˜ R , ˜ T ) = ( R , T ) | { T < } , respectively, where R , T , R , T are as specified in (10). Then P ( T = 1) = 0, P ( T <
1) = m m + m and P ( T <
1) = m m + m by calculation. Moreover, we have that˜ F ( r, t ) = P ( R ≤ r, T ≤ t | T <
1) = m + m m Z r G m ( st ) d G m ( s ) = t m r m + m , and that ˜ F ( r, t ) = t m r m + m . Hence R ∼ Mo( m + m ) and T ∼ Mo( m ), and R ∼ Mo( m + m ) and T ∼ Mo( m ) are independent random variables as pairs. Therefore, it holds that E [ R Λ( T , | T <
1] = E [ ˜ R Λ( ˜ T , E [ ˜ R ] E [Λ( ˜ T , m ( m + m ) m + m + 1 Z Λ( t, t m − d t, E [ R Λ(1 , T ) | T <
1] = E [ ˜ R Λ(1 , ˜ T )] = m ( m + m ) m + m + 1 Z Λ(1 , t ) t m − d t, and thus λ µ m ,m (Λ) = P ( T < E [ ˜ R Λ( ˜ T , P ( T < E [ ˜ R Λ(1 , ˜ T )] P ( T < E [ ˜ R ˜ T ] + P ( T < E [ ˜ R ˜ T ]= R (cid:8) Λ( t, t m − + Λ(1 , t ) t m − (cid:9) d t R ( t m + t m ) d t = ( m + 1)( m + 1) m + m + 2 Z (cid:8) Λ( t, t m − + Λ(1 , t ) t m − (cid:9) d t. Example 3.12 ( L ∞ -radial representation of tail Spearman’s rho) . Let m = m = 1. Then µ m ,m is theuniform measure on I , which is the tail generating measure of tail Spearman’s rho considered in Part 1)of Example 2.9. Therefore, Equation (11) in Proposition 3.11 yields the L ∞ -radial representation of tailSpearman’s rho given by λ S (Λ) = Z (Λ( t,
1) + Λ(1 , t )) d t. (12)The radial representations presented in Propositions 3.9 and 3.10 enables us to evaluate µ -TDM via singleintgrations. Moreover, the representations give us insight about the choice of tail generating measure µ ∈ M .Concerning L ∞ -radial representation (9), µ -TDMs whose µ (or µ ) measure is supported on a strict subsetof I may not be appropriate since such µ -TDMs do not take into account all the conditional probabilities6Λ ( u ) and Λ ( v ), u, v ∈ I , considered in Section 3.2. Accordingly, TDC may not be a suitable tail dependencemeasure since it ignores all the conditional probabilities Λ ( u ) and Λ ( v ), u, v ∈ (0 , u = v = 1. Finally, although there is no essential difference among those representations, Representation (9)may be the most tractable one since marginal TDFs Λ( t,
1) and Λ(1 , t ) are typically more accessible thanΛ(cos φ, sin φ ); see various examples in Section 4.4. As deduced from Example 3.4, the choice of the tail generating measure µ ∈ M is important to overcomethe issue of underestimation of tail dependence. The maximal tail dependence measure λ ( C ) = sup µ ∈M λ µ ( C )naturally arises if µ ∈ M is chosen dependent on the underlying copula C ∈ C so that the resulting µ -TDM λ µ evaluates tail dependence most conservatively. In this section we study maximal TDM and related taildependence measures constructed by maximizing existing tail dependence measures. A useful formula ofmaximal TDM is first derived in Section 4.1. Section 4.2 concerns maximization of TDC and its normalizedversion over all the admissible paths heading to the origin. This maximization approach was first consideredin Furman et al. (2015), and we show that maximal TDM arises as the maximum of the normalized TDC overall admissible paths. Related maximal type tail dependence measures, such as the maximal tail dependencecoefficient (MTDC) , are also introduced. Axiomatic properties of tail dependence measures of maximal typeare shown in Section 4.3, and their examples are provided in Section 4.4. Through the examples we revealthat maximal TDM may overestimate, and MTDC may still underestimate degree of tail dependence. Given a tail dependence function Λ ∈ L , we are interested in choosing the tail generating measure µ ∈ M so that λ µ (Λ) is maximized. The class of maximal TDMs is then defined as follows. Definition 4.1 (Maximal tail dependence measures) . Let M ⊆ M . A measure λ M : L → I defined by λ M (Λ) = sup µ ∈M λ µ (Λ)is called the M - maximal tail dependence measure (MTDM) . When M = M , we call λ M the maximal taildependence measure , and denote it by λ .To study MTDM, let us define the normalized tail dependence functions byΛ ? ( t ) = Λ( t, t , Λ ? ( t ) = Λ(1 , t ) t and Λ ? ( t ) = Λ ? ( t ) ∨ Λ ? ( t ) , t ∈ I , where Λ ? (0), Λ ? (0) and Λ ? (0) can be defined as limits provided they exist. The normalized TDFs are relatedto µ -TDMs via Λ ? ( t ) = λ µ l ( t )(Λ) and Λ ? ( t ) = λ µ l ( t ) (Λ) , t ∈ (0 , , where µ l ( a ) , 0 < a < ∞ , is as defined in Example 2.8. The collection of all such measures is denoted by M l = (cid:8) µ l ( a ) ∈ M : 0 < a < ∞ (cid:9) . Lemma 4.2 (Properties of the normalized tail dependence functions) . Let Λ ∈ L . Then Λ ? , Λ ? and Λ ? are I -valued, Lipschitz continuous and decreasing, and thus the limits Λ ? (0) = lim t ↓ Λ ? ( t ), Λ ? (0) = lim t ↓ Λ ? ( t )and Λ ? (0) = lim t ↓ Λ ? ( t ) exist. Proof.
We prove the statements for Λ ? , and the other case of Λ ∗ and Λ ? can be shown similarly. By Part 7)of Proposition A.5, we have that 0 ≤ Λ ? ( t ) = Λ( t, t ≤ M ( t, t = 1, and thus Λ ? is I -valued. It is also Lipschitzcontinuous by Part 1) of Proposition A.6. Finally, Λ ? is decreasing since it is absolutely continuous andΛ ? ( t ) = t∂ Λ( t, − Λ( t, t = t∂ Λ( t, − t∂ Λ( t, − ∂ Λ( t, t = − ∂ Λ( t, t ≤ t ∈ I by Parts 3) and 4) of Proposition A.6.Using Lemma 4.2 we derive the formula for MTDM as follows. Theorem 4.3 (Formula for maximal tail dependence measure) . Let Λ ∈ L . Then it holds that λ (Λ) = λ M l (Λ) = Λ ? (0) = max (cid:18) lim t ↓ Λ( t, t , lim t ↓ Λ(1 , t ) t (cid:19) . (14) Proof.
By Lemma 4.2, we have that Λ ? (0) ≥ Λ ? ( t ) for all t ∈ I , that is,Λ ? (0) ≥ Λ( t, t and Λ ? (0) ≥ Λ(1 , t ) t for all t ∈ (0 , . For any µ ∈ M , the L ∞ -radial representation (9) implies that λ µ (Λ) = w Λ(1 ,
1) + w R (0 , Λ( t,
1) d µ ( t ) + w R (0 , Λ(1 , t ) d µ ( t ) w + w R (0 , t d µ ( t ) + w R (0 , t d µ ( t ) ≤ w Λ ? (0) + w R (0 , t Λ ? (0) d µ ( t ) + w R (0 , t Λ ? (0) d µ ( t ) w + w R (0 , t d µ ( t ) + w R (0 , t d µ ( t )= Λ ? (0)for w , w , w ≥ w + w + w = 1 and some Borel probability measures µ and µ on (0 , ? is decreasing and continuous, for any (cid:15) > t ∈ (0 ,
1] such thatΛ ? (0) − (cid:15) < Λ ? ( t ) = λ µ l ( t )(Λ) , if Λ ? ( t ) = Λ ? ( t ) ,λ µ l ( t ) (Λ) , if Λ ? ( t ) = Λ ? ( t ) . Since µ l ( t ) , µ l ( t ) ∈ M l ⊂ M , we obtain the desired equations (14).Note that, although the formula in Theorem 4.3 relies on L ∞ -representation of µ -TDM, a similar formulacan be derived based on L -representation of µ -TDM. We end this section with a remark on attainability ofthe suprema in λ and λ M l .8 Remark 4.4 (Attainability of λ and λ M l ) . Equation (14) implies that the suprema in λ and λ M l are ingeneral not attainable. As seen in the proof of Theorem 4.3, the maximal TDMs λ and λ M l are attainableif and only if there exists t ∈ (0 ,
1] such that Λ ? ( t ) = Λ ? (0). By Lemma 4.2 and Equation (13), it is alsoequivalent to the existence of t ∈ (0 ,
1] such that ∂ Λ( s,
1) = 0 , ≤ s ≤ t, if Λ ? (0) = Λ ? (0) , and ∂ Λ( s,
1) = 0 , ≤ s ≤ t, if Λ ? (0) = Λ ? (0) , under which the measures µ l ( t ) and µ l ( t ) attain the maximum, respectively. An example of such an attainablecase can be found in Example 4.17. For conservative evaluation of tail dependence, Furman et al. (2015) considered various indices constructedbased on the supremum of the probability C ( ϕ ( p ) , ψ ( p )) = P ( U ≤ ϕ ( p ) , V ≤ ψ ( p )) over all paths ϕ, ψ : I → I such that ϕ (0) = ψ (0) = 0. In this section we show that the maximal TDM proposed in Section 4.1 can bealso derived in their framework of path-based maximization. We also compare maximal TDM with otherindices of tail dependence. To this end, we begin with the definition of admissible paths as follows. Definition 4.5 (Admissible paths) . A function ϕ : I → I is called an admissible path if it satisfies thefollowing properties:1. p ≤ ϕ ( p ) ≤ p ∈ I , and2. ϕ ( p ) is continuous in the neighborhood of p = 0, and admits the derivative ϕ (0+) = lim p ↓ ϕ ( p ) p ∈ (0 , ∞ ).Let P be the set of all admissible paths.For ϕ ∈ P , the path p ( ϕ ( p ) , ψ ( p )) = (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) , p ∈ I , is in I by Part 1), and lim p ↓ ϕ ( p ) =lim p ↓ p ϕ ( p ) = 0 by Part 2) of Definition 4.5, respectively. Note that ϕ ( p ) = p leads to the diagonal path( ϕ ( p ) , ψ ( p )) = ( p, p ), and ϕ ( p ) = 1 and ϕ ( p ) = p are not admissible since they violate Part 2) of Definition 4.5.In addition, independence copula evaluated along the path ( ϕ ( p ) , ψ ( p )) is free from the choice of ϕ sinceΠ (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) = p . By replacing the diagonal path with ( ϕ ( p ) , ψ ( p )), we define the path-based taildependence coefficients as follows. Definition 4.6 ( ϕ -tail dependence coefficients) . Let C ∈ C and ϕ ∈ P . Then the ϕ - tail dependence coefficient and ϕ - normalized tail dependence coefficient (NTDC) are defined, respectively, by χ ϕ ( C ) = lim p ↓ C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) p and κ ϕ ( C ) = lim p ↓ C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) M (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) , provided the limits exist.Note that both of ϕ -TDC and ϕ -NTDC yield TDC when ϕ is the diagonal path ϕ ( p ) = p . Moreover, ϕ -NTDC can be interpreted as the normalized ϕ -TDC since κ ϕ ( C ) = χ ϕ ( C ) χ ϕ ( M ) . The next proposition showsthat χ ϕ ( C ) and κ ϕ ( C ) are well-defined for C ∈ C L2 , and admit representations in terms of TDF.9 Proposition 4.7 (Existence and representation of ϕ -tail dependence coefficients) . Suppose C ∈ C L2 and ϕ ∈ P . Then1. χ ϕ ( C ) and κ ϕ ( C ) are well-defined,2. χ ϕ ( C ) and κ ϕ ( C ) can be written by χ ϕ ( C ) = Λ (cid:18) b ϕ , b ϕ ; C (cid:19) and χ ϕ ( C ) = Λ (cid:16) b ϕ , b ϕ ; C (cid:17) M (cid:16) b ϕ , b ϕ (cid:17) where b ϕ = ϕ (0+) ∈ (0 , ∞ ) , (15)3. 0 ≤ χ ϕ ( C ) ≤ κ ϕ ( C ) ≤ Proof.
Let C ∈ C L2 and b ϕ = ϕ (0+) ∈ (0 , ∞ ). Since (cid:12)(cid:12)(cid:12)(cid:12)(cid:12)(cid:12) C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) p − C (cid:16) b ϕ p, pb ϕ (cid:17) p (cid:12)(cid:12)(cid:12)(cid:12)(cid:12)(cid:12) ≤ (cid:12)(cid:12)(cid:12)(cid:12) ϕ ( p ) p − b ϕ (cid:12)(cid:12)(cid:12)(cid:12) + (cid:12)(cid:12)(cid:12)(cid:12) pϕ ( p ) − b ϕ (cid:12)(cid:12)(cid:12)(cid:12) → p ↓ , we have that χ ϕ ( C ) = lim p ↓ C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) p = lim p ↓ C (cid:16) b ϕ p, pb ϕ (cid:17) p = Λ (cid:18) b ϕ , b ϕ ; C (cid:19) , and that κ ϕ ( C ) = lim p ↓ C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) M (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) = lim p ↓ C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) p pM (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) = Λ (cid:16) b ϕ , b ϕ ; C (cid:17) M (cid:16) b ϕ , b ϕ (cid:17) , which yield 1) and 2). Finally, 3) follows since 0 ≤ Λ (cid:16) b ϕ , b ϕ ; C (cid:17) ≤ M (cid:16) b ϕ , b ϕ (cid:17) ≤ ϕ -TDCand ϕ -NTDC over all admissible paths. This motivates us to define the following coefficients. Definition 4.8 (Maximal tail dependence coefficients) . For C ∈ C L2 , the maximal tail dependence coefficient(MTDC) and maximal normalized tail dependence coefficient (MNTDC) are defined, respectively, by χ ( C ) = sup ϕ ∈P χ ϕ ( C ) and κ ( C ) = sup ϕ ∈P κ ϕ ( C ) . The coefficients χ ( C ) and κ ( C ) are well-defined by Part 1) of Proposition 4.7. Since χ ϕ and κ ϕ admitRepresentations (15), χ ( C ) and κ ( C ) can also be expressed in terms of TDF as follows. Proposition 4.9 (Representations of maximal tail dependence coefficients) . Let C ∈ C L2 . Then we have that χ ( C ) = sup b ∈ R + Λ (cid:18) b, b ; C (cid:19) and κ ( C ) = sup b ∈ R + Λ (cid:0) b, b ; C (cid:1) M (cid:0) b, b (cid:1) . (16)0By Proposition 4.9, the maximal TDCs χ ( C ) and κ ( C ) can be seen as functions of Λ ∈ L through (16),and thus we also write them as χ (Λ) and κ (Λ). Note that the supremum in χ ( C ) is attainable in b ∈ R + sincelim b ↓ Λ (cid:0) b, b (cid:1) = lim b →∞ Λ (cid:0) b, b (cid:1) = 0 and the map b Λ (cid:0) b, b (cid:1) ∈ [0 ,
1] is continuous and bounded. Therefore,we can normalize χ (Λ) by m (Λ) = sup ( M (cid:18) a, a (cid:19) : a ∈ argmax b ∈ R + Λ (cid:18) b, b (cid:19)) ∈ I (17)to see the relative degree of tail dependence of Λ against M . Suppose that m (Λ) = 0. Then we have that a = 0 or a = ∞ for a ∈ argmax b ∈ R + Λ (cid:0) b, b (cid:1) , and thus Λ (cid:0) a, a (cid:1) = 0. However, this implies that Λ (cid:0) b, b (cid:1) = 0 for all b ∈ R + , and thus m (Λ) = 1, which is a contradiction. Therefore, we have that m (Λ) > Definition 4.10 (Maximal intermediate tail dependence coefficient) . For Λ ∈ L , the maximal intermediatetail dependence coefficient (MITDC) is defined by χ ? (Λ) = χ (Λ) m (Λ) , where m (Λ) is as defined in (17). Remark 4.11 (Difference from the constructions in Furman et al. (2015)) . Our constructions of maximaltail dependence coefficients in Definition 4.8 are different from those in Furman et al. (2015) since they firstdefine the path of maximal dependence ϕ ∗ ∈ P such that C (cid:18) ϕ ∗ ( p ) , p ϕ ∗ ( p ) (cid:19) = max ϕ ∈P C (cid:18) ϕ ( p ) , p ϕ ( p ) (cid:19) for all p ∈ I , (18)and then construct a tail index as a limit lim p ↓ C (cid:16) ϕ ∗ ( p ) , p ϕ ∗ ( p ) (cid:17) p . We adopt different definitions from themsince our interest lies in the maximum of C (cid:16) ϕ ( p ) , p ϕ ( p ) (cid:17) only on the neighborhood of p = 0. Moreover, asmentioned in Furman et al. (2015), uniqueness of the path ϕ ∗ in (18) needs to be assumed so that ϕ ∗ iscontinuous; otherwise the path can be pathological, which, for example, fluctuates between two continuouspaths infinitely often.The coefficients defined in Definitions 4.8 and 4.10 are related to TDC and the maximal TDM by thefollowing proposition. Theorem 4.12 (Inequalities for maximal tail dependence coefficients) . For Λ ∈ L , we have that λ (Λ) ≤ χ (Λ) ≤ χ ? (Λ) ≤ κ (Λ) = λ M (Λ) . Proof.
Let Λ ∈ L . The inequalities λ (Λ) ≤ χ (Λ) ≤ χ ? (Λ) follow since λ (Λ) = Λ(1 , ≤ sup b ∈ R + Λ (cid:18) b, b (cid:19) = χ (Λ) ≤ χ (Λ) m (Λ) = χ ? (Λ) . The inequality χ ? (Λ) ≤ κ (Λ) holds since κ (Λ) = sup b ∈ R + Λ (cid:0) b, b (cid:1) M (cid:0) b, b (cid:1) ≥ Λ (cid:0) a, a (cid:1) M (cid:0) a, a (cid:1) ≥ χ (Λ) m (Λ) = χ ? (Λ) , a ∈ argmax b ∈ R + Λ (cid:0) b, b (cid:1) . Finally, since Λ ( b, b ) M ( b, b ) = λ µ l ( b )(Λ), b ∈ R + , we have that κ (Λ) = λ (Λ) byTheorem 4.3.We end this section with an example such that λ (Λ) < χ (Λ) < χ ? (Λ) < κ (Λ), Λ ∈ L . Example 4.13 (A mixture of singular copulas in Example A.3) . For w , w ≥ w + w = 1and θ , θ ∈ (0 , C w ,w ,θ ,θ = w C θ + w τ ∗ C θ , where C θ is the singular copulaconsidered in Example A.3 and ( V, U ) ∼ τ ∗ C θ for ( U, V ) ∼ C θ . SinceΛ ( u, v ; C θ ) = u, if θ v ≥ u,θv, if θ v < u, and Λ ( u, v ; τ ∗ C θ ) = v, if θ u ≥ v,θu, if θ u < v, we have that Λ ( u, v ; C w ,w ,θ ,θ ) = w u + w θ u, if θ v ≥ u,w θ v + w θ u, if θ v < u and θ u < v,w θ v + w v, if θ u ≥ v, and thus λ ( C w ,w ,θ ,θ ) = Λ(1 , C w ,w ,θ ,θ ) = w θ + w θ . Since the function b Λ (cid:18) b, b ; C w ,w ,θ ,θ (cid:19) = w b + w θ b, if b ≤ √ θ ,w θ b + w θ b, if √ θ < b < √ θ ,w θ b + w b , if √ θ ≤ b, is maximized at b = √ θ if w √ θ ≥ w √ θ , and at b = √ θ if w √ θ < w √ θ , we have that χ ( C w ,w ,θ ,θ ) = w √ θ + w θ √ θ , if w √ θ ≥ w √ θ ,w θ √ θ + w √ θ , if w √ θ < w √ θ , and that χ ? ( C w ,w ,θ ,θ ) = w + w θ , if w √ θ ≥ w √ θ ,w θ + w , if w √ θ < w √ θ . Next, the function b Λ (cid:0) b, b ; C w ,w ,θ ,θ (cid:1) M (cid:0) b, b (cid:1) = w + w θ , if b ≤ √ θ ,w θ b + w θ , if √ θ < b ≤ ,w θ + w θ b , if 1 < b < √ θ ,w θ + w , if √ θ ≤ b, is maximized to be w + w θ on b ≤ √ θ if w (1 − θ ) ≥ w (1 − θ ), and to be w θ + w on √ θ ≤ b if w (1 − θ ) < w (1 − θ ). Therefore, we have that κ ( C w ,w ,θ ,θ ) = w + w θ , if w (1 − θ ) ≥ w (1 − θ ) ,w θ + w , if w (1 − θ ) < w (1 − θ ) . w , w , θ , θ ) = (0 . , . , . , . w (1 − θ ) = 0 . w (1 − θ ) = 0 . w √ θ = 0 . w w √ θ = 0 . λ ( C w ,w ,θ ,θ ) = 0 . < χ ( C w ,w ,θ ,θ ) = 0 . < χ ? ( C w ,w ,θ ,θ ) = 0 . < κ ( C w ,w ,θ ,θ ) = 0 . . As a particular case when w = 1 and w = 0, we have that λ ( C θ ) = θ < χ ( C θ ) = p θ < χ ? ( C θ ) = κ ( C θ ) = 1 . Regarding the attainer of χ ( C w ,w ,θ ,θ ) and χ ? ( C w ,w ,θ ,θ ), if w = w , that is, if C θ and τ ∗ C θ areequally weighted, then the line closer to the main diagonal attains the maximum of χ ϕ ( C ). If θ = θ , thenthe line having higher probability weight is chosen. For the attainer of κ ( C w ,w ,θ ,θ ), if w = w then thesharper line (the one further from the main diagonal) becomes dominant to determine κ ( C w ,w ,θ ,θ ). If θ = θ , the line having higher probability weight becomes dominant. Various tail dependence measures χ , χ ? and κ = λ are introduced in Sections 4.1 and 4.2. In this sectionwe study their properties as considered in Section 3.1. We first investigate axiomatic properties in the followingproposition. Proposition 4.14 (Basic properties of tail dependence measures of maximal type) . Consider the followingproperties for ξ : L → R + .1. (Range) : 0 ≤ ξ (Λ) ≤ ∈ L .2. (Bound) : ξ ( M ) = 1.3. (Asymptotic independence) : ξ (Λ) = 0 if and only if Λ(1 ,
1) = 0.4. (Convexity) : ξ ( t Λ + (1 − t )Λ ) ≤ tξ (Λ) + (1 − t ) ξ (Λ ) for every Λ , Λ ∈ L and t ∈ I .5. (Monotonicity) If Λ (cid:22) Λ for Λ , Λ ∈ L , then ξ (Λ) ≤ ξ (Λ ).6. (Continuity) : If C n ∈ C L2 , n = 1 , , . . . , converges to C ∈ C L2 pointwise as n → ∞ , then lim n →∞ ξ (Λ( · ; C n )) = ξ (Λ( · ; C )).Then χ and λ satisfy 1) – 6), and χ ? satisfies 1), 2) and 3). χ ? satisfies 5) if m (Λ) = m (Λ ). Moreover, χ ? satisfies 6) if maximizers of Λ (cid:0) b, b ; C n (cid:1) , n = 1 , . . . , and Λ (cid:0) b, b ; C (cid:1) are all unique. Proof.
For χ , χ ? and λ , 1) follows from Theorem 4.12, 2) is straightforward to check, and 3) will be shownin Parts 2) and 3) of Proposition 4.16. 4) and 5) for χ and λ are straightforward to check. 5) for χ ? is alsostraightforward under the assumtion that m (Λ) = m (Λ ). Therefore, it remains to show 6) for χ , χ ? and λ . To this end, suppose that C n ∈ C L2 , n = 1 , , . . . , converges to C ∈ C L2 pointwise as n → ∞ . As seen inthe proof of Part 6) of Proposition 3.2, the convergence of Λ n ( u, v ) = Λ( u, v ; C n ) to Λ( u, v ) = Λ( u, v ; C ) isuniform for ( u, v ) ∈ R . Therefore, we have thatlim n →∞ χ ( C n ) = lim n →∞ max b ∈ R + Λ n (cid:18) b, b (cid:19) = max b ∈ R + Λ (cid:18) b, b (cid:19) = χ ( C ) , n →∞ λ ( C n ) = lim n →∞ max b ∈ R + Λ n (cid:0) b, b (cid:1) M (cid:0) b, b (cid:1) = lim n →∞ max b ∈ R + Λ n bM (cid:0) b, b (cid:1) , bM (cid:0) b, b (cid:1) ! = max b ∈ R + Λ bM (cid:0) b, b (cid:1) , bM (cid:0) b, b (cid:1) ! = max b ∈ R + Λ (cid:0) b, b (cid:1) M (cid:0) b, b (cid:1) = λ ( C ) . Finally, if maximizers of Λ (cid:0) b, b ; C n (cid:1) , n = 1 , . . . , and Λ (cid:0) b, b ; C (cid:1) are all unique, then lim n →∞ m (Λ n ) = m (Λ),and thus we conclude that lim n →∞ χ ? ( C n ) = χ ? ( C ).The assumptions for χ ? to satisfy Parts 5) and 6) typically hold if C and C in Part 5) and C n , n = 1 , . . . , and C all belong to the same class of copulas; see examples in Section 4.4. On the other hand, χ ? is in generalneither convex nor concave as shown in the following example. Remark 4.15 (Convexity for maximal intermediate tail dependence coefficient) . Consider two mixtures ofsingular copulas C = C w , ,w , ,θ , ,θ , and C = C w , ,w , ,θ , ,θ , as studied in Example 4.13. Case 1 : Let t = 0 . w , = 0 . w , = 0 . w , = 0 . w , = 0 . θ , = 0 . θ , = 0 . θ , = 0 .
422 and θ , = 0 . w , p θ , ≥ w , p θ , and w , p θ , ≤ w , p θ , , the maxima ofΛ (cid:0) b, b (cid:1) and Λ (cid:0) b, b (cid:1) are attained at b ∗ = p θ , and b ∗ = √ θ , , respectively. Moreover, the maximum of t Λ (cid:0) b, b (cid:1) + (1 − t )Λ (cid:0) b, b (cid:1) is attained at b ∗ t = √ θ , . Therefore, by calculation, we have that χ ? ( t Λ + (1 − t )Λ ) = t Λ (cid:16) b ∗ t , b ∗ t (cid:17) + (1 − t )Λ (cid:16) b ∗ t , b ∗ t (cid:17) b ∗ t = 0 . , and that tχ ? (Λ ) + (1 − t ) χ ? (Λ ) = t Λ (cid:16) b ∗ , b ∗ (cid:17) b ∗ + (1 − t ) Λ (cid:16) b ∗ , b ∗ (cid:17) b ∗ = 0 . . Case 2 : Let t = 0 . w , = 0 . w , = 0 . w , = 0 . w , = 0 . θ , = 0 . θ , = 0 . θ , = 0 .
120 and θ , = 0 . b ∗ = p θ , , b ∗ = √ θ , and b ∗ t = p θ , .Therefore, by calculation, we have that χ ? ( t Λ + (1 − t )Λ ) = 0 . > tχ ? (Λ ) + (1 − t ) χ ? (Λ ) = 0 . . By Case 1 and Case 2, χ ? is neither convex nor concave.Finally, relationships of χ , χ ? and λ with TDC are summarized in the following proposition. Proposition 4.16 (Relationships with tail dependence coefficient) . Let Λ ∈ L . Then we have the followingproperties.1. λ (Λ) ≤ χ (Λ) ≤ χ ? (Λ) ≤ λ (Λ) ≤ λ (Λ) = 0, then χ (Λ) = χ ? (Λ) = λ (Λ) = 0.3. If at least one of χ (Λ), χ ? (Λ) and λ (Λ) is zero, then λ (Λ) = 0.4. If λ (Λ) = 1, then χ (Λ) = χ ? (Λ) = λ (Λ) = 1.45. If χ (Λ) = 1, then λ (Λ) = 1. Moreover, χ ? (Λ) = 1 or λ (Λ) = 1 do not imply λ (Λ) = 1. Proof.
1) follows from Theorem 4.12. 2) is implied by Part 5) of Proposition A.5. 3) and 4) are directconsequences from 1). For 5), we have that χ ? ( C θ ) = λ ( C θ ) = 1 but λ ( C θ ) = θ < θ ∈ (0 , C θ is the singular copula considered in Example A.3; see Example 4.13. Therefore, it remains to show that χ (Λ) = 1 implies λ (Λ) = 1. Suppose χ (Λ) = 1. Then Λ (cid:16) b ∗ , b ∗ (cid:17) = χ (Λ) = 1 where b ∗ ∈ argmax b ∈ R + Λ (cid:0) b, b (cid:1) . Wehave that 0 < b ∗ < ∞ since Λ (cid:16) b ∗ , b ∗ (cid:17) = 0 if b ∗ = 0 or ∞ . Suppose that 0 < b ∗ <
1. Then, by Part 7) ofProposition A.5, we have that 1 = Λ (cid:18) b ∗ , b ∗ (cid:19) ≤ M (cid:18) b ∗ , b ∗ (cid:19) = b ∗ , and thus 1 ≤ b ∗ , which is a contradiction. Similarly, if we assume that 1 < b ∗ , then we have that1 = Λ (cid:16) b ∗ , b ∗ (cid:17) ≤ M (cid:16) b ∗ , b ∗ (cid:17) = b ∗ , and thus b ∗ ≤
1, which is again a contradiction. Therefore, wehave that b ∗ = 1, and thus λ (Λ) = Λ(1 ,
1) = Λ (cid:16) b ∗ , b ∗ (cid:17) = χ (Λ) = 1. In this section we derive the maximal type tail dependence measures χ , χ ? and λ for various copulas.Through the examples, we reveal that the maximal TDM may overestimate and MTDC and MITDC mayunderestimate degree of tail dependence. Example 4.17 (Survival Marshall-Olkin copula) . Let us consider the
Marshall-Olkin copula C a,b defined by C MO a,b ( u, v ) = min (cid:0) u − a v, uv − b (cid:1) , a, b ∈ (0 , , ( u, v ) ∈ I . By calculation, the TDF of the survival Marshall-Olkin copula is given byΛ( u, v ; ˆ C a,b ) = u + v − max( v + (1 − a ) u, u + (1 − b ) v ) = min( au, bv ) . Therefore, the TDC is given by λ ( ˆ C a,b ) = Λ(1 ,
1; ˆ C a,b ) = min( a, b ). Moreover, since the function t Λ (cid:0) t, t (cid:1) =min (cid:0) at, bt (cid:1) is maximized at t = q ba , we have that χ ( ˆ C a,b ) = √ ab and χ ? ( ˆ C a,b ) = √ abM (cid:18)q ba , p ab (cid:19) = max( a, b ) . Finally, since Λ( t,
1) = min( at, b ) and Λ(1 , t ) = min( a, bt ), t ∈ I , we have that λ S ( ˆ C a,b ) = Z (Λ( t,
1; ˆ C a,b ) + Λ(1 , t ; ˆ C a,b )) d t = a − a b , if a < b, b − b a , if a ≥ b, by (12), and that λ ( ˆ C a,b ) = max n lim t ↓ t, t , lim t ↓ ,t ) t o = max( a, b ), which is attained by any measure µ l ( t ) , t ∈ R + , such that t ∈ { s ∈ I : Λ ? ( s ) = Λ ? (0) } = (cid:20) , min (cid:18) ab , ba (cid:19)(cid:21) . b L ( b , ) . . . . ( a ) C l a y t on b L ( b , ) . . . . ( b ) S t uden t t b L ( b , ) . . . . ( c ) A sy mm e t r i c G u m be l b L ( , b ) . . . . ( d ) A sy mm e t r i c G u m be l b L ( b , ) . . . . ( e ) A sy mm e t r i c G a l a m bo s b L ( , b ) . . . . ( f ) A sy mm e t r i c G a l a m bo s Figure 2: The tail dependence functions b Λ (cid:0) b, b (cid:1) and b Λ (cid:0) b , b (cid:1) for (a) Clayton copula C Cl θ , (b) t copula C t ν,ρ with ν = 5, (c) (d) survival asymmetric Gumbel copula C Gu α,β,θ with α = 0 .
75 and β = 0 .
35, and(e) (f) survival asymmetric Galambos copula C Ga α,β,θ with α = 0 .
35 and β = 0 .
75. The parameters of thecopulas vary from (a) θ = 0 .
075 to 68 . ρ = − .
99 to 1, (c) (d) θ = 1 .
00 to 69 . θ = 0 . . t L ( t , ) t . . . . ( a ) C l a y t on t L ( t , ) t . . . . ( b ) S t uden t t t L ( t , ) t . . . . ( c ) A sy mm e t r i c G u m be l t L ( , t ) t . . . . ( d ) A sy mm e t r i c G u m be l t L ( t , ) t . . . . ( e ) A sy mm e t r i c G a l a m bo s t L ( , t ) t . . . . ( f ) A sy mm e t r i c G a l a m bo s Figure 3: The normalized tail dependence functions t Λ( t, t and t Λ(1 ,t ) t for (a) Clayton copula C Cl θ ,(b) t copula C t ν,ρ with ν = 5, (c) (d) survival asymmetric Gumbel copula C Gu α,β,θ with α = 0 .
75 and β = 0 . C Ga α,β,θ with α = 0 .
35 and β = 0 .
75. The parameters of thecopulas vary from (a) θ = 0 .
075 to 68 . ρ = − .
99 to 1, (c) (d) θ = 1 .
00 to 69 . θ = 0 . . Example 4.18 (Survival EV copulas) . For A ∈ A , let C A be the EV copula defined in Equation (19). ByEquation (20), we have thatΛ( t, t = 1 − A ( w ) w and Λ(1 , t ) t = 1 − A (1 − w ) w where w = w ( t ) = tt + 1 ∈ I , for Λ( · ) = Λ( · ; ˆ C A ). Therefore, we have that λ ( ˆ C A ) = 2 (cid:0) − A (cid:0) (cid:1)(cid:1) , and that λ ( ˆ C A ) = max( − A (0) , A (1)).In particular, for the asymmetric Gumbel and Galambos copulas, we have thatΛ Gu α,β,θ (cid:18) b, b (cid:19) = b (cid:8) α − θ + β − θ b θ (cid:9) − θ , ≤ α, β ≤ , θ ≥ , Λ Ga α,β,θ (cid:18) b, b (cid:19) = ( α − β ) b + β b + 1 b − b (cid:8) α θ + β θ b − θ (cid:9) θ , ≤ α, β ≤ , θ > , where b ∈ R + ; see Figure 2 (c) (d) (e) (f) for examples of the curves. By tedious calculation, bothfunctions are maximized at b ∗ = q βα . Therefore, we have that χ (Λ Gu α,β,θ ) = (cid:16) − θ (cid:17) √ αβ , χ ? (Λ Gu α,β,θ ) = (cid:16) − θ (cid:17) max( α, β ), χ (Λ Ga α,β,θ ) = 2 − θ √ αβ and χ ? (Λ Ga α,β,θ ) = 2 − θ max( α, β ). In addition, the normalizedTDFs are given byΛ Gu α,β,θ ( t, t = αw + β (1 − w ) w − { ( αw ) θ + ( β (1 − w )) θ } θ w = α + βt − (cid:0) α θ + β θ t − θ (cid:1) θ , Λ Gu α,β,θ (1 , t ) t = α (1 − w ) + βww − { ( α (1 − w )) θ + ( βw ) θ } θ w = αt + β − (cid:0) α θ t − θ + β θ (cid:1) θ , and Λ Ga α,β,θ ( t, t = { ( αw ) − θ + ( β (1 − w )) − θ } − θ w = (cid:0) α − θ + β − θ t θ (cid:1) − θ , Λ Ga α,β,θ (1 , t ) t = { ( α (1 − w )) − θ + ( βw ) − θ } − θ w = (cid:0) α − θ t θ + β − θ (cid:1) − θ , for w = w ( t ) = tt +1 and t ∈ I . Therefore, we obtain the formulas λ (Λ Gu α,β,θ ) = α + β − ( α θ + β θ ) θ , λ (Λ Ga α,β,θ ) = ( α − θ + β − θ ) − θ and λ (Λ Gu α,β,θ ) = λ (Λ Ga α,β,θ ) = max( α, β ). Examples of the curves of the normalizedTDFs are provided in Figure 3 (c) (d) (e) (f).Note that, in Figure 2, we plot the two functions b Λ (cid:0) b, b (cid:1) and b Λ (cid:0) b , b (cid:1) on b ∈ I to visualize onefunction b Λ (cid:0) b, b (cid:1) on b ∈ R + so that the two cases b ≤ ≤ b are treated in the same scale. Thevariable b ∈ I in Λ (cid:0) b, b (cid:1) and Λ (cid:0) b , b (cid:1) is related to the variable t ∈ I in the normalized TDFs Λ ? ( t ) and Λ ? ( t ),and thus to µ -TDMs viaΛ (cid:0) b, b (cid:1) M (cid:0) b, b (cid:1) = Λ ? ( b ) = λ µ ( b )(Λ) and Λ (cid:0) b , b (cid:1) M (cid:0) b , b (cid:1) = Λ ? ( b ) = λ µ ( b ) (Λ) . Example 4.19 (Symmetric copulas) . In this example we consider survival Gumbel, Clayton and t copulas.First, since asymmetric Gumbel copula with α = β = 1 yields Gumbel copula, we have, by Example 4.18, that λ ( ˆ C Gu θ ) = χ ( ˆ C Gu θ ) = χ ? ( ˆ C Gu θ ) = 2 − θ and λ ( ˆ C Gu θ ) = 1 , θ ≥ , for the survival Gumbel copula ˆ C Gu θ , where the maximum of Λ (cid:16) b, b ; ˆ C Gu θ (cid:17) is attained at b = 1. For Claytoncopula, we have that Λ Cl θ (cid:0) b, b (cid:1) = ( b − θ + b θ ) − θ for b ∈ R + and θ ∈ (0 , ∞ ), which is maximized at b ∗ = 1; see8also Figure 2 (a). Therefore, we have that λ ( C Cl θ ) = χ ( C Cl θ ) = χ ? ( C Cl θ ) = 2 − θ . In addition, the normalizedTDFs are given by Λ( t, t = Λ(1 ,t ) t = (1+ t − θ ) − θ t , t ∈ I , and thus λ ( C Cl θ ) = lim t ↓ t − θ ) − θ t = 1; see Example A.2and Figure 3 (a). Finally, for t copulas, their TDFs are given in Example A.1, and we have that λ ( C t ν,ρ ) = χ ( C t ν,ρ ) = χ ? ( C t ν,ρ ) = 2 T ν +1 − s ( ν + 1)(1 − ρ )1 + ρ ! and λ ( C t ν,ρ ) = 1by calculation; see also Figure 2 (b) and Figure 3 (b) for examples.We end this section with a discussion on the choice of tail dependence measures based on the examplesabove. First, the MTDMs λ of the survival asymmetric Gumbel and Galambos, Clayton and t copulas areindependent of the parameters θ for the first three copulas, and of ρ for t copulas. Since these parameterscontrol degree of tail dependence, it turns out that λ may not be an appropriate tail dependence measure. Infact, MTDM may overestimate degree of tail dependence since λ ( ˆ C Gu θ ) = λ ( C Cl θ ) = λ ( C t ν,ρ ) = 1 independentof the parameters. Second, the maximal TDCs χ and χ ? may still underestimate tail dependence since theycoincide with TDC for survival Gumbel, Clayton and t copulas. From the viewpoint of L - and L ∞ -radialrepresentations of µ -TDM, all the maximal type tail dependence measures χ , χ ? and λ quantify tail dependenceonly along a specific angle. As discussed in Section 3.3, such measures may not be appropriate since they donot take into account all tail conditional probabilities. Therefore, we conclude that tail dependence amongcopulas should be compared not solely by the maximal type measures χ , χ ? and λ but also by µ -TDMs, suchas tail Spearman’s rho, so that all tail conditional probabilities are counted for comparison. In this section we conduct numerical experiments to show performance of the proposed tail dependencemeasures for various copulas. The proposed measures are computed based on the estimated TDF constructedfrom (pseudo) samples from an underlying copula; see Appendix B for statistical inference of the proposedmeasures. Simulation study is first conducted in Section 5.1, and real data analysis is provided in Section 5.2.
We first consider survival Marshall-Olkin copula ˆ C MO a,b and skew t copula C ST ν,δ ,δ ,γ , where Marshall-Olkin copula is as given in Example 4.17, and skew t copula is as specified in Smith et al. (2012). We set( a, b ) = (0 . , .
75) in survival Marshall-Olkin copula and ( ν, δ , δ , γ ) = (5 , . , − . , .
95) in skew t copula.The scatter plots of these copulas are provided in Figure 4.We first simulate n = 10 samples from ˆ C MO a,b and C ST ν,δ ,δ ,γ , and then estimate their TDFs by Λ n ( u, v ) = nk C n (cid:0) kn u, kn v (cid:1) for ( u, v ) ∈ (cid:8) L , . . . , (cid:9) , where L = 100, k ∈ N will be specified afterwards and C n is anempirical copula constructed from the samples. We conduct the plateau-finding algorithm to find k in theestimator. For each k ∈ [0 . n, . n ], we estimate the tail dependence coefficient by λ n = Λ n (1 , k ∗ from the interval [ k, k ] on which the estimated TDCs are almost unchanged. Theresulting plots are provided in Figure 5, based on which we choose k SMO = 400 and k ST = 150. Finally, basedon the estimated TDFs, we estimate the tail dependence coefficient λ , tail Spearman’s rho λ S , tail Gini’s9 ll llll l llllllll ll ll ll lllll ll ll l l lll l ll l ll lll lll ll lll lll l ll lll lll lll l l lll ll l ll ll ll l l lll lll ll l l lll l ll lll ll llllll lllll l ll lll llll lll l llll ll ll ll l lll l l ll ll ll ll llll l l ll l ll ll llll l lll ll ll ll l ll lll l ll ll l ll lll lll ll ll ll ll lll ll ll l lll lll lll l ll lll llll ll lll l ll l lll l l lll ll ll ll l ll l l ll lll lll l ll lll ll lll l ll l ll ll l ll l l l lll l ll lll ll ll ll l l ll ll ll l lll ll l ll ll l lll lll lll lll ll ll ll ll ll l ll ll l llll l l ll l ll ll ll ll lll l lll ll ll ll l ll ll l lll lll ll l ll ll lll ll l l l ll l ll l ll l lll l lll l ll lll ll ll ll lll ll ll lll ll ll l ll llll l l lll ll ll l ll l llllll l ll ll llll lll lll ll lll lll l lll l lll ll llll lll lll ll l ll l lll ll ll ll l ll l lll lllll ll ll l ll ll ll ll ll ll l l ll lll lll llll ll ll lll l ll ll ll lll ll ll l lllll l llll lll ll l lll ll l ll ll l ll l l lllll lll ll ll ll l ll l ll lll l ll ll l lll l ll lll lll lll ll l l ll l ll ll l ll ll l l ll ll l ll l ll l ll l ll ll ll lll l ll ll l lll l ll l ll lll l llll ll lll ll lll ll l ll l ll lllll l ll ll l ll lllll l lll l ll lllll ll l ll ll ll lll l ll l llll l ll ll ll ll l ll lll llll ll ll ll ll lll lllll l ll l llll l lll l lll l lll lll lll ll l lllll ll lll l ll l l ll l ll ll ll l lll l ll ll ll ll ll ll l ll lllll l lll llll ll ll lll l ll ll l lll lll l ll ll lll ll ll l ll l llll ll ll lll ll l lll l lll ll ll l ll lll ll l l ll lll l ll ll llll ll l lll l l l lll l ll ll l l lll l ll ll l ll l ll l ll ll lll ll lll lllll l ll lll l l l l ll ll ll ll l lll lll ll l ll ll ll lll ll l ll lllll ll llll l llll lll ll ll lll ll lll l l ll ll lll lll ll ll lll ll ll l llll l lll l ll l ll l ll ll l l ll l lll ll ll ll ll lll ll llll llll l l l ll ll l lll ll ll ll l lll l l ll lll lll lll ll llll l lll ll l ll ll l lll llll l l ll l l ll ll lll l ll ll l ll lll ll l ll ll lll l ll lll ll l ll lll l ll ll ll ll l llll ll l lll lll ll ll l l ll l lll ll ll ll ll lll lll l ll lll l lll l lll ll lll lll lll lll ll ll l ll ll lll ll l ll l ll lll l ll llll ll ll ll l ll llll ll l lll lll ll l ll ll l l lll ll ll l ll lll ll llll ll ll ll l ll l l l ll l lll ll lll llll l l lll l lll ll ll ll l ll l l llll lll l ll ll llll ll lll ll lll llll lll ll l ll ll ll ll l lll l lll l lll ll l llll ll lll ll ll ll lll l ll ll ll l lll l ll lll ll lll l l ll llll llll l lll l ll ll ll l ll lll ll lllll ll llll l lll ll ll lll l llll lll ll l lll lll ll l llll l ll l ll l l lll ll lll lll ll l ll l ll l llll ll l ll ll ll l l lll ll ll ll l l lll llll ll ll l lll ll ll ll ll ll l l ll l llll ll ll l lll l l l llll l ll l llll l lll l ll lll l ll l ll l l lll ll l ll l lll ll l lllll l ll lll l l ll lll l lll ll lll l llll llll l ll ll lll lll l lll l l l l ll l ll l lllll ll l ll llll ll lll lll l lll ll l ll l l ll l l llll l ll lll lll ll ll ll l ll l ll l ll ll ll l lllll ll ll l ll ll ll l ll llll l ll ll lll ll l ll l ll ll ll lll lll ll ll ll ll ll ll ll l ll lll lll ll ll ll l l lll ll l ll ll l ll lll ll lll l l ll ll ll lll lll ll l l ll l lll ll lll l lll ll ll lll l lll ll lll l l ll l ll l lll llllll ll llll ll l ll l ll ll l ll l llll ll l ll ll l llll ll lllll l l l lll lll lll ll lll lll ll ll ll ll ll ll l l lll l ll lll ll l l ll ll ll l lll l l ll ll lll ll lll ll ll l lll ll ll ll l ll lll ll l lll l ll ll lll l ll l lll ll ll ll l ll llll ll l ll l llll ll ll l lll ll l ll l ll lllll ll l ll ll ll l lll ll l lll l l ll ll l ll ll ll ll l lll l ll ll ll ll ll llllll l ll ll ll ll l l llll ll l l ll ll ll l llll l l lll ll ll ll ll lll ll l ll l ll ll l ll ll ll l l ll llll l l l ll llll lll ll ll ll lll lll l ll ll ll l ll ll l lll ll lll l l lll lll l ll l lll ll l l ll ll l l ll l ll lll l l ll lll l llll l ll ll l l lll l l ll ll l ll lll l lll ll ll lll ll llll ll l lll llll ll l ll ll ll lll ll ll ll ll l l l ll lll ll l llll ll l ll l llll l ll ll l ll ll l ll ll l ll ll l l ll ll ll l lll l llll l ll ll lll l l llll l llll l l l l llll l ll ll lllll llll l llll ll lll ll l ll lll ll llll l lll l l ll ll lll ll l ll l ll ll ll ll l l lll lll ll ll l lll l llll ll l ll lll lll ll lll lll l ll ll lll lll ll ll l ll ll l lll l lll ll ll lllll lll lll l ll ll lll ll ll l ll lll l llll ll ll ll l llll llll ll l l lll l l llll l lll lll ll ll ll ll ll l lll ll l lll ll lll ll lll ll ll l ll l l lll ll ll lll ll ll l ll l ll ll lll ll l lll ll ll l ll l lll l l llll ll l ll l ll ll ll ll lll lll l l l l llllll ll l ll l ll lll l lll l ll l llll lll ll lll lll ll ll l l ll lllll ll ll ll l l lll l l ll l ll l llll l ll lllll lll l l lll ll ll l l ll llll l ll ll llllll ll ll lll ll ll ll ll l llll ll llll lll l lll l lll ll ll l lll llll lll l l ll l ll ll lll ll ll ll ll llll lll llll l lll lll lll ll l l ll ll l ll ll lll ll l lll l ll l lll llll l ll l ll ll l ll ll lll ll ll l lll lll l lll l llll l ll llll lll ll l ll ll l ll ll ll ll l lll lll l l ll l ll ll llll ll ll ll l lll lll lllll lll l lll lll ll ll lll ll l lll ll ll lll l ll ll lll l lll lll lll l l ll l l lll ll l ll lll lll ll ll l ll ll ll l l ll ll ll l lll l ll l lll l l ll ll l lll l lll ll ll ll ll llll l ll l l lll l lll lll l lll ll ll lll ll l ll l l l l llll ll ll lll ll ll ll lll ll l lll llll lll l lll lll lll l ll l lll l ll lll ll l lll l ll lll lll ll lll l lll l ll lll l ll lll l ll l ll ll lll lll lll lll l llll ll lll ll ll l lll l ll l ll ll ll llll ll ll lll ll ll ll ll ll ll l lll l ll l ll l llll lll lll l l lll l l lll l ll ll l l ll ll ll ll ll ll ll ll ll l l lll lll l ll ll ll ll ll ll l ll lll l lll lll lll lll ll llll llll l l llll l lll lll ll llll l ll ll ll ll l lll l lll l lll lll ll ll ll ll ll l l ll ll ll lll ll llll l lll l llll ll lllll ll l ll lll ll l ll l lll lll l ll l l l l llll l ll l ll ll l lll l ll ll ll ll ll l lll l lll ll l ll l l ll ll llll lll l ll ll l lll l ll lll lll l l ll ll l l lll ll l lll l lll ll l lll l l lllll l ll ll ll ll l llllll l lll ll lll lll lll l ll l l lll ll lll ll ll l lll l ll lll l l llll l ll ll l ll l llll ll ll l ll l ll ll l l llll l lll l ll l ll ll llll l ll l lll l l ll ll llll l ll ll ll lll llll lllllll lll ll llll ll llll ll l lll ll l lll ll l ll lll l l lll lll lllll ll ll ll ll lll ll ll ll l lll ll lll ll lll ll lll l lll lll lll ll ll ll ll ll l ll ll lll l ll lll ll lll ll llll ll ll ll l ll l lll lll l ll l ll ll ll ll ll ll ll ll ll lll ll ll l ll l l ll ll lll l lll ll lll l llll l lllll l l ll l l ll l ll l l ll l ll lll l lll ll l l llll llll l ll l ll l lll l llll ll lll lll l lll lll l lll ll l l llll ll ll ll lll lll lll ll llllll l llll l lll l ll l ll ll llll ll l lll lll ll lll lll ll lll l ll l ll ll ll ll ll ll ll l lll l l ll ll lll l l llll ll ll l l ll llll ll l lll ll ll lll l ll ll ll llll l ll lll ll ll ll lll l llll ll lll l l l ll l ll l l l l lll ll llll lll ll l ll lll ll ll ll ll lll ll ll l ll l l llll lll lll llll lll l ll l ll l llll l lll ll ll lll l lllll l llll l ll lllll l ll l ll l ll lll l ll llll lll l l l lll l l lll l lll ll lll llll ll l ll ll ll ll l ll l lllll l lll l ll l ll ll ll ll l llll ll ll l lllll l ll ll ll l l ll ll ll ll llll l lll lll l ll l lll llll l lll l l lllll ll ll ll l ll ll ll ll ll ll ll llll ll ll lll lll l l lll ll ll lll lll lllll ll l ll llll l ll l lll ll l lll l ll llll ll l l lll l ll l ll lll l l lll l l ll l ll l ll l l ll ll l ll ll ll llll ll lll l ll l ll ll l l l ll l l ll ll ll l ll l llll l ll l ll ll ll l llll ll ll ll lll lllll llll ll ll lll ll lll l ll l lll ll l llll lll lll ll l l lll ll ll ll lll ll lllll l ll l ll ll l l ll ll l l llllll l l ll lll ll lll ll llll lll l l l lllll ll ll ll l lllll ll ll lll l ll l l lll ll ll ll ll ll ll lll lll ll l l ll l ll ll ll l llll l ll ll lll lll l ll l ll ll ll ll ll ll ll l ll ll l ll l ll l l ll llll l ll l ll ll lllll l ll lll ll l lll ll l l ll lll l ll l ll lll l ll lll l lll ll ll l ll ll l l lll l ll l l ll ll l ll l ll ll l lll l l ll ll ll ll ll lll lllll lll l llll ll l llll ll ll l ll ll ll lll llll lll lll llll l ll l l ll ll ll ll ll l ll llll ll l llll ll ll lll ll l lll llll l llll ll l lll l lll lll ll lll lll lll ll ll ll llll ll l lll llll lll ll lllll l llll l ll l lll l l ll ll ll ll llll llll lll l lll lll l ll l lll ll l ll ll l l l lll l lll l ll ll l lll ll lll ll ll l llll l ll l lll l llll l ll lllll ll lll l l lll lll ll ll ll ll ll l ll ll ll lll l ll lll lllll l ll lll ll llll l ll l l l l ll ll lll l l l ll llll lll ll ll ll lll ll ll l llll ll ll llllll ll lll l llll lll lll l l lll lll ll lllll ll lll ll ll lll l lll ll ll l l lll ll l lll l lll l ll l lll l ll lll ll lll l l lll ll llll ll l ll ll l l ll l ll l lll l l l ll l ll lll llll l ll ll ll l lll ll ll l l ll ll ll lll l lll lll llll ll ll ll lll llll l lll l ll lll llll l l lll ll l lllll ll ll ll l ll l ll l ll l lll l llll ll ll lll ll l ll lll l l l ll ll lll ll ll ll lll ll l l lll lll ll l l lll llll lll l llll l llll l ll l l ll l l lll ll ll lll llllll ll ll ll llll l ll ll l lll ll lll ll ll ll ll l ll lllll ll ll lll ll l llll l l lll lll ll lll ll ll l ll ll l l ll lll ll l llll lll l ll ll ll ll ll ll lll lll l ll lll ll ll ll lll l lll l ll ll ll ll l lll l ll ll l lll ll ll lll ll ll ll ll l lll llllll ll l lll ll l ll l l l llll l llll lll lll l ll lll llll ll lll lll ll llll ll l l ll l lll ll ll l l ll ll ll ll ll l l lll ll ll l lll lll l ll l ll lll lll ll l lll l ll lll llll llll lll l ll lll lll l ll l l l ll l ll l ll ll ll llll l ll lll ll l ll l ll ll ll ll ll ll l ll lll ll l lll l ll ll l lll lll l lll ll l ll ll ll ll l ll lll l ll ll ll llll ll lll l l ll ll llll lll lll ll l ll l l lll lll ll ll ll lll l lll lll ll ll ll lll lll ll ll lll l lll l ll lll l lll ll ll lll ll ll ll l ll ll ll ll ll lll llll l lll lll lll l lll llll ll l ll lll ll l llll l ll ll ll ll ll l ll l lll l l ll l ll ll l lll lll ll ll ll ll ll l lll l ll llllll ll lll l ll llll lll ll lll ll ll ll ll l ll l ll l llll l llll l l ll l lll l ll llll l ll lll ll ll ll ll ll l l ll ll lll l l lll lll l l l ll l lllll l l ll ll l ll lll ll ll l lll ll l l lll lll ll lll ll ll ll l llll ll ll l lll ll ll lll llll lll ll ll lll lll l lll l ll l ll ll lll l lll lll ll lll ll ll l lll lllll llll ll ll l l lllll ll l lllll lllll ll lll l ll l ll lll ll lll l l ll ll l ll l llll l ll l l llll ll l lll lll ll ll ll ll l ll l lll ll lll lll ll l l ll ll l l l lll l l ll ll l l l ll lll ll ll l lll l l l ll ll lll l l lll l ll ll ll ll l lll ll ll l l ll ll l ll lll l lll l llll l l ll lll lll l l ll ll ll lll ll l llll l lll l ll ll lll lllll l lll lll llll lll llll llll lll lll ll ll l lll ll lll l lll l ll l ll lll ll ll ll ll ll lllll l lll ll llll l l ll ll ll l ll ll ll ll ll ll lll lll l llll l ll ll ll ll llll ll l ll ll lll lll l lll lll ll l l lll ll ll lll ll l ll ll l ll ll lll lll l ll ll ll lll ll ll ll l lllll llll l ll l ll l l ll ll ll l l ll lll llll l ll lll lll l l l ll llll lll llll l lll ll l ll l ll l lll l ll ll lll l ll ll lll lll l lll ll ll l l ll ll ll lll ll l l ll ll llll ll ll ll lll ll ll l ll ll ll l llll ll lll l ll ll ll ll lll lll l l lll l ll l lll lll lllll l ll l ll ll l ll ll ll l l ll llll lll l lll l lll lll lll llll l ll lll l lll ll l ll ll l l llll lll lll ll ll llll ll ll l lll l lll l llll l ll l l ll l ll ll l ll l ll lllllllll l lll llll llll l llll l ll l l lll l lll l ll ll llll l lll ll lll l lll l l ll lll ll lll ll lll l lll lll ll l l ll l l lll ll ll l l ll ll lll lll lll ll ll ll lll ll l l lllll lll l lll ll l lll l ll ll l l lll ll ll ll l ll l llll ll lll l l ll l ll ll ll l l lll ll lll l lll l ll lll l l l lll llll ll lll l l lll ll lll l llll ll ll l lll l ll ll l l l ll l ll ll ll ll ll l l ll llll l ll ll ll l ll l ll ll lllll lll ll lll l lll l ll ll l ll ll ll ll lll l ll ll ll l lll ll lll lll l ll ll ll l l lll lll ll l lllll ll lll lll ll ll lll lll llll ll l ll l ll ll ll llll lll l l ll llll llll l ll ll lll l ll l ll ll ll ll lll ll l l ll ll ll ll ll l lll llll lll l llll l lll l lll l ll ll l l ll ll ll ll ll ll lll l ll l lll l l l ll l ll l ll lll llll l ll lll ll ll lll lll ll ll ll l llll ll lll lll l ll ll lll l lll lll ll ll ll lll ll lll l lllll l lll ll ll lll l llll llll ll l l l ll lll lll l l l lll l ll ll lll ll ll ll ll l ll ll l lll ll l lll ll l ll ll ll ll lll ll l ll llll ll l lll lll l ll ll l ll l l l ll lll ll ll ll l l ll ll ll ll ll ll l ll ll llll l ll lll lll ll l ll ll l l llll ll lllll ll lll l lll lll l ll l ll ll ll llll llll lll lll llll llll lllll lll ll ll ll l lll ll lllll ll llll lll ll ll l ll ll ll ll ll ll l ll l ll l l lllll l l ll l ll lll lll lll lll l ll ll l ll ll ll lll lll ll llll lll l lll l ll lll ll ll lll l ll l ll l llllll ll l ll l ll l llll l lllll l ll l llllll ll ll l ll l ll l ll lll l l ll l ll ll ll lll ll ll lll ll lll ll lll ll l ll lll ll ll l ll ll l l lll ll ll l ll lll l l lll ll l l l l ll l l ll llll ll llll ll l l llll ll l lll ll ll lll lll ll l ll l ll l ll l l ll lll l lll ll ll ll l lll l ll ll ll ll l ll l ll l ll l ll ll l ll ll l l ll l l lll l ll lll ll l llll ll l ll ll ll ll llll lll l ll l ll ll lll llll l llll l ll lll l l ll ll llll ll l ll l ll llll l ll ll l ll l lll l lll l ll ll ll ll l lll l lll ll ll ll ll ll ll ll ll l l ll lll ll l ll llll l l lll l ll l ll l lll lll l l ll lll l ll l ll ll ll l llllll l ll llll lll lll ll ll ll l ll l lll l ll ll ll ll lll l ll ll l lll lll lll l ll ll l ll ll l l l ll ll l l lll l ll lll lll l l llll ll l ll lll ll l lll l ll l llll ll ll ll l ll l lll l lll lll ll l ll lll l l l lll l lll l ll ll ll ll l l ll l l llll ll ll lll l ll ll llll ll l ll ll ll ll l ll ll lll l ll l lll llll l ll lll l ll l lll llllll ll l lll ll lll l lll ll l ll l llll l ll ll ll ll ll l llll lll l ll l l ll ll l ll lll lll l ll ll ll l ll ll ll l ll ll ll llll ll ll l lll ll ll llll l ll ll l ll l ll ll ll l l l llll ll l ll ll l ll lll ll ll ll lll ll llll ll lll ll ll ll l llll ll lll lll lll ll l lll ll lll l l llll lll lll l lll l ll ll ll ll ll ll lllll lll ll ll ll lll ll l ll ll ll l ll lll l ll ll l ll l llll l lll lll lll ll l l lllllllll lll ll l lllllll l l l l lll lll llll ll ll l ll ll ll lll ll l ll lll ll l ll l l lll l lllll l ll l ll l lll ll lll lll ll lll l l llll lll ll l l ll l ll l ll l llll l llll l ll l l ll l l ll l l lll lll l l ll ll lll l ll l l lll ll ll lll l l l l ll l l ll l ll l lll l llll l ll l ll l ll ll lllll ll ll llll ll l l ll l ll lll l lll l ll l ll l llll llll l ll ll ll l lll llll ll l llll ll ll l ll l l lll l ll ll l l l l ll l ll llll l ll lll ll l lll l ll l l ll ll lll l ll ll ll ll ll ll lll lll llll ll l ll l l ll l l ll ll ll lllll lll ll lll llll ll ll ll llll l l lll ll l l llll lll l ll l ll ll l ll ll lll lll l ll l l llll l lllll ll ll l ll l l ll l . . . . . . Survival Marshall−Olkin copula U V ll l l l lll ll llll ll l ll lll ll ll llll llll l ll l lll l l ll ll lll ll ll lll ll l lll lll l l l ll l ll l ll lll ll lll ll l ll ll llll lll l llll l ll lll ll ll lll l l lll llll llll l ll ll ll l ll lll l lll ll lll ll ll l l ll l l l ll lll lll lll lll llll lll l lll lll ll l l ll l ll ll ll l lll ll ll l l ll ll lll lll l ll lll l ll ll ll ll l l ll ll ll llll l ll ll l ll ll l lll ll l l llll l l lll lll ll lll lll llll llll l lll lll ll ll l ll ll ll ll lllll ll l ll ll ll ll llll ll ll ll ll ll lll ll l ll ll l ll ll ll ll ll l lll llll ll ll ll llll ll ll l l ll ll l ll ll l ll l l ll lll lll l ll llll l lll l ll lll ll ll l l llll l l ll l ll llll l l lll ll l ll llll lll ll lll ll l lll l ll llll lll lll l ll ll lll ll l lll lll ll lll ll l ll l ll l ll ll ll l l lll ll ll llll lll lll l ll ll ll l ll lll ll ll l lll ll ll llll lll ll l l lll lll l ll l lll lll ll ll l ll lll llll lll ll ll ll l l l ll ll ll lll l l ll ll l ll ll l llll l lll l ll llll l lll lll llll l llllll ll ll l lll l lll l ll l ll ll ll l l lllll l ll llll ll llll ll lll ll ll ll ll l ll lll lll l ll lll ll l l ll llll ll l ll l ll ll l lll ll lll ll lll l ll l ll ll ll l ll l ll ll lll l ll l llll l ll ll lll lll lll lll ll lll ll lll l ll ll lll ll ll l ll ll ll l l lll ll l l ll l ll lll l l ll l lll ll l llll l lllll l lll ll ll ll ll l ll ll lll l ll l lll lll ll l l ll l ll l l ll ll lll ll l lll ll l lll lll lll lll l ll ll lll ll lll ll lll ll ll llll l ll ll ll llll l llll lll l ll lll ll ll lll l ll ll l ll lllll ll ll ll lll ll ll lll ll ll lll ll l lll ll l ll ll lll l llll l llllll llll ll ll ll l ll l ll ll lll l ll ll ll l llll ll ll l l ll l l l ll llll l lll l ll l l ll llll l lll ll ll l lll lll ll ll l lll lllll l ll l lll ll ll l l lll ll lll ll ll ll ll ll l lll lll ll ll l lll lll llll llll l ll l l ll llll lll ll lll ll l ll ll ll ll l l ll lll ll ll lllll ll l lll lll llll ll l l llll l l ll l l l ll l ll l lll ll ll lll l llll l lll ll l ll l ll lll l l l lll l llll ll ll ll l ll lll l l lll ll lll llll ll l lll ll l lll l l lll l llll lll lll l ll l l ll l ll llll ll ll l ll ll l lll ll ll l ll ll ll ll l lll lll l l lll ll lll l ll ll lll l ll ll ll l lll l ll lll ll l l ll l l l ll ll l ll lll lll lllll lll lllll lll l llll l ll ll lll llll ll ll ll ll ll ll lll ll ll l lll l llll lll l l l ll lll l lll lll ll lll l lllll lll ll l ll ll ll l l ll ll l ll l l ll lll l l llll l l l lll ll l ll ll l ll lll lllll l l lll ll lll l llll ll l ll ll ll lll llll l lll l llll l ll lll ll l lll l llll lll l ll llll ll ll l l ll lll l lll ll ll lll l ll lll l ll ll l llll llll l l l lll ll l lll l lll ll ll l ll ll l l ll lll l lll lll l ll l lll ll lll l ll ll l lll ll ll ll lll l l ll ll lll l ll lll lll lll l l ll lll ll ll l lll ll ll ll ll ll ll ll l l ll llll ll ll lll ll lll l lll l l ll ll l lll ll lll lll ll ll lllllll l l ll ll ll lll ll lll lll ll l l llllll ll l lllll llll llll ll ll l ll lll l l ll lll l ll lll l l ll ll ll l l l l lll l lll lll ll ll l ll lll lll ll lllll llll lll lll l ll llll ll l ll l l llll ll llll l l lll l l ll lllll lll ll l lll ll ll l ll lll l l l ll l ll lll l l ll lll llll l lll ll ll ll l ll l llll l l ll l llll l lll ll ll lll l llll ll ll lll l ll lll ll ll ll ll l llll l ll ll l lll llll l l lll lll ll l lll l ll ll ll ll ll ll llll l lll l lll ll llll ll ll ll ll l llll l lll ll ll l l ll ll ll l ll ll l llll l llll ll l ll ll ll l ll l ll lll l lll l ll lll l lll llll l l llllll ll ll ll ll l lll ll ll ll ll l lll ll ll lll l ll l l lll l ll ll ll l ll ll l l llll ll l l ll l l lll ll l ll llll ll l l ll ll l lll l ll l l llll ll l l ll lllll l l llll lll ll ll ll l ll ll ll lll lllll ll ll ll ll ll lll lll l ll llll l ll ll l ll l ll l ll l lllll ll ll llll llll ll ll l lllll ll l lll l lll l ll ll ll ll ll lll l ll llll lll ll l ll lll ll lllll lll l l l ll l ll lll l ll lll ll lll l l ll ll ll llllll l l l lll lll l ll ll ll lll ll l ll lll l l ll lll l lll l ll l llll ll ll ll ll lll l ll l lll lll ll l l ll ll ll l ll lllll ll llll ll lll l ll lll l l lll l ll l llll ll llll llll l ll lll ll ll llll ll ll lll l ll lll ll ll ll ll lll l ll lll lll llll ll ll lll lll ll l lll ll l l ll ll llll lllll ll l ll l l lll ll ll ll llll lll llll ll lll ll lll l l l ll ll l ll llll lll l lll l lll l lll l lll l ll l ll ll l l ll l l ll l l lll l lll l l lll ll l l ll ll l ll l ll ll ll lll lll llll l ll ll l lll ll lll ll ll l ll ll l l llll l ll l ll lll ll ll l ll l l l lll ll llll l l llll l ll ll ll ll lll lll ll ll ll l l ll ll l ll llll lll lll l ll lll lll ll l llll ll lll lll lllll l l ll ll ll lll l lll l lll ll lll l lll ll ll ll ll ll ll ll ll ll ll ll lll lll ll l lll llll ll ll ll l l lll ll lll ll llll l l ll lll ll lll llllll l l lll ll l ll l l ll lll lll l l ll lll ll ll l ll ll ll ll l llll lllll ll l ll lll lll ll ll l lll l l l ll ll ll lll l ll lll l l ll ll lll ll ll l ll l ll lll l ll ll ll lll l lll ll lll ll lllll l ll ll ll llll ll lll l lll lll ll lll l l lll l lll l ll ll ll ll l lll l llll l lll ll l lll ll l l ll ll lllll l l ll lll l ll ll l lllll l ll l ll lllll ll l ll l l ll l ll lll l ll lll l ll ll ll l ll llll l ll ll ll ll ll ll l ll ll l ll l ll llll lll l ll ll l ll l ll ll ll ll ll llll lll l lll ll ll lll l lll l ll l lll lll l lll l ll lll lll ll ll ll l ll ll lllll lll lll l lll ll ll lll ll ll llll lll l l lll llll ll llll l l ll ll lll l lll l ll lll lll ll l lll l ll l lll ll lll l l ll l ll ll ll l lll ll ll l l llll lll l ll lll l ll ll lll l llll l ll ll ll llll l ll l llll ll ll ll lll ll lll l ll l ll ll lll l lll l ll lll l ll ll ll l lll l ll ll ll l ll ll llll l ll lll l ll llll ll ll ll l ll llll l l ll l lll ll l llll l l lll l lll ll l ll ll lll l l l ll ll lll l ll lll ll l ll l l ll ll llll l l ll ll l l llllll l lllll ll l lll ll ll ll ll lll ll lll ll ll ll l l l ll l l ll llll ll lll l ll ll lll lll l lll lll l ll ll lll ll ll ll l ll ll ll ll lll lll ll lll lll l l lll llll ll l l ll l ll l ll l ll ll ll lllll l l ll l ll lll ll l ll ll l lll l llll ll l l ll lll ll ll ll l ll ll llll llll ll l lll l l ll l ll l lll ll ll l ll llllll l l ll ll lll l ll llll ll l ll ll ll l ll ll l llll ll ll l lll lll l l ll l ll lll ll l l lll ll ll l ll ll ll ll ll ll lll lll lll ll ll l llll lll l ll llll ll l ll ll llll l l ll l lll llll l l lll ll l lll lll l ll ll l l ll llll ll ll lll l l ll ll l llll l lll llll l ll l l llll lll ll l lll lll lll ll ll ll lll ll llll ll lll l l l l lll l lll ll llll l ll l l ll l ll ll ll llll llll l ll ll lll lll ll l lll ll ll lll ll lllll l ll ll lll ll l ll l lll lll llll l lll lll ll l l ll l ll l llll l l ll ll ll l ll lll ll ll ll lll l llll l l lll l ll l ll l ll ll ll l ll lllll l lll lll lll ll ll ll ll ll l ll l ll lll l lll llll l l lll ll l lll ll l ll lll ll ll llll ll lll l ll ll ll ll ll ll ll ll l ll ll l lll lll l ll lll l ll l ll llll l lll lll llll l ll ll l ll ll lllll lll lll ll l lllll ll lll l lll ll ll llll l ll ll lll lll ll ll lll lll l l llll ll llll ll lll l lll l llll l ll l ll l l ll ll ll lll l ll l ll ll l ll ll ll ll lll ll ll llll ll ll ll ll lll ll l ll l ll l ll lll lll l ll ll ll ll lll ll l ll l llll ll lll l llll llll ll lll l l l lll ll l ll llllll l lll l l lll ll ll l lll llll l ll ll ll ll ll l l lll lll l lll ll l ll l ll lll l ll ll l ll ll ll l lll lll l l ll l ll ll l llll ll lll l ll l ll l ll l ll l ll l ll ll l lll ll l lll ll ll ll ll l ll l lll lll l ll ll l ll lll ll ll ll l ll ll lll ll lll l lll lll l lllll l ll ll ll l lll lll lll ll l lll ll l ll l ll lll ll ll ll ll l ll l lll lll lll ll l ll llll llll l lll ll l l l ll ll ll l lll l l ll ll ll l l l ll l ll lll l lll llll l ll l l lll lll ll l l lll lll llll l ll lll l llll l l ll lll lll lll ll l lllll l ll l lll lll llll ll ll ll ll lll ll ll ll l l l ll ll lll ll l l lll ll l ll ll ll ll ll ll lll ll l ll ll l lllll ll llll l ll lll ll l llll l l lll ll l lll ll ll lll ll ll l l llll l ll l ll ll l ll llll l lll lll ll lll ll ll llll l l ll l ll ll l lll ll ll lll l ll ll l ll ll l lll ll ll l l lll l ll ll l lll lll ll llll ll ll l llll l ll l ll lll llll ll llll lll l ll l ll lll ll l l ll l ll llll lll ll ll l llll ll ll l ll l ll ll llll ll l ll ll l ll ll l lll lll llll ll ll ll ll ll lll llll l ll ll l lll l ll l ll ll lllll l ll l lll l l l ll l llll ll ll lllll l lll lll llll l ll ll lll ll l ll llll llll l ll l ll l l lll ll lll lll l lll ll l llll lll l llll ll ll ll lll ll lll ll lll ll l l ll l lll l l llllllll ll lll ll l lll l ll lll l ll lll ll lll lll ll lll l l ll ll l ll lll lllll l l lll ll l lll l lllll ll ll ll ll ll ll lll l l llll l l lll lll lll lll ll l lll l l llll l l lll llll lll llll ll l ll ll ll ll lll ll ll ll lll l llll l ll ll lll l lll l ll lll ll ll l ll l ll l ll llll ll ll l l lll ll llll lll ll l ll ll ll lll l ll ll l ll l l l ll l ll ll lll ll ll lll l ll l l lll ll l lll ll l llll l ll ll ll l lll ll ll l ll l ll ll ll ll lll l lll ll l l l lll ll ll ll ll ll l lll l l ll ll lll ll l ll lll l ll llll l ll llll ll lll l l lllll lll lll ll l llll ll l ll l lll ll lll lllll ll l l ll lll l llll lll lll lll lll ll ll ll ll lll l lll ll ll ll llll ll l ll lll l ll ll l ll ll lll l ll lll ll l l lll llll lll ll l lll ll lll ll ll ll ll lll l ll ll ll l l ll l ll ll l l lll llll ll ll l l ll ll ll l ll llll lll l llll ll ll llll l ll ll ll ll lllll l ll lllll l ll ll lll ll l lll l l ll ll lll l ll l l ll ll l l ll ll ll ll l ll ll llll l lll l lll lll ll ll ll l lll ll ll l lll l ll l lll ll ll l ll l ll l l ll l llll l ll ll lll ll ll ll l ll ll l l ll ll ll lll lll ll l llll ll l llll lll l ll ll ll ll lll lllll l ll ll ll l ll lll ll l l ll llll l l l llll ll lll ll ll lll l ll ll l lll ll l l ll ll lll lll lll ll ll ll ll l l ll l l ll llll llll lll l ll l ll l l ll l lll ll ll lllll l l lll l ll lll lll l lll ll lll lll l ll lll ll ll ll ll l lll l l ll lll lll ll lll lll lll ll l llllll lll ll ll ll ll l ll lll lll lll ll ll ll l ll lll lllll ll ll lll ll ll llllll ll ll lll ll l ll l l ll ll lllll l llll lll lll lll l l ll ll ll l llllll lll lll llll ll l ll ll ll ll ll llll ll ll lll l lll l ll lll l llll l ll l lll ll ll lll l l lll ll ll ll lll lll ll l ll l ll l ll ll lll l l lllll ll ll lll lllll l lllll ll lllll l ll l lll l ll l lll ll lll ll ll lll ll lllll l llll l ll llll l llll ll ll l lll ll l l lll ll lll ll l ll ll l ll lll l llllll ll lll l l ll l l lll lll ll l lll l ll l l l ll l ll ll l lllll ll l ll ll ll ll ll l ll ll lll ll llll ll l ll ll lll lll l l lll ll lllll lll lll ll lll ll l llll ll lllll lll ll ll l llll l lllll lll l ll lll lll llll ll llll lll ll l l ll l ll lll lll l ll l llll ll l ll lll ll l ll l l l ll l l lll lll l lll ll l ll ll lll lll ll lll ll l lll l lll lll l l llll ll ll lll l ll ll l l l ll l lll ll lll l l ll lll ll ll l l lll l ll lll ll ll l ll ll lll ll l llll ll ll ll llll l lll ll ll lll l lll ll l l lll l l l ll llll ll l l lll l llll l ll ll lll l ll lll ll ll lll ll l lll ll ll l l ll l ll l ll lll ll ll lll l ll l ll lll ll ll ll l lll l lll ll ll ll ll l l ll lll l llll ll ll ll ll l ll ll ll l lll ll lll lll l l llll l l lll ll ll l ll lll l ll ll l l lll ll lll l ll ll llll ll ll lll ll lll llll ll lll llllll ll l ll lllll l lll lll l llll l lll l lll ll lll ll ll ll lll l lllllll ll lll ll ll l ll ll l ll lll ll ll ll l l lll ll l llll ll ll ll lll lll l ll ll ll ll l lll l ll ll ll l ll l lll lll ll ll l l lll lll ll lll l llll ll lll llllll l llll ll l l lll lll l ll llll l ll l lll l l ll lll l llll llll lll l ll lll ll ll l lll ll ll llll l lll lll l ll l ll lll lll ll lll ll l lll l l llll ll l l lll l llll l lll l l ll lll ll lll llll ll ll lll l l lll l l lll l lll l lll l llll lll ll ll ll ll llll l ll ll l lll lll l lll ll ll ll ll ll ll llll l lll l ll lll ll l ll ll ll ll ll ll ll lll ll lll llll lll l lll lll ll l lll ll ll ll ll llll l ll l llll llll ll lll lll ll l ll ll ll ll lll l lllll l l ll l ll l ll l ll l lll l l ll llll lll lll ll l l lll llll ll ll llll ll l llll lll l lll ll l ll ll lll l ll ll ll ll l lll ll lll ll ll ll lll l lll lll ll ll lll lll ll lll ll llll lll lll llll ll ll ll ll l ll lll ll llllll lll ll l ll lll ll ll ll ll ll ll l ll ll ll ll ll l ll lll l ll l lll ll ll l llll ll ll lll ll lll l lll ll llll ll lllll lll l ll lll ll ll lll l ll ll lll ll lll l l ll l lllll ll ll ll l ll ll l l ll l lll ll lll ll l ll ll lll l l lll ll llll lll l ll lll ll l ll lll l l lll l ll ll ll l ll l l ll ll l ll lll ll lll l lll llll ll l lll ll ll ll ll l ll l lll lll lll l l ll l l lll ll l lllll l lllll l lll lllll ll lll lll llll l lll lll l l l lll l l lll l l ll lll lll l ll l lll l l lll l lll l ll lllll ll l ll ll ll l ll lll ll ll ll ll ll l ll llll ll l l ll l llll l lll l llll lll lll l lll ll ll ll lll l lll lll lll l lll lll l ll llll l l l ll l ll lll lll ll l ll ll l ll lll l lll llll ll lllll l l ll ll ll lll ll lllll ll l llll l ll ll ll ll ll ll ll ll lll llll ll l llll l llll lll lll ll llll lll ll llll ll ll ll ll l ll lll lll l ll ll lllll ll ll l ll ll ll l lll lll l ll ll l ll lllll ll lll l lll l llll ll ll l llll l ll l ll l l ll lll l lll ll ll ll ll ll ll ll ll ll ll l l llll ll lll ll ll ll l ll lll ll lll ll ll ll ll l lll l ll l ll ll ll lll ll l l l lll l ll ll l ll ll ll ll lll l ll ll l l ll llll lll l lll l l ll l ll l l l l ll ll lll ll l ll ll ll ll lll l ll ll l l l ll ll l ll lll ll lll llll lll l l lll ll ll ll lll ll l lllll ll l lll ll ll ll ll ll l l llll l ll l ll lll llll l ll ll ll l l l l ll ll llllll l lll l lll l lll l ll l lll ll l l l ll llll ll ll lll l l ll l ll llllll lll lll ll llllll ll ll ll l lll ll ll l ll l llll ll lll l lll ll ll ll ll l lll l lll ll ll ll ll ll l lll l l lll ll ll ll lll lll ll l l ll ll l ll llll l ll ll ll lll lll l l llllll l ll lllll l lll ll llllll ll ll l llll l lll ll ll ll lllll l ll l l l ll l ll lll lll ll ll lll l ll l ll ll ll ll l lll ll ll ll lll ll l lll ll l ll l ll ll l llll l lll ll ll l lll lll l l ll lll lllll l ll lll lll l ll l ll lll ll lll lll llll l ll l ll lll lllll ll lll l ll llll ll ll ll lll ll ll lll ll l ll lllll l l lll ll l l lll ll llll llll l llll ll l l ll l l ll l ll ll lll l lll l lll l ll lll l l ll ll lll ll ll l ll ll l l lll ll l ll lll l ll ll lll ll lll ll l l l llll lll ll ll ll ll lll l ll l lll l llll ll ll ll lll lll lll ll lll l lll l ll lll l ll l llll ll l ll l l ll ll lll l lll ll l l lll l ll ll ll lll ll lll ll ll ll ll ll ll ll ll ll l ll ll lll lll lll ll ll l lll l lll l l ll l lll ll lll ll l llll l ll l l lll ll l lll l ll l l lll l l lllll l ll lll lll l l llll llll l l ll ll l llll ll ll ll ll l ll lll llll lll lll lll ll l l lll ll ll ll l l ll l ll ll ll l ll l ll ll lll l ll l lll ll l . . . . . . Skew t copula U V Figure 4: Scatter plots of the survival Marshall-Olkin copula ˆ C MO a,b and the skew t copula C ST ν,δ ,δ ,γ wherethe parameters are given by ( a, b ) = (0 . , .
75) and ( ν, δ , δ , γ ) = (5 , . , − . , . n = 10 . The red lines indicate the optimal line attaining the maximal tail dependence coefficientd.gamma λ G , maximal tail dependence coefficient χ , maximal intermediate tail dependence coefficient χ ? andmaximal tail dependence measure λ with the estimators provided in Appendix B. Note that the true values ofthe tail dependence measures for survival Marshall-Olkin copulas are available in Example 4.17.According to the bootstrap analysis, we repeat the whole procedure above B = 100 times based on thesamples drawn with replacement. We report the estimated TDFs, normalized TDFs and the tail dependencemeasures with 95% bootstrap confidence intervals in Figure 6, Figure 7 and Table 1, respectively.From Figures 6 and 7, one can observe that TDF can be estimated more stably than normalized TDF. Inparticular, normalized TDFs for small t around 0 are exposed to large variance since few samples contributeto their estimation. Due to this high variability, decreasingness of the normalized TDFs can be violatedalthough it is theoretically the case. This issue becomes critical on estimating MTDM since its estimator isconstructed by normalized TDFs for t sufficiently close to 0. Therefore, too small t leads to unstable estimatorof λ although, at least theoretically, the normalized TDFs converge to MTDM as t goes to 0. Taking thistrade-off relationship into account, we maximized TDF on b ∈ L , L , . . . , , LL − , . . . , L to estimate χ and χ ? ,and maximized normalized TDFs on t ∈ { L , L , . . . , } to estimate λ . From Table 1, one can observe thatthere is a large gap between TDC and MTDM, both of which extremely evaluate tail dependence with theangles along which the normalized TDF becomes minimal and maximal, respectively. The confidence intervalsof the maximal type tail dependence measures, in particular χ ? and λ , are typically wider than those of theother measures λ , λ G and λ S . From these discussions, the use of non-extremal µ -TDMs, such as λ G and λ S ,is supported for comparing degree of tail dependence. To investigate the financial relationships among countries in stressed economy, we compare the taildependence measures of the returns of the stock indices DJ, NASDAQ, FTSE, HSI and NIKKEI from0 . . . . . Survival Marshall−Olkin copula k T DC . . . . skew t copula k T DC Figure 5: Plots of the estimated tail dependence coefficients against k for the survival Marshall-Olkincopula ˆ C MO a,b and the skew t copula C ST ν,δ ,δ ,γ where the parameters are given by ( a, b ) = (0 . , .
75) and( ν, δ , δ , γ ) = (5 , . , − . , . n = 5626). A long period is taken to ensure enough sample size, andthus the resulting measures can be interpreted to quantify degree of average tail dependence over differentstress events. In the analysis, we particularly focus on the relationships of DJ to other indices to compareinternational relationships of the above four countries to US. To this end, we first filter the marginal returnseries by GARCH(1,1) model with skew t white noise. The residuals are then transformed by rank to obtainpseudo samples from the underlying copula.Based on the pseudo samples, we conduct the plateau-finding algorithm to find the suitable k . Theplots of the TDCs against various k are provided in Figure 8. From the plots we choose k DJ, NASDAQ = 250, k DJ, FTSE = 250, k DJ, HSI = 350 and k DJ, NIKKEI = 300. With the selected k , we conduct the same analysis asin simulation study in Section 5.1. The results are summarized in Figure 9, Figure 10 and Table 1.From Figures 9 and 10, estimates of the normalized TDFs, in particular those at small t , are more volatilethan those of TDFs. From Table 1, one can observe that the proposed tail dependence measures capture thestrong tail dependence of (DJ, NASDAQ) and of (DJ, FTSE), and also quantify the weak tail dependenceof (DJ, HSI) and of (DJ, NIKKEI). These differences of the strength of tail dependence can be most vividlyexhibited by MTDM compared with other tail dependence measures. Although the sample size in the realdata analysis is much smaller than that of simulation study in Section 5.1, the bootstrap confidence intervalsof λ , λ G , λ S and χ are not extremely wider than those in simulation study. On the other hand, confidenceintervals of χ ? and λ seem to be more affected by the smaller sample size. Therefore, estimates of the maximaltype tail dependence measures may be more sensitive to sample size due to the variability of normalizedTDFs for small t . To avoid unstable estimation of maximal type tail dependence measures, it is important toplot the normalized TDFs and heuristically find a suitable small t such that the normalizd TDFs are stablyestimated. For ease of stable estimation, non-extremal µ -TDMs, such as λ G and λ S , are again recommendedfor comparing degree of tail dependence.1 b L ( b , ) . . . . ( a ) M a r s ha ll − O l k i n L ( b , ) Maximal TDCTDC b L ( , b ) . . . . ( b ) M a r s ha ll − O l k i n L ( , b ) Maximal TDCTDCb L ( b , ) . . . . ( c ) S k e w t b L ( , b ) . . . . ( d ) S k e w t Figure 6: Plots of the estimated tail dependence functions b Λ n (cid:0) b, b (cid:1) and b Λ n (cid:0) b , b (cid:1) for (a) (b) survivalMarshall-Olkin copula and (c) (d) skew t copula based on n = 10 samlples. The black solid lines indicatebootstrap means of the estimated TDFs and the dotted lines are their 95% bootstrap confidence intervals. Thebootstrap sample size is set to be B = 100. The colored horizontal lines represent maximal tail dependencecoefficient (Maximal TDC, red) and tail dependence coefficient (TDC, blue). Based on a representation of linear measures of concordance, we constructed a new class of tail dependencemeasures called the µ -tail dependence measures ( µ -TDMs) for a probability measure µ ∈ M on the unitsquare. The proposed measure is determined by the tail dependence function (TDF) of the underlying copula,and includes tail dependence coefficient (TDC) and tail Spearman’s rho as special cases. Various subclassesof µ -TDMs, such as generalized TDC and generalized tail Gini’s gamma, were also provided. We theninvestigated axiomatic properties of µ -TDMs. In particular, an intuitive interpretation of the monotonicityaxiom of µ -TDMs in terms of marginal tail probabilities was derived from the concordance order amongTDFs. We also showed that TDC is the minimal µ -TDM over all measures µ ∈ M , and thus TDC alwaysunderestimates degree of tail dependence. Useful representations of µ -TDMs were also derived, which extractessential information of TDF evaluated only by angles. Maximal type tail dependence measures were thenstudied. We proved that the maximal tail dependence measure (MTDM), the maximum of µ -TDMs over all2 t L ( t , ) t . . . . ( a ) M a r s ha ll − O l k i n t L ( , t ) t . . . . ( b ) M a r s ha ll − O l k i n t L ( t , ) t . . . . ( c ) S k e w t L ( t , ) tMITDCTail Gini MTDMTail SpearmanTDC t L ( , t ) t . . . . ( d ) S k e w t L ( , t ) tMITDCTail Gini MTDMTail SpearmanTDC Figure 7: Plots of the estimated normalized tail dependence functions t Λ n ( t, t and t Λ n (1 ,t ) t for (a) (b)survival Marshall-Olkin copula and (c) (d) skew t copula based on n = 10 samlples. The black solid linesindicate bootstrap means of the estimated normalized TDFs and the dotted lines are their 95% bootstrapconfidence intervals. The bootstrap sample size is set to be B = 100. The colored horizontal lines representmaximal tail dependence measure (MTDM, red), maximal intermediate tail dependence coefficient (MITDC,blue), tail Spearman’s rho (Tail Spearman, orange), tail Gini’s gamma (Tail Gini, green) and tail dependencecoefficient (TDC, yellow).measures µ ∈ M , is the limit of the normalized TDF. We also exhibited relationships of the MTDM withpath-based tail dependence coefficients considered in Furman et al. (2015), and showed that MTDM arises asthe path-based maximum of normalized tail dependence coefficient. We found that path-based maximumof tail dependence coefficient (maxilal tail dependence coefficient, MTDC) can be computed by maximizingthe TDF over derivatives of the paths at the origin. Properties and examples of the maximal type taildependence measures were also provided. The examples for various parametric copulas showed that MTDMmay overestimate, and MTDC may still underestimate degree of tail dependence. Finally, performance ofthe proposed tail dependence measures was demonstrated in simulation and empirical studies. Through thenumerical experiments, we revealed that stable estimation of MTDM is challenging in particular when samplesize is small. Together with the issues of over- and underestimations of tail dependence, we recommend theuse of µ -TDMs with µ supported on all the angles for comparing degree of tail dependence. More detailed3 λ λ G λ S χ χ ? λ (1) Survival Marshal-Olkin copulaEstimate 0.355 0.396 0.454 0.518 0.759 0.83195% CI (0.311, 0.395) (0.344, 0.446) (0.397, 0.507) (0.469, 0.563) (0.691, 0.817) (0.725, 0.919)(2) Skew t copulaEstimate 0.390 0.445 0.500 0.473 0.865 0.99995% CI (0.317, 0.467) (0.356, 0.522) (0.405, 0.586) (0.400, 0.545) (0.495, 1.000) (0.987, 1.000)(3) Stock returns of DJ and NASDAQEstimate 0.449 0.469 0.531 0.488 0.717 0.98095% CI (0.400, 0.496) (0.415, 0.526) (0.471, 0.593) (0.440, 0.529) (0.496, 0.922) (0.913, 1.000)(4) Stock returns of DJ and FTSEEstimate 0.363 0.424 0.478 0.412 0.585 0.81895% CI (0.292, 0.438) (0.346, 0.492) (0.396, 0.553) (0.343, 0.489) (0.435, 0.724) (0.682, 0.955)(5) Stock returns of DJ and HSIEstimate 0.186 0.198 0.225 0.200 0.282 0.44695% CI (0.144, 0.224) (0.149, 0.243) (0.171, 0.275) (0.151, 0.243) (0.177, 0.552) (0.321, 0.562)(6) Stock returns of DJ and NIKKEIEstimate 0.154 0.160 0.180 0.167 0.286 0.39395% CI (0.118, 0.192) (0.116, 0.209) (0.132, 0.236) (0.130, 0.213) (0.150, 0.742) (0.282, 0.522) Table 1: Bootstrap estimates and 95% confidence intervals (CI) of the tail dependence coefficient ( λ ), tailGini’s gamma ( λ G ), tail Spearman’s rho ( λ S ), maximal tail dependence coefficient ( χ ), maximal intermediatetail dependence coefficient ( χ ? ) and maximal tail dependence measure ( λ ) of the survival Marshall Olkincopula, skew t copula, and the copulas of the filtered returns of (DJ, NASDAQ), (DJ, FTSE), (DJ, HSI) and(DJ, NIKKEI). The bootstrap sample size is set to be B = 100.comparison of tail dependence measures is left for future research. Other interesting future directions may beto explore multivariate extensions and compatibility problems; see Embrechts et al. (2016) and Hofert andKoike (2019). References
Asimit, A. V., Gerrard, R., Hou, Y., and Peng, L. (2016). Tail dependence measure for examining financialextreme co-movements.
Journal of Econometrics , 194(2):330–348.Blomqvist, N. (1950). On a measure of dependence between two random variables.
The Annals of MathematicalStatistics , pages 593–600.Bücher, A. (2014). A note on nonparametric estimation of bivariate tail dependence.
Statistics & RiskModeling , 31(2):151–162.4 . . . . . DJ vs NASDAQ k T DC . . . . DJ vs FTSE k T DC . . . DJ vs HSI k T DC . . . . . . DJ vs NIKKEI k T DC Figure 8: Plots of the estimated tail dependence coefficients against k for the copulas of the filtered returnseries of (DJ, NASDAQ), (DJ, FTSE), (DJ, HSI) and (DJ, NIKKEI).Charpentier, A. (2003). Tail distribution and dependence measures. In Proceedings of the 34th ASTINConference , pages 1–25.Durante, F. and Sempi, C. (2015).
Principles of copula theory . CRC press.Edwards, H. H. and Taylor, M. D. (2009). Characterizations of degree one bivariate measures of concordance.
Journal of Multivariate Analysis , 100(8):1777–1791.Embrechts, P., Hofert, M., and Wang, R. (2016). Bernoulli and tail-dependence compatibility.
The Annals ofApplied Probability , 26(3):1636–1658.Fredricks, G. A. and Nelsen, R. B. (1997). Copulas constructed from diagonal sections. In
Distributions withgiven marginals and moment problems , pages 129–136. Springer.Furman, E., Su, J., and Zitikis, R. (2015). Paths and indices of maximal tail dependence.
ASTIN Bulletin ,45(3):661–678.5 b L ( b , ) . . . . ( a ) D J vs N AS D A Q L ( b , ) Maximal TDCTDC b L ( , b ) . . . . ( b ) D J vs N AS D A Q L ( , b ) Maximal TDCTDCb L ( b , ) . . . . ( c ) D J vs FT SE b L ( , b ) . . . . ( d ) D J vs FT SE b L ( b , ) . . . . ( e ) D J vs H S I b L ( , b ) . . . . ( f ) D J vs H S I b L ( b , ) . . . . ( g ) D J vs N I KKE I b L ( , b ) . . . . ( h ) D J vs N I KKE I Figure 9: Plots of the estimated tail dependence functions b Λ n (cid:0) b, b (cid:1) and b Λ n (cid:0) b , b (cid:1) for filtered stockreturns of (a) (b) (DJ, NASDAQ), and (c) (d) (DJ, FTSE), (e) (f) (DJ, HSI) and (g)(h) (DJ, NIKKEI). Theblack solid lines indicate bootstrap means of the estimated TDFs and the dotted lines are their 95% bootstrapconfidence intervals. The bootstrap sample size is set to be B = 100. The colored horizontal lines representmaximal tail dependence coefficient (Maximal TDC, red) and tail dependence coefficient (TDC, blue).6 t L ( t , ) t . . . . ( a ) D J vs N AS D A Q L ( t , ) tMITDCTail Gini MTDMTail SpearmanTDC t L ( , t ) t . . . . ( b ) D J vs N AS D A Q L ( , t ) tMITDCTail Gini MTDMTail SpearmanTDCt L ( t , ) t . . . . ( c ) D J vs FT SE t L ( , t ) t . . . . ( d ) D J vs FT SE t L ( t , ) t . . . . ( e ) D J vs H S I t L ( , t ) t . . . . ( f ) D J vs H S I t L ( t , ) t . . . . ( g ) D J vs N I KKE I t L ( , t ) t . . . . ( h ) D J vs N I KKE I Figure 10: Plots of the estimated normalized tail dependence functions t Λ n ( t, t and t Λ n (1 ,t ) t forfiltered stock returns of (a) (b) (DJ, NASDAQ), and (c) (d) (DJ, FTSE), (e) (f) (DJ, HSI) and (g)(h) (DJ,NIKKEI). The black solid lines indicate bootstrap means of the estimated normalized TDFs and the dottedlines are their 95% bootstrap confidence intervals. The bootstrap sample size is set to be B = 100. Thecolored horizontal lines represent maximal tail dependence measure (MTDM, red), maximal intermediate taildependence coefficient (MITDC, blue), tail Spearman’s rho (Tail Spearman, orange), tail Gini’s gamma (TailGini, green) and tail dependence coefficient (TDC, yellow).7Gini, C. (1914). L’ammontare e la composizione della ricchezza delle nazioni , volume 62. Fratelli Bocca.Hofert, M. and Koike, T. (2019). Compatibility and attainability of matrices of correlation-based measures ofconcordance.
ASTIN Bulletin: The Journal of the IAA , 49(3):885–918.Jaworski, P. (2004). On uniform tail expansions of bivariate copulas.
Applicationes Mathematicae , 4(31):397–415.Jaworski, P. (2006). On uniform tail expansions of multivariate copulas and wide convergence of measures.
Applicationes Mathematicae , 33:159–184.Jaworski, P. (2010). Tail behaviour of copulas. In
Copula theory and its applications , pages 161–186. Springer.Jaworski, P. (2019). On extreme value copulas with given concordance measures. In
International SummerSchool on Aggregation Operators , pages 29–46. Springer.Joe, H. and Li, H. (2019). Tail densities of skew-elliptical distributions.
Journal of Multivariate Analysis ,171:421–435.Joe, H., Li, H., and Nikoloulopoulos, A. K. (2010). Tail dependence functions and vine copulas.
Journal ofMultivariate Analysis , 101(1):252–270.Kendall, M. G. (1938). A new measure of rank correlation.
Biometrika , 30(1/2):81–93.Klüppelberg, C., Kuhn, G., Peng, L., et al. (2007). Estimating the tail dependence function of an ellipticaldistribution.
Bernoulli , 13(1):229–251.Kortschak, D. and Albrecher, H. (2009). Asymptotic results for the sum of dependent non-identicallydistributed random variables.
Methodology and Computing in Applied Probability , 11(3):279–306.Li, H. (2013). Dependence comparison of multivariate extremes via stochastic tail orders. In
Stochastic ordersin reliability and risk , pages 363–387. Springer.Li, H. and Wu, P. (2013). Extremal dependence of copulas: A tail density approach.
Journal of MultivariateAnalysis , 114:99–111.Nelsen, R. B. (2006).
An introduction to copulas . Springer, New York.Nelsen, R. B. and Fredricks, G. A. (1997). Diagonal copulas. In
Distributions with given marginals andmoment problems , pages 121–128. Springer.Nikoloulopoulos, A. K., Joe, H., and Li, H. (2009). Extreme value properties of multivariate t copulas.
Extremes , 12(2):129–148.Scarsini, M. (1984). On measures of concordance.
Stochastica , 8(3):201–218.Schmid, F. and Schmidt, R. (2007). Multivariate conditional versions of spearman’s rho and related measuresof tail dependence.
Journal of Multivariate Analysis , 98(6):1123–1140.8Schmidt, R. and Stadtmüller, U. (2006). Non-parametric estimation of tail dependence.
Scandinavian Journalof Statistics , 33(2):307–335.Schweizer, B. and Wolff, E. F. (1981). On nonparametric measures of dependence for random variables.
TheAnnals of Statistics , 9(4):879–885.Sibuya, M. (1960). Bivariate extreme statistics, i.
Annals of the Institute of Statistical Mathematics ,11(3):195–210.Smith, M. S., Gan, Q., and Kohn, R. J. (2012). Modelling dependence using skew t copulas: Bayesian inferenceand applications.
Journal of Applied Econometrics , 27(3):500–522.Spearman, C. (1904). “general intelligence,” objectively determined and measured.
The American Journal ofPsychology , 15(2):201–292.Taylor, A. E. (1985).
General theory of functions and integration . Courier Corporation.
AppendicesA Examples and properties of tail dependence function
Tail dependence function (TDF) plays an important role for quantifying extremal co-movements betweenrandom variables. In this appendix we review examples and properties of TDF.
A.1 Examples of tail dependence function
In this section we provide examples of tail dependence function for various copulas;
Example A.1 (Fréchet, Gaussian and t copulas) . Fundamental copulas : It is straightforward to verify that Λ( u, v ; M ) = M ( u, v ) and Λ( u, v ; Π) =Λ( u, v ; W ) ≡ Fréchet copulas : Let C F α,β = αM + βW + (1 − α − β )Π, 0 ≤ α, β ≤ α + β ≤
1, be the family of
Fréchetcopulas . Then Λ( u, v ; C F α,β ) = αM ( u, v ).3. Gaussian copulas : By Proposition 8 of Jaworski (2006), we have that Λ( u, v ; ξ ∗ C Ga ρ ) ≡ ρ ∈ ( − ,
1) and ξ ∈ { ι, σ , σ , σ ◦ σ } .4. t copulas : Denote by C tν,ρ a t copula with degrees of freedom ν > ρ ∈ ( − , t -copula has the tail dependence functionΛ( u, v ; C tν,ρ ) = uT ν +1 (cid:18)r ν + 11 − ρ (cid:18) ρ − (cid:16) vu (cid:17) − ν (cid:19)(cid:19) + vT ν +1 (cid:18)r ν + 11 − ρ (cid:18) ρ − (cid:16) uv (cid:17) − ν (cid:19)(cid:19) , where T ν is the cumulative distribution function of Student t distribution with degrees of freedom ν . Since C tν,ρ is increasing in ρ with respect to the concordance order, the tail dependence function9Λ( · ; C tν,ρ ) inherits this monotonicity property. On the other hand, Nikoloulopoulos et al. (2009) observedthat ν Λ( u, v ; C tν,ρ ) is not monotone in general for fixed u, v and ρ although the monotonicity holdswhen u = v = 1.A formula for tail dependence function of elliptical copula can be found in Klüppelberg et al. (2007). Example A.2 (Archimedean copulas) . Consider an
Archimedean copula C ϕ ( u, v ) = ϕ ( ϕ − ( u ) + ϕ − ( v ))where ϕ is the so-called Archimedean generator . For ψ = ϕ − , let E ( ψ ) = lim x ↓ xψ ( x ) ψ ( x ) and E ( ψ ) = lim x ↓ ( − x ) ψ (1 − x ) ψ (1 − x ) , assuming that the limits exist. Then Jaworski (2004) and Jaworski (2006) showed that C ϕ has the taildependence function Λ( u, v ) = ( u − θ + v − θ ) − θ , if E ( ψ ) = − θ, < θ < ∞ ,M ( u, v ) , if E ( ψ ) = −∞ , , if E ( ψ ) = 0 , and the survival Archimedean copula ˆ C ϕ = σ ∗ σ ∗ C ϕ has the tail dependence functionΛ( u, v ) = u + v − ( u θ + v θ ) θ , if E ( ψ ) = θ, < θ < ∞ ,M ( u, v ) , if E ( ψ ) = ∞ , , if E ( ψ ) = 1 . In particular, Clayton copulas C Cl θ with ϕ θ ( x ) = (1 + x ) − θ , θ >
0, satisfy E ( ϕ θ ) = − θ and E ( ϕ θ ) = 1, andthus Λ( u, v ; C Cl θ ) = ( u − θ + v − θ ) − θ and Λ( u, v ; ˆ C Cl θ ) ≡ . More generally, it holds that Λ( u, v ; C ϕ ) = ( u − θ + v − θ ) − θ if ϕ is an Archimedean generator which is regularlyvarying at ∞ with tail index θ >
0; see Proposition 2.5 of Joe et al. (2010). Next, Gumbel copulas C Gu θ with ψ θ ( x ) = ( − log x ) θ , θ ≥
1, satisfy E ( ψ ) = 0 and E ( ψ ) = θ , and thusΛ( u, v ; C Gu θ ) ≡ u, v ; ˆ C Gu θ ) = u + v − ( u θ + v θ ) θ . In general, we have that Λ( u, v ; ˆ C ϕ ) = u + v − ( u θ + v θ ) θ if ψ is regularly varying at 1 with tail index θ > Example A.3 (A singular copula from Section 3.2.1 of Nelsen (2006)) . For θ ∈ I , consider a distributionfunction C θ on I for which the probability mass θ is uniformly distributed on the line segment from (0 ,
0) to( θ, − θ is uniformly distributed on the line segment from ( θ,
1) to (1 , C θ is a copula described by C θ ( u, v ) = u, if θv ≥ u,θv, if θv < u and (1 − θ ) v < − u,u + v − , if (1 − θ ) v ≥ − u, C = M and C = W . Hence we have thatΛ θ ( u, v ) = u, if θv ≥ u,θv, if θv < u. Example A.4 (Survival extreme value copulas) . Consider the bivariate extreme value (EV) copula C A ( u, v ) = exp (cid:26) (log u + log v ) A (cid:18) log u log u + log v (cid:19)(cid:27) , (19)where A ∈ A is the so-called Pickands dependence function (PDF) with A = { A : I → [1 / ,
1] : convex and max( w, − w ) ≤ A ( w ) ≤ w ∈ I } . For instance, A ≡ C A = Π and A ( w ) = max( w, − w ) leads to C A = M . Further examples includethe asymmetric Gumbel and the Galambos copulas , which are implied, respectively, by A Gu α,β,θ ( w ) = (1 − α ) w + (1 − β )(1 − w ) + { ( αw ) θ + ( β (1 − w )) θ } θ , ≤ θ < ∞ and 0 < α, β ≤ , and A Ga α,β,θ ( w ) = 1 − { ( αw ) − θ + ( β (1 − w )) − θ } − θ , < θ < ∞ and 0 < α, β ≤ . For f, g : R d → R , denote the relationship lim x → y f ( x ) g ( x ) = 1 for y ∈ R d by f ’ g ( x → y ). Using therelationships log(1 − x ) ’ − x and 1 − x ’ e − x ( x ↓ C A has the tail dependence functionΛ( u, v ; ˆ C A ) = u + v − ( u + v ) A (cid:18) uu + v (cid:19) . (20)Other approaches to specify tail dependence functions based on the tail densities and copula densities canbe found in Li and Wu (2013), Joe and Li (2019) and Proposition 8.3.2 of Jaworski (2010). A.2 Basic properties of tail dependence function
In this section we provide a brief review on properties of TDF and the set of all TDFs. We begin withbasic properties summarized in the next two propositions.
Proposition A.5 (Basic properties of tail dependence function) . Let Λ ∈ L .1. ( -increasingness) : Λ is 2-increasing, that is, Λ( u , v ) − Λ( u , v ) − Λ( u, v ) + Λ( u, v ) ≥ ≤ u ≤ u and 0 ≤ v ≤ v .2. (Monotonicity) : Λ( u, v ) ≤ Λ( u , v ) for 0 ≤ u ≤ u and 0 ≤ v ≤ v . If Λ
0, then Λ( u, v ) < Λ( u , v )for 0 < u < u and 0 < v < v .3. (Groundedness) : Λ is grounded, that is, Λ( u, v ) = 0 if u or v is zero.4. (Homogeneity) : Λ( tu, tv ) = t Λ( u, v ) for every t ≥ u, v ) ∈ R .5. (Degeneracy) : Λ( u, v ) = 0 for all ( u, v ) ∈ R if and only if Λ( u , v ) = 0 for some u , v > (Coherence) : If C (cid:22) C for C, C ∈ C L2 , then Λ( u, v ; C ) ≤ Λ( u, v ; C ) for all ( u, v ) ∈ R .7. (Bounds) : 0 ≤ Λ( u, v ) ≤ M ( u, v ) for all ( u, v ) ∈ R and the bounds are attainable.8. (Max-min inequalities) : ( s ∧ t )Λ( u, v ) ≤ Λ( su, tv ) ≤ ( s ∨ t )Λ( u, v ) for every s, t ≥ u, v ) ∈ R .9. (Superadditivity) : Λ( u + u , v + v ) ≥ Λ( u, v ) + Λ( u , v ) for every ( u, v ) , ( u , v ) ∈ R .10. (Concavity) : Λ( t ( u, v )+(1 − t )( u , v )) ≥ t Λ( u, v )+(1 − t )Λ( u , v ) for every t ∈ I and ( u, v ) , ( u , v ) ∈ R . Proof.
Proposition A.6 (Continuity and derivatives of Λ) . Let Λ ∈ L .1. (Continuity) : | Λ( u, v ) − Λ( u , v ) | ≤ | u − u | + | v − v | for every ( u, v ) , ( u , v ) ∈ R , and thus Λ isLipschitz continuous.2. (Continuity along the subdiagonal) : The function t Λ( t, − t ) on I is concave and continuous.3. (Partial derivatives) : The partial derivatives ∂ Λ( u, v ) and ∂ Λ( u, v ) exist almost everywhere on( u, v ) ∈ R . Moreover, 0 ≤ ∂ Λ( u, v ) , ∂ Λ( u, v ) ≤ v ∂ Λ( u, v ) and u ∂ Λ( u, v )are increasing almost everywhere on R + .4. (Euler’s theorem) : Λ( u, v ) = u∂ Λ( u, v ) + v∂ Λ( u, v ) for ( u, v ) ∈ R .5. (Conditional tail probability function) : Suppose that C ∈ C has continuous second-order partialderivatives. Then, for any ( u, v ) ∈ R , it holds that ∂ Λ( u, v ) = lim p ↓ P ( V ≤ pv | U = pu ) = lim p ↓ ∂ C ( pu, pv ) ,∂ Λ( u, v ) = lim p ↓ P ( U ≤ pu | V = pv ) = lim p ↓ ∂ C ( pu, pv ) . Proof.
1) and 3) are shown in Theorem 1 and Theorem 3 of Schmidt and Stadtmüller (2006), respectively. 2)can be found in Corollary 1 of Jaworski (2004). 4) is the well-known Euler’s homogeneous theorem. 5) can befound in Nikoloulopoulos et al. (2009).Next we introduce the characterization of the set L . Proposition A.7 (Characterizations of L ) .
1. The set of all tail dependence functions can be written by L = { Λ : R → R + : Λ is 1-homogeneous, 2-increasing and 0 ≤ Λ ≤ M } . (21)2. There is a mapping between L and A , the set of all Pickands dependence functions.2 Proof.
1) Denote by ˜ L the right-hand side of (21). Then the inclusion L ⊆ ˜ L holds by Proposition A.5. Toshow that ˜ L ⊆ L , define the function C Λ ( u, v ) = max(Λ( u, v ) , u + v − , ( u, v ) ∈ I , (22)for Λ ∈ ˜ L . Then C Λ is a copula by Theorem 2 of Jaworski (2004). Moreover, for ( u, v ) ∈ R , we have thatlim p ↓ C Λ ( pu, pv ) p = lim p ↓ Λ( pu, pv ) p = Λ( u, v )since Λ ≥ pu + pv − < p ∈ (0 , u, v ; C Λ ) = Λ andthus Λ ∈ L .2) Let Λ ∈ L . Then the function defined by A ( w ) = 1 − Λ( w, − w ) is a Pickands dependence function byPart 7) of Proposition A.5 and Part 2) of Proposition A.6. Next, let A ∈ A . Then the function defined by (20)is the tail dependence function of the survival EV copula (19), and thus the desired result follows. Remark A.8 (Implications of Proposition A.7) . A copula with prescribed tail dependence function : As shown in the proof of Proposition A.7, one canconstruct a copula with prescribed tail dependence function Λ ∈ L by Construction (22).2.
Restriction to EV copulas : The mappings specified in the proof of Part 2) of Proposition A.7 determinethe bijection between Pickands dependence function and tail dependence function of a survival EVcopula. Therefore, one can construct tail dependence measures of survival EV copulas based on theirPickands dependence functions; see Section 3 of Jaworski (2019) for this approach.3.
PDF-based tail dependence measures : As in Jaworski (2019), one can construct tail dependence measuresfor a given Λ ∈ L based on the induced Pickands dependence function determined by A ( w ) = 1 − Λ( w, − w ). By the construction, the resulting measures depend only on the values of Λ on the subdiagonal.We end this section with an example showing that closedness of C L2 does not hold in general. Example A.9 ( C L2 is not closed in general) . In this example we provide a sequence of copulas ( C n ) ⊂ C such that C n ∈ C L2 , n = 1 , , . . . , pointwise converge to a copula C ∈ C but C / ∈ C L2 . A function δ : I → I iscalled diagonal section if it satisfies the following conditions:(1) δ (1) = 1,(2) δ ( u ) ≤ u for all u ∈ I ,(3) δ is increasing, and(4) | δ ( v ) − δ ( u ) | ≤ | v − u | for all ( u, v ) ∈ I .For a given diagonal section δ , the function( u, v ) C δ ( u, v ) = min (cid:18) u, v, δ ( u ) + δ ( v )2 (cid:19) , ( u, v ) ∈ I , diagonal copula , and it satisfies C δ ( u, u ) = δ ( u ), u ∈ I ; see Fredricks and Nelsen(1997) and Nelsen and Fredricks (1997). Let δ , δ be two strictly increasing diagonal sections such that2 u − < δ ( u ) < δ ( u ) for all ( u, v ) ∈ (0 , h ( u ) = min { t > δ ( u ) + 2 t = δ ( u + t ) } , u = 1 / , u i = δ − ( δ ( u i − )) and u i = u i + h ( u i ) , i ∈ N . For n ∈ N , define the function δ ( n ) : I → I as δ ( n ) ( u ) = δ ( u ) , if 0 ≤ u ≤ / ,δ ( u i − ) , if u i − ≤ u < u i , i = 1 , . . . , n,δ ( u i ) + 2( u − u i ) , if u i ≤ u < u i +1 , i = 1 , . . . , n,δ ( u ) { n :odd } + δ ( u ) { n :even } , if u n +1 ≤ u ≤ . Then δ ( n ) and its limit δ ( u ) = δ ( u ) , if 0 ≤ u ≤ / ,δ ( u i − ) , if u i − ≤ u < u i , i ∈ N ,δ ( u i ) + 2( u − u i ) , if u i ≤ u < u i +1 , i ∈ N , are diagonal sections, which can be similarly checked as in Lemma 4.3 of Kortschak and Albrecher (2009).Moreover, ˆ C δ ( n ) → ˆ C δ pointwise by their constructions. However, it holds thatΛ(1 ,
1; ˆ C δ ( n ) ) = Λ(1 ,
1; ˆ C δ ) if n is odd , Λ(1 ,
1; ˆ C δ ) if n is even , and thus lim n →∞ Λ(1 ,
1; ˆ C δ ( n ) ) does not exist when, for example, δ ( u ) = u and δ ( u ) = u . B Statistical inference on tail dependence measures
In this section we briefly develop statistical inference on the proposed tail dependence measures under theassumption of known or unknown marginal distributions. To this end, let ( X i , Y i ), i = 1 , . . . , n , be an i.i.d.sample of ( X, Y ) ∼ H with continuous marginal distributions X ∼ F , Y ∼ G and a copula C . If F and G areknown, then ( U i , V i ) = ( F ( X i ) , G ( Y i )), i = 1 , . . . , n , is an i.i.d. sample from C . If F and G are unknown, then( ˆ U i , ˆ V i ) = ( ˆ F n ( X i ) , ˆ G n ( Y i )) where ˆ F n ( x ) = 1 n n X i =1 { X i ≤ x } and ˆ G n ( x ) = 1 n n X i =1 { Y i ≤ x } , is called the pseudo sample from C . Denote by C n ( u, v ) = 1 n + 1 n X i =1 { U i ≤ u,V i ≤ v } and ˆ C n ( u, v ) = 1 n + 1 n X i =1 { ˆ U i ≤ u, ˆ V i ≤ v } the empirical copulas based on the samples ( U i , V i ) and ( ˆ U i , ˆ V i ), respectively.4Let µ ∈ M and denote by H µ the joint cumulative distribution function (cdf) of µ with marginal cdfs F µ and G µ . For p ∈ (0 , Z I C n ( pu, pv ) d µ ( u, v ) = 1 n n X i =1 Z I { U i ≤ pu,V i ≤ pv } d µ ( u, v )= 1 n n X i =1 µ (cid:18)(cid:26) ( u, v ) ∈ I : U i p ≤ u ≤ , V i p ≤ v ≤ (cid:27)(cid:19) = 1 − n n X i =1 F µ (cid:18) U i p − (cid:19) − n n X i =1 G µ (cid:18) V i p − (cid:19) + 1 n n X i =1 H µ (cid:18) U i p − , V i p − (cid:19) , and that Z I ˆ C n ( pu, pv ) d µ ( u, v ) = 1 − n n X i =1 F µ ˆ U i p − ! − n n X i =1 G µ ˆ V i p − ! + 1 n n X i =1 H µ ˆ U i p − , ˆ V i p − ! . Therefore, provided that H µ , F µ , G µ and R I M d µ ∈ (0 ,