aa r X i v : . [ m a t h . S T ] S e p RESEARCH PAPER
Multi-Gaussian random variables
O. Korotkova
Department of Physics, University of Miami, 1320 Campo Sano Dr., Coral Gables, FL 33146
ARTICLE HISTORY
Compiled September 22, 2020
ABSTRACT
A generalization of the classic Gaussian random variable to the family of Multi-Gaussian (MG) random variables characterized by shape parameter
M >
0, inaddition to the mean and the standard deviation, is introduced. The probabilitydensity function of the MG family members is the alternating series of the Gaussianfunctions with the suitably chosen heights and widths. In particular, for the integervalues of M the series has finite number of terms and leads to flattened profiles, whilereducing to classic Gaussian density for M = 1. For non-integer, positive values of M a convergent infinite series of Gaussian functions is obtained that can be truncatedin practical problems. While for all M > < M <
KEYWORDS
Normal; Gaussian; Log-normal; Probability density function; Multivariate;Bivariate
1. Introduction
The most famous Probability Density Function (PDF) of a continuous random vari-able - Gaussian - stemming from the early works of de Moivre [1] and Gauss [2] can begeneralized in a number of ways for inclusion of desired shape details such as flatten-ing, skewing, splitting, etc. One well-known generalization was proposed by Subbotin[3] (see also [4]) who extended the PDF curve to flatter or sharper versions by varyingthe power law of the exponential function, hence the family is sometimes termed expo-nential power or super-Gaussian. The Subbotin’s PDF was later rescaled by Lunetta[5] and the resulting family has become well explored (c.f. [6]). While originally theywere used for analysis of the astrophysical data, the super-Gaussian random variablesare currently employed for characterization of a wide range of statistical phenomena:in big data analysis in general [7] and, in particular, in finance [8], [9], genetics [10] andscientific impact assessment [11], to name a few. However, this seemingly transparentgeneralization often rely on the use of special functions, such as a hypergeometricfunction, as is the case for evaluation of its characteristic function [12] (see also [13]).In this paper we introduce a novel family of continuous random variables that servesa similar purpose as the super-Gaussian family, i.e., it reshapes the Gaussian distribu-tion to flat-top or cusp-top versions, depending on the value of the shape parameter.
CONTACT O. Korotkova Email: [email protected] owever, the main advantage of our family over the super-Gaussians stems from thefact that the statistical properties of its members can be expressed as the series ofthose for the Gaussian random variables, with very simple expressions defining theirheights and widths. Moreover, in case when the shape parameter is an integer the flat-top distributions can be formed by a finite number of terms in the series. As we show,the ability to represent a PDF of the new random variable as a linear combination ofGaussian contributions leads to unprecedented tractability in derivation of a numberof its characteristics. For any value of the shape parameter we will term our new familyof random variables
Multi-Gaussian (MG), not to be confused with the well-knownmultivariate Gaussian. The multi-Gaussian functions of various dimensions have beenpreviously used in optics for modeling of beam intensity profiles [14], aperure shapes[15], scattering potentials [16], [17] and various correlation and coherence functions[18]-[21].Starting from the finite series case (integer values of shape parameter) we firstbuild in some intuition for the new random variable. In particular, after introducingthe PDF, we provide calculations of its Cumulative Distribution Function (CDF),Moment Generating Function (MGF), Characteristic Function (CF) and the CumulantGenerating Function (CGF). Then we derive the general expressions for the momentsand the cumulants of any order and find explicit expressions for the first four membersof each sequence.Next, we introduce the Log-Multi-Gaussian (LMG) random variable on assumingthat its logarithm is MG-distributed. Such derived distribution can be regarded asan extension of the classic Log-Normal distribution [23] to flat-topped profiles. TheLMG distribution has an analog derived from the super-Gaussian family [24], [25],however, as we show, the calculations of major statistical characteristics of the LMGrandom variables can be obtained almost effortlessly, as the linear combinations of thewell-known results for the Log-Normal variables.We also discuss the natural extension of the univariate MG random variable to themultivariate domain and, in particular, the bivariate domain. Such variables redice toclassic multivariate/bivariate Gaussians if the shape parameter M = 1.Finally, we show how to generalize all the aforementioned results to the MG distribu-tions with shape parameter M taking on any positive values, not necessarily integers,while discussing numerical examples for the special case when M is a reciprocal of aninteger, leading to formation of various cusped distributions. The features of the LMGand bivariate MG random variables with any positive M are also briefly outlined andthe corresponding numerical results are provided.
2. Multi-Gaussian distribution with integer shape index M Let us begin by recalling that the Gaussian PDF of a continuous real random variable X with mean µ (location parameter) and standard deviation σ (scale parameter), i.e. X ∼ N ( µ, σ ), has form p ( G ) X ( x ) = 1 √ πσ exp (cid:20) − ( x − µ ) σ (cid:21) . (1)2onsider now a function f ( x ) = 1 √ πσ " − (cid:18) − exp (cid:20) − ( x − µ ) σ (cid:21)(cid:19) M , (2)where M is a positive real number. For M = 1 f ( x ) reduces to the Gaussian function inEq. (1), while for M > < M < M as a shape parameter.Let us first discuss the case when M is an integer, M = 1 , , , ... On using thebinomial theorem( u + v ) M = M X m =0 (cid:18) Mm (cid:19) u M − m v m , (cid:18) Mm (cid:19) = M ! m !( M − m )! , (3)with u = 1, v = − exp h − ( x − µ ) σ i we arrive at the finite series f ( x ) = 1 √ πσ M X m =1 (cid:18) Mm (cid:19) ( − m − exp (cid:20) − ( x − µ ) σ m (cid:21) , (4)with the standard deviation of the m -th term in the series of the form σ m = σ/ √ m, (5)and the same mean µ for all terms. Let us now introduce a PDF as p ( MG ) X ( x ) = f ( x ) / ∞ Z −∞ f ( x ) dx, (6)ensuring that ∞ R −∞ p ( MG ) X ( x ) dx = 1. On finding that C ( M ) = ∞ Z −∞ f ( x ) dx = M X m =1 (cid:18) Mm (cid:19) ( − m − √ m , (7)we finally obtain the Multi-Gaussian
PDF: p ( MG ) X ( x ) = 1 C ( M ) √ πσ M X m =1 (cid:18) Mm (cid:19) ( − m − exp (cid:20) − ( x − µ ) σ m (cid:21) . (8)We may also say that X ∼ N ( µ, σ, M ) as a generalization of normal variable to anyvalues of shape parameter M .The Cumulative Distribution Function (CDF) of the Multi-Gaussian random vari-able in Eq. (8) can then be readily calculated from its definition and by changing the3rder of summation and integration: P ( MG ) X ( x ) = x Z −∞ p ( MG ) X ( s ) ds = 12 C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m (cid:20) Erf (cid:18) x − µ √ σ m (cid:19)(cid:21) , (9)where Erf stands for the error function. Figures 1 (A) and (B) show the PDF andthe CDF of a MG random variable for M = 1 , ,
10 and 40, calculated from Eqs. (8)and (9), respectively. Figure 2 shows the PDF for M = 10 and different values of µ and σ . -4 -2 2 40.10.20.30.4 -4 -2 2 40.20.40.60.81.0 x x (A) (B) M = 1M = 2M = 10M = 40 M = 1M = 2M = 10M = 40 p ( x ) (MG) X P ( x ) (MG) X Figure 1.
Multi-Gaussian PDF (A) and CDF (B) with µ = 0, σ = 1. -6 -4 -2 2 4 60.20.40.60.81.00 -6 -4 -2 2 4 60.050.100.150.200 (A) (B) x x p ( x ) X(MG) p ( x ) X(MG)
M = 10 M = 10 m = 0 s = 1s = 0.2s = 0.5s = 1s = 2 m = -0.5m = 0m= 0.5m = 1
Figure 2.
Multi-Gaussian PDF with M = 10 and various values of µ and σ . .2. Moment Generating Function, Characteristic Function andMoments The Moment Generating Function (MGF) of the MG random variable in Eq. (8) canbe also directly evaluated from its definition: M X ( t ) ( MG ) = ∞ Z −∞ exp[ xt ] p X ( x ) dx = 1 C ( M ) exp[ µt ] M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:20) σ m t (cid:21) . (10)The Characteristic Function (CF) of the MG distribution can be readily obtainedeither from definition or its relation with the MGF: φ ( MG ) X ( ω ) = ∞ Z −∞ exp[ iωx ] p ( MG ) X ( x ) dx = M ( MG ) X ( iω )= 1 C ( M ) exp[ iωµ ] M X m =1 (cid:18) Mm (cid:19) ( − m − m n √ m exp (cid:20) − σ m ω (cid:21) . (11)The statistical moment of order k can be either evaluated from the MGF in Eq.(10) via expression µ ( MG ) k = d k dt k [ M X ( t )] t =0 , (12)or directly via the PDF function in Eq. (8). Following the latter path we find that µ ( MG ) k = ∞ Z −∞ x k p ( MG ) X ( x ) dx = 1 C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m ∞ Z −∞ x k √ πσ m exp (cid:20) − ( x − µ ) σ m (cid:21) dx, (13)where we recognize that each integral term is the k -th moment of the Gaussian distri-bution [see Eq. (1)] with mean µ and standard deviation σ m : µ ( G ) k = σ ( − i √ k U (cid:18) k , , − µ σ m (cid:19) , (14)where U ( a, b, z ) is the confluent Hypergeometric function. In particular, on substitut-ing the first four moments of the Gaussian distribution from Eq. (14) into Eq. (13) we5nd at once that µ ( MG )1 = µ,µ ( MG )2 = µ + σ ξ ( M ) ,µ ( MG )3 = µ + 3 µσ ξ ( M ) ,µ ( MG )4 = µ + 6 µ ξ ( M ) + 3 σ ξ ( M ) . (15)Here parameters ξ n ( M ) are defined via ratios ξ n ( M ) = C n ( M ) C ( M ) , n = 0 , , , ... (16)where C n ( M ) = M X m =1 (cid:18) Mm (cid:19) ( − m − m n √ m , (17)and, in particular, in agreement with C ( M ) defined in Eq. (7). Also, ξ n (1) = 1 forany n = 1 , , , ... , hence sequence (15) reduces to that in Eq. (14). The Cumulant Generating Function (CGF) of the MG random variable has form K ( MG ) X ( h ) = ln h M ( MG ) X ( h ) i = hµ + ln " C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:18) h σ m (cid:19) , (18)as implied by Eq. (10). On expanding the exponential function in Eq. (18) in Taylorseries, interchanging the order of two summations and recognizing coefficients ξ n ( M )in the sum inside we get K ( MG ) X ( h ) = hµ + ln " ∞ X n =0 h n σ n n n ! ξ n ( M ) . (19)Further, expanding the logariphmic function in Eq. (19) in the Taylor series we arriveat the double power series K ( MG ) X ( h ) = µh + ∞ X p =1 ( − p +1 p " ∞ X n =0 h n σ n n n ! ξ n ( M ) p . (20)The cumulants can be found as coefficients κ ( MG ) n of powers of h in expansion (20): K ( MG ) X ( h ) = ∞ X p =1 k ( MG ) p h p p ! . (21)6n particular, the first four cumulants are: κ ( MG )1 = µ,κ ( MG )2 = σ ξ ( M ) ,κ ( MG )3 = 0 ,κ ( MG )4 = 3 σ [ ξ ( M ) − ξ ( M )] . (22)and, all cumulants of odd orders higher than three are also trivial.
3. Log-Multi-Gaussian (LMG) distribution
Let random variable X be the MG-distributed and Y is such that ln Y = X . Then,if we let Y = exp[ X ] = g ( X ) and X = ln[ Y ] = g − ( Y ), then the PDF for variable Ytakes form p Y ( y ) = p X [ g − ( y )] (cid:12)(cid:12)(cid:12)(cid:12) dg − ( y ) dy (cid:12)(cid:12)(cid:12)(cid:12) = p X [ln y ] 1 | y | , y > . (23)Substitution of the MG PDF from Eq. (8) into Eq. (23) leads to PDF p ( LMG ) Y ( y ) = 1 C ( M ) σ √ π | y | M X m =1 (cid:18) Mm (cid:19) ( − m − exp (cid:20) − (ln y − µ ) σ m (cid:21) , (24)which we will term Log-Multi-Gaussian (LMG) . We may say that Y ∼ LN ( µ, σ, M ).For M = 1 it reduces to the classic Log-Normal distribution [23]. The CDF of theLMG distribution can be readily found from relation P ( LMG ) Y ( y ) = P ( MG ) X (ln y )= 12 C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m (cid:20) Erf (cid:20)r m (cid:18) ln y − µσ (cid:19)(cid:21)(cid:21) , y > . (25)The MGF of the LMG distribution diverges but the moments can be found fromdefinition: µ ( LMG ) k = ∞ Z −∞ y k p ( LMG ) Y ( y ) dy = 1 C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m √ πσ m ∞ Z −∞ y k − exp (cid:20) − (ln y − µ ) σ m (cid:21) dy . (26)7he expression in the curly bracket is the k -th moment of the LG distribution withmean µ and variance σ m : µ ( LN ) k = 1 √ πσ m ∞ Z −∞ y k − exp (cid:20) − (ln y − µ ) σ m (cid:21) dy = exp (cid:20) k (2 µ + kσ m )2 (cid:21) (27)Thus, µ ( LMG ) k = 1 C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:20) k (2 µ + kσ m )2 (cid:21) . (28)In particular, the first four moments are: µ ( LMG )1 = exp[ µ ] C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:20) σ m (cid:21) ,µ ( LMG )2 = exp[2 µ ] C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:2) σ m (cid:3) ,µ ( LMG )3 = exp[3 µ ] C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:20) σ m (cid:21) ,µ ( LMG )4 = exp[4 µ ] C ( M ) M X m =1 (cid:18) Mm (cid:19) ( − m − √ m exp (cid:2) σ m (cid:3) . (29)Figure 3 shows the PDF and the CDF of a LMG random variable with µ = 0 and σ = 1, for several values of index M . As M increases the PDF profiles become sharperwith maxima occuring at smaller values of y . Figure 4 illustrates the PDF of a LMGrandom variable with M = 10 but different values of σ and µ .
4. Multivariate Multi-Gaussian distribution
Let X = ( X , X , ..., X N ) T ∈ R be an N -dimensional vector of real random variables, T standing for transpose. Also let the N -dimensional vector of their mean values be µµµ = ( µ , µ , ...µ N ) T and their N × N covariance matrix be ΣΣΣ m = ( Cov [ X i , X j ] , ≤ i, j ≤ N ) /m . Then the PDF of the multivariate MG random vector can be defined bystraightforward generalization of the multivariate Gaussian PDF as P ( MG ) XXX ( x , x , ..., x N ) = M P m =1 (cid:0) Mm (cid:1) ( − m − exp (cid:2) − ( xxx − µµµ ) T ΣΣΣ − m ( xxx − µµµ ) (cid:3) C ( M ) p (2 π ) n p det [ΣΣΣ m ] , (30)where det stands for determinant of a matrix and power − (A) (B) M = 1M = 2M = 10M = 40 M = 1M = 2M = 10M = 40 p (y) (LMG) Y P (y) (LMG) Y y (C) (D) y y p (y) (LMG) Y P (y) (LMG) Y
00 0
Figure 3.
Log-Multi-Gaussian PDF (A), (C) and CDF (B), (D) with µ = 0, σ = 1. (C) and (D) are the sameas (A) and (B) but on log-linear scale. y (A) (B) M = 1M = 2M = 10M = 40 M = 1M = 2M = 10M = 40 p (y) (LMG) Y (LMG) Y y (C) (D) µ = -0.5 µ = 0 µ = 0.5 µ = 1 y y p (y) (LMG) Y p (y) (LMG) Y p (y) σ = µ = σ = 2 µ = 0 M = 10 σ = 1 µ = -0.5 µ = 0 µ = 0.5 µ = 1M = 10 σ = 0.1 Figure 4.
Log-Multi-Gaussian PDF with different values of σ and µ . In particular, in the bivariate MG case, N = 2, one has P ( MG ) X X ( x , x ) = M P m =1 (cid:0) Mm (cid:1) ( − m − exp h − zm − ρ ) i C ( M )2 πσ σ p − ρ , (31)9ith z = ( x − µ ) σ − ρ ( x − µ )( x − µ ) σ σ + ( x − µ ) σ , (32)where we have used µµµ = (cid:18) µ µ (cid:19) , ΣΣΣ m = 1 m (cid:18) σ ρσ σ ρσ σ σ (cid:19) . (33)Figure 5 presents the bivariate MG PDF for several values of ρ and M . (A) (B)(D)(C) x xxx x xx x Figure 5.
Bivariate Multi-Gaussian random variable with µ = µ = 0, σ = σ = 1. (A) M = 1, ρ = 0; (B) M = 1, ρ = 0 .
7; (C) M = 40, ρ = 0; (D) M = 40, ρ = 0 .
5. Generalization to any
M > Let us now return to Eq. (2) assuming that M is any positive number, and not neces-sarily an integer. Then fractional binomial theorem can be used instead of the usual10ne, hence Eq. (3) becomes:( u + v ) M = M X m =0 (cid:18) Mm (cid:19) u M − m v m , (cid:18) Mm (cid:19) = ( M ) m m ! , (34)where ( M ) m = M ( M − ... ( M − m + 1) (35)is the Pochhammer symbol. Let us then set u = 1, v = − exp h − ( x − µ ) σ i and expressfunction f ( x ) in Eq. (2) via the generalized binomial series: f ( x ) = ∞ X m =1 ( M ) m m ! ( − m − exp (cid:20) − ( x − µ ) σ m (cid:21) . (36)The MG PDF can be obtained as the normalized version of f ( x ) in Eq. (36) [see Eq.(6)]: p ( MG ) X ( x ) = 1 C ( M ) √ πσ ∞ X m =1 ( M ) m m ! ( − m − exp (cid:20) − ( x − µ ) σ m (cid:21) . (37)with normalization C ( M ) = ∞ X m =1 ( M ) m m ! √ m ( − m − . (38)The infinite series (37) converges as m → ∞ and hence in practical applications canbe suitably truncated. All the caclulations relating to the CDF, MGF, CF, KGF,moments, cumulants, etc. can be carried out in the same manner as done for theinteger values of M but with binomial coefficients being of form as in Eqs. (34) and(35). For instance, the CDF takes form P ( MG ) X ( x ) = 12 C ( M ) ∞ X m =1 ( M ) m m ! √ m ( − m − (cid:20) Erf (cid:18) x − µ √ σ m (cid:19)(cid:21) . (39)Figure 6 presents the PDF given by Eq. (37) and the CDF given by Eq. (39) of theMulti-Gaussian random variable for several rational values of M .Likewise, the coefficients ξ n ( M ) appearing in calculations of the moments and thecumulants must use more general expression for C n ( M ): C n ( M ) = ∞ X m =1 ( M ) m m ! ( − m − m n √ m , (40)instead of that given in Eq. (17). 11 x (A) (B) M = 1M = 1/2M = 1/10M = 1/40 M = 1M = 1/2M = 1/10M = 1/40 p ( x ) (MG) X P ( x ) (MG) X -4 -2 2 40.20.40.60.81.01.2 -4 -2 2 40.20.40.60.81.00 0 Figure 6.
Multi-Gaussian PDF for fractional M , with truncation index of 2000. (A) and CDF (B) with µ = 0, σ = 1. In particular, the expression for the LMG PDF now takes form p ( LMG ) Y ( y ) = 1 C ( M ) σ √ π | y | ∞ X m =1 ( M ) m m ! exp (cid:20) − (ln y − µ ) σ m (cid:21) , (41)where C ( M ) is given in Eq. (38). Also, the bivariate MG PDF becomes: P ( MG ) X X ( x , x ) = 1 C ( M )2 πσ σ p − ρ ∞ X m =1 ( M ) m m ! ( − m − exp (cid:20) − zm − ρ ) (cid:21) , (42)where C ( M ) is given in Eq. (38) but with z being defined as before in Eq. (32).Figure 7 present the examples of LMG PDF from Eq. (41) and Fig. 8 shows typicalbivariate MG distribution from Eq. (42), both for rational values of index M .
6. Summary
We have introduced a new family of continuous, real random variables whose PDFrepresent flattened and cusped deviations from a Gaussian random variable dependingon a single shape parameter taking positive values. Gaussian random variables are aparticular case of our family when shape parameter takes on value one. While generalanalytic form of the new family can itself be used it appears possible to express itas a linear combination of Gaussian functions with alternating signs, the same meanand the monotonically decreasing standard deviations. Hence the suggested name ofthe random variable: Multi-Gaussian. It was shown that for the integer values of theshape parameter the series of Gaussian functions has the finite number of terms whilefor non-integer positive values it becomes infinite but remains convergent. Due to thisfeature the calculations of the statistical properties of the Multi-Gaussian randomvariables essentially reduce to algebraic operations over the well-known properties fora Gaussian random variable.We have also introduced the Log-Multi-Gaussian random variable and carried outthe extension from univariate Multi-Gaussian random variable to its multivariate coun-terpart. Other straightforward extensions can be readily made, for instance, relatingto complex Multi-Gaussian random variables. The Multi-Gaussian family of randomvariables is invisioned to be useful in the same applications as the Subbotin’s family12exponential-power or super-Gaussian family) but in situations where simple, closed-form analytical results are of importance. (A) (B)(D)(C) x xxx x xx x Figure 7.
Bivariate LMG random variable with µ = µ = 0, σ = σ = 1. (A) M = 1, ρ = 0; (B) M = 1, ρ = 0 .
7; (C) M = 1 / ρ = 0; (D) M = 1 / ρ = 0 . (A) (B) y p (y) Y(LMG) p (y) Y(LMG) m = 0 M = 1/10 s = 1 M = 1M = 1/2M = 1/10M = 1/40 s = 0.5s = 1s = 1.5s = 3 y m = 0 Figure 8.
LMG PDF for rational M , with truncation index of 2000, for various values of M µ , and σ . eferences [1] A. de Moivre, Approximatio ad summam terminorum binomii ( a + b ) n in seriem expansi ,(Printed for private circulation, 1733, Reproduced in Archibald, 1926).[2] C.F. Gauss, Theoria motus Corporum Celestium , (Hamburg, Perthes et Besser 1809).[3] M.T. Subbotin,
On the law of frequency of error , Mat. Sb. 31 (1923), pp. 296–301.[4] P. L´evy,
Calcul des Probabilit´es (Gauthier-Villars, Paris, France, 1925).[5] G. Lunetta,
Di una Generalizzazione dello Schema della Curva Normale , Annali dellaFacolt`a di Economia e Commercio di Palermo 17 (1963), pp. 237-244.[6] S. Nadarajah,
A generalized normal distribution , J. Appl. Stat. 32 (2005), pp. 685-694.[7] J. Laherrere, D. Sornette,
Stretched exponential distributions in nature and economy: fattails with characteristic scales , Eur. Phys. J. B 2 (1998), pp. 525–539.[8] R. Mantegna, H. Stanley,
An Introduction to Econophysics: Correlations and Complexityin Finance (Cambridge University Press, Cambridge, UK, 2000).[9] J. McCauley,
Dynamics of Markets: Econophysics and Finance (Cambridge UniversityPress, New York, USA, 2007).[10] F. Liang, C. Liu and N. Wang,
A robust sequential Bayesian method for identification ofdifferently expressed genes , Statistica Sinica 17 (2007), pp. 571–597.[11] S. Redner,
How popular is your paper? An empirical study of the citation distribution ,Eur. Physical J. B 4 (1998), pp. 131–134.[12] T.K. Pog´any and S. Nadarajah,
Sur la fonction caract´eristique de la distribution normaleg´en´eralis´ee , Comptes Rendus Mathematique 348 (2010), pp. 203–206.[13] T.A. Maturi, A. Elsayigh,
The correlation between variate-values and ranks in samplesfrom complete forth power exponential distribution , J. Math. Res. 1 (2009), pp. 14–18.[14] Y. Li,
Light beams with flat-topped profiles , Opt. Lett. 27 (2002), 1007–1009.[15] Y. Li, H. Lee and E. Wolf,
Effect of edge rounding and sloping of sidewalls on the readoutsignal of the information pits on optical disks , Opt. Eng. 42, (2003), pp. 2707-2720.[16] S. Sahin, O. Korotkova and G. Gbur,
Scattering of light from particles with semi-softboundaries , Opt. Lett. 36 (2011), pp. 3957–3959.[17] O. Korotkova, S. Sahin and E. Shchepakina,
Light scattering by three-dimensional objectswith semi-hard boundaries , J. Opt. Soc. A 31 (2014), pp. 1782–1787.[18] S. Sahin and O. Korotkova,
Light sources generating far fields with tunable flat profiles ,Opt. Lett. 37 (2012), pp. 2970–2972.[19] O. Korotkova,
Random sources for rectangular far fields , Opt. Lett. 39 (2014), pp. 64–67.[20] O. Korotkova,
Design of weak scattering media for controllable light scattering , Opt. Lett.40 (2015), pp. 284–287.[21] O. Korotkova,
Can a sphere scatter light producing rectangular intensity patterns ? Opt.Lett. 40 (2015), pp. 1709–1712.[22] J. Li, F. Wang and O. Korotkova,
Random sources for cusped beams , Opt. Express 24(2016), pp. 17779–17791.[23] D. McAlister,
The law of the geometric mean , Proc. Roy. Soc. 29 (1879), pp. 367-376.[24] S. Vianelli,
Sulle curve lognormali di oedine r quali famiglie di distribuzioni di errori diproporzione , Statistica 42 (1982), pp. 155–176.[25] S. Vianelli, The family of normal and lognormal distributions of order r , Metron 41 (1983),pp. 3–10., Metron 41 (1983),pp. 3–10.