A Short Proof of the Symmetric Determinantal Representation of Polynomials
aa r X i v : . [ m a t h . C V ] J a n A Short Proof of the Symmetric DeterminantalRepresentation of Polynomials
Anthony Stefan ∗ and Aaron Welters † Florida Institute of TechnologyDepartment of Mathematical SciencesMelbourne, FL, USAJanuary 12, 2021
Abstract
We provide a short proof of the theorem that every real multivariatepolynomial has a symmetric determinantal representation, which was firstproved in J. W. Helton, S. A. McCullough, and V. Vinnikov,
Noncommu-tative convexity arises from linear matrix inequalities , J. Funct. Anal. 240(2006), 105-191. We then provide an example using our approach and ex-tend our results from the real field R to an arbitrary field F different fromcharacteristic 2. The new approach we take is only based on elementaryresults from the theory of determinants, the theory of Schur complements,and basic properties of polynomials. As was first proven in 2006 by J. W. Helton, S. McCullough, and V. Vinnikov[06HMV], any real polynomial p ( z ) ∈ R [ z ] in n variables [ z = ( z , . . . , z n )] has a symmetric determinant representation , i.e., there exists an affine linear matrixpencil A + P ni =1 z i A i with symmetric matrices A , . . . , A n ∈ R m × m such that p ( z ) = det A + n X i =1 z i A i ! . (1)We give a new proof of this theorem, which we will refer to as the HMV Theorem .One of the merits of our proof that makes it short and elementary is that itrequires no prior knowledge of multidimensional systems theory (as comparedto [06HMV] and [12RQ]) or advanced representation theory for multivariatepolynomials (as compared to [11BG]). For instance, the proof of the HMV ∗ Email: astefan2015@my.fit.edu † Email: awelters@fit.edu F ). As such,we provide a discussion on this and then extend our results to arbitrary fieldsdifferent from characteristic 2.The rest of the paper will proceed as follows. In Sec. 2 we establish thenotation, definitions, and preliminary results used in the paper. In Sec. 3 weprove the HMV Theorem. In Sec. 4 we provide an example using our approach.In Sec. 5 we discuss the extension of the HMV Theorem to arbitrary fieldsdifferent from characteristic 2. Finally, in Sec. 6 we provide an Appendix withauxiliary results. We will denote any matrix A ∈ R m × m that is partitioned in 2 × A = [ A ij ] i,j =1 , = (cid:20) A A A A (cid:21) = (cid:20) A A A A (cid:21) , where the matrix A ij ∈ R m i × m j is called the ( i, j )-block of A .The direct sum A ⊕ B of two square matrices A ∈ R k × k and B ∈ R p × p isdefined to be the matrix A ⊕ B ∈ R ( k + p ) × ( k + p ) with the 2 × A ⊕ B = (cid:20) A B (cid:21) . F not just R , see [02FIS, Sec. 4.3, Exercise 21]). Lemma 1 (Determinant of a direct sum) If A and B are real symmetricmatrices then the direct sum A ⊕ B is a real symmetric matrix and det( A ⊕ B ) = (det A )(det B ) . (2)The Schur complement of a matrix A = [ A ij ] i,j =1 , with respect to A [i.e.,with respect to its (2 , A ] will be denoted as A/A and defined by A/A = A − A A − A , whenever the matrix A is invertible. The following result is also well-known(and is true for matrices with entries in an arbitrary field F not just R , see[05FZ, p. 19, Theorem 1.1]). Lemma 2 (Schur’s determinant formula) If A = [ A ij ] i,j =1 , ∈ R N × N and A is invertible then det A = det( A ) det( A/A ) . (3)The next result is also well-known (see, for instance, [20SW, p. 16, Lemma 6]although the statement and proof is the valid for any field F including R ). Lemma 3 (Sum of a Schur complement with matrix) If B ∈ R m × m , A = [ A ij ] i,j =1 , ∈ R N × N , and A is invertible with A/A ∈ R m × m then, C/C = A/A + B, (4) where C = [ C ij ] i,j =1 , ∈ R N × N is partitioned in × block matrix form as C = (cid:20) C C C C (cid:21) = (cid:20) A + B A A A (cid:21) (5) and C = A is invertible. Moreover, if both matrices A and B are symmetricthen the matrix C is symmetric. The next lemma is proved in the Appendix (in fact, the lemma is actuallyvalid for any field F different from characteristic 2, but the proof that we givefor R is much shorter; for more details on the proof of this for such fields F seeSec. 5). Lemma 4 (Realization of simple products)
The matrix polynomial uvB in any two variables u and v with symmetric matrix B ∈ R m × m , has a Bess-mertny˘ı realization A ( u, v ) /A ( u, v ) = uvB, (6) with an affine linear matrix pencil A ( u, v ) = A + uA + vA = (cid:20) A ( u, v ) A ( u, v ) A ( u, v ) A ( u, v ) (cid:21) , (7) such that, for some N ∈ N , the matrices A j ∈ R N × N for each j = 0 , , aresymmetric and A ( u, v ) is a real constant invertible matrix. F not just for R ). Before we state it, we needthe following definition. Definition 5 (Simple product substitution)
For any real polynomial q = q ( z ) ∈ R [ z ] in n variables z = ( z , . . . , z n ) , any k ∈ { , . . . , n } , and any pairof variables u and v , the real polynomial p = q ( z ) | z k = uv obtained from q ( z ) bymaking the substitution z k = uv is called a simple product substitution on q . Lemma 6 (Polynomial realization)
Each real polynomial p can be con-structed from an affine linear real polynomial q by applying a finite numberof simple product substitutions to q . In this section we will prove the HMV Theorem over the field R (i.e., Theorem9). To do this we will need the following two lemmas (the first lemma and itsproof are valid over any field F not just R ; the second lemma and its proof arevalid over any field F different from characteristic 2). Lemma 7 (Scalar multiplication) If c is a real number and p ( z ) ∈ R [ z ] hasa symmetric determinantal representation then cp ( z ) also has a symmetric de-terminantal representation. Proof.
Suppose c ∈ R and p ( z ) = det [ A ( z )], where A ( z ) = A + P ni =1 z i A i is an affine linear matrix pencil with symmetric matrices A , . . . , A n ∈ R m × m .Then it follows from Lemma 1 that B ( z ) = A ( z ) ⊕ [ c ] = A ⊕ [ c ] + n X i =1 z i ( A i ⊕ [ c ])is an affine linear matrix pencil with symmetric matrices A i ⊕ [ c ] ∈ R ( m +1) × ( m +1) such that cp ( z ) = det [ B ( z )] . This proves that cp ( z ) has a symmetric determinantal representation. Lemma 8 (Substitution) If q = q ( z ) ∈ R [ z ] has a symmetric determinantrepresentation and p is obtained from q by a simple product substitution then p also has a symmetric determinant representation. roof. Suppose q = q ( z ) is a real polynomial in n variables z = ( z , . . . , z n )with a symmetric determinant representation q ( z ) = det [ A ( z )] , where A ( z ) = A + P ni =1 z i A i is an affine linear matrix pencil with symmetricmatrices A , . . . , A n ∈ R m × m . Let p be a polynomial obtained from q by asimple product substitution. Then, by definition, there is a k ∈ { , . . . , n } anda pair of variables u and v such that p is the real polynomial p = q ( z ) | z k = uv inthe variables z, u, v . Hence, p = q ( z ) | z k = uv = det [ A ( z )] | z k = uv = det [ A ( z ) | z k = uv ] . It follows immediately from Lemma 3 and Lemma 4 that A ( z ) | z k = uv = uvA k + A + n X i =1 ,i = k z i A i = B ( z, u, v ) /B ( z, u, v ) , where B ( z, u, v ) is an affine linear matrix pencil of the form B ( z, u, v ) = B + n X i =1 ,i = k z i B i + uB n + vB n +1 = (cid:20) B ( z, u, v ) B ( z, u, v ) B ( z, u, v ) B ( z, u, v ) (cid:21) , (8)such that, for some N ∈ N , the matrices B j ∈ R N × N for each j = 0 , . . . , n + 1are symmetric and B ( z, u, v ) is a constant invertible matrix. In particular, B ( z, u, v ) ≡ det B (0 , ,
0) = d ∈ R \ { } . (9)Therefore, it follows from this, Lemma 1, and Lemma 2 that p = det [ B ( z, u, v ) /B ( z, u, v )] = 1 d det [ B ( z, u, v )] = det (cid:8) B ( z, u, v ) ⊕ [ d − ] (cid:9) , which is a symmetric determinantal representation for p . This completes theproof.We are now ready to prove the HMV Theorem as an immediate corollaryof Lemma 6 and Lemma 8 (the statement and its proof is valid for any field F different from characteristic 2; for more details on this see Sec. 5). Theorem 9 (Symmetric determinantal representation)
Any real poly-nomial p ( z ) ∈ R [ z ] in n variables z = ( z , . . . , z n ) has a symmetric determinantrepresentation, i.e., there exists an affine linear matrix pencil A + P ni =1 z i A i with real symmetric matrices A , . . . , A n ∈ R m × m such that p ( z ) = det A + n X i =1 z i A i ! . (10)5 roof. Let p = p ( z ) ∈ R [ z ] be a real polynomial in n variables z = ( z , . . . , z n ).If p is an affine linear real polynomial then p ( z ) = det[ p ( z )] is symmetric deter-minantal representation. Suppose that p is not an affine linear real polynomial.Then by Lemma 6, there exists an affine linear real polynomial q such that p = S l · · · S q , where S k is an operation of simple product substitution for each k = 1 , . . . , l for some l ∈ N . By Lemma 8, S q has a symmetric determinantalrepresentation. If l = 1 then we are done. Thus, assume l ≥
2. If S k · · · S q has a symmetric determinantal representation for some integer 1 ≤ k < l thenby Lemma 8, S k +1 S k · · · S q = S k +1 ( S k · · · S q ) has a symmetric determinantalrepresentation. Therefore, by induction p = S l · · · S q has a symmetric deter-minantal representation. This proves the theorem. Here we will use the results in this paper to show how to produce a symmetricdeterminantal representation for the real polynomial p ( z , z , z ) = z + z z . (11)First, (11) can be obtained from the affine linear real polynomial q ( z , w ) = z + w by making the simple product substitution w = z z since p ( z , z , z ) = z + z z = q ( z , w ) | w = z z . Next, q ( z , w ) has the symmetric determinantal representation q ( z , w ) = det (cid:2) q ( z , w ) (cid:3) = det( z [1] + w [1])and hence, p ( z , z , z ) = z + z z = det (cid:2) q ( z , w ) (cid:3) | w = z z = det( z [1] + z z [1]) . z [1] + z z [1])= det [ z ] + ( z + z ) ( z − z ) ( z + z ) − ( z − z ) 0 1 , (cid:20) − (cid:21) = det z ( z + z ) ( z − z ) ( z + z ) − ( z − z ) 0 1 , (cid:20) − (cid:21) = (cid:18) det (cid:20) − (cid:21)(cid:19) − det z ( z + z ) ( z − z ) ( z + z ) − ( z − z ) 0 1 = ( −
1) det z ( z + z ) ( z − z ) ( z + z ) − ( z − z ) 0 1 = det z ( z + z ) ( z − z ) ( z + z ) − ( z − z ) 0 1 ⊕ [ − = det z ( z + z ) ( z − z ) 0 ( z + z ) − ( z − z ) 0 1 00 0 0 − . Therefore, the real polynomial p ( z , z , z ) = z + z z has the symmetric de-terminantal representation p ( z , z , z ) = z + z z = det ( A + z A + z A + z A ) , (12)with the affine linear matrix pencil A + z A + z A + z A = z ( z + z ) ( z − z ) 0 ( z + z ) − ( z − z ) 0 1 00 0 0 − (13)and symmetric matrices A , A , A , A ∈ R × given by A = − − , A = , (14) A =
12 12 , A = − − . (15)7 Extensions
In this section we will discuss how to extended our proof of the HMV Theo-rem (i.e., Theorem 9) on symmetric determinantal representations of real poly-nomials to polynomials with coefficients in an arbitrary field F different fromcharacteristic 2.We will begin in Subsection 5.1 with a discussion of some previous results inthis regard and compare it to ours. Furthermore, we include a brief discussion onhow we modify our proof of the HMV Theorem from the field R to an arbitraryfield F different from characteristic 2. Moreover, we explain why the proof ofour theorem would fail for fields of characteristic 2 by pointing out which stepin our proof causes the problem and illustrate the issue in the example fromSection 4. Finally, in Subsection 5.2 we give our short proof of the extension ofthe HMV Theorem to such fields. In [12RQ, Sec. 5.1], R. Quarez extends the HMV Theorem to polynomials withcoefficients in an arbitrary ring R of characteristic different from 2 (see [12RQ,Theorem 5.1]). To do this, he has to adjust the steps in his construction towork not only over R , but also over the ring R . The main issue he has to resolveis adjusting his proof (mainly of [12RQ, Theorem 4.4]) to avoid issues withinversion and using the diagonalization theorem for real symmetric matrices.However, these matrices could involve, for instance, square roots and hence notallowable over a general ring (cf. [12RQ, Theorem 5.1] and its proof).In comparison with our paper, although we just treat a field F different fromcharacteristic 2 instead of a ring R (for the sake of simplicity), we only have toadjust one step in our proof of the HMV Theorem. The adjustment we haveto make is in the proof of Lemma 4, namely, for the case when the symmetricmatrix B ∈ F m × m is not invertible. This is due to the issue that it is not alwayspossible to find a scalar λ ∈ F \ { } such that B − λ I m is invertible. Thisissue would not occur if F is an infinite field (i.e., a field of characteristic 0) as B then can only have a finite number of eigenvalues [02FIS, p. 249, Theorem5.3], but it is an issue when F is a finite field (i.e., a field of characteristic p ,for some prime number p ). To see this, let F be a finite field. Then, it has p k elements in it for some prime number p (so that F has characteristic p ) andsome integer k ≥
1. Let 0 , λ , . . . , λ p k − be all the distinct elements of F andconsider the diagonal matrix B = diag { , λ , . . . , λ p k − } ∈ F p k × p k . This is asymmetric matrix in F p k × p k that is not invertible such that B − λ I p k is notinvertible for every scalar λ ∈ F \ { } .In [11BG], B. Grenet et al. gives a proof of the HMV Theorem using aconstruction that, from the very beginning, was meant to be valid for any fieldof characteristic different from 2. As they point out (see [11BG, Sec. 5]), theirconstructions are not valid for fields of characteristic 2 because they need to usethe scalar 1 / F of characteristic 2, is for a similar reason as in [11BG]. More precisely, theonly issue in our proof of the HMV Theorem is in the proof of Lemma 4, wherein order to realize the simple product uvB for a symmetric matrix B ∈ F m × m ,we need to use the scalar 1 / xy + z in the three variables x, y, z fromthe field F , the field with two elements, and prove it has no symmetric deter-minantal representation. More specifically, they prove by their result [13GMT,Theorem 4.2], that the polynomial p ( x, y, z ) = xy + z can not be represented asthe determinant of a symmetric matrix with entries in F ∪ { x, y, z } .Therefore, if you compare this to our example from Section 4, since we needto use the scalar 1 / In this subsection we will prove an extension of the HMV Theorem from thefield R (i.e., Theorem 9) to an arbitrary field F different from characteristic 2(i.e., Theorem 13). We will need the following three elementary lemmas (whichare true over any field F with any characteristic).The following lemma is well-known (see, for instance, [72PP, p. 123, Theo-rem 3] as proved in [66KP, p. 57, Theorem 1.1 ′ ], cf. [66KP, pp. 48-49, Theorem1.1] and its proof). Lemma 10 (Rank factorization) If F is a field and A ∈ F m × m is a sym-metric matrix with rank r ≥ then there exists an invertible matrix Y ∈ F m × m and a invertible symmetric matrix B ∈ F r × r such that A = Y T (cid:20) B
00 0 m − r (cid:21) Y, (16) where the zero matrices bordering B are absent if r = m . The next two lemmas are well-known (see, for instance, [20SW, p. 17,Lemma 8] and [20SW, p. 21, Proposition 12] in the field C although the proofis the same for any field F ). Lemma 11 (Shorted matrices are Schur complements) If F is a fieldand B = [ B ij ] i,j =1 , ∈ F ( r + k ) × ( r + k ) with B ∈ F k × k invertible and B/B ∈ r × r then the direct sum B/B ⊕ l ∈ F ( r + l ) × ( r + l ) is a Schur complement C/C = B/B ⊕ l = (cid:20) B/B
00 0 l (cid:21) , (17) where C ∈ F ( r + l + k ) × ( r + l + k ) is a × block matrix with the following blockpartitioned structure C = [ C ij ] i,j =1 , : C = (cid:20) C C C C (cid:21) = B B l B B , (18) and C = B is invertible. Moreover, if the matrix B is symmetric then thematrix C is symmetric. Lemma 12 (Matrix multiplication of a Schur complement) If F is afield and C = [ C ij ] i,j =1 , ∈ F ( m + k ) × ( m + k ) with C ∈ F k × k invertible and C/C ∈ F m × m then, for any matrices X ∈ C s × m and Y ∈ C m × s , D/D = X ( C/C ) Y, (19) where D ∈ C ( s + k ) × ( s + k ) is the × block matrix D = (cid:20) D D D D (cid:21) = (cid:20) XC Y XC C Y C (cid:21) = (cid:20) X I k (cid:21) (cid:20) C C C C (cid:21) (cid:20) Y I k (cid:21) (20) and D = C is invertible. Moreover, if C is symmetric and X = Y T then D is symmetric. Theorem 13 (Symmetric determinantal representation)
Let F be a fieldof characteristic different from . Then any polynomial p ( z ) ∈ F [ z ] in n vari-ables z = ( z , . . . , z n ) has a symmetric determinant representation, i.e., thereexists an affine linear matrix pencil A + P ni =1 z i A i with symmetric matrices A , . . . , A n ∈ F m × m such that p ( z ) = det A + n X i =1 z i A i ! . (21) Proof.
The proof of this theorem is the same as the proof of the case when thefield is R (i.e., the same proof as we gave for Theorem 9, but just replace thefield R in that proof with the field F of characteristic different from 2) exceptwe have to make one change to the proof of Lemma 4 in the Appendix, namely,for the case when the symmetric matrix B ∈ F m × m is not invertible we needto prove Lemma 4 for the matrix polynomial uvB . If B = 0 then the proofis immediate. Thus, assume B = 0. In this case, we prove Lemma 4 by firstapplying Lemma 10 to the matrix B to get the factorization uvB = Y T (cid:20) uvB
00 0 m − r (cid:21) Y, (22)10here Y ∈ F m × m and B ∈ F r × r are invertible matrices and B is symmetric.Now we can appeal to the first part of the proof of Lemma 4 applied to thematrix polynomial uvB and then use Lemma 11 followed by Lemma 12 toprove Lemma 4 for the matrix polynomial uvB . This proves the theorem. The following lemma is well-known (see, for instance, [20SW, p. 16, Proposition7] although the statement and proof is the valid for any field F including R ). Lemma 14 (Sum of two Schur complements) If A ∈ R m × m and B ∈ R n × n are × block matrices A = (cid:20) A A A A (cid:21) , B = (cid:20) B B B B (cid:21) such that A ∈ R p × p , B ∈ R q × q are invertible and A/A , B/B ∈ R k × k then C/C = A/A + B/B , (23) where C ∈ R ( k + p + q ) × ( k + p + q ) is the × block matrix with the following blockpartitioned structure C = [ C ij ] i,j =1 , : C = (cid:20) C C C C (cid:21) = A + B A B A A B B , (24) and C = (cid:20) A B (cid:21) (25) is invertible. Moreover, if both matrices A and B are symmetric then the matrix C is also symmetric. Proof of Lemma 4.
Let B ∈ R m × m be a symmetric matrix and consider thematrix polynomial uvB in the two variables u and v . Consider first the casethat B is invertible. If u = v then we have u B = (cid:20) uI m uI m − B − (cid:21)(cid:30) (cid:2) − B − (cid:3) , (26)where I m is the m × m identity matrix in R m × m . It follows from this that if u and v are independent variables then (cid:18)
12 ( u ± v ) (cid:19) = (cid:20) ( u ± v ) I m ( u ± v ) I m − B − (cid:21)(cid:30) (cid:2) − B − (cid:3) (27)11nd hence by Lemma 14, uvB = (cid:18)
12 ( u + v ) (cid:19) B + (cid:18)
12 ( u − v ) (cid:19) ( − B ) (28)= ( u + v ) I ( u − v ) I ( u + v ) I − B − ( u − v ) I B − , (cid:20) − B − B − (cid:21) . (29)This proves the lemma in the case that B is invertible. Now suppose B is notinvertible. Then there exists a λ ∈ R \ { } which is not an eigenvalue of B . Let B = B − λ I m and B = λ I m . Therefore, since both B and B are invertiblesymmetric matrices in R m × m and uvB = uvB + uvB , the proof then followsimmediately from Lemma 14. Proof of Lemma 6.
First, Lemma 6 is obviously true for any affine linearreal polynomial, i.e., for any real polynomial of degree less than or equal toone. We will now prove it is true for any real polynomial of degree 2. Let n ∈ N , a , . . . , a n , b , . . . , b n ( n +1)2 be real scalars, and z , . . . , z n , w , . . . , w n ( n +1)2 be independent variables. Consider the affine linear real polynomial q ( z, w ) = a + n X l =1 a l z l + n ( n +1)2 X k =1 b k w k . Let S k = | w k = z i z j denote the operation of simple product substitution of w k = z i z j , for all integer pairs i, j with 1 ≤ i ≤ j ≤ n with integers k = 1 , . . . , n ( n +1)2 ordered by the lexicographical order on the pairs ( i, j ). Applying the operationsconsecutively of S , . . . , S n ( n +1)2 to q we get the polynomial S n ( n +1)2 · · · S q = a + n X l =1 a l z l + n ( n +1)2 X k =1 b k z i z j . This proves the lemma for any real polynomial of degree 2. Suppose the lemmais true for all real polynomials of degree less than or equal to d for somenatural number d ≥
2. We will now prove the lemma is true for any realpolynomial of degree d + 1. Let n ∈ N , b , . . . , b M n,d +1 be real scalars, and z , . . . , z n , w , . . . , w M n,d +1 be independent variables, where M n,d +1 is the num-ber of monomials of degree d + 1 in n independent variables. To be explicit, itis a well-known result that this number is given by the following formula M n,d +1 = (cid:18) n + dd + 1 (cid:19) = ( n + d )!( d + 1)!( n − . Let q d ( z ) be a real polynomial of degree less than or equal to d in the n in-dependent variables z = ( z , . . . , z n ). Consider the real polynomial q ( z, w )of degree less than or equal to d in the n + M n,d +1 independent variables12 , . . . , z n , w , . . . , w M n,d +1 defined by r ( z, w ) = q d ( z ) + M n,d +1 X k =1 b k z α k w k , where the set { z α k : k = 1 , . . . , M n,d +1 } equals the set of all monomials in the n variables z , . . . , z n of degree d − z α k z i z j , k = 1 , . . . , M n,d +1 , ≤ i ≤ j ≤ n is a list, with no repeats, of all the monomials in the n variables z , . . . , z n of degree d + 1. By the induction hypothesis, there exists a affinelinear real polynomial q and a finite number T l , l = 1 , . . . M of simple productsubstitution operations such that r ( z, w ) = T M · · · T q. Let S k = | w k = z i z j denote the operation of simple product substitution of w k = z i z j , for all integer pairs i, j with 1 ≤ i ≤ j ≤ n with integers k = 1 , . . . , n ( n +1)2 ordered by the lexicographical order on the pairs ( i, j ). Applying the operationsconsecutively of S , . . . , S n ( n +1)2 to r ( z, w ) we get the polynomial S n ( n +1)2 · · · S r = S n ( n +1)2 · · · S T M · · · T q = q d ( z ) + M n,d +1 X k =1 b k z α k z i z j = p ( z ) . This proves the lemma for any real polynomial of degree d + 1. Therefore, byinduction the lemma is true for any real polynomial of any degree. This provesthe lemma. References [02FIS] Stephen H. Friedberg, Arnold J. Insel, and Lawrence E. Spence. Linearalgebra, 4th ed., Pearson, 2002.[11BG] B. Grenet, E. Kaltofen, P. Koiran, and N. Portier. Symmetric determi-nantal representation of formulas and weakly skew circuits, Contempo-rary Mathematics Volume 566, 2011. doi:10.1090/conm/556/11008.[13GMT] B. Grenet, T. Monteil, and S. Thomass´e. Symmetric determinantalrepresentations in characteristic 2, Linear Algebra and its Applications439 (2013), 1364–1381. doi:10.1016/j.laa.2013.04.022.[06HMV] J. W. Helton, S. A. McCullough, and V. Vinnikov. Noncommutativeconvexity arises from linear matrix inequalities, J. Funct. Anal. 240(2006), 105-191. doi:10.1016/j.jfa.2006.03.018.[66KP] I. J. Katz and M. H. Pearl. On EP r and normal EP r ma-trices, J. Res. Nat. Bur. Stand., Series B, 70B (1966), 47-75.doi:10.6028/jres.070B.004. 1372PP] Martin H. Pearl and Alan I. Penn. Normal matrices with entries froman arbitrary field of characteristic6