How to construct a upper triangular matrix that satisfy the quadratic polynomial equation with different roots
aa r X i v : . [ m a t h . G M ] A ug HOW TO CONSTRUCT A UPPER TRIANGULAR MATRIXTHAT SATISFY THE QUADRATIC POLYNOMIAL EQUATIONWITH DIFFERENT ROOTS
IVAN GARGATE AND MICHAEL GARGATE
Abstract.
Let R be an associative ring with identity 1. We describe allmatrices in T n ( R ) the ring of n × n upper triangular matrices over R ( n ∈ N ),and T ∞ ( R ) the ring of infinite upper triangular matrices over R , satisfying thequadratic polynomial equation x − rx + s = 0. For such propose we assumethat the above polynomial have two different roots in R . Moreover, in the casethat R in finite, we compute the number of all matrices to solves the matrixequation A − rA + sI = 0 , where I is the identity matrix. Introduction
Let R be an associative ring with identity 1. Denote by T n ( R ) the n × n upper triangular groups with entries in R and T ∞ ( R ) the ring of infinite uppertriangular matrices over R . There are several authors who have works over thisspaces, for instance, Slowik [2] show how to construct an involution matrix overthese spaces, Hou [1] proof the similar results for idempotent matrices and Gargatein [4] compute the number off all involutions over the incidence algebras I ( X, K )where X is a finite poset and K is a finite field. Recentely Gargate [5] computethe number of coninvolution matrices over the special rings: the Gaussian Integersmodule p and the Quartenion Integers module p , with p an odd prime number.Remember that various special matrices satisfy some polynomial equations, forinstance, idempotent matrices satisfies x − x = 0 and involution matrices satisfies x − x − rx + s = 0 withthe condition that the polynomial has two different roots in R . We investigate howto construct these special matrices and compute the total of these matrices when R is a finite ring.Our main results is the followings Theorem: Theorem 1.1.
Assume that R is an associative ring with identity 1. Let M beeither the group T n ( K ) or T ∞ ( K ) for some n ∈ N and denote by I the identitymatrix of M . Consider the quadratic polynomial equation x − rx + s = 0 andassume that this equation has two different roots a, b ∈ R such that a − b is not aright zero divisor. Then a matrix A ∈ M satisfies the quadratic equation of the type A − rA + sI = 0 , (1) if and only if A is described by the following statements: Key words and phrases. triangular matrix, infinite triangular matrix . (i)
For all ≤ i ≤ n , we have a ii ∈ { a, b } with a, b different roots of thequadratic equation x − rx + s = 0 , r = a + b and s = ab . (ii) For all pairs of indices ≤ i < j ≤ n such that a ii = a jj , then a ij equals to a ij = if j = i + 1 − a ii − another root j − X p = i +1 a ip a pj if j > i + 1 . (2)(iii) For i < j , such that a ii = a jj , then a ij can be chosen arbitrarily. Next using the above theorem we will prove the following result
Theorem 1.2.
Let R be an associative ring with identity 1 and | R | = q the numberof the elements in R . Consider the quadratic polynomial equation x − rx + s = 0 and assume that this equation has two different roots a, b ∈ R such that a − b is nota right zero divisor. Then the total number of n × n upper triangular matrices thatsatisfy the quadratic equaton A − rA + sI = 0 is equal to X n + n = n ≤ n i (cid:18) nn n (cid:19) · q n n . where n , n are the number of times that appears a, b in the diagonal respectively. Matrix solutions of the equation A − rA + sI = 0We star our considerations we notice the following property. Remark 2.1.
Assume that R is an associative ring with identity 1, M = T ∞ ( R )or M = T n ( R ) for some n ∈ N . If A ∈ M is a block matrix such that A = B B B · · · B B · · · B · · · . . . where B ii are square matrices and A satisfies the quadratic equation A − rA + sI =0, then for all i , the matrices B ii satisfy the quadratic equation as well. Proof.
Since A satisfies the quadratic equation A − rA + sI = 0, we have A − rA + sI = B − rB + sI ∗ ∗ · · · B − rB + sI ∗ · · · B − rB + sI · · · . . . , and we obtain B ii − rB ii + sI = 0 for all i by comparing entries of the diagonalposition in the matrix equality above. (cid:3) Now, we can prove our first main result.
Proof of Theorem 1.1.
Let A = P i,j a ij E ij ∈ M be a matrix that satisfies theequation (1). As we have that a and b are different roots of the quadratic equationthen r = a + b and s = ab .Since A − rA + sI = 0 our coefficients must satisfy the equations: OW TO CONSTRUCT AN UPPER TRIANGULAR MATRIX 3 a ii − r · a ii + s = 0 ,a ii a i,i + i + a i,i +1 a i +1 ,i +1 − r · a i,i +1 = 0 ,a ii a i,i +2 + a i,i +1 a i +1 ,i +2 + a i,i +2 a i +2 ,i +2 − r · a i,i +2 = 0 , ... m X p =0 a i,i + p a i + p,i + m − r · a i,i + m = 0 , ... (3)Since A satisfies the equation (1) and a ii − r · a ii + s = 0 then a ii ∈ { a, b } .We need to proved that (ii) and (iii) given in Theorem 1.1 hold. We useinduction on j − i .Assume that j − i = 1. We have a ii a i,i + i + a i,i +1 a i +1 ,i +1 − r · a i,i +1 = 0 , (4)from the family of equations (3). One can see that: • If a ii = a i +1 ,i +1 , of the equation (4) we have2 a ii a i,i +1 − r · a i,i +1 = 0 , or a i,i +1 (2 a ii − r ) = 0 , then a i,i +1 = 0 since a ii ∈ { a, b } and r = a + b with a = b . • If a ii = a i +1 ,i +1 , then of the equation (4) we obtain a i,i +1 ( a ii + a i +1 ,i +1 − r ) = 0 , thus a i,i +1 can be chosen arbitrarily, since r = a + b = a ii + a i +1 ,i +1 .So the first super diagonal entries of the matrix A fulfill (ii) and (iii).Now, suposse that j − i − = m > i, i + m ) entries of theequation (1), and we have the ( m + 1)-st family of the equation (3): m X p =0 a i,i + p a i + p,i + m − r · a i,i + m = 0 , or a i,i + m ( a ii + a i + m,i + m − r ) + m − X p =1 a i,i + p a i + p,i + m = 0 . (5) IVAN GARGATE AND MICHAEL GARGATE • If a ii = a i + m,i + m then ( a ii + a i + m,i + m − r ) = 0 and we obtain that a i,i + m = − a ii + a i + m,i + m − r ) m − X p =1 a i,i + p a i + p,i + m = − a ii − other root ) m − X p =1 a i,i + p a i + p,i + m (6)where r = a + b and( a ii + a i + m,i + m − r ) = ( a ii − other root ) = a − b , if a ii = a i + m,i + m = ab − a , if a ii = a i + m,i + m = b So (ii) of the Theorem 1.1 hold. • If a ii = a i + m,i + m then we must have a ii + a i + m,i m − r = 0 since a ii ∈ { a, b } and r = a + b . So we get m − X p =1 a i,i + p a i + p,i + m = 0 (7)from equation (5).Now, consider A ( m, i ) the submatrix of A defined as A ( m, i ) = a ii a i,i +1 · · · a i,i + m a i +1 ,i +1 · · · a i +1 ,i + m . . . ... a i + m,i + m . From Remark (2.1) one can see that A satisfies the quadratic equation(1) if and only if A ( m, i ) also satisfies the equation (1) for all m and i .We write this matrix as a block matrix such that A ( m, i ) = a ii α a i,i + m β γ a i + m,i + m . (8)Since A satisfies the equation (1) and by Remark (2.1) we have that thematrices A ( m − , i ) = (cid:20) a ii α β (cid:21) and A ( m − , i + 1) = (cid:20) β γ a i + m,i + m (cid:21) , also satisfies the equation (1). So, we obtain that a ii α + αβ − rα = 0and βγ + γa i + m,i + m − rγ = 0 . Thus( A ( m, i )) − rA ( m, i )+ sI = a ii a i,i + m + αγ + a i,i + m a i + m,i + m − ra i,i + m since a ii ∈ { a, b } with a, b roots of the equation x − rx + s = 0. OW TO CONSTRUCT AN UPPER TRIANGULAR MATRIX 5 As a ii = a i + m,i + m from equations (5) and (7) we have αγ = m − X p =1 a i,i + p a i + p,i + m = 0 . Hence, a ii a i,i + m + αγ + a i,i + m a i + m,i + m − ra i,i + m = a i,i + m ( a ii + a i + m,i + m − r ) + αγ = αγ = 0since r = a + b = a ii + a i + m,i + m .Therefore, A ( m, i ) satisfies the equation (1), regardless of the value ofthe entry a i,i + m .Thus (iii) of the Theorem 1.1 holds.Assume now that the entries of A fulfill (i), (ii) and (iii) of Theorem 1.1.We shall prove that A satisfies the quadratic equation A − rA + sI = 0. Sincethe equation (2) involves only the coefficients with indices p , such that i ≤ p ≤ j ,it suffices to prove the claim for A ( m, i ). For m = 1 and m = 2 one can easlycheck now that all sub matrices A (1 , i ) and A (2 , i ) satisfy the quadratic equation A − rA + sI = 0. Suppose that the claim hold for all 1 ≤ t ≤ m −
1, i.e. A (2 , i ) , A (3 , i ) , . . . , A ( m − , i ) satisfy the quadratic equation (1) for all i , we needonly prove that A ( m, i ) also satisfy the quadratic equation (1).Consider A ( m, i ) as a block matrix given in the form of equation (8). Thus,we have that the quadratic equation A ( m, i ) − rA ( m, i ) + sI equals a ii − ra ii + s a ii α + αβ − rα a ii a i,i + m + αγ + a i,i + m a i + m,i + m − ra i,i + m β − rβ + s βγ + γa i + m,i + m − rγa i + m,i + m − ra i + m,i + m + s (9)By assumption, A ( m − , i ) satisfy the equation (1) for all i . So A ( m − , i )and A ( m − , i + 1) satisfy the equation (1). Thus, A ( m − , i ) − rA ( m − , i ) + sI = (cid:20) a ii − ra ii + s a ii α + αβ − rα β − rβ + s (cid:21) = (cid:20) (cid:21) , (10)and A ( m − , i +1) − rA ( m − , i +1)+ sI = (cid:20) β − rβ + s βγ + a i + m,i + m γ − rγ a i + m,i + m − ra i + m,i + m + s (cid:21) = (cid:20) (cid:21) . (11)From the equations (10) and (11) above, we have a ii − ra ii + s = 0 and a i + m,i + m − ra i + m,i + m + s = 0 since a ii ∈ { a, b } is root the equation x − rx + s = 0, β − rβ + sI = 0 by Lemma (2.1) and a ii α + αβ − rα = 0 (12) IVAN GARGATE AND MICHAEL GARGATE βγ + a i + m,i + m γ − rγ = 0 (13)Then by multiplying the equation (12) by γ and the equation (13) by α weobtain that a ii αγ + αβγ − rαγ = 0 αβγ + a i + m,i + m αγ − rαγ = 0 . Hence, αβγ = ( r − a ii ) αγαβγ = ( r − a i + m,i + m ) αγ. • If we consider a ii = a i + m,i + m we have αβγ = ( a ii ) αγ = ( a i + m,i + m ) αγ, since r = a ii + a i + m,i + m , which implies that αγ = 0.So, the (1 , m ) entries of equation (9) is a i,i + m ( a ii + a i + m,i + m − r ) + αγ = αγ = 0 . Therefore, A ( m, i ) satisfies the quadratic equation A − rA + sI = 0. • On the other hand, if a ii = a i + m,i + m from (iii) of the Theorem 1.1 or theequations (5) and (6) we have a i,i + m = − a ii + a i + m,i + m − r ) m − X p =1 a i,i + p a i + p,i + m = − a ii + a i + m,i + m − r ) αγ, so, in this case the (1 , m ) entries of equation (9) is a i,i + m ( a ii + a i + m,i + m − r )+ αγ = (cid:18) − αγa ii + a i + m,i + m − r (cid:19) ( a ii + a i + m,i + m − r )+ αγ = 0 . Therefore, A ( m, i ) also satisfies the quadratic equation (1)Thus, we have proved that A satisfies the quadratic equation (1) in the uppertriangular matrix ring M where a ii ∈ { a, b } and a, b are different roots of theequation x − rx + s = 0 if and only if A is described as in (i), (ii) and (iii) of theTheorem 1.1 (cid:3) Follows immediately from Theorem 1.1 the results of Hou [1] and Slowik [2]:
Corollary 2.2 (Hou [1]) . We can construct any n × n idempotent upper triangularmatrix over R that has only zeros and ones on its diagonal (i) For all i , the entries in the main diagonal a ii ∈ { , } . (ii) For i < j , if a ii = a jj , then a ij equals to a ij = if j = i + 1(1 − a ii ) j − X p = i +1 a ip a pj if j > i + 1 . (14)(iii) For i < j , if a ii = a jj , then a ij can be chosen arbitrarily. OW TO CONSTRUCT AN UPPER TRIANGULAR MATRIX 7
Proof.
For a ii ∈ { , } the quadratic equation (1) equals to A = A then A is aidempotent matrix.We need to verify that equation (14) of the Theorem 1.1 yields the samepossibilities for a ij shown in the procedure above. The equation (2) becomes a ij = − a ii − another root j − X p = i +1 a ip a pj = (1 − a ii ) j − X p = i +1 a ip a pj for a ii ∈ { , } . (cid:3) Corollary 2.3 (Slowik [2]) . We can construct any n × n involution upper triangularmatrix over R wher a ii ∈ { , − } (i) For all i , the entries in the main diagonal a ii ∈ {− , } . (ii) For i < j , if a ii = a jj , then a ij equals to a ij = if j = i + 1 − (2 a ii ) − j − X p = i +1 a ip a pj if j > i + 1 . (15)(iii) For i < j , if a ii = − a jj , then a ij can be chosen arbitrarily.Proof. For a ii ∈ {− , } the quadratic equation (1) equals to A = I then A is aInvolution matrix.We need to verify that equation (15) of the Theorem 1.1 yields the samepossibilities for a ij shown in the procedure above. The equation (2) becomes a ij = − a ii − another root j − X p = i +1 a ip a pj = − a ii − ( − a ii ) j − X p = i +1 a ip a pj = − a ii j − X p = i +1 a ip a pj for a ii ∈ {− , } . (cid:3) Compute the number of all solutions for the quadraticpolynomial equation
Theorem 3.1.
Let R be an associative ring with identity 1 and | R | = q the numberof the elements in R . Then the total number of n × n upper triangular that satisfythe quadratic equaton A − rA + sI = 0 with a ii ∈ { a, b } on the diagonal where { a, b } different roots of the quadratic equation x − rx − s = 0 is equal to X n + n = n ≤ n i (cid:18) nn n (cid:19) · q n n . IVAN GARGATE AND MICHAEL GARGATE where n , n are the number of times that appears a, b in the diagonal respectively, r = a + b and s = ab .Proof of Theorem 1.2. By Theorem 1.1, the number of possible upper triangularmatrices that satisfy the quadratic equation A − rA + sI = 0 with the set D = { a, b } , a = b on the diagonal depends entirely on which pairs of diagonal entrieshave a ii = a jj . To enumerate those possibilities, consider an integer column vector d = ( d , d , . . . , d n ) the respective diagonal having each d i ∈ D and denote for n , n the numbers of a, b that appears in the diagonal respectively, such that n + n = n with 0 ≤ n i for i = 1 ,
2. By ∆ we denote the number de pairs ( d i , d j ) with i < j and d i = d j . Notice that ∆ = n · n . In particular, ∆ is independent of the order in which the elements of the set D appear on d . Consequently we have on the diagonal yields q ∆ = q n · n , possibleupper tiangular matrices that satisfy the quadratic equation A − rA + sI = 0.Finally, all d ′ i s can be put on our main diagonal on (cid:18) nn (cid:19)(cid:18) n − n n (cid:19) = (cid:18) nn , n (cid:19) , where (cid:18) nn , n (cid:19) = n ! n ! n ! . Therefore, the total number of n × n upper tiangular matrices that satisfy thequadratic equation A − rA + sI = 0 with elements the set { a, b } with a = b on thediagonal is X n + n = n ≤ n i (cid:18) nn n (cid:19) · q n n . (cid:3) References [1] Hou X. Idempotents in triangular matrix ring. Linear and Multilinear Algebra.In Press: doi: 10.188003081087.2019.1596223.[2] Slowik R., Involutions in triangular groups, Linear and Multilinear Algebra.2013; 61:7, 909-916.[3] Roksana S?owik. How to construct a triangular matrix of a given order, Linearand Multilinear Algebra, 62:1, 28-38,(2014).[4] I. Gargate., M. Gargate, Involutions on Incidence Algebras of Finite Posets.(2019). arXiv:1907.06805.[5] I. Gargate., M. Gargate, Coninvolutions on Upper Triangular Matrix Groupover the Ring of Gaussian Integers and Quaternions integers modulo p . (2020).https://arxiv.org/abs/2008.00575. UTFPR, Campus Pato Branco, Rua Via do Conhecimento km 01, 85503-390 PatoBranco, PR, Brazil
E-mail address : [email protected] UTFPR, Campus Pato Branco, Rua Via do Conhecimento km 01, 85503-390 PatoBranco, PR, Brazil
E-mail address ::