Lecture 24 Math221
Schur's Theorem
It time to see what can be done if we can not make a matrix similar to a diagonal matrix.
i.e. when the algebraic multiplicty is not equal to the geometric multiplicity of one of the eigenvalues of a given matrix.
We will state Schur's Theorem in the case of real matrices at this time. At the end of this lecture we will give a brief discussion of what happens when we have complex eigenvalues.
Schur's Theorem Let A be an
matrix, where A has only real eigenvalues. Then there is an
orthogonal matrix Q such that
,
where T is an
upper-triangular matrix.
Notice: Since A and T are similar then the diagonal elements of T must be the eigenvalues of A as well as those of T.
We will consider first a matrices that we have not been able to diagonalize. Next we will look at a matrix that can be made similar to a diagonal matrix but is not symmetric and see what happens if we try to find a Q. The third example will look at a little more complicated example where we could not make the matrix similar to a diagonal matrix. We finish this lecture by looking at the case were the matrix has complex eigenvalues but can not be made similar to a diagonal matrix.
Example 1
| > |
A:=<<3,1,1>|<0,2,1>|<0,1,2>>; |
| > |
EA:=Eigenvectors(A,output='list'); |
As you can see the eigenvalue 3 has algebraic multiplicity of 2 but geometric multiplicity of only 1. Our procedure will go as follows for this type of probelm. Add an linearly independent vector to the set of eigenvectors and then put them through the Gram-Schmidt process.
| > |
GramSchmidt([p1,p2,e1],normalized); |
This was a waste of my time why?
Example 2
We will now look at a non-symmetric matrix that can be made similar to a diagonal matrix and see if we can find an orthogonal matrix so that the resulting matrix is upper triangular.
| > |
B:=<<1,0,-2>|<1,3,1>|<1,3,1>>; |
| > |
EB:=Eigenvectors(B,output='list'); |
| > |
BG:=GramSchmidt([p1,p2,p3],normalized); |
Schur's Theorem holds but this is not as good of a results as making it similar to a diagonal and by the time you go through all the Gram-Schmidt process to obtain only an upper triangular matrix you could have found the inverse of the eigenvalue matrix P.
Example 3
The following matrix will have an eigenvalue that has algebraic multiplicity 3 and geometric multiplicity 1. This situation can cause us a little more work but if we follow through we will obtain the orthogonal matrix which will make it similar to an upper triangular matrix.
| > |
C:=<<1,0,0,0,1>|<0,2,1,0,0>|<0,1,2,0,0>|<0,0,0,3,2>|<0,1,0,2,3>>; |
| > |
EC:=Eigenvectors(C,output='list'); |
| > |
GC:=GramSchmidt([p1,p2,p3,e1,e2],normalized); |
| > |
Qa:=<GC[1]|GC[2]|GC[3]|GC[4]|GC[5]>; |
We did not obtain the upper triangular matrix but things were not too bad. We have everything we want expect for the
matrix
. We will look at this matrix separately and then expand it to a full
matrix
| > |
T1a:=SubMatrix(T1,4..5,4..5); |
| > |
ET1a:=Eigenvectors(T1a,output='list'); |
As you can see the two by two matrix has the two missing eigenvalue and also this matrix has a very nice eigenvalue. The matrix
will make the
matrix upper triangular
We need to generated a
matrix. We will use the first three columns as the first three columns of the identity matrix and then we will make our two vectors 5 dimensional by adding 0's in the first three places. Notice: We should not expect every time that these will be the other standard basis elements.
We did this in two steps but is there an orthogoanl matrix Q that will accomplish this in one step?
We should take a closer look at the product of Qa and Q2.
This should not be a suprise since Qa and Q2 where both orthogonal matrices and the product of two orthogonal matrices is orthogonal.
Example 4
We will take a look at a matrix with complex eigenvalues and is not similar to diagonal matrix.
| > |
E:=<<1,0,0,1>|<0,-2,-1,1>|<0,-2,1,1>|<0,-9,-3,4>>; |
| > |
EE:=Eigenvectors(E,output='list'); |
| > |
EG:=GramSchmidt([p1,p2,p3,p4],normalized); |
| > |
u1c:=<conjugate(u1[1]),conjugate(u1[2]),conjugate(u1[3]),conjugate(u1[4])>; |
| > |
u2c:=<conjugate(u2[1]),conjugate(u2[2]),conjugate(u2[3]),conjugate(u2[4])>; |
| > |
u3c:=<conjugate(u3[1]),conjugate(u3[2]),conjugate(u3[3]),conjugate(u3[4])>; |
The matrix multipliction is the conjugate transpose of the matrix U. I can not write it in the Maple text. It should look like U^* A U = T . Shows that there is a problem with notation when we try to use Maples notation. The matrix U is said to be unitary.
When the entries are complex I have found that using the command simplify becomes very necessary . The graphics was not good when I left off the command. Also, in checking to see that a unitary matrix times it conjugate transpose it the identity it did not multiply two square roots together without the command simplify.
Schur's Theorem Let A be a square
matrix then there is a unitary matrix U such that
U^* A U = T where T is an
upper triangular matrix with the eigenvalues on the diagonal.
As you can see the general form of Schur's theroem replaces an ortogonal matrix with a more general unitary matrix and takes off the restricition that the eigenvalues be real.
Solution to Differential Equations
We can use Schur's theorem to solve systems of differential equation when the eigenvalues of the coefficint matrix have repeated roots and not a complete set of eigenvectors.
Example 5.
Let X(t) =
then
X(t) =
X(t) . We need to look for the eigenvalues and eigenvectors of the coefficient.
 |
(6.1) |
| > |
Eigenvectors(A,output='list'); |
}]]](images/lecture24c_88.gif) |
(6.2) |
As you can see there is only one eigenvector for the eigenvaule 3 which has algebraic multiplicity of 2. This means that we can diagonalize the matrix A. We will do the next best things and make A similar to an upper triangular matrix. Here the vector to add to the eigenvector to form an orthognal set is easy and can be done by inspection. It is [1,1].
](images/lecture24c_89.gif) |
(6.3) |
](images/lecture24c_90.gif) |
(6.4) |
](images/lecture24c_91.gif) |
(6.5) |
](images/lecture24c_92.gif) |
(6.6) |
 |
(6.7) |
 |
(6.8) |
We want to make a change of variables to that we can use the upper triangular matrix above.
Let Y(t) =
and then look at the origional system again with the change of variables in mind.
This is because
To obtain the formula we just multiplied the equation
by
which is the inverse of
We now have a system we can solve by starting at the bottom and working up.
We will solve the second equation and find
and then put the general solution into the first equation and solve for
. If you have forgotten how or do not know how to solve differential equations Maple will help.
| > |
eq1:=diff(y2(t),t) -3*y2(t)=0; |
 |
(6.9) |
 |
(6.10) |
| > |
eq2:=diff(y1(t),t) -3*y1(t) =2*k*exp(3*t); |
 |
(6.11) |
 |
(6.12) |
| > |
Y:=matrix([[(2*k*t+k1)*exp(3*t)],[k*exp(3*t)]]); |
![array( 1 .. 2, 1 .. 1, [( 2, 1 ) = `*`(k, `*`(exp(`+`(`*`(3, `*`(t)))))), ( 1, 1 ) = `*`(`+`(`*`(2, `*`(k, `*`(t))), k1), `*`(exp(`+`(`*`(3, `*`(t)))))) ] )](images/lecture24c_109.gif) |
(6.13) |
 |
(6.14) |
This is the solution we were looking for.
Exercises
Exercises 3.6 Exericse 3.6(a),(c),(d), Exercise 3.6.2 Exercises are due November 24 2008. e-mail them to me