Test 2 Practice Exam Key
1.
| > |
A:=<<1,-1,2>|<1,0,4>|<1,2,8>>; |
 |
(1) |
 |
(2) |
a.
| > |
A1:=GaussianElimination(A); |
 |
(3) |
| > |
A2:=ReducedRowEchelonForm(A1); |
 |
(4) |
 |
(5) |
 |
(6) |
 |
(7) |
 |
(8) |
rowspace=span{r1,r2}=span{r11,r21}
b.
| > |
AT1:=GaussianElimination(AT); |
 |
(9) |
| > |
AT2:=ReducedRowEchelonForm(AT1); |
 |
(10) |
 |
(11) |
 |
(12) |
 |
(13) |
 |
(14) |
c.
 |
(15) |
 |
(16) |
| > |
NT:=BackwardSubstitute(GaussianElimination(Az)); |
 |
(17) |
 |
(18) |
d.
 |
(19) |
| > |
GaussianElimination(RN); |
 |
(20) |
The matrix is if rank 3 so they are linearly independent.
2.
a.
 |
(21) |
 |
(22) |
 |
(23) |
 |
(24) |
| > |
GaussianElimination(V); |
 |
(25) |
Three piviots so they are linearly independent.
b.
| > |
alpha:=DotProduct(v1,v2)/DotProduct(v1,v1); |
 |
(26) |
 |
(27) |
c.
 |
(28) |
 |
(29) |
| > |
p3:=v3-DotProduct(v3,p2)/DotProduct(p2,p2)*p2-DotProduct(v3,p1)/DotProduct(p1,p1)*p1; |
 |
(30) |
d.
| > |
q1:=Normalize(p1,Euclidean); |
 |
(31) |
| > |
q2:=Normalize(p2,Euclidean); |
 |
(32) |
 |
(33) |
 |
(34) |
 |
(35) |
e.
 |
(36) |
 |
(37) |
| > |
x:=BackwardSubstitute(<R|QTe>); |
 |
(38) |
 |
(39) |
3.
 |
(40) |
 |
(41) |
 |
(42) |
 |
(43) |
| > |
s:=BackwardSubstitute(<ETE|ETb>); |
 |
(44) |
 |
(45) |
 |
(46) |
y=1.033333333+2.05x.
4a
| > |
<x+y,x-y,x,y>:=x*<1,1,1,0>+y*<1,-1,0,1>; |
 |
(47) |
 |
(48) |
 |
(49) |
=span(a,b) so a vector space.
b.
Y in U then YA=0 and Z in U then ZA=0 Look at (a*Y+b*Z)A=a*YA+b*Za=a*0+b*0 =0 so is closed under linear combinations which tell us we have a subspace.
5. Notice that the following is true.
 |
(50) |
 |
(51) |
 |
(52) |
 |
(53) |
Also, B1 and B3 are linearly independent so a basis is B1,B3. The dimension is 2. You could look at a linear combination and set it equal to zero
 |
(54) |
We then have a1+b1=0 and b1+c1=0 . The other two equations -b1-c1=0 and -a1-b1=0 are the same as the first tow. This tell us that a1=c1and b1=-1 is a possiblity so they are dependent. But the first and the last one are definitely independent so the dimension is two.
6.
 |
(55) |
 |
(56) |
 |
(57) |
 |
(58) |
| > |
GaussianElimination(CE); |
 |
(59) |
b.
We must have z+y=0 or z=-y which tell us the vectors look like <x,y,-y> Try <0,0,1>;
 |
(60) |
| > |
GaussianElimination(<w1|w2|w3>); |
 |
(61) |
They are linearly independent so they form a basis for three space.
7. 0:=c1*v1+c2*v2+...+ck*vk
Want to show that ci for any i is zero
Next multiply both sides by vi^T
The gives
vi^T.0= vi^Tc1v1+.....+vi^Tck*vk but everything is zero since orthogonal except when k=i and we have
vi^T.civi =0 or civi^T.vi=0 Since viT.vi is the square of the lenght then ci=0 and the vectors are linearly independent.
8. a. T
b. F Must be linearly independent.
c. T your text defines a orthogonal matrix as a square matrix so the answer must be true.
d. T Theorem in you book
e. T If not then the dimension would be bigger than n which is impossible.