Eigenvalues, Eigenvectors, and Eigenspaces_ Part 2_ Attempt review _ eClass
pdf
keyboard_arrow_up
School
University of Alberta *
*We aren’t endorsed by this school
Course
125
Subject
Mathematics
Date
Apr 3, 2024
Type
Pages
14
Uploaded by SuperHumanSteel1194
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
1/14
Started on
Monday, 25 March 2024, 9:11 PM
State
Finished
Completed on
Monday, 25 March 2024, 10:06 PM
Time taken
55 mins 38 secs
Grade
66.00
out of 66.00 (
100
%)
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
2/14
Question 1
Correct
Mark 5.00 out of 5.00
The Relationship between the Algebraic and Geometric Multiplicities of a Matrix
Recall the following definitions for an matrix . When considering the characteristic polynomial of , we view as a variable.
Definition:
The algebraic multiplicity
of an eigenvalue of , denoted , is the number of times appears as a root of the
characteristic polynomial of . Equivalently, it is the number of times appears as a factor of .
Definition:
Let be an eigenvalue of (so here is viewed as a fixed number). The geometric multiplicity
of , denoted , is the
dimension of the eigenspace corresponding to .
If is an eigenvalue of a square matrix , then we have the following inequalities involving the algebraic multiplicity and the
geometric multiplicity :
Theorem: When is an eigenvalue, there must be at least one non-zero eigenvector, hence the eigenspace is non-zero and its dimension is at least
one: this explains the first inequality. The explanation why the second inequality is true is more complicated and will not be provided in this
course.
It is possible for to be strictly less than , as the next example shows.
Example
Let
The characteristic polynomial of is , so is an eigenvalue with algebraic multiplicity 2
, and is
an eigenvalue with algebraic multiplicity 1
. Without further calculations, we may determine from the inequalities
that the geometric multiplicity of is 1
. For the other eigenvalue, , all that the inequalities tell us is
that 2
. However, after some row operations, we find that the matrix has rank , so the dimension of its
null space is 1
, showing that the geometric multiplicity of is 1
.
By computing reduced row-echelon forms, we find that the eigenspace associated to is spanned by and the eigenspace associated to
is spanned by .
This example illustrates the following important case of the theorem above.
Important special case:
If the algebraic multiplicity of an eigenvalue is 1, so , then the inequality in the theorem above reduces
to This can only be true when .
Correct
Marks for this submission: 5.00/5.00.
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
3/14
Question 2
Correct
Mark 6.00 out of 6.00
The Eigenvalue
Let be an
matrix.
Although it is
for the zero vector to be an eigenvector of , it is possible for to be an eigenvalue of .
Whether or not is an eigenvalue of gives us an important piece of information about :
Theorem:
An matrix is invertible if and only if is not an eigenvalue of .
Proof:
is invertible
(By the Invertible Matrix Theorem)
a root of the characteristic polynomial of
an of eigenvalue of Note:
If is an eigenvalue of , then the eigenspace of corresponding to is the
of .
not possible
≠ 0
≠ 0
is not
is not
null space
Correct
Marks for this submission: 6.00/6.00.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
4/14
Question 3
Correct
Mark 10.00 out of 10.00
Example:
Let \(A = {\left[\begin{array}{ccc} 3 & 0 & 0 \\ 5 & 8 & 0 \\ 6 & 7 & -12 \end{array}\right]}\).
The eigenvalues of \(A\) are
3
, 8
, and -12
.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Since \(\lambda = 0\) an eigenvalue of \(A\), the matrix \(A\) invertible.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Let's verify this conclusion by computing the determinant of \(A\).
Since \(\det(A) = \) -288
, the matrix \(A\) invertible.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Example:
Let \(B = {\left[\begin{array}{ccc} -1 & 0 & 1 \\ 2 & 6 & -14 \\ 1 & 0 & -1 \end{array}\right]}\).
The eigenvalues of \(B\) are
-2
, 6
, and 0
.
Correct answer, well done.
Marks for this submission: 2.00/2.00.
Since \(\lambda = 0\) an eigenvalue of \(B\), the matrix \(B\) invertible.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
is not
is
≠ 0
is
is
is not
A correct answer is \( 3 \), which can be typed in as follows: 3
A correct answer is \( 8 \), which can be typed in as follows: 8
A correct answer is \( -12 \), which can be typed in as follows: -12
A correct answer is: "is not"
A correct answer is: "is"
A correct answer is \( -288 \), which can be typed in as follows: -288
A correct answer is: "≠ 0"
A correct answer is: "is"
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
5/14
Question 4
Correct
Mark 2.00 out of 2.00
A correct answer is \( -2 \), which can be typed in as follows: -2
A correct answer is \( 0 \), which can be typed in as follows: 0
A correct answer is \( 6 \), which can be typed in as follows: 6
A correct answer is: "is"
A correct answer is: "is not"
The Invertible Matrix Theorem revisited
Based upon the material covered in Block 5, we may expand the Invertible Matrix Theorem to the following:
The Invertible Matrix Theorem
The following statements are equivalent for an \(n \times n\) matrix \(A\). This means that if one is true, then all the others are true; if one is
false, then all the others are false.
\(\bullet\) \(A\) is invertible.
\(\bullet\) The equation \(A\mathbf{x} = \mathbf{0}\) has only the trivial solution.
\(\bullet\) The equation \(A\mathbf{x} = \mathbf{b}\) has a solution for every \(\mathbf{b} \in \mathbb{R}^n\).
\(\bullet\) The equation \(A\mathbf{x} = \mathbf{b}\) has a unique solution for every \(\mathbf{b} \in \mathbb{R}^n\).
\(\bullet\) The reduced row-echelon form of \(A\) is \(I_n\).
\(\bullet\)
A row echelon form of \( A\) has \( n \) non-zero rows.
\(\bullet\) The nullity of \(A\) is \(0\).
\(\bullet\) The rank of \(A\) is \(n\).
\(\bullet\) The columns of \(A\) are linearly independent.
\(\bullet\) The columns of \(A\) span \(\mathbb{R}^n\).
\(\bullet\) The columns of \(A\) form a basis for \(\mathbb{R}^n\).
\(\bullet\) The rows of \(A\) are linearly independent.
\(\bullet\) The rows of \(A\) span \(\mathbb{R}^n\).
\(\bullet\) The rows of \(A\) form a basis for \(\mathbb{R}^n\).
\(\bullet\) \(\text{det}(A)\)
.
\(\bullet\) \(0\)
an eigenvalue of \(A\)
≠ 0
is not
Correct
Marks for this submission: 2.00/2.00.
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
6/14
Question 5
Correct
Mark 6.00 out of 6.00
Linear Independence of Eigenvectors with Different Eigenvalues
We will see later that having a basis of \(\mathbb{R}^n\) consisting of eigenvectors for a matrix \( A\) can help a lot to work with the matrix \(
A\). Recall that a basis of \(\mathbb{R}^n\) is a set of vectors that span \(\mathbb{R}^n\) and are linearly independent. It is thus useful to know
one condition that ensures the linear independence of some eigenvectors.
Theorem Let \(A\) be a square matrix, and suppose \(\lambda_1,\ldots,\lambda_k\) are distinct
eigenvalues of \(A\). For each \(i \in \
{1,\ldots,k\}\), let \(\mathbf{v}_i\) be an eigenvector of \(A\) with eigenvalue \(\lambda_i\). Then the vectors are \
(\mathbf{v}_1,\ldots,\mathbf{v}_k\) are linearly independent.
The general proof of this theorem falls outside the scope of this course. Let us prove it at least in the special case of two distinct eigenvalues \(
\lambda_1, \lambda_2 \).
Suppose \(c_1\mathbf{v}_1 + c_2\mathbf{v}_2 = \mathbf{0} \;\; (1)\).
Applying \( A\) to both sides of this equation gives \( c_1A \mathbf{v}_1 + c_2A\mathbf{v}_2 = \mathbf{0} ,\) hence
\( \quad c_1 \lambda_1\mathbf{v}_1 + c_2 \lambda_2 \mathbf{v}_2 = \mathbf{0} \;\; (2) ,\)
since \(A\mathbf{v}_i = \lambda_i\mathbf{v}_i\) for \(i = 1,2\). Furthermore, multiplying (1) by \(\lambda_1\) yields
\(\quad c_1 \lambda_1\mathbf{v}_1 + c_2 \lambda_1\mathbf{v}_2 = \mathbf{0} \;\; (3).\)
Subtracting equation (3) from (2) gives
\(\quad c_2(\lambda_2 - \lambda_1)\mathbf{v}_2 = \mathbf{0} .\)
Hence, because \(\mathbf{v}_2\) is
, it can be concluded that \(c_2(\lambda_2 - \lambda_1) = \,\)
0
, and
therefore \(c_2 = \,\)
0
because we have assumed that \(\lambda_1 \not= \lambda_2\). We are now left with the equation \
(c_1\mathbf{v}_1 = \mathbf{0}\), from which we conclude that \(c_1 = \,\)
0
as well, because \(\mathbf{v}_1\) is
. Thus, \(\mathbf{v}_1,\mathbf{v}_2\) are linearly independent because we started with \(c_1\mathbf{v}_1 +
c_2\mathbf{v}_2 = \mathbf{0}\) and concluded that \( c_1=0\) and \( c_2=0\).
The general case of \(k\) distinct eigenvalues can be proven in a similar way.
Remark:
In fact, we may even prove the following: If \(\lambda_1,\ldots,\lambda_k\) are distinct eigenvalues of \(A\), and if, for each \(i\), \
(S_i\) is a linearly independent set of eigenvectors with eigenvalue \(\lambda_i\), then the union \(S_1 \cup \cdots \cup S_k\) is also a linearly
independent set.
Example: Let \(A = \begin{bmatrix}1&2&0\\0&2&0\\-1&0&0\end{bmatrix}\).
non-zero
non-zero
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
7/14
Recall from Question 10 of the learning activity "Eigenvalues, Eigenvectors, and Eigenspaces: Part 1" that the eigenvalues of \(A\) are \(\lambda
= 0\), \(\lambda = 1\), and \(\lambda = 2\), and the corresponding eigenspaces are \[ E_0 = \operatorname{Span}\left( \mathbf{u} =
\begin{bmatrix}0\\0\\1\end{bmatrix} \right), \; \; E_1 = \operatorname{Span}\left( \mathbf{v} = \begin{bmatrix}-1\\0\\1\end{bmatrix} \right), \]
\[ \text{and} \; \; E_2 = \operatorname{Span}\left( \mathbf{w} = \begin{bmatrix}-2\\-1\\1\end{bmatrix} \right) \]
Therefore, since \(\mathbf{u} \in E_0\), \(\mathbf{v} \in E_1\), and \(\mathbf{w} \in E_2\) are eigenvectors of \(A\) corresponding to different
eigenvalues, the set
\[ \left\{ \begin{bmatrix}0\\0\\1\end{bmatrix},
\begin{bmatrix}-1\\0\\1\end{bmatrix}, \begin{bmatrix}-2\\-1\\1\end{bmatrix}\right\}\]
is
.
linearly independent
Correct
Marks for this submission: 6.00/6.00.
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
8/14
Question 6
Correct
Mark 7.00 out of 7.00
Matrices with no Repeated Eigenvalues
Suppose \(A\) is an \(n \times n\) matrix. Since its characteristic polynomial \( C_A(\lambda) \) has degree \( n\), a basic fact from algebra tells
us that \( C_A(\lambda) \) has at most \(n \) roots. At one extreme, it may not have any root in the set of real numbers \( \mathbb{R}\). This is
the case, for instance, of the matrix \( \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} \) whose characteristic polynomial is \( \lambda^2 +1 \). It
can also have just one root of multiplicity \( n\): for instance, if \( A \) is a triangular matrix with all its diagonal entries equal to \( \lambda_1 \),
then \( C_A(\lambda) = (-1)^n (\lambda-\lambda_1)^n.\)
Another extreme possibility is when \( A\) has \(n\) distinct eigenvalues \(\lambda_1,\ldots,\lambda_n\). Because \(A\) has size \(n \times n\),
the characteristic polynomial \(C_A(\lambda)\) has degree n
. On the other hand,
\(\quad C_A(\lambda) = (-1)^n(\lambda - \lambda_1)^{m_1}(\lambda - \lambda_2)^{m_2} \cdots (\lambda - \lambda_n)^{m_n} ,\)
where \(m_i \geq 1\) is the algebraic multiplicity of \(\lambda_i\) (that is, \(m_i = \operatorname{alg}(\lambda_i)\)). Therefore, since \
(C_A(\lambda)\) has degree n
, and the polynomial \( (-1)^n(\lambda - \lambda_1)(\lambda - \lambda_2)\cdots (\lambda -
\lambda_n) \) has the same degree, it follows that \( m_i = \,\)
1
for all \(i \in \{1,\ldots,n\}\). Hence, by the inequalities \(1 \leq
\operatorname{geo}(\lambda_i) \leq m_i\), we have \(\operatorname{geo}(\lambda_i)= \,\)
1
. This means that every eigenspace of
\(A\) has dimension 1
.
Suppose that \(\mathbf{v}_i\) is an eigenvector of \(A\) with eigenvalue \(\lambda_i\), for \(i \in \{1,\ldots,n\}\). Because these eigenvectors all
correspond to different eigenvalues, they are linearly independent by Question 5. Hence, they form \(n\) linearly independent vectors in \
(\mathbb{R}^n\), so in fact they form a basis for \(\mathbb{R}^n\). This proves the following theorem.
Theorem If \(A\) is an \(n \times n\) matrix with \(n\) distinct eigenvalues, then
\(\bullet\) \(A\) has \(n\) eigenspaces, each of dimension \(1\),
\(\bullet\) \(\mathbb{R}^n\) has a basis consisting of eigenvectors, one basis vector from each eigenspace.
The \( 3\times 3 \) matrix in Question 10 in Part 1 had 3 distinct eigenvalues. Here is another example of such a matrix.
Example
The \(3 \times 3\) matrix
\(\quad A = \begin{bmatrix}6 & 13 & -7 \\ -1 & -2 & 1 \\ 1 & 1 & -2\end{bmatrix} \)
has characteristic polynomial \( C_A(\lambda) = -(\lambda + 1)(\lambda - 1)(\lambda - 2)\) and therefore has 3
distinct
eigenvalues, namely, -1,1,2
(enter your answer as a list of numbers in increasing numerical order, separated by commas but no
spaces).
By row reducing \(A - \lambda_i I\), where \(\lambda_1,\lambda_2,\lambda_3\) are the eigenvalues of \(A\) in increasing numerical order, we
find that eigenvectors for the three eigenvalues are, respectively,
\(\quad \mathbf{v}_1 = \begin{bmatrix}1 \\ 0 \\ 1\end{bmatrix} ,\quad \mathbf{v}_2 = \begin{bmatrix}4 \\ -1 \\ 1\end{bmatrix} ,\quad
\mathbf{v}_3 = \begin{bmatrix}5 \\ -1 \\ 1\end{bmatrix} .\)
These three vectors therefore form a basis of \(\mathbb{R}^3\) consisting of eigenvectors of \(A\).
Correct
Marks for this submission: 7.00/7.00.
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
9/14
Question 7
Correct
Mark 1.00 out of 1.00
Powers of a Matrix acting on an Eigenvector
Theorem:
Let \(A\) be a square matrix and let \(\mathbf{x}\) be an eigenvector of \(A\) with corresponding eigenvalue \(\lambda\).
(a) For any non-negative integer \(k\), \(\mathbf{x}\) is an eigenvector of \(A^k\) with corresponding eigenvalue \(\lambda^k\). Therefore,\
[A^k\mathbf{x}=\lambda^k\mathbf{x}\]
(b) If \(A\) is invertible, then \(\lambda\)
and \(\mathbf{x}\) is an eigenvector of \(A^{-1}\) with corresponding eigenvalue \
(\dfrac{1}{\lambda}\). Therefore,\[A^{-1}\mathbf{x}=\dfrac{1}{\lambda}\mathbf{x}\]
(c) More generally, if \(A\) is invertible, then for any positive integer \(k\), \(\mathbf{x}\) is an eigenvector of \(A^{-k}\) with corresponding
eigenvalue \(\dfrac{1}{\lambda^k}\). Therefore,\[A^{-k}\mathbf{x}=\dfrac{1}{\lambda^k}\mathbf{x}\]
Let's prove part (a) for \(k = 2\)
\[\begin{aligned}A^2\mathbf{x} &= A(A\mathbf{x})\\ &= A(\lambda\mathbf{x})\\ &= \lambda(A\mathbf{x})\\ &=
\lambda(\lambda\mathbf{x})\\ &= \lambda^2\mathbf{x}\end{aligned}\]Therefore, \(A^2\mathbf{x} = \lambda^2\mathbf{x}\).
≠ 0
Correct
Marks for this submission: 1.00/1.00.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
10/14
Question 8
Correct
Mark 7.00 out of 7.00
Example:
Let \(A = {\left[\begin{array}{ccc} -1 & -1 & 1 \\ 0 & -2 & 0 \\ 2 & -2 & 0 \end{array}\right]}\).
Recall from Question 9 of the learning activity "Eigenvalues, Eigenvectors, and Eigenspaces: Part 1" that \(\lambda = -2\) is an eigenvalue of \
(A\) and the corresponding eigenspace is \[E_{-2} = \operatorname{Span}\left( {\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]}\; ,\;
{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\right)\]
Note:
The eigenvalues of \(A\) are \(\lambda = 1\) and \(\lambda = -2\). Since \(\lambda =\) 0
is not an eigenvalue of \(A\), we know
that \(A\) is invertible.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
(a) Compute \(A^7{\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]}\).
\(A^7{\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]} = (-2)^7{\left[\begin{array}{c} 1 \\ 1 \\ 0
\end{array}\right]} =\)
-128
-128
0
Correct answer, well
done.
Marks for this submission:
1.00/1.00.
(b) Compute \(A^{-4}{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\).
\(A^{-4} {\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]} = \dfrac{1}{(-2)^{4}}
{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]} =\)
-1/16
0
1/16
Correct answer, well
done.
Marks for this submission:
1.00/1.00.
(c) Compute \(A^3{\left[\begin{array}{c} -3 \\ 2 \\ 5 \end{array}\right]}\).
Since \({\left[\begin{array}{c} -3 \\ 2 \\ 5 \end{array}\right]} = \) 2
\({\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]} + \) 5
\
({\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]} \in E_{-2}\), the vector \({\left[\begin{array}{c} -3 \\ 2 \\ 5 \end{array}\right]}\) is an
eigenvector of \(A\) with eigenvalue -2
.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Thus, \(A^3{\left[\begin{array}{c} -3 \\ 2 \\ 5 \end{array}\right]} =\)
24
-16
-40
Correct answer, well done.
Marks for this submission: 1.00/1.00.
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
11/14
A correct answer is \( 0 \), which can be typed in as follows: 0
A correct answer is \( \left[\begin{array}{c} -128 \\ -128 \\ 0 \end{array}\right] \).
A correct answer is \( \left[\begin{array}{c} -\frac{1}{16} \\ 0 \\ \frac{1}{16} \end{array}\right] \).
A correct answer is \( 2 \), which can be typed in as follows: 2
A correct answer is \( 5 \), which can be typed in as follows: 5
A correct answer is \( -2 \), which can be typed in as follows: -2
A correct answer is \( \left[\begin{array}{c} 24 \\ -16 \\ -40 \end{array}\right] \).
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
12/14
Question 9
Correct
Mark 10.00 out of 10.00
Exercise:
Let \(A\) be an invertible \(n \times n\) matrix. Suppose that \(\mathbf{u} \in \mathbb{R}^n\) is an eigenvector of \(A\) with
corresponding eigenvalue \(\lambda = 2\) and \(\mathbf{v} \in \mathbb{R}^n\) is an eigenvector of \(A\) with corresponding eigenvalue \
(\lambda = -7\) . Fill in the following blanks. (Enter \( \frac{a}{b}\) as a/b, where applicable. Work out any powers before inputting your answer.)
1. \(A\mathbf{u} = \)
2
\(\mathbf{u}\)
\
(\hspace{2cm}\)
2. \(A\mathbf{v} = \)
-7
\(\mathbf{v}\)
3. \(A^5\mathbf{u} = \)
32
\(\mathbf{u}\)
4. \(A^3\mathbf{v} = \)
-343
\(\mathbf{v}\)
5. \(A^{-6}\mathbf{u} = \)
1/64
\
(\mathbf{u}\)
6. \(A^{-2}\mathbf{v} = \)
1/49
\
(\mathbf{v}\)
Example:
Suppose that \(A\) is an invertible \(n \times n\) matrix and that \(\mathbf{x} \in \mathbb{R}^n\) is an eigenvector of \(A\) with corresponding
eigenvalue \(\lambda = -3\).
Since \(\mathbf{x}\) is an eigenvector of \(A\) with corresponding eigenvalue \(\lambda = -3\) we have that \(\mathbf{x} \neq \mathbf{0}\) and
\(A\mathbf{x} = \) -3
\(\mathbf{x}\).
(a) Show that \(\mathbf{x}\) is also an eigenvector of \(B = 2A^2 -6A^{-1} +5I\) and find the corresponding eigenvalue (of \(B\)).
\(B\mathbf{x}\)\(= (2A^2 -6A^{-1} +5I) \mathbf{x}\)
\(=2A^2\mathbf{x} - 6A^{-1}\mathbf{x} +5\mathbf{x}\)
\(=2(-3)^2\mathbf{x} - 6(\frac{1}{-3})\mathbf{x} + 5\mathbf{x}\)
\(=18\mathbf{x} + 2\mathbf{x} +5\mathbf{x}\)
\(=\)
25
\(\mathbf{x}\)
Therefore, the nonzero vector \(\mathbf{x}\) is an eigenvector of \(B\) with corresponding eigenvalue \(\lambda =\) 25
.
(b) Verify (to yourself) that \(\mathbf{x}\) is also an eigenvector of \(C = -8I + 18A^{-2} + A^3 - 4A\) and find the corresponding eigenvalue (of
\(C\)).
The vector \(\mathbf{x}\) is an eigenvector of \(C\) with corresponding eigenvalue \(\lambda =\) -21
.
Correct
Marks for this submission: 10.00/10.00.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
13/14
Question 10
Correct
Mark 12.00 out of 12.00
Linear Combinations of Eigenvectors
Working with eigenvectors can greatly simplify computations, since matrix multiplication with an eigenvector is just scalar multiplication. As we
will see in the following theorem, there are also computational advantages to working with linear combinations of eigenvectors.
Theorem:
Let \(A\) be an \(n \times n\) matrix and let \(\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_m\) be eigenvectors of \(A\)
corresponding to the eigenvalues \(\lambda_1, \lambda_2, \ldots \lambda_m\), respectively. If \(\mathbf{x}\) is a linear combination of these
eigenvectors, say \[\mathbf{x} = c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \cdots + c_m \mathbf{v}_m,\] then for any non-negative integer \(k\) (or
any integer if \(A\) is invertible),
\[A^k \mathbf{x} = c_1\lambda_1^k\mathbf{v}_1 + c_2\lambda_2^k\mathbf{v}_2 + \cdots + c_m \lambda_m^k\mathbf{v}_m\]
Let's illustrate how to use this theorem through an example.
Example:
Let \(A = {\left[\begin{array}{ccc} -1 & -1 & 1 \\ 0 & -2 & 0 \\ 2 & -2 & 0 \end{array}\right]}\).
Recall from Question 9 of the learning activity "Eigenvalues, Eigenvectors, and Eigenspaces: Part 1" that the eigenvalues of \(A\) are \(\lambda
= 1\) and \(\lambda = -2\) and the corresponding eigenspaces are \[ E_1 = \operatorname{Span}\left( \mathbf{u} = {\left[\begin{array}{c} 1 \\
0 \\ 2 \end{array}\right]}\right) \; \; \text{and} \; \; \; \; E_{-2} = \operatorname{Span}\left(\mathbf{v}= {\left[\begin{array}{c} 1 \\ 1 \\ 0
\end{array}\right]}\; ,\;\mathbf{w} = {\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\right)\]
Note: For any integer \(k\), since \(\mathbf{u} \in E_1\), we have \(A^k\mathbf{u} = (1)^k\mathbf{u} = \mathbf{u}\) and since \(\mathbf{v},
\mathbf{w} \in E_{-2}\), we have \(A^k\mathbf{v} = (-2)^k\mathbf{v}\) and \(A^k\mathbf{w} = (-2)^k\mathbf{w}\).
(a) Compute \(A^3{\left[\begin{array}{c} -2 \\ -4 \\ 13 \end{array}\right]}\).
We first note that \({\left[\begin{array}{c} -2 \\ -4 \\ 13 \end{array}\right]}\) can be written as a linear combination of the eigenvectors \
(\mathbf{u}\), \(\mathbf{v}\), and \(\mathbf{w}\). Indeed,
\[{\left[\begin{array}{c} -2 \\ -4 \\ 13 \end{array}\right]} = 5 {\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - 4{\left[\begin{array}{c} 1 \\ 1 \\
0 \end{array}\right]} + 3{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\]
Thus, \(A^3{\left[\begin{array}{c}
-2 \\ -4 \\ 13 \end{array}\right]}\)
\
(=\)
\( A^3\left(5 {\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - 4{\left[\begin{array}{c} 1 \\ 1 \\ 0
\end{array}\right]} + 3{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\right)\)
\
(=\)
\( A^3\left(5 {\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]}\right) - A^3
\left(4{\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]}\right) + A^3\left(3{\left[\begin{array}{c} -1
\\ 0 \\ 1 \end{array}\right]}\right)\)
\
(=\)
\( 5A^3{\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - 4A^3{\left[\begin{array}{c} 1 \\ 1 \\ 0
\end{array}\right]} + 3A^3{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\)
\
(=\)
\( 5(\)
1
\()^3{\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - 4(\)
-2
\()^3{\left[\begin{array}
{c} 1 \\ 1 \\ 0 \end{array}\right]} + 3(\)
-2
\()^3{\left[\begin{array}{c} -1 \\ 0 \\ 1
\end{array}\right]}\), since \(\mathbf{u} \in E_1\) and \(\mathbf{v}, \mathbf{w} \in E_{-2}\)
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
\
(=\)
5
\({\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]}\) \( + \) 32
\({\left[\begin{array}{c} 1
\\ 1 \\ 0 \end{array}\right]}\) \( - \) 24
\({\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\)
3/27/24, 1:24 AM
Eigenvalues, Eigenvectors, and Eigenspaces: Part 2: Attempt review | eClass
https://eclass.srv.ualberta.ca/mod/quiz/review.php?attempt=15222924&cmid=7601911
14/14
Thus, \(A^3{\left[\begin{array}{c}
-2 \\ -4 \\ 13 \end{array}\right]}\)
\
(=\)
\( A^3\left(5 {\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - 4{\left[\begin{array}{c} 1 \\ 1 \\ 0
\end{array}\right]} + 3{\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\right)\)
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
\
(=\)
61
32
-14
Correct answer, well done.
Marks for this submission: 1.00/1.00.
(b) Compute \(A^5{\left[\begin{array}{c} 1 \\ -2 \\ 6 \end{array}\right]}\).
We first note that \({\left[\begin{array}{c} 1 \\ -2 \\ 6 \end{array}\right]}\) can be written as a linear combination of the eigenvectors \
(\mathbf{u}\), \(\mathbf{v}\), and \(\mathbf{w}\). Indeed,
\({\left[\begin{array}{c} 1 \\ -2 \\ 6 \end{array}\right]} =\) 3
\({\left[\begin{array}{c} 1 \\ 0 \\ 2 \end{array}\right]} - \)
2
\
({\left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right]} + \)
0
\({\left[\begin{array}{c} -1 \\ 0 \\ 1 \end{array}\right]}\)
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Correct answer, well done.
Marks for this submission: 1.00/1.00.
Thus, \( A^5{\left[\begin{array}{c} 1 \\ -2 \\ 6 \end{array}\right]} = \)
67
64
6
Correct answer, well done.
Marks for this submission: 2.00/2.00.
A correct answer is \( 1 \), which can be typed in as follows: 1
A correct answer is \( -2 \), which can be typed in as follows: -2
A correct answer is \( -2 \), which can be typed in as follows: -2
A correct answer is \( 5 \), which can be typed in as follows: 5
A correct answer is \( 32 \), which can be typed in as follows: 32
A correct answer is \( 24 \), which can be typed in as follows: 24
A correct answer is \( \left[\begin{array}{c} 61 \\ 32 \\ -14 \end{array}\right] \).
A correct answer is \( 3 \), which can be typed in as follows: 3
A correct answer is \( 2 \), which can be typed in as follows: 2
A correct answer is \( 0 \), which can be typed in as follows: 0
A correct answer is \( \left[\begin{array}{c} 67 \\ 64 \\ 6 \end{array}\right] \).