As your lecturer, I've prepared these 50 multiple-choice questions to help you master the concepts of direct sums, orthogonal complements, and projections. Please attempt each question carefully. Your progress will be saved.
1. Let \(U\) and \(W\) be subspaces of a vector space \(V\). What is the definition of the sum of \(U\) and \(W\), denoted \(U+W\)?
Explanation: The sum of two subspaces \(U\) and \(W\) is the set of all possible vectors that can be formed by adding a vector from \(U\) to a vector from \(W\). This is a fundamental definition.
Source: Anthony & Harvey, Definition 12.1.
2. A sum of two subspaces \(U+W\) is a direct sum, denoted \(U \oplus W\), if and only if which of the following conditions holds?
Explanation: The sum is direct if the intersection of the subspaces is only the zero vector. This is the primary definition of a direct sum.
Source: Subject Guide, Definition 5.2.
3. Let \(V = \mathbb{R}^3\), \(U = \text{span}\{(1,0,0), (0,1,0)\}\) and \(W = \text{span}\{(0,0,1)\} ). Which statement is true?
Explanation: \(U\) is the xy-plane and \(W\) is the z-axis. Their intersection is only the zero vector, so the sum is direct. Together, they span all of \(\mathbb{R}^3\).
Source: Anthony & Harvey, Chapter 12.1.
4. An equivalent condition for the sum \(U+W\) to be a direct sum is:
Explanation: A key theorem states that a sum is direct if and only if every vector in the sum has a unique representation as a sum of vectors from the constituent subspaces.
Source: Subject Guide, Theorem 5.1.
5. Let \(S\) be a subspace of an inner product space \(V\). What is the definition of the orthogonal complement, \(S^\perp\)?
Explanation: The orthogonal complement of \(S\) is the set of all vectors in \(V\) that are orthogonal to every vector in \(S\).
Source: Subject Guide, Definition 5.3.
6. If \(S\) is a subspace of a finite-dimensional inner product space \(V\), which of the following is always true?
Explanation: A fundamental result in linear algebra is that any finite-dimensional inner product space \(V\) can be expressed as the direct sum of a subspace \(S\) and its orthogonal complement \(S^\perp\).
Source: Subject Guide, Theorem 5.3.
7. Let \(S = \text{span}\{(1, 1, 1)\} ) in \(\mathbb{R}^3\) with the standard inner product. What is the orthogonal complement \(S^\perp\)?
Explanation: \(S^\perp\) consists of all vectors \(\mathbf{v}=(x,y,z)\) such that \(\langle \mathbf{v}, (1,1,1) \rangle = 0\). This gives the equation \(x+y+z=0\), which is the equation of a plane through the origin.
Source: Anthony & Harvey, Chapter 12.2.
8. For any matrix \(A\), which of the following relationships is correct?
Explanation: This is one of the four fundamental subspaces relationships. The orthogonal complement of the range (or column space) of a matrix \(A\) is the null space of its transpose \(A^T\).
Source: Subject Guide, Theorem 5.5.
9. Let \(A\) be an \(m \times n\) matrix. The subspace \(N(A)^\perp\) is equal to which other fundamental subspace?
Explanation: This is another of the four fundamental subspaces relationships. The orthogonal complement of the null space of \(A\) is the range (or column space) of \(A^T\). The range of \(A^T\) is also known as the row space of \(A\).
Source: Subject Guide, Theorem 5.5.
10. A linear transformation \(P: V \to V\) is a projection if and only if:
Explanation: A linear transformation is a projection if and only if it is idempotent, meaning applying the transformation twice is the same as applying it once. This is written as \(P^2 = P\).
Source: Subject Guide, Theorem 5.7.
11. A matrix \(P\) represents an orthogonal projection if and only if:
Explanation: An orthogonal projection has the properties of being both a projection (idempotent, \(P^2=P\)) and projecting onto a subspace orthogonally. The second condition is captured by the matrix being symmetric (\(P=P^T\)).
Source: Subject Guide, Theorem 5.8.
12. Let \(P\) be a projection matrix. Which of the following is a property of any projection?
Explanation: By definition, a projection is a linear transformation from a vector space to itself. Not all projections are symmetric (only orthogonal ones), and they are generally not invertible (unless they are the identity projection on the whole space).
Source: Anthony & Harvey, Chapter 12.3.
13. If \(P\) is an idempotent matrix (\(P^2=P\)), what are its possible eigenvalues?
Explanation: Let \(\lambda\) be an eigenvalue of \(P\) with eigenvector \(\mathbf{v}\). Then \(P\mathbf{v} = \lambda\mathbf{v}\). Applying \(P\) again, \(P^2\mathbf{v} = P(\lambda\mathbf{v}) = \lambda(P\mathbf{v}) = \lambda(\lambda\mathbf{v}) = \lambda^2\mathbf{v}\). Since \(P^2=P\), we have \(P\mathbf{v} = \lambda^2\mathbf{v}\). Thus, \(\lambda\mathbf{v} = \lambda^2\mathbf{v}\), which implies \((\lambda^2 - \lambda)\mathbf{v} = \mathbf{0}\). Since \(\mathbf{v} \neq \mathbf{0}\), we must have \(\lambda(\lambda-1)=0\), so \(\lambda=0\) or \(\lambda=1\).
Source: Subject Guide, Activity 5.6.
14. Let \(V = \mathbb{R}^2\), \(U = \text{span}\{(1,1)\} ) and \(W = \text{span}\{(-1,1)\} ). Is the sum \(U+W\) a direct sum?
Explanation: The vectors \((1,1)\) and \((-1,1)\) are linearly independent. Therefore, the only vector in their intersection is the zero vector. This satisfies the condition for a direct sum. Note that orthogonality is a sufficient condition for a sum to be direct, but not a necessary one, unless we are talking about a subspace and its orthogonal complement.
Source: Subject Guide, Example 5.1.
15. Which of the following properties of the orthogonal complement is correct for any subspace \(S\) of a finite-dimensional inner product space?
Explanation: The orthogonal complement of the orthogonal complement of a subspace is the original subspace itself. This is a fundamental property.
Source: Subject Guide, Theorem 5.4.
16. Let \(P\) be the matrix for a projection onto a subspace \(U\) parallel to \(W\). The null space of \(P\), \(N(P)\), is:
Explanation: The projection \(P\) maps any vector \(\mathbf{v} = \mathbf{u} + \mathbf{w}\) to \(\mathbf{u}\). The null space consists of all vectors that are mapped to the zero vector. This happens when \(\mathbf{u}=\mathbf{0}\), so the vectors in the null space are precisely the vectors in \(W\).
Source: Anthony & Harvey, Chapter 12.4.
17. Let \(P\) be the matrix for a projection onto a subspace \(U\) parallel to \(W\). The range of \(P\), \(R(P)\), is:
Explanation: The projection \(P\) maps any vector \(\mathbf{v} = \mathbf{u} + \mathbf{w}\) to \(\mathbf{u}\). The range is the set of all possible outputs. Since \(\mathbf{u}\) can be any vector in \(U\), the range of \(P\) is exactly the subspace \(U\).
Source: Anthony & Harvey, Chapter 12.4.
18. Consider the matrix \(A = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}\). This matrix represents:
Explanation: The matrix is idempotent (\(A^2=A\)) and symmetric (\(A^T=A\)), so it represents an orthogonal projection. Applying it to a vector \((x,y)\) gives \((x,0)\), which is the orthogonal projection onto the x-axis.
Source: Anthony & Harvey, Chapter 12.3.
19. To prove that a sum of two subspaces \(U+W\) is direct, one can show that \(U \cap W = \{\mathbf{0}\}\). How would you start a proof for this?
Explanation: To prove that a set (in this case, the intersection) contains only the zero vector, the standard method is to take an arbitrary element from that set and prove that it must be the zero vector.
Source: Subject Guide, Chapter 5.1.2.
20. Let \(S\) be a subspace of \(V\). Which statement about \(S^\perp\) is incorrect?
Explanation: The orthogonal complement \(S^\perp\) is not simply the complement of \(S\). For example, in \(\mathbb{R}^2\), if \(S\) is the x-axis, \(S^\perp\) is the y-axis. A vector like \((1,1)\) is in neither \(S\) nor \(S^\perp\).
Source: Subject Guide, Chapter 5.2.1.
21. If \(P\) is a projection matrix, then \(I-P\) is also a projection matrix. If \(P\) projects onto \(U\) parallel to \(W\), what does \(I-P\) project onto?
Explanation: First, check if \(I-P\) is idempotent: \((I-P)^2 = I - 2P + P^2 = I - 2P + P = I - P\). So it is a projection. For any \(\mathbf{v} = \mathbf{u} + \mathbf{w}\), \((I-P)\mathbf{v} = \mathbf{v} - P\mathbf{v} = (\mathbf{u}+\mathbf{w}) - \mathbf{u} = \mathbf{w}\). So \(I-P\) projects onto \(W\) parallel to \(U\).
Source: Anthony & Harvey, Chapter 12.4.
22. Let \(A = \begin{pmatrix} 0.5 & 0.5 \\ 0.5 & 0.5 \end{pmatrix}\). This matrix is:
Explanation: We check for idempotency: \(A^2 = \begin{pmatrix} 0.5 & 0.5 \\ 0.5 & 0.5 \end{pmatrix} \begin{pmatrix} 0.5 & 0.5 \\ 0.5 & 0.5 \end{pmatrix} = \begin{pmatrix} 0.25+0.25 & 0.25+0.25 \\ 0.25+0.25 & 0.25+0.25 \end{pmatrix} = A\). It is idempotent. We check for symmetry: \(A^T = A\). Since it is both idempotent and symmetric, it is an orthogonal projection matrix.
Source: Subject Guide, Theorem 5.8.
23. The matrix \(P = \begin{pmatrix} 0.5 & 0.5 \\ 0.5 & 0.5 \end{pmatrix}\) projects vectors in \(\mathbb{R}^2\) onto which subspace?
Explanation: The columns of a projection matrix span its range (the subspace it projects onto). The columns are \((0.5, 0.5)\). This vector lies on the line \(y=x\). Any vector \((x,y)\) is mapped to \((0.5x+0.5y, 0.5x+0.5y)\), which is a point on the line \(y=x\).
Source: Anthony & Harvey, Chapter 12.5.
24. Let \(A\) be an \(m \times n\) matrix. The row space of \(A\), \(R(A^T)\), and the null space of \(A\), \(N(A)\), are orthogonal complements. This means their direct sum is:
Explanation: The row space \(R(A^T)\) and the null space \(N(A)\) are both subspaces of \(\mathbb{R}^n\). Since they are orthogonal complements, their direct sum spans the entire space they reside in, which is \(\mathbb{R}^n\).
Source: Subject Guide, Theorem 5.5.
25. Let \(P\) be a projection. The transformation \(T = 2P - I\) represents:
Explanation: A reflection \(R\) has the property \(R^2=I\). Let's check \(T^2\): \(T^2 = (2P-I)^2 = 4P^2 - 4P + I^2 = 4P - 4P + I = I\). Since \(T^2=I\), \(T\) is a reflection about the subspace \(U\) (the range of P) along \(W\) (the nullspace of P).
Source: This is a common extension of the properties of projections. It can be derived from the definitions in Anthony & Harvey, Chapter 12.3.
26. Let \(U\) and \(W\) be subspaces of \(V\). If \(\dim(U)=3\), \(\dim(W)=4\), and \(\dim(U \cap W)=1\), what is \(\dim(U+W)\)?
Explanation: The dimension formula for the sum of two subspaces is \(\dim(U+W) = \dim(U) + \dim(W) - \dim(U \cap W)\). Plugging in the values, we get \(3 + 4 - 1 = 6\).
Source: This is a standard theorem related to sums of subspaces, covered in Anthony & Harvey, Chapter 6.
27. If \(P\) is the orthogonal projection onto the line spanned by the vector \(\mathbf{a} = (1, 2, 2)\) in \(\mathbb{R}^3\), what is the matrix \(P\)?
Explanation: The formula for orthogonal projection onto the line spanned by a vector \(\mathbf{a}\) is \(P = \frac{1}{\mathbf{a}^T\mathbf{a}} \mathbf{a}\mathbf{a}^T\). Here, \(\mathbf{a}^T\mathbf{a} = 1^2+2^2+2^2 = 9\). The outer product \(\mathbf{a}\mathbf{a}^T\) is \(\begin{pmatrix} 1 \\ 2 \\ 2 \end{pmatrix} \begin{pmatrix} 1 & 2 & 2 \end{pmatrix} = \begin{pmatrix} 1 & 2 & 2 \\ 2 & 4 & 4 \\ 2 & 4 & 4 \end{pmatrix}\). So, \(P = \frac{1}{9} \begin{pmatrix} 1 & 2 & 2 \\ 2 & 4 & 4 \\ 2 & 4 & 4 \end{pmatrix}\).
Source: Anthony & Harvey, Chapter 12.5.
28. True or False: If \(U\) and \(W\) are orthogonal subspaces, their sum \(U+W\) is a direct sum.
Explanation: If \(U\) and \(W\) are orthogonal, it means every vector in \(U\) is orthogonal to every vector in \(W\). If a vector \(\mathbf{v}\) is in their intersection, \(\mathbf{v} \in U \cap W\), then \(\mathbf{v}\) must be orthogonal to itself, i.e., \(\langle \mathbf{v}, \mathbf{v} \rangle = 0\). This implies \(\mathbf{v} = \mathbf{0}\). Therefore, \(U \cap W = \{\mathbf{0}\}\), which is the condition for a direct sum.
Source: Anthony & Harvey, Chapter 12.2.
29. Let \(P = \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix}\). Is this a projection matrix?
Explanation: We check for idempotency: \(P^2 = \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix} = P\). So it is a projection. However, it is not symmetric (\(P^T \neq P\)), so it is not an orthogonal projection.
Source: Subject Guide, Theorems 5.7 and 5.8.
30. Let \(A\) be a matrix. The statement \(N(A)^\perp = R(A^T)\) is a key part of the Fundamental Theorem of Linear Algebra. What does \(R(A^T)\) represent?
Explanation: The range of \(A^T\), \(R(A^T)\), is the space spanned by the columns of \(A^T\). The columns of \(A^T\) are the rows of \(A\). Therefore, \(R(A^T)\) is the row space of \(A\).
Source: Anthony & Harvey, Chapter 12.2.2.
31. If \(P\) is an orthogonal projection matrix, what can be said about \(I-P\)?
Explanation: We know \(I-P\) is a projection. To check if it's orthogonal, we check for symmetry. \((I-P)^T = I^T - P^T = I - P\), since \(P\) is symmetric. As \(I-P\) is both idempotent and symmetric, it is an orthogonal projection.
Source: Derived from properties in Subject Guide, Chapter 5.4.
32. Let \(U = \text{span}\{(1,0,1), (0,1,1)\} ) in \(\mathbb{R}^3\). Which vector is in \(U^\perp\)?
Explanation: A vector \(\mathbf{v}=(x,y,z)\) is in \(U^\perp\) if it is orthogonal to both basis vectors of \(U\). This gives two equations: \(x+z=0\) and \(y+z=0\). From these, \(x=-z\) and \(y=-z\). A vector satisfying this is \((1,1,-1)\) (by setting \(z=-1\)).
Source: Subject Guide, Example 5.3.
33. If \(P\) is a projection onto \(U\) parallel to \(W\), and \(\mathbf{v} = \mathbf{u} + \mathbf{w}\) with \(\mathbf{u} \in U, \mathbf{w} \in W\), then \(P\mathbf{v}\) is:
Explanation: This is the definition of a projection. The projection \(P\) onto \(U\) parallel to \(W\) maps a vector \(\mathbf{v}\) to its component \(\mathbf{u}\) in \(U\).
Source: Subject Guide, Definition 5.4.
34. Let \(V = P_2\), the space of polynomials of degree at most 2. Let \(U\) be the subspace of even polynomials (\(p(-x)=p(x)\)) and \(W\) be the subspace of odd polynomials (\(p(-x)=-p(x)\)). Is \(V = U \oplus W\)?
Explanation: Any polynomial \(p(x)\) can be uniquely written as the sum of an even part \(\frac{p(x)+p(-x)}{2}\) and an odd part \(\frac{p(x)-p(-x)}{2}\). The only polynomial that is both even and odd is the zero polynomial. Thus, \(U \cap W = \{0\}\) and \(U+W=V\), so the sum is direct.
Source: This is a classic example of a direct sum decomposition, applying the principles from Anthony & Harvey, Chapter 12.1.
35. Which of the following matrices is idempotent?
Explanation: A matrix \(P\) is idempotent if \(P^2=P\). Let's check option (b): \(\begin{pmatrix} 1 & 0 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 1 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 1 & 0 \end{pmatrix}\). This is idempotent. The other matrices are not.
Source: Subject Guide, Definition 5.7.
36. Let \(A\) be an \(m \times n\) matrix. The orthogonal complement of the column space, \(R(A)^\perp\), is also known as:
Explanation: The orthogonal complement of the column space of \(A\) is the null space of \(A^T\), which is \(N(A^T)\). This is also called the left null space of \(A\) because it consists of all vectors \(\mathbf{y}\) such that \(\mathbf{y}^T A = \mathbf{0}^T\).
Source: Subject Guide, Theorem 5.5.
37. If \(V = U \oplus W\), what is the value of \(\dim(V)\)?
Explanation: The general formula is \(\dim(U+W) = \dim(U) + \dim(W) - \dim(U \cap W)\). For a direct sum, \(U \cap W = \{\mathbf{0}\}\), which has dimension 0. Therefore, the formula simplifies to \(\dim(U \oplus W) = \dim(U) + \dim(W)\).
Source: Anthony & Harvey, Chapter 12.1.
38. Let \(P\) be a projection matrix. What is \(P(I-P)\)?
Explanation: Using the distributive property of matrix multiplication, \(P(I-P) = PI - P^2\). Since \(P\) is a projection, \(P^2=P\). So, \(P(I-P) = P - P = 0\).
Source: Derived from properties in Subject Guide, Chapter 5.4.
39. Let \(A = \begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix}\). What is the orthogonal complement of the column space \(R(A)\)?
Explanation: We need to find \(R(A)^\perp = N(A^T)\). The transpose is \(A^T = \begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix}\). We solve \(A^T\mathbf{x} = \mathbf{0}\), which is \(x+2y=0\). A basis for this null space is the vector \((-2,1)\). So \(R(A)^\perp = \text{span}\{(-2,1)\} ).
Source: Subject Guide, Theorem 5.5.
40. True or False: Any symmetric matrix is a projection matrix.
Explanation: A projection matrix must be idempotent (\(P^2=P\)). A symmetric matrix is not necessarily idempotent. For example, \(A = \begin{pmatrix} 2 & 0 \\ 0 & 2 \end{pmatrix}\) is symmetric, but \(A^2 = \begin{pmatrix} 4 & 0 \\ 0 & 4 \end{pmatrix} \neq A\).
Source: Subject Guide, Chapter 5.4.
41. If \(P\) is a projection onto \(U\) parallel to \(W\), and \(\mathbf{v} = \mathbf{u} + \mathbf{w}\) with \(\mathbf{u} \in U, \mathbf{w} \in W\), then \(P\mathbf{v}\) is:
Explanation: This is the definition of a projection. The projection \(P\) onto \(U\) parallel to \(W\) maps a vector \(\mathbf{v}\) to its component \(\mathbf{u}\) in \(U\).
Source: Subject Guide, Definition 5.4.
42. Let \(A = \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix}\). Which statement is false?
Explanation: Let's check for idempotency: \(A^2 = \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} = \begin{pmatrix} 2 & 2 \\ 2 & 2 \end{pmatrix} = 2A \neq A\). Therefore, the matrix is not idempotent.
Source: Subject Guide, Definition 5.7.
43. If \(S\) is a subspace of \(V\), then \(S^\perp\) is also a subspace of \(V\). To prove this, one must show that \(S^\perp\) is non-empty and closed under...
Explanation: The definition of a subspace requires it to be closed under the two vector space operations: vector addition and scalar multiplication.
Source: Subject Guide, Theorem 5.2.
44. Let \(A\) be an \(m \times n\) matrix. The column space \(R(A)\) and the left null space \(N(A^T)\) are orthogonal complements. This means their direct sum is:
Explanation: The column space \(R(A)\) is a subspace of \(\mathbb{R}^m\). The left null space \(N(A^T)\) is also a subspace of \(\mathbb{R}^m\). Since they are orthogonal complements, their direct sum spans the entire space they reside in, which is \(\mathbb{R}^m\).
Source: Subject Guide, Theorem 5.5.
45. If \(P\) is a projection matrix, which of the following is NOT necessarily true?
Explanation: A projection matrix is only required to be idempotent. It is symmetric if and only if the projection is orthogonal. A non-orthogonal projection is still a projection but will not have a symmetric matrix.
Source: Subject Guide, Chapter 5.4.
46. Let \(U = \text{span}\{(1,1)\} ) and \(W = \text{span}\{(2,2)\} ) in \(\mathbb{R}^2\). Is the sum \(U+W\) a direct sum?
Explanation: The vector \((2,2)\) is a scalar multiple of \((1,1)\), so the two subspaces are identical. Their intersection is the entire subspace, not just the zero vector. Therefore, the sum is not direct.
Source: Subject Guide, Definition 5.2.
47. To prove that a matrix \(P\) represents a projection, you must prove that:
Explanation: The defining characteristic of a projection matrix is that it is idempotent. Applying the projection twice has the same effect as applying it once.
Source: Subject Guide, Theorem 5.7.
48. If \(P\) is an orthogonal projection onto a subspace \(S\), then for any vector \(\mathbf{v}\), the vector \(\mathbf{v} - P\mathbf{v}\) is...
Explanation: The vector \(P\mathbf{v}\) is the component of \(\mathbf{v}\) in \(S\). The vector \(\mathbf{v} - P\mathbf{v}\) is the remaining component, which is, by definition of orthogonal projection, the component in the orthogonal complement \(S^\perp\).
Source: Subject Guide, Definition 5.5.
49. Let \(A = \begin{pmatrix} 3 & 1 \\ 1 & 3 \end{pmatrix}\). The matrix \(P = \frac{1}{2}A\) is:
Explanation: Let \(P = \begin{pmatrix} 1.5 & 0.5 \\ 0.5 & 1.5 \end{pmatrix}\). We check for idempotency: \(P^2 = \begin{pmatrix} 2.5 & 1.5 \\ 1.5 & 2.5 \end{pmatrix} \neq P\). Since it is not idempotent, it is not a projection matrix.
Source: Subject Guide, Definition 5.7.
50. If \(V = U \oplus W\), and \(\mathbf{v}_1 = \mathbf{u}_1 + \mathbf{w}_1\) and \(\mathbf{v}_2 = \mathbf{u}_2 + \mathbf{w}_2\). To prove that a projection \(P_U\) is a linear transformation, one must show that \(P_U(\alpha \mathbf{v}_1 + \beta \mathbf{v}_2)\) equals:
Explanation: This is the definition of a linear transformation. We need to show that the projection of a linear combination of vectors is the same as the linear combination of their projections. \(P_U(\alpha \mathbf{v}_1 + \beta \mathbf{v}_2) = P_U((\alpha\mathbf{u}_1+\beta\mathbf{u}_2) + (\alpha\mathbf{w}_1+\beta\mathbf{w}_2)) = \alpha\mathbf{u}_1+\beta\mathbf{u}_2 = \alpha P_U(\mathbf{v}_1) + \beta P_U(\mathbf{v}_2)\).
Source: Subject Guide, Chapter 5.3.1.