MT2175: Inner Products and Orthogonality Quiz

1. Which of the following is NOT a required property for an operation $\langle \cdot, \cdot \rangle$ to be an inner product on a real vector space $V$? (u, v, w are vectors in V, k is a scalar)
The norm $||\mathbf{u}||$ is defined from the inner product as $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle}$, not the other way around. The three fundamental properties for a real inner product are Symmetry, Linearity (which includes Additivity and Homogeneity), and Positive-definiteness ($\langle \mathbf{u}, \mathbf{u} \rangle \ge 0$ and $\langle \mathbf{u}, \mathbf{u} \rangle = 0 \iff \mathbf{u} = \mathbf{0}$).
Source: Anthony & Harvey, Definition 10.1, p. 313.
2. Let $\mathbf{u} = (u_1, u_2)$ and $\mathbf{v} = (v_1, v_2)$ be vectors in $\mathbb{R}^2$. Which of the following is a valid inner product on $\mathbb{R}^2$?
This is a weighted Euclidean inner product. Option (a) fails positive-definiteness (e.g., for $\mathbf{u}=(1,2)$, $\langle \mathbf{u}, \mathbf{u} \rangle = 1-4 = -3 < 0$). Option (b) is not linear. Option (d) is not positive-definite (e.g., for $\mathbf{u}=(1,-1)$, $\langle \mathbf{u}, \mathbf{u} \rangle = -1-1 = -2 < 0$). Option (c) satisfies all axioms: it's symmetric, linear, and $\langle \mathbf{u}, \mathbf{u} \rangle = 2u_1^2 + 3u_2^2 \ge 0$, with equality only if $u_1=u_2=0$.
Source: Anthony & Harvey, Example 10.5, p. 314.
3. In an inner product space, the norm of a vector $\mathbf{v}$ is defined as:
The norm (or length) of a vector is defined as the square root of the inner product of the vector with itself. This ensures the norm is a non-negative real number.
Source: Anthony & Harvey, Definition 10.6, p. 315.
4. The Cauchy-Schwarz inequality states that for any two vectors $\mathbf{u}$ and $\mathbf{v}$ in an inner product space:
The Cauchy-Schwarz inequality states that the absolute value of the inner product of two vectors is less than or equal to the product of their norms. This is a fundamental inequality in linear algebra.
Source: Anthony & Harvey, Theorem 10.7, p. 315.
5. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal vectors in an inner product space, the Generalised Pythagoras' Theorem states:
The Generalised Pythagoras' Theorem is a direct consequence of the properties of the inner product. Since $||\mathbf{u}+\mathbf{v}||^2 = \langle \mathbf{u}+\mathbf{v}, \mathbf{u}+\mathbf{v} \rangle = ||\mathbf{u}||^2 + 2\langle \mathbf{u}, \mathbf{v} \rangle + ||\mathbf{v}||^2$ and $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ for orthogonal vectors, the result follows.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
6. A set of non-zero, pairwise orthogonal vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is:
A key theorem states that a set of non-zero orthogonal vectors is always linearly independent. To prove this, assume $c_1\mathbf{v}_1 + \dots + c_k\mathbf{v}_k = \mathbf{0}$. Taking the inner product with any $\mathbf{v}_i$ shows that $c_i||\mathbf{v}_i||^2 = 0$, which implies $c_i=0$ since $\mathbf{v}_i \neq \mathbf{0}$.
Source: Anthony & Harvey, Theorem 10.14, p. 318.
7. An $n \times n$ matrix $P$ is called an orthogonal matrix if:
The definition of an orthogonal matrix $P$ is that its inverse is equal to its transpose, i.e., $P^T P = I$. While its columns must be orthonormal (not just orthogonal) and its determinant must be $\pm 1$, the defining property is $P^{-1} = P^T$.
Source: Anthony & Harvey, Definition 10.15, p. 319.
8. What is an orthonormal set of vectors?
An orthonormal set combines two properties: "ortho" for orthogonal (the inner product of any two distinct vectors is zero) and "normal" for normalized (each vector has a norm, or length, of 1).
Source: Anthony & Harvey, Definition 10.19, p. 320.
9. An $n \times n$ matrix $P$ is orthogonal if and only if its columns form:
This is a fundamental theorem connecting the algebraic definition of an orthogonal matrix ($P^T P = I$) to the geometric properties of its column vectors. The entry $(i, j)$ of $P^T P$ is the dot product of the $i$-th column of $P$ with the $j$-th column. For this to be the identity matrix, the dot products must be 0 for $i \neq j$ (orthogonal) and 1 for $i=j$ (unit norm).
Source: Anthony & Harvey, Theorem 10.21, p. 321.
10. The Gram-Schmidt process is a method for:
The Gram-Schmidt process takes a basis and produces an orthonormal basis for the same space by iteratively constructing orthogonal vectors and then normalizing them.
Source: Anthony & Harvey, Section 10.4, p. 321.
11. Using the standard Euclidean inner product, what is the norm of the vector $\mathbf{v} = (1, -2, 2)$ in $\mathbb{R}^3$?
The norm is calculated as $||\mathbf{v}|| = \sqrt{1^2 + (-2)^2 + 2^2} = \sqrt{1 + 4 + 4} = \sqrt{9} = 3$.
Source: Anthony & Harvey, Definition 10.6, p. 315.
12. Which property of an inner product ensures that $\langle \mathbf{u}, k\mathbf{v} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ for a real inner product?
This is derived from the basic axioms. $\langle \mathbf{u}, k\mathbf{v} \rangle = \langle k\mathbf{v}, \mathbf{u} \rangle$ by symmetry. Then $\langle k\mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{v}, \mathbf{u} \rangle$ by homogeneity. Finally, $k\langle \mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ by symmetry again.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
13. If $||\mathbf{u}|| = 3$, $||\mathbf{v}|| = 4$, and the angle $\theta$ between them is $\pi/3$, what is $\langle \mathbf{u}, \mathbf{v} \rangle$? (Assume standard Euclidean inner product).
Using the formula $\langle \mathbf{u}, \mathbf{v} \rangle = ||\mathbf{u}|| ||\mathbf{v}|| \cos\theta$, we get $3 \times 4 \times \cos(\pi/3) = 12 \times (1/2) = 6$.
Source: Anthony & Harvey, Theorem 1.43, p. 31.
14. The triangle inequality for norms states that:
The triangle inequality states that the length of one side of a triangle (the vector sum $\mathbf{u}+\mathbf{v}$) is less than or equal to the sum of the lengths of the other two sides ($||\mathbf{u}|| + ||\mathbf{v}||$).
Source: Anthony & Harvey, Theorem 10.13, p. 318.
15. If you apply the Gram-Schmidt process to the vectors $\mathbf{v}_1 = (1, 0)$ and $\mathbf{v}_2 = (1, 1)$, what is the resulting orthonormal basis $\{\mathbf{u}_1, \mathbf{u}_2\}$?
Step 1: $\mathbf{u}_1 = \frac{\mathbf{v}_1}{||\mathbf{v}_1||} = \frac{(1,0)}{1} = (1,0)$. Step 2: Find $\mathbf{w}_2 = \mathbf{v}_2 - \langle \mathbf{v}_2, \mathbf{u}_1 \rangle \mathbf{u}_1 = (1,1) - (1)(1,0) = (0,1)$. Step 3: Normalize $\mathbf{w}_2$ to get $\mathbf{u}_2 = \frac{(0,1)}{1} = (0,1)$. The resulting basis is the standard basis.
Source: Anthony & Harvey, Section 10.4, p. 321.
16. Which of the following matrices is orthogonal?
A rotation matrix is always orthogonal. Its columns, $(\cos\theta, \sin\theta)$ and $(-\sin\theta, \cos\theta)$, are orthogonal to each other and both have a norm of 1. The other matrices do not have orthonormal columns.
Source: Anton & Rorres, Section 6.6, Example 2.
17. If $P$ is an orthogonal matrix, what is the value of $\det(P)$?
Since $P^T P = I$, we have $\det(P^T P) = \det(I) = 1$. Using the property $\det(AB) = \det(A)\det(B)$ and $\det(P^T) = \det(P)$, we get $\det(P)^2 = 1$, which implies $\det(P) = \pm 1$.
Source: Anthony & Harvey, Exercise 10.4, p. 325.
18. The set of all vectors in $\mathbb{R}^3$ orthogonal to $\mathbf{v} = (1, 1, 1)$ forms a:
The set of all vectors $\mathbf{x}=(x,y,z)$ such that $\langle \mathbf{x}, \mathbf{v} \rangle = 0$ is given by the equation $x+y+z=0$. This is the equation of a plane through the origin with normal vector $\mathbf{v}$. This set is the orthogonal complement of the line spanned by $\mathbf{v}$.
Source: Anthony & Harvey, Section 10.2.1.
19. Let $V = P_2$, the space of polynomials of degree at most 2, with inner product $\langle p, q \rangle = \int_{-1}^{1} p(x)q(x)dx$. Are the vectors $p(x)=x$ and $q(x)=x^2$ orthogonal?
We compute the inner product: $\langle p, q \rangle = \int_{-1}^{1} x \cdot x^2 dx = \int_{-1}^{1} x^3 dx = \left[\frac{x^4}{4}\right]_{-1}^{1} = \frac{1}{4} - \frac{1}{4} = 0$. Since the inner product is zero, the vectors are orthogonal.
Source: Anton & Rorres, Example 4, p. 203.
20. Normalizing the vector $\mathbf{v} = (3, 4)$ in $\mathbb{R}^2$ with the standard Euclidean inner product results in:
First, find the norm: $||\mathbf{v}|| = \sqrt{3^2 + 4^2} = \sqrt{9+16} = \sqrt{25} = 5$. Then, divide the vector by its norm: $\frac{\mathbf{v}}{||\mathbf{v}||} = \frac{1}{5}(3, 4) = (\frac{3}{5}, \frac{4}{5})$.
Source: Anthony & Harvey, p. 315.
21. The distance $d(\mathbf{u}, \mathbf{v})$ between two vectors in an inner product space is defined as:
The distance between two vectors is defined as the norm of their difference. This generalizes the standard distance formula in Euclidean space.
Source: Anton & Rorres, p. 185.
22. If $\{\mathbf{v}_1, \dots, \mathbf{v}_n\}$ is an orthonormal basis for $\mathbb{R}^n$ and $\mathbf{x} = c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n$, how is the coefficient $c_i$ found?
For an orthonormal basis, the coordinates (coefficients) of a vector are simply its inner products with the basis vectors. This is a major advantage of using orthonormal bases. $\langle \mathbf{x}, \mathbf{v}_i \rangle = \langle c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n, \mathbf{v}_i \rangle = c_i\langle \mathbf{v}_i, \mathbf{v}_i \rangle = c_i$.
Source: Anthony & Harvey, Theorem 10.20, p. 320.
23. Which statement is false?
The columns (and rows) of an orthogonal matrix must form an orthonormal set. However, there is no restriction on the signs of the diagonal entries. For example, a reflection matrix is orthogonal but can have negative entries on the diagonal.
Source: Anthony & Harvey, Theorem 10.18, p. 319.
24. In the Gram-Schmidt process, when constructing an orthogonal vector $\mathbf{w}_2$ from $\mathbf{v}_1$ and $\mathbf{v}_2$ (where $\mathbf{u}_1$ is the normalized $\mathbf{v}_1$), the formula is:
The process works by taking the next vector ($\mathbf{v}_2$) and subtracting its projection onto the subspace spanned by the previously found orthogonal vectors (in this case, just $\mathbf{u}_1$). The result is a vector orthogonal to the previous ones.
Source: Anthony & Harvey, p. 321.
25. If $\langle \mathbf{u}, \mathbf{v} \rangle = 3u_1v_1 + u_2v_2$ is an inner product on $\mathbb{R}^2$, what is the norm of $\mathbf{u}=(1, -1)$?
The norm is $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle} = \sqrt{3(1)(1) + (1)(-1)(-1)} = \sqrt{3+1} = \sqrt{4} = 2$. Note that the standard norm would be $\sqrt{2}$.
Source: Anton & Rorres, Example 2, p. 184.
26. The property $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$ is a direct consequence of which two fundamental inner product axioms?
We have $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{v}+\mathbf{w}, \mathbf{u} \rangle$ by symmetry. Then, by additivity, this is $\langle \mathbf{v}, \mathbf{u} \rangle + \langle \mathbf{w}, \mathbf{u} \rangle$. Applying symmetry again to each term gives $\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
27. If a set of vectors is orthonormal, it is:
An orthonormal set consists of non-zero (norm 1) orthogonal vectors. A set of non-zero orthogonal vectors is always linearly independent. It only becomes a basis (a spanning set) if it contains the "right number" of vectors (equal to the dimension of the space).
Source: Anthony & Harvey, Theorem 10.14, p. 318.
28. The orthogonal complement of a line through the origin in $\mathbb{R}^3$ is:
The orthogonal complement of a subspace W, denoted $W^\perp$, contains all vectors orthogonal to every vector in W. In $\mathbb{R}^3$, all vectors orthogonal to a given line (a 1D subspace) form a plane (a 2D subspace) perpendicular to that line.
Source: Anton & Rorres, Section 6.2.
29. If $P$ is an orthogonal matrix, then $P^T$ is:
If $P$ is orthogonal, $P^T P = P P^T = I$. We need to check if $(P^T)^T (P^T) = I$. This simplifies to $P P^T = I$, which is true by the definition of an orthogonal matrix. Therefore, $P^T$ is also orthogonal.
Source: Anthony & Harvey, Activity 10.22, p. 321.
30. The Cauchy-Schwarz inequality is an equality if and only if:
Equality holds in the Cauchy-Schwarz inequality, $|\langle \mathbf{u}, \mathbf{v} \rangle| = ||\mathbf{u}|| ||\mathbf{v}||$, if and only if one vector is a scalar multiple of the other, meaning they are linearly dependent.
Source: Anton & Rorres, Section 6.1.
31. What is the result of normalizing the vector $\mathbf{v}=(0, 5, 0)$ in $\mathbb{R}^3$?
The norm is $||\mathbf{v}|| = \sqrt{0^2+5^2+0^2} = 5$. Normalizing gives $\frac{1}{5}(0, 5, 0) = (0, 1, 0)$.
Source: Anthony & Harvey, p. 315.
32. Two vectors $\mathbf{u}$ and $\mathbf{v}$ are orthogonal if $\langle \mathbf{u}, \mathbf{v} \rangle = 0$. This is a:
This is the formal definition of orthogonality in a general inner product space. It is motivated by the geometric property in $\mathbb{R}^2$ and $\mathbb{R}^3$ but is itself a definition.
Source: Anthony & Harvey, Definition 10.9, p. 317.
33. If $P$ is an $n \times n$ orthogonal matrix, then its rows form:
A key property of orthogonal matrices is that both their columns and their rows form an orthonormal basis for $\mathbb{R}^n$. This is because if $P$ is orthogonal, so is $P^T$, and the columns of $P^T$ are the rows of $P$.
Source: Anthony & Harvey, p. 321.
34. In the Gram-Schmidt process, if $\mathbf{v}_2$ is a scalar multiple of $\mathbf{v}_1$, what happens?
The Gram-Schmidt process requires a linearly independent set of vectors. If $\mathbf{v}_2 = k\mathbf{v}_1$, then $\mathbf{v}_2$ is already in the span of $\mathbf{v}_1$. Its projection onto the span of $\mathbf{v}_1$ is $\mathbf{v}_2$ itself, so $\mathbf{w}_2 = \mathbf{v}_2 - \text{proj}_{\mathbf{v}_1}(\mathbf{v}_2) = \mathbf{v}_2 - \mathbf{v}_2 = \mathbf{0}$.
Source: Anthony & Harvey, Section 10.4, p. 321.
35. The standard inner product on $\mathbb{C}^n$ is defined as $\langle \mathbf{u}, \mathbf{v} \rangle = \mathbf{u} \cdot \overline{\mathbf{v}} = u_1\overline{v_1} + \dots + u_n\overline{v_n}$. Why is the conjugate used?
The conjugate ensures that $\langle \mathbf{v}, \mathbf{v} \rangle = \sum v_i \overline{v_i} = \sum |v_i|^2$, which is a non-negative real number. Without the conjugate, $\langle \mathbf{v}, \mathbf{v} \rangle$ could be a complex number or even zero for a non-zero vector (e.g., $\langle (1, i), (1, i) \rangle = 1^2 + i^2 = 0$).
Source: Anthony & Harvey, Section 13.4.1, p. 401.
36. For vectors $\mathbf{u}, \mathbf{v}$ in an inner product space, which of the following is always true?
$||\mathbf{u}+\mathbf{v}|| = ||\mathbf{v}+\mathbf{u}||$ by commutativity of vector addition. $||\mathbf{u}-\mathbf{v}|| = ||-(\mathbf{v}-\mathbf{u})|| = |-1| \cdot ||\mathbf{v}-\mathbf{u}|| = ||\mathbf{v}-\mathbf{u}||$. The norm is defined as a square root, so it is always non-negative.
Source: Anton & Rorres, Theorem 4.1.4, p. 4.
37. If $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ and $\langle \mathbf{v}, \mathbf{w} \rangle = 0$, does it imply $\langle \mathbf{u}, \mathbf{w} \rangle = 0$?
Orthogonality is not transitive. Consider $\mathbf{u}=(1,0,0), \mathbf{v}=(0,1,0), \mathbf{w}=(1,0,1)$ in $\mathbb{R}^3$. $\mathbf{u}$ is orthogonal to $\mathbf{v}$, and $\mathbf{v}$ is orthogonal to $\mathbf{w}$, but $\mathbf{u}$ is not orthogonal to $\mathbf{w}$.
Source: Conceptual understanding of orthogonality.
38. The set of all vectors orthogonal to a subspace $W$ is called the...
This is the definition of the orthogonal complement, denoted $W^\perp$. It is itself a subspace.
Source: Anthony & Harvey, Definition 12.7, p. 367.
39. If $S = \{\mathbf{v}_1, \mathbf{v}_2\}$ is an orthogonal basis for a subspace $W$, the orthogonal projection of $\mathbf{u}$ onto $W$ is given by:
When the basis is orthogonal but not necessarily orthonormal, you must divide by the square of the norm of each basis vector when calculating the projection coefficients. If the basis were orthonormal, the denominators would be 1.
Source: Anton & Rorres, Theorem 6.3.5, p. 223.
40. The distance from a point $\mathbf{u}$ to a subspace $W$ is given by:
The distance from a point to a subspace is the length of the component of the vector that is orthogonal to the subspace. This vector is $\mathbf{u} - \text{proj}_W \mathbf{u}$.
Source: Anthony & Harvey, Theorem 12.30, p. 379.
41. If $A$ is an orthogonal matrix, then $A^T$ is...
This is the definition of an orthogonal matrix.
Source: Anthony & Harvey, Definition 10.15, p. 319.
42. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal, then $||\mathbf{u}-\mathbf{v}||^2$ equals:
$||\mathbf{u}-\mathbf{v}||^2 = \langle \mathbf{u}-\mathbf{v}, \mathbf{u}-\mathbf{v} \rangle = \langle \mathbf{u}, \mathbf{u} \rangle - 2\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{v}, \mathbf{v} \rangle$. Since $\langle \mathbf{u}, \mathbf{v} \rangle = 0$, this simplifies to $||\mathbf{u}||^2 + ||\mathbf{v}||^2$. This is another form of Pythagoras's Theorem.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
43. The process of creating a unit vector from a non-zero vector $\mathbf{v}$ is called:
Normalizing a vector means scaling it so that its length (norm) becomes 1, without changing its direction. This is done by dividing the vector by its own norm.
Source: Anthony & Harvey, p. 315.
44. If $A$ is an $m \times n$ matrix, its row space and nullspace are orthogonal complements in...
The row vectors and the vectors in the nullspace (solutions to $A\mathbf{x}=\mathbf{0}$) are both vectors with $n$ components, so they are subspaces of $\mathbb{R}^n$. The Fundamental Theorem of Linear Algebra states they are orthogonal complements.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.
45. Let $\mathbf{u}=(1,1), \mathbf{v}=(1,-1)$ in $\mathbb{R}^2$. Are they orthogonal with respect to the standard inner product?
The inner product is $\langle \mathbf{u}, \mathbf{v} \rangle = (1)(1) + (1)(-1) = 1 - 1 = 0$. Since the inner product is zero, they are orthogonal.
Source: Anthony & Harvey, p. 317.
46. The zero vector is orthogonal to every vector in a vector space.
This is true because $\langle \mathbf{0}, \mathbf{v} \rangle = \langle 0\mathbf{v}, \mathbf{v} \rangle = 0\langle \mathbf{v}, \mathbf{v} \rangle = 0$ for any vector $\mathbf{v}$.
Source: Anthony & Harvey, Theorem 6.1.1, p. 191.
47. If $Q$ is an orthogonal matrix, then multiplying a vector $\mathbf{x}$ by $Q$ (i.e., $Q\mathbf{x}$) preserves its...
Multiplication by an orthogonal matrix is an isometry, which means it preserves lengths and angles. $||Q\mathbf{x}||^2 = \langle Q\mathbf{x}, Q\mathbf{x} \rangle = (Q\mathbf{x})^T(Q\mathbf{x}) = \mathbf{x}^T Q^T Q \mathbf{x} = \mathbf{x}^T I \mathbf{x} = \mathbf{x}^T\mathbf{x} = ||\mathbf{x}||^2$.
Source: Anton & Rorres, Section 6.6.
48. The first step of the Gram-Schmidt process on a set $\{\mathbf{v}_1, \mathbf{v}_2, \dots\}$ is to set $\mathbf{u}_1 = \dots$
The process begins by creating the first vector of the new orthonormal basis. This is done by taking the first vector from the original set and normalizing it.
Source: Anthony & Harvey, p. 321.
49. The set of all vectors orthogonal to every vector in a subspace $W$ is denoted by:
This is the standard notation for the orthogonal complement of a subspace $W$. It is read as "W perp".
Source: Anthony & Harvey, Definition 12.7, p. 367.
50. If $A$ is an $m \times n$ matrix, the orthogonal complement of the row space of $A$ is the...
This is a statement of the Fundamental Theorem of Linear Algebra. The row space and nullspace are orthogonal complements in $\mathbb{R}^n$.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.
, '

MT2175: Inner Products and Orthogonality Quiz

1. Which of the following is NOT a required property for an operation $\langle \cdot, \cdot \rangle$ to be an inner product on a real vector space $V$? (u, v, w are vectors in V, k is a scalar)
The norm $||\mathbf{u}||$ is defined from the inner product as $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle}$, not the other way around. The three fundamental properties for a real inner product are Symmetry, Linearity (which includes Additivity and Homogeneity), and Positive-definiteness ($\langle \mathbf{u}, \mathbf{u} \rangle \ge 0$ and $\langle \mathbf{u}, \mathbf{u} \rangle = 0 \iff \mathbf{u} = \mathbf{0}$).
Source: Anthony & Harvey, Definition 10.1, p. 313.
2. Let $\mathbf{u} = (u_1, u_2)$ and $\mathbf{v} = (v_1, v_2)$ be vectors in $\mathbb{R}^2$. Which of the following is a valid inner product on $\mathbb{R}^2$?
This is a weighted Euclidean inner product. Option (a) fails positive-definiteness (e.g., for $\mathbf{u}=(1,2)$, $\langle \mathbf{u}, \mathbf{u} \rangle = 1-4 = -3 < 0$). Option (b) is not linear. Option (d) is not positive-definite (e.g., for $\mathbf{u}=(1,-1)$, $\langle \mathbf{u}, \mathbf{u} \rangle = -1-1 = -2 < 0$). Option (c) satisfies all axioms: it's symmetric, linear, and $\langle \mathbf{u}, \mathbf{u} \rangle = 2u_1^2 + 3u_2^2 \ge 0$, with equality only if $u_1=u_2=0$.
Source: Anthony & Harvey, Example 10.5, p. 314.
3. In an inner product space, the norm of a vector $\mathbf{v}$ is defined as:
The norm (or length) of a vector is defined as the square root of the inner product of the vector with itself. This ensures the norm is a non-negative real number.
Source: Anthony & Harvey, Definition 10.6, p. 315.
4. The Cauchy-Schwarz inequality states that for any two vectors $\mathbf{u}$ and $\mathbf{v}$ in an inner product space:
The Cauchy-Schwarz inequality states that the absolute value of the inner product of two vectors is less than or equal to the product of their norms. This is a fundamental inequality in linear algebra.
Source: Anthony & Harvey, Theorem 10.7, p. 315.
5. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal vectors in an inner product space, the Generalised Pythagoras' Theorem states:
The Generalised Pythagoras' Theorem is a direct consequence of the properties of the inner product. Since $||\mathbf{u}+\mathbf{v}||^2 = \langle \mathbf{u}+\mathbf{v}, \mathbf{u}+\mathbf{v} \rangle = ||\mathbf{u}||^2 + 2\langle \mathbf{u}, \mathbf{v} \rangle + ||\mathbf{v}||^2$ and $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ for orthogonal vectors, the result follows.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
6. A set of non-zero, pairwise orthogonal vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is:
A key theorem states that a set of non-zero orthogonal vectors is always linearly independent. To prove this, assume $c_1\mathbf{v}_1 + \dots + c_k\mathbf{v}_k = \mathbf{0}$. Taking the inner product with any $\mathbf{v}_i$ shows that $c_i||\mathbf{v}_i||^2 = 0$, which implies $c_i=0$ since $\mathbf{v}_i \neq \mathbf{0}$.
Source: Anthony & Harvey, Theorem 10.14, p. 318.
7. An $n \times n$ matrix $P$ is called an orthogonal matrix if:
The definition of an orthogonal matrix $P$ is that its inverse is equal to its transpose, i.e., $P^T P = I$. While its columns must be orthonormal (not just orthogonal) and its determinant must be $\pm 1$, the defining property is $P^{-1} = P^T$.
Source: Anthony & Harvey, Definition 10.15, p. 319.
8. What is an orthonormal set of vectors?
An orthonormal set combines two properties: "ortho" for orthogonal (the inner product of any two distinct vectors is zero) and "normal" for normalized (each vector has a norm, or length, of 1).
Source: Anthony & Harvey, Definition 10.19, p. 320.
9. An $n \times n$ matrix $P$ is orthogonal if and only if its columns form:
This is a fundamental theorem connecting the algebraic definition of an orthogonal matrix ($P^T P = I$) to the geometric properties of its column vectors. The entry $(i, j)$ of $P^T P$ is the dot product of the $i$-th column of $P$ with the $j$-th column. For this to be the identity matrix, the dot products must be 0 for $i \neq j$ (orthogonal) and 1 for $i=j$ (unit norm).
Source: Anthony & Harvey, Theorem 10.21, p. 321.
10. The Gram-Schmidt process is a method for:
The Gram-Schmidt process takes a basis and produces an orthonormal basis for the same space by iteratively constructing orthogonal vectors and then normalizing them.
Source: Anthony & Harvey, Section 10.4, p. 321.
11. Using the standard Euclidean inner product, what is the norm of the vector $\mathbf{v} = (1, -2, 2)$ in $\mathbb{R}^3$?
The norm is calculated as $||\mathbf{v}|| = \sqrt{1^2 + (-2)^2 + 2^2} = \sqrt{1 + 4 + 4} = \sqrt{9} = 3$.
Source: Anthony & Harvey, Definition 10.6, p. 315.
12. Which property of an inner product ensures that $\langle \mathbf{u}, k\mathbf{v} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ for a real inner product?
This is derived from the basic axioms. $\langle \mathbf{u}, k\mathbf{v} \rangle = \langle k\mathbf{v}, \mathbf{u} \rangle$ by symmetry. Then $\langle k\mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{v}, \mathbf{u} \rangle$ by homogeneity. Finally, $k\langle \mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ by symmetry again.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
13. If $||\mathbf{u}|| = 3$, $||\mathbf{v}|| = 4$, and the angle $\theta$ between them is $\pi/3$, what is $\langle \mathbf{u}, \mathbf{v} \rangle$? (Assume standard Euclidean inner product).
Using the formula $\langle \mathbf{u}, \mathbf{v} \rangle = ||\mathbf{u}|| ||\mathbf{v}|| \cos\theta$, we get $3 \times 4 \times \cos(\pi/3) = 12 \times (1/2) = 6$.
Source: Anthony & Harvey, Theorem 1.43, p. 31.
14. The triangle inequality for norms states that:
The triangle inequality states that the length of one side of a triangle (the vector sum $\mathbf{u}+\mathbf{v}$) is less than or equal to the sum of the lengths of the other two sides ($||\mathbf{u}|| + ||\mathbf{v}||$).
Source: Anthony & Harvey, Theorem 10.13, p. 318.
15. If you apply the Gram-Schmidt process to the vectors $\mathbf{v}_1 = (1, 0)$ and $\mathbf{v}_2 = (1, 1)$, what is the resulting orthonormal basis $\{\mathbf{u}_1, \mathbf{u}_2\}$?
Step 1: $\mathbf{u}_1 = \frac{\mathbf{v}_1}{||\mathbf{v}_1||} = \frac{(1,0)}{1} = (1,0)$. Step 2: Find $\mathbf{w}_2 = \mathbf{v}_2 - \langle \mathbf{v}_2, \mathbf{u}_1 \rangle \mathbf{u}_1 = (1,1) - (1)(1,0) = (0,1)$. Step 3: Normalize $\mathbf{w}_2$ to get $\mathbf{u}_2 = \frac{(0,1)}{1} = (0,1)$. The resulting basis is the standard basis.
Source: Anthony & Harvey, Section 10.4, p. 321.
16. Which of the following matrices is orthogonal?
A rotation matrix is always orthogonal. Its columns, $(\cos\theta, \sin\theta)$ and $(-\sin\theta, \cos\theta)$, are orthogonal to each other and both have a norm of 1. The other matrices do not have orthonormal columns.
Source: Anton & Rorres, Section 6.6, Example 2.
17. If $P$ is an orthogonal matrix, what is the value of $\det(P)$?
Since $P^T P = I$, we have $\det(P^T P) = \det(I) = 1$. Using the property $\det(AB) = \det(A)\det(B)$ and $\det(P^T) = \det(P)$, we get $\det(P)^2 = 1$, which implies $\det(P) = \pm 1$.
Source: Anthony & Harvey, Exercise 10.4, p. 325.
18. The set of all vectors in $\mathbb{R}^3$ orthogonal to $\mathbf{v} = (1, 1, 1)$ forms a:
The set of all vectors $\mathbf{x}=(x,y,z)$ such that $\langle \mathbf{x}, \mathbf{v} \rangle = 0$ is given by the equation $x+y+z=0$. This is the equation of a plane through the origin with normal vector $\mathbf{v}$. This set is the orthogonal complement of the line spanned by $\mathbf{v}$.
Source: Anthony & Harvey, Section 10.2.1.
19. Let $V = P_2$, the space of polynomials of degree at most 2, with inner product $\langle p, q \rangle = \int_{-1}^{1} p(x)q(x)dx$. Are the vectors $p(x)=x$ and $q(x)=x^2$ orthogonal?
We compute the inner product: $\langle p, q \rangle = \int_{-1}^{1} x \cdot x^2 dx = \int_{-1}^{1} x^3 dx = \left[\frac{x^4}{4}\right]_{-1}^{1} = \frac{1}{4} - \frac{1}{4} = 0$. Since the inner product is zero, the vectors are orthogonal.
Source: Anton & Rorres, Example 4, p. 203.
20. Normalizing the vector $\mathbf{v} = (3, 4)$ in $\mathbb{R}^2$ with the standard Euclidean inner product results in:
First, find the norm: $||\mathbf{v}|| = \sqrt{3^2 + 4^2} = \sqrt{9+16} = \sqrt{25} = 5$. Then, divide the vector by its norm: $\frac{\mathbf{v}}{||\mathbf{v}||} = \frac{1}{5}(3, 4) = (\frac{3}{5}, \frac{4}{5})$.
Source: Anthony & Harvey, p. 315.
21. The distance $d(\mathbf{u}, \mathbf{v})$ between two vectors in an inner product space is defined as:
The distance between two vectors is defined as the norm of their difference. This generalizes the standard distance formula in Euclidean space.
Source: Anton & Rorres, p. 185.
22. If $\{\mathbf{v}_1, \dots, \mathbf{v}_n\}$ is an orthonormal basis for $\mathbb{R}^n$ and $\mathbf{x} = c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n$, how is the coefficient $c_i$ found?
For an orthonormal basis, the coordinates (coefficients) of a vector are simply its inner products with the basis vectors. This is a major advantage of using orthonormal bases. $\langle \mathbf{x}, \mathbf{v}_i \rangle = \langle c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n, \mathbf{v}_i \rangle = c_i\langle \mathbf{v}_i, \mathbf{v}_i \rangle = c_i$.
Source: Anthony & Harvey, Theorem 10.20, p. 320.
23. Which statement is false?
The columns (and rows) of an orthogonal matrix must form an orthonormal set. However, there is no restriction on the signs of the diagonal entries. For example, a reflection matrix is orthogonal but can have negative entries on the diagonal.
Source: Anthony & Harvey, Theorem 10.18, p. 319.
24. In the Gram-Schmidt process, when constructing an orthogonal vector $\mathbf{w}_2$ from $\mathbf{v}_1$ and $\mathbf{v}_2$ (where $\mathbf{u}_1$ is the normalized $\mathbf{v}_1$), the formula is:
The process works by taking the next vector ($\mathbf{v}_2$) and subtracting its projection onto the subspace spanned by the previously found orthogonal vectors (in this case, just $\mathbf{u}_1$). The result is a vector orthogonal to the previous ones.
Source: Anthony & Harvey, p. 321.
25. If $\langle \mathbf{u}, \mathbf{v} \rangle = 3u_1v_1 + u_2v_2$ is an inner product on $\mathbb{R}^2$, what is the norm of $\mathbf{u}=(1, -1)$?
The norm is $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle} = \sqrt{3(1)(1) + (1)(-1)(-1)} = \sqrt{3+1} = \sqrt{4} = 2$. Note that the standard norm would be $\sqrt{2}$.
Source: Anton & Rorres, Example 2, p. 184.
26. The property $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$ is a direct consequence of which two fundamental inner product axioms?
We have $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{v}+\mathbf{w}, \mathbf{u} \rangle$ by symmetry. Then, by additivity, this is $\langle \mathbf{v}, \mathbf{u} \rangle + \langle \mathbf{w}, \mathbf{u} \rangle$. Applying symmetry again to each term gives $\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
27. If a set of vectors is orthonormal, it is:
An orthonormal set consists of non-zero (norm 1) orthogonal vectors. A set of non-zero orthogonal vectors is always linearly independent. It only becomes a basis (a spanning set) if it contains the "right number" of vectors (equal to the dimension of the space).
Source: Anthony & Harvey, Theorem 10.14, p. 318.
28. The orthogonal complement of a line through the origin in $\mathbb{R}^3$ is:
The orthogonal complement of a subspace W, denoted $W^\perp$, contains all vectors orthogonal to every vector in W. In $\mathbb{R}^3$, all vectors orthogonal to a given line (a 1D subspace) form a plane (a 2D subspace) perpendicular to that line.
Source: Anton & Rorres, Section 6.2.
29. If $P$ is an orthogonal matrix, then $P^T$ is:
If $P$ is orthogonal, $P^T P = P P^T = I$. We need to check if $(P^T)^T (P^T) = I$. This simplifies to $P P^T = I$, which is true by the definition of an orthogonal matrix. Therefore, $P^T$ is also orthogonal.
Source: Anthony & Harvey, Activity 10.22, p. 321.
30. The Cauchy-Schwarz inequality is an equality if and only if:
Equality holds in the Cauchy-Schwarz inequality, $|\langle \mathbf{u}, \mathbf{v} \rangle| = ||\mathbf{u}|| ||\mathbf{v}||$, if and only if one vector is a scalar multiple of the other, meaning they are linearly dependent.
Source: Anton & Rorres, Section 6.1.
31. What is the result of normalizing the vector $\mathbf{v}=(0, 5, 0)$ in $\mathbb{R}^3$?
The norm is $||\mathbf{v}|| = \sqrt{0^2+5^2+0^2} = 5$. Normalizing gives $\frac{1}{5}(0, 5, 0) = (0, 1, 0)$.
Source: Anthony & Harvey, p. 315.
32. Two vectors $\mathbf{u}$ and $\mathbf{v}$ are orthogonal if $\langle \mathbf{u}, \mathbf{v} \rangle = 0$. This is a:
This is the formal definition of orthogonality in a general inner product space. It is motivated by the geometric property in $\mathbb{R}^2$ and $\mathbb{R}^3$ but is itself a definition.
Source: Anthony & Harvey, Definition 10.9, p. 317.
33. If $P$ is an $n \times n$ orthogonal matrix, then its rows form:
A key property of orthogonal matrices is that both their columns and their rows form an orthonormal basis for $\mathbb{R}^n$. This is because if $P$ is orthogonal, so is $P^T$, and the columns of $P^T$ are the rows of $P$.
Source: Anthony & Harvey, p. 321.
34. In the Gram-Schmidt process, if $\mathbf{v}_2$ is a scalar multiple of $\mathbf{v}_1$, what happens?
The Gram-Schmidt process requires a linearly independent set of vectors. If $\mathbf{v}_2 = k\mathbf{v}_1$, then $\mathbf{v}_2$ is already in the span of $\mathbf{v}_1$. Its projection onto the span of $\mathbf{v}_1$ is $\mathbf{v}_2$ itself, so $\mathbf{w}_2 = \mathbf{v}_2 - \text{proj}_{\mathbf{v}_1}(\mathbf{v}_2) = \mathbf{v}_2 - \mathbf{v}_2 = \mathbf{0}$.
Source: Anthony & Harvey, Section 10.4, p. 321.
35. The standard inner product on $\mathbb{C}^n$ is defined as $\langle \mathbf{u}, \mathbf{v} \rangle = \mathbf{u} \cdot \overline{\mathbf{v}} = u_1\overline{v_1} + \dots + u_n\overline{v_n}$. Why is the conjugate used?
The conjugate ensures that $\langle \mathbf{v}, \mathbf{v} \rangle = \sum v_i \overline{v_i} = \sum |v_i|^2$, which is a non-negative real number. Without the conjugate, $\langle \mathbf{v}, \mathbf{v} \rangle$ could be a complex number or even zero for a non-zero vector (e.g., $\langle (1, i), (1, i) \rangle = 1^2 + i^2 = 0$).
Source: Anthony & Harvey, Section 13.4.1, p. 401.
36. For vectors $\mathbf{u}, \mathbf{v}$ in an inner product space, which of the following is always true?
$||\mathbf{u}+\mathbf{v}|| = ||\mathbf{v}+\mathbf{u}||$ by commutativity of vector addition. $||\mathbf{u}-\mathbf{v}|| = ||-(\mathbf{v}-\mathbf{u})|| = |-1| \cdot ||\mathbf{v}-\mathbf{u}|| = ||\mathbf{v}-\mathbf{u}||$. The norm is defined as a square root, so it is always non-negative.
Source: Anton & Rorres, Theorem 4.1.4, p. 4.
37. If $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ and $\langle \mathbf{v}, \mathbf{w} \rangle = 0$, does it imply $\langle \mathbf{u}, \mathbf{w} \rangle = 0$?
Orthogonality is not transitive. Consider $\mathbf{u}=(1,0,0), \mathbf{v}=(0,1,0), \mathbf{w}=(1,0,1)$ in $\mathbb{R}^3$. $\mathbf{u}$ is orthogonal to $\mathbf{v}$, and $\mathbf{v}$ is orthogonal to $\mathbf{w}$, but $\mathbf{u}$ is not orthogonal to $\mathbf{w}$.
Source: Conceptual understanding of orthogonality.
38. The set of all vectors orthogonal to a subspace $W$ is called the...
This is the definition of the orthogonal complement, denoted $W^\perp$. It is itself a subspace.
Source: Anthony & Harvey, Definition 12.7, p. 367.
39. If $S = \{\mathbf{v}_1, \mathbf{v}_2\}$ is an orthogonal basis for a subspace $W$, the orthogonal projection of $\mathbf{u}$ onto $W$ is given by:
When the basis is orthogonal but not necessarily orthonormal, you must divide by the square of the norm of each basis vector when calculating the projection coefficients. If the basis were orthonormal, the denominators would be 1.
Source: Anton & Rorres, Theorem 6.3.5, p. 223.
40. The distance from a point $\mathbf{u}$ to a subspace $W$ is given by:
The distance from a point to a subspace is the length of the component of the vector that is orthogonal to the subspace. This vector is $\mathbf{u} - \text{proj}_W \mathbf{u}$.
Source: Anthony & Harvey, Theorem 12.30, p. 379.
41. If $A$ is an orthogonal matrix, then $A^T$ is...
This is the definition of an orthogonal matrix.
Source: Anthony & Harvey, Definition 10.15, p. 319.
42. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal, then $||\mathbf{u}-\mathbf{v}||^2$ equals:
$||\mathbf{u}-\mathbf{v}||^2 = \langle \mathbf{u}-\mathbf{v}, \mathbf{u}-\mathbf{v} \rangle = \langle \mathbf{u}, \mathbf{u} \rangle - 2\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{v}, \mathbf{v} \rangle$. Since $\langle \mathbf{u}, \mathbf{v} \rangle = 0$, this simplifies to $||\mathbf{u}||^2 + ||\mathbf{v}||^2$. This is another form of Pythagoras's Theorem.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
43. The process of creating a unit vector from a non-zero vector $\mathbf{v}$ is called:
Normalizing a vector means scaling it so that its length (norm) becomes 1, without changing its direction. This is done by dividing the vector by its own norm.
Source: Anthony & Harvey, p. 315.
44. If $A$ is an $m \times n$ matrix, its row space and nullspace are orthogonal complements in...
The row vectors and the vectors in the nullspace (solutions to $A\mathbf{x}=\mathbf{0}$) are both vectors with $n$ components, so they are subspaces of $\mathbb{R}^n$. The Fundamental Theorem of Linear Algebra states they are orthogonal complements.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.
45. Let $\mathbf{u}=(1,1), \mathbf{v}=(1,-1)$ in $\mathbb{R}^2$. Are they orthogonal with respect to the standard inner product?
The inner product is $\langle \mathbf{u}, \mathbf{v} \rangle = (1)(1) + (1)(-1) = 1 - 1 = 0$. Since the inner product is zero, they are orthogonal.
Source: Anthony & Harvey, p. 317.
46. The zero vector is orthogonal to every vector in a vector space.
This is true because $\langle \mathbf{0}, \mathbf{v} \rangle = \langle 0\mathbf{v}, \mathbf{v} \rangle = 0\langle \mathbf{v}, \mathbf{v} \rangle = 0$ for any vector $\mathbf{v}$.
Source: Anthony & Harvey, Theorem 6.1.1, p. 191.
47. If $Q$ is an orthogonal matrix, then multiplying a vector $\mathbf{x}$ by $Q$ (i.e., $Q\mathbf{x}$) preserves its...
Multiplication by an orthogonal matrix is an isometry, which means it preserves lengths and angles. $||Q\mathbf{x}||^2 = \langle Q\mathbf{x}, Q\mathbf{x} \rangle = (Q\mathbf{x})^T(Q\mathbf{x}) = \mathbf{x}^T Q^T Q \mathbf{x} = \mathbf{x}^T I \mathbf{x} = \mathbf{x}^T\mathbf{x} = ||\mathbf{x}||^2$.
Source: Anton & Rorres, Section 6.6.
48. The first step of the Gram-Schmidt process on a set $\{\mathbf{v}_1, \mathbf{v}_2, \dots\}$ is to set $\mathbf{u}_1 = \dots$
The process begins by creating the first vector of the new orthonormal basis. This is done by taking the first vector from the original set and normalizing it.
Source: Anthony & Harvey, p. 321.
49. The set of all vectors orthogonal to every vector in a subspace $W$ is denoted by:
This is the standard notation for the orthogonal complement of a subspace $W$. It is read as "W perp".
Source: Anthony & Harvey, Definition 12.7, p. 367.
50. If $A$ is an $m \times n$ matrix, the orthogonal complement of the row space of $A$ is the...
This is a statement of the Fundamental Theorem of Linear Algebra. The row space and nullspace are orthogonal complements in $\mathbb{R}^n$.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.
], ['\\(', '\\)']] } };

MT2175: Inner Products and Orthogonality Quiz

1. Which of the following is NOT a required property for an operation $\langle \cdot, \cdot \rangle$ to be an inner product on a real vector space $V$? (u, v, w are vectors in V, k is a scalar)
The norm $||\mathbf{u}||$ is defined from the inner product as $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle}$, not the other way around. The three fundamental properties for a real inner product are Symmetry, Linearity (which includes Additivity and Homogeneity), and Positive-definiteness ($\langle \mathbf{u}, \mathbf{u} \rangle \ge 0$ and $\langle \mathbf{u}, \mathbf{u} \rangle = 0 \iff \mathbf{u} = \mathbf{0}$).
Source: Anthony & Harvey, Definition 10.1, p. 313.
2. Let $\mathbf{u} = (u_1, u_2)$ and $\mathbf{v} = (v_1, v_2)$ be vectors in $\mathbb{R}^2$. Which of the following is a valid inner product on $\mathbb{R}^2$?
This is a weighted Euclidean inner product. Option (a) fails positive-definiteness (e.g., for $\mathbf{u}=(1,2)$, $\langle \mathbf{u}, \mathbf{u} \rangle = 1-4 = -3 < 0$). Option (b) is not linear. Option (d) is not positive-definite (e.g., for $\mathbf{u}=(1,-1)$, $\langle \mathbf{u}, \mathbf{u} \rangle = -1-1 = -2 < 0$). Option (c) satisfies all axioms: it's symmetric, linear, and $\langle \mathbf{u}, \mathbf{u} \rangle = 2u_1^2 + 3u_2^2 \ge 0$, with equality only if $u_1=u_2=0$.
Source: Anthony & Harvey, Example 10.5, p. 314.
3. In an inner product space, the norm of a vector $\mathbf{v}$ is defined as:
The norm (or length) of a vector is defined as the square root of the inner product of the vector with itself. This ensures the norm is a non-negative real number.
Source: Anthony & Harvey, Definition 10.6, p. 315.
4. The Cauchy-Schwarz inequality states that for any two vectors $\mathbf{u}$ and $\mathbf{v}$ in an inner product space:
The Cauchy-Schwarz inequality states that the absolute value of the inner product of two vectors is less than or equal to the product of their norms. This is a fundamental inequality in linear algebra.
Source: Anthony & Harvey, Theorem 10.7, p. 315.
5. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal vectors in an inner product space, the Generalised Pythagoras' Theorem states:
The Generalised Pythagoras' Theorem is a direct consequence of the properties of the inner product. Since $||\mathbf{u}+\mathbf{v}||^2 = \langle \mathbf{u}+\mathbf{v}, \mathbf{u}+\mathbf{v} \rangle = ||\mathbf{u}||^2 + 2\langle \mathbf{u}, \mathbf{v} \rangle + ||\mathbf{v}||^2$ and $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ for orthogonal vectors, the result follows.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
6. A set of non-zero, pairwise orthogonal vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is:
A key theorem states that a set of non-zero orthogonal vectors is always linearly independent. To prove this, assume $c_1\mathbf{v}_1 + \dots + c_k\mathbf{v}_k = \mathbf{0}$. Taking the inner product with any $\mathbf{v}_i$ shows that $c_i||\mathbf{v}_i||^2 = 0$, which implies $c_i=0$ since $\mathbf{v}_i \neq \mathbf{0}$.
Source: Anthony & Harvey, Theorem 10.14, p. 318.
7. An $n \times n$ matrix $P$ is called an orthogonal matrix if:
The definition of an orthogonal matrix $P$ is that its inverse is equal to its transpose, i.e., $P^T P = I$. While its columns must be orthonormal (not just orthogonal) and its determinant must be $\pm 1$, the defining property is $P^{-1} = P^T$.
Source: Anthony & Harvey, Definition 10.15, p. 319.
8. What is an orthonormal set of vectors?
An orthonormal set combines two properties: "ortho" for orthogonal (the inner product of any two distinct vectors is zero) and "normal" for normalized (each vector has a norm, or length, of 1).
Source: Anthony & Harvey, Definition 10.19, p. 320.
9. An $n \times n$ matrix $P$ is orthogonal if and only if its columns form:
This is a fundamental theorem connecting the algebraic definition of an orthogonal matrix ($P^T P = I$) to the geometric properties of its column vectors. The entry $(i, j)$ of $P^T P$ is the dot product of the $i$-th column of $P$ with the $j$-th column. For this to be the identity matrix, the dot products must be 0 for $i \neq j$ (orthogonal) and 1 for $i=j$ (unit norm).
Source: Anthony & Harvey, Theorem 10.21, p. 321.
10. The Gram-Schmidt process is a method for:
The Gram-Schmidt process takes a basis and produces an orthonormal basis for the same space by iteratively constructing orthogonal vectors and then normalizing them.
Source: Anthony & Harvey, Section 10.4, p. 321.
11. Using the standard Euclidean inner product, what is the norm of the vector $\mathbf{v} = (1, -2, 2)$ in $\mathbb{R}^3$?
The norm is calculated as $||\mathbf{v}|| = \sqrt{1^2 + (-2)^2 + 2^2} = \sqrt{1 + 4 + 4} = \sqrt{9} = 3$.
Source: Anthony & Harvey, Definition 10.6, p. 315.
12. Which property of an inner product ensures that $\langle \mathbf{u}, k\mathbf{v} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ for a real inner product?
This is derived from the basic axioms. $\langle \mathbf{u}, k\mathbf{v} \rangle = \langle k\mathbf{v}, \mathbf{u} \rangle$ by symmetry. Then $\langle k\mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{v}, \mathbf{u} \rangle$ by homogeneity. Finally, $k\langle \mathbf{v}, \mathbf{u} \rangle = k\langle \mathbf{u}, \mathbf{v} \rangle$ by symmetry again.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
13. If $||\mathbf{u}|| = 3$, $||\mathbf{v}|| = 4$, and the angle $\theta$ between them is $\pi/3$, what is $\langle \mathbf{u}, \mathbf{v} \rangle$? (Assume standard Euclidean inner product).
Using the formula $\langle \mathbf{u}, \mathbf{v} \rangle = ||\mathbf{u}|| ||\mathbf{v}|| \cos\theta$, we get $3 \times 4 \times \cos(\pi/3) = 12 \times (1/2) = 6$.
Source: Anthony & Harvey, Theorem 1.43, p. 31.
14. The triangle inequality for norms states that:
The triangle inequality states that the length of one side of a triangle (the vector sum $\mathbf{u}+\mathbf{v}$) is less than or equal to the sum of the lengths of the other two sides ($||\mathbf{u}|| + ||\mathbf{v}||$).
Source: Anthony & Harvey, Theorem 10.13, p. 318.
15. If you apply the Gram-Schmidt process to the vectors $\mathbf{v}_1 = (1, 0)$ and $\mathbf{v}_2 = (1, 1)$, what is the resulting orthonormal basis $\{\mathbf{u}_1, \mathbf{u}_2\}$?
Step 1: $\mathbf{u}_1 = \frac{\mathbf{v}_1}{||\mathbf{v}_1||} = \frac{(1,0)}{1} = (1,0)$. Step 2: Find $\mathbf{w}_2 = \mathbf{v}_2 - \langle \mathbf{v}_2, \mathbf{u}_1 \rangle \mathbf{u}_1 = (1,1) - (1)(1,0) = (0,1)$. Step 3: Normalize $\mathbf{w}_2$ to get $\mathbf{u}_2 = \frac{(0,1)}{1} = (0,1)$. The resulting basis is the standard basis.
Source: Anthony & Harvey, Section 10.4, p. 321.
16. Which of the following matrices is orthogonal?
A rotation matrix is always orthogonal. Its columns, $(\cos\theta, \sin\theta)$ and $(-\sin\theta, \cos\theta)$, are orthogonal to each other and both have a norm of 1. The other matrices do not have orthonormal columns.
Source: Anton & Rorres, Section 6.6, Example 2.
17. If $P$ is an orthogonal matrix, what is the value of $\det(P)$?
Since $P^T P = I$, we have $\det(P^T P) = \det(I) = 1$. Using the property $\det(AB) = \det(A)\det(B)$ and $\det(P^T) = \det(P)$, we get $\det(P)^2 = 1$, which implies $\det(P) = \pm 1$.
Source: Anthony & Harvey, Exercise 10.4, p. 325.
18. The set of all vectors in $\mathbb{R}^3$ orthogonal to $\mathbf{v} = (1, 1, 1)$ forms a:
The set of all vectors $\mathbf{x}=(x,y,z)$ such that $\langle \mathbf{x}, \mathbf{v} \rangle = 0$ is given by the equation $x+y+z=0$. This is the equation of a plane through the origin with normal vector $\mathbf{v}$. This set is the orthogonal complement of the line spanned by $\mathbf{v}$.
Source: Anthony & Harvey, Section 10.2.1.
19. Let $V = P_2$, the space of polynomials of degree at most 2, with inner product $\langle p, q \rangle = \int_{-1}^{1} p(x)q(x)dx$. Are the vectors $p(x)=x$ and $q(x)=x^2$ orthogonal?
We compute the inner product: $\langle p, q \rangle = \int_{-1}^{1} x \cdot x^2 dx = \int_{-1}^{1} x^3 dx = \left[\frac{x^4}{4}\right]_{-1}^{1} = \frac{1}{4} - \frac{1}{4} = 0$. Since the inner product is zero, the vectors are orthogonal.
Source: Anton & Rorres, Example 4, p. 203.
20. Normalizing the vector $\mathbf{v} = (3, 4)$ in $\mathbb{R}^2$ with the standard Euclidean inner product results in:
First, find the norm: $||\mathbf{v}|| = \sqrt{3^2 + 4^2} = \sqrt{9+16} = \sqrt{25} = 5$. Then, divide the vector by its norm: $\frac{\mathbf{v}}{||\mathbf{v}||} = \frac{1}{5}(3, 4) = (\frac{3}{5}, \frac{4}{5})$.
Source: Anthony & Harvey, p. 315.
21. The distance $d(\mathbf{u}, \mathbf{v})$ between two vectors in an inner product space is defined as:
The distance between two vectors is defined as the norm of their difference. This generalizes the standard distance formula in Euclidean space.
Source: Anton & Rorres, p. 185.
22. If $\{\mathbf{v}_1, \dots, \mathbf{v}_n\}$ is an orthonormal basis for $\mathbb{R}^n$ and $\mathbf{x} = c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n$, how is the coefficient $c_i$ found?
For an orthonormal basis, the coordinates (coefficients) of a vector are simply its inner products with the basis vectors. This is a major advantage of using orthonormal bases. $\langle \mathbf{x}, \mathbf{v}_i \rangle = \langle c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n, \mathbf{v}_i \rangle = c_i\langle \mathbf{v}_i, \mathbf{v}_i \rangle = c_i$.
Source: Anthony & Harvey, Theorem 10.20, p. 320.
23. Which statement is false?
The columns (and rows) of an orthogonal matrix must form an orthonormal set. However, there is no restriction on the signs of the diagonal entries. For example, a reflection matrix is orthogonal but can have negative entries on the diagonal.
Source: Anthony & Harvey, Theorem 10.18, p. 319.
24. In the Gram-Schmidt process, when constructing an orthogonal vector $\mathbf{w}_2$ from $\mathbf{v}_1$ and $\mathbf{v}_2$ (where $\mathbf{u}_1$ is the normalized $\mathbf{v}_1$), the formula is:
The process works by taking the next vector ($\mathbf{v}_2$) and subtracting its projection onto the subspace spanned by the previously found orthogonal vectors (in this case, just $\mathbf{u}_1$). The result is a vector orthogonal to the previous ones.
Source: Anthony & Harvey, p. 321.
25. If $\langle \mathbf{u}, \mathbf{v} \rangle = 3u_1v_1 + u_2v_2$ is an inner product on $\mathbb{R}^2$, what is the norm of $\mathbf{u}=(1, -1)$?
The norm is $||\mathbf{u}|| = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle} = \sqrt{3(1)(1) + (1)(-1)(-1)} = \sqrt{3+1} = \sqrt{4} = 2$. Note that the standard norm would be $\sqrt{2}$.
Source: Anton & Rorres, Example 2, p. 184.
26. The property $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$ is a direct consequence of which two fundamental inner product axioms?
We have $\langle \mathbf{u}, \mathbf{v}+\mathbf{w} \rangle = \langle \mathbf{v}+\mathbf{w}, \mathbf{u} \rangle$ by symmetry. Then, by additivity, this is $\langle \mathbf{v}, \mathbf{u} \rangle + \langle \mathbf{w}, \mathbf{u} \rangle$. Applying symmetry again to each term gives $\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{u}, \mathbf{w} \rangle$.
Source: Anthony & Harvey, Theorem 1.36, p. 25.
27. If a set of vectors is orthonormal, it is:
An orthonormal set consists of non-zero (norm 1) orthogonal vectors. A set of non-zero orthogonal vectors is always linearly independent. It only becomes a basis (a spanning set) if it contains the "right number" of vectors (equal to the dimension of the space).
Source: Anthony & Harvey, Theorem 10.14, p. 318.
28. The orthogonal complement of a line through the origin in $\mathbb{R}^3$ is:
The orthogonal complement of a subspace W, denoted $W^\perp$, contains all vectors orthogonal to every vector in W. In $\mathbb{R}^3$, all vectors orthogonal to a given line (a 1D subspace) form a plane (a 2D subspace) perpendicular to that line.
Source: Anton & Rorres, Section 6.2.
29. If $P$ is an orthogonal matrix, then $P^T$ is:
If $P$ is orthogonal, $P^T P = P P^T = I$. We need to check if $(P^T)^T (P^T) = I$. This simplifies to $P P^T = I$, which is true by the definition of an orthogonal matrix. Therefore, $P^T$ is also orthogonal.
Source: Anthony & Harvey, Activity 10.22, p. 321.
30. The Cauchy-Schwarz inequality is an equality if and only if:
Equality holds in the Cauchy-Schwarz inequality, $|\langle \mathbf{u}, \mathbf{v} \rangle| = ||\mathbf{u}|| ||\mathbf{v}||$, if and only if one vector is a scalar multiple of the other, meaning they are linearly dependent.
Source: Anton & Rorres, Section 6.1.
31. What is the result of normalizing the vector $\mathbf{v}=(0, 5, 0)$ in $\mathbb{R}^3$?
The norm is $||\mathbf{v}|| = \sqrt{0^2+5^2+0^2} = 5$. Normalizing gives $\frac{1}{5}(0, 5, 0) = (0, 1, 0)$.
Source: Anthony & Harvey, p. 315.
32. Two vectors $\mathbf{u}$ and $\mathbf{v}$ are orthogonal if $\langle \mathbf{u}, \mathbf{v} \rangle = 0$. This is a:
This is the formal definition of orthogonality in a general inner product space. It is motivated by the geometric property in $\mathbb{R}^2$ and $\mathbb{R}^3$ but is itself a definition.
Source: Anthony & Harvey, Definition 10.9, p. 317.
33. If $P$ is an $n \times n$ orthogonal matrix, then its rows form:
A key property of orthogonal matrices is that both their columns and their rows form an orthonormal basis for $\mathbb{R}^n$. This is because if $P$ is orthogonal, so is $P^T$, and the columns of $P^T$ are the rows of $P$.
Source: Anthony & Harvey, p. 321.
34. In the Gram-Schmidt process, if $\mathbf{v}_2$ is a scalar multiple of $\mathbf{v}_1$, what happens?
The Gram-Schmidt process requires a linearly independent set of vectors. If $\mathbf{v}_2 = k\mathbf{v}_1$, then $\mathbf{v}_2$ is already in the span of $\mathbf{v}_1$. Its projection onto the span of $\mathbf{v}_1$ is $\mathbf{v}_2$ itself, so $\mathbf{w}_2 = \mathbf{v}_2 - \text{proj}_{\mathbf{v}_1}(\mathbf{v}_2) = \mathbf{v}_2 - \mathbf{v}_2 = \mathbf{0}$.
Source: Anthony & Harvey, Section 10.4, p. 321.
35. The standard inner product on $\mathbb{C}^n$ is defined as $\langle \mathbf{u}, \mathbf{v} \rangle = \mathbf{u} \cdot \overline{\mathbf{v}} = u_1\overline{v_1} + \dots + u_n\overline{v_n}$. Why is the conjugate used?
The conjugate ensures that $\langle \mathbf{v}, \mathbf{v} \rangle = \sum v_i \overline{v_i} = \sum |v_i|^2$, which is a non-negative real number. Without the conjugate, $\langle \mathbf{v}, \mathbf{v} \rangle$ could be a complex number or even zero for a non-zero vector (e.g., $\langle (1, i), (1, i) \rangle = 1^2 + i^2 = 0$).
Source: Anthony & Harvey, Section 13.4.1, p. 401.
36. For vectors $\mathbf{u}, \mathbf{v}$ in an inner product space, which of the following is always true?
$||\mathbf{u}+\mathbf{v}|| = ||\mathbf{v}+\mathbf{u}||$ by commutativity of vector addition. $||\mathbf{u}-\mathbf{v}|| = ||-(\mathbf{v}-\mathbf{u})|| = |-1| \cdot ||\mathbf{v}-\mathbf{u}|| = ||\mathbf{v}-\mathbf{u}||$. The norm is defined as a square root, so it is always non-negative.
Source: Anton & Rorres, Theorem 4.1.4, p. 4.
37. If $\langle \mathbf{u}, \mathbf{v} \rangle = 0$ and $\langle \mathbf{v}, \mathbf{w} \rangle = 0$, does it imply $\langle \mathbf{u}, \mathbf{w} \rangle = 0$?
Orthogonality is not transitive. Consider $\mathbf{u}=(1,0,0), \mathbf{v}=(0,1,0), \mathbf{w}=(1,0,1)$ in $\mathbb{R}^3$. $\mathbf{u}$ is orthogonal to $\mathbf{v}$, and $\mathbf{v}$ is orthogonal to $\mathbf{w}$, but $\mathbf{u}$ is not orthogonal to $\mathbf{w}$.
Source: Conceptual understanding of orthogonality.
38. The set of all vectors orthogonal to a subspace $W$ is called the...
This is the definition of the orthogonal complement, denoted $W^\perp$. It is itself a subspace.
Source: Anthony & Harvey, Definition 12.7, p. 367.
39. If $S = \{\mathbf{v}_1, \mathbf{v}_2\}$ is an orthogonal basis for a subspace $W$, the orthogonal projection of $\mathbf{u}$ onto $W$ is given by:
When the basis is orthogonal but not necessarily orthonormal, you must divide by the square of the norm of each basis vector when calculating the projection coefficients. If the basis were orthonormal, the denominators would be 1.
Source: Anton & Rorres, Theorem 6.3.5, p. 223.
40. The distance from a point $\mathbf{u}$ to a subspace $W$ is given by:
The distance from a point to a subspace is the length of the component of the vector that is orthogonal to the subspace. This vector is $\mathbf{u} - \text{proj}_W \mathbf{u}$.
Source: Anthony & Harvey, Theorem 12.30, p. 379.
41. If $A$ is an orthogonal matrix, then $A^T$ is...
This is the definition of an orthogonal matrix.
Source: Anthony & Harvey, Definition 10.15, p. 319.
42. If $\mathbf{u}$ and $\mathbf{v}$ are orthogonal, then $||\mathbf{u}-\mathbf{v}||^2$ equals:
$||\mathbf{u}-\mathbf{v}||^2 = \langle \mathbf{u}-\mathbf{v}, \mathbf{u}-\mathbf{v} \rangle = \langle \mathbf{u}, \mathbf{u} \rangle - 2\langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{v}, \mathbf{v} \rangle$. Since $\langle \mathbf{u}, \mathbf{v} \rangle = 0$, this simplifies to $||\mathbf{u}||^2 + ||\mathbf{v}||^2$. This is another form of Pythagoras's Theorem.
Source: Anthony & Harvey, Theorem 10.12, p. 317.
43. The process of creating a unit vector from a non-zero vector $\mathbf{v}$ is called:
Normalizing a vector means scaling it so that its length (norm) becomes 1, without changing its direction. This is done by dividing the vector by its own norm.
Source: Anthony & Harvey, p. 315.
44. If $A$ is an $m \times n$ matrix, its row space and nullspace are orthogonal complements in...
The row vectors and the vectors in the nullspace (solutions to $A\mathbf{x}=\mathbf{0}$) are both vectors with $n$ components, so they are subspaces of $\mathbb{R}^n$. The Fundamental Theorem of Linear Algebra states they are orthogonal complements.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.
45. Let $\mathbf{u}=(1,1), \mathbf{v}=(1,-1)$ in $\mathbb{R}^2$. Are they orthogonal with respect to the standard inner product?
The inner product is $\langle \mathbf{u}, \mathbf{v} \rangle = (1)(1) + (1)(-1) = 1 - 1 = 0$. Since the inner product is zero, they are orthogonal.
Source: Anthony & Harvey, p. 317.
46. The zero vector is orthogonal to every vector in a vector space.
This is true because $\langle \mathbf{0}, \mathbf{v} \rangle = \langle 0\mathbf{v}, \mathbf{v} \rangle = 0\langle \mathbf{v}, \mathbf{v} \rangle = 0$ for any vector $\mathbf{v}$.
Source: Anthony & Harvey, Theorem 6.1.1, p. 191.
47. If $Q$ is an orthogonal matrix, then multiplying a vector $\mathbf{x}$ by $Q$ (i.e., $Q\mathbf{x}$) preserves its...
Multiplication by an orthogonal matrix is an isometry, which means it preserves lengths and angles. $||Q\mathbf{x}||^2 = \langle Q\mathbf{x}, Q\mathbf{x} \rangle = (Q\mathbf{x})^T(Q\mathbf{x}) = \mathbf{x}^T Q^T Q \mathbf{x} = \mathbf{x}^T I \mathbf{x} = \mathbf{x}^T\mathbf{x} = ||\mathbf{x}||^2$.
Source: Anton & Rorres, Section 6.6.
48. The first step of the Gram-Schmidt process on a set $\{\mathbf{v}_1, \mathbf{v}_2, \dots\}$ is to set $\mathbf{u}_1 = \dots$
The process begins by creating the first vector of the new orthonormal basis. This is done by taking the first vector from the original set and normalizing it.
Source: Anthony & Harvey, p. 321.
49. The set of all vectors orthogonal to every vector in a subspace $W$ is denoted by:
This is the standard notation for the orthogonal complement of a subspace $W$. It is read as "W perp".
Source: Anthony & Harvey, Definition 12.7, p. 367.
50. If $A$ is an $m \times n$ matrix, the orthogonal complement of the row space of $A$ is the...
This is a statement of the Fundamental Theorem of Linear Algebra. The row space and nullspace are orthogonal complements in $\mathbb{R}^n$.
Source: Anton & Rorres, Theorem 6.2.6, p. 206.