MT2175: Complex Matrices and Vector Spaces

1
What are the three main forms of a complex number \(z\)?
A complex number \(z\) can be expressed in three forms:
  1. Cartesian Form: \(z = a + ib\), where \(a = \text{Re}(z)\) is the real part and \(b = \text{Im}(z)\) is the imaginary part.
  2. Polar Form: \(z = r(\cos\theta + i\sin\theta)\), where \(r = |z|\) is the modulus (or absolute value) and \(\theta = \arg(z)\) is the argument (or angle).
  3. Exponential Form: \(z = re^{i\theta}\), which is derived from the polar form using Euler's formula, \(e^{i\theta} = \cos\theta + i\sin\theta\).
Source: Subject Guide, 7.1; Anthony & Harvey, 13.1
2
Define the complex conjugate, modulus, and argument of a complex number \(z = a + ib\).
  • Complex Conjugate (\(\bar{z}\)): The conjugate of \(z = a + ib\) is \(\bar{z} = a - ib\). Geometrically, it is the reflection of \(z\) across the real axis.
  • Modulus (\(|z|\)): The modulus of \(z\) is a non-negative real number given by \(|z| = \sqrt{a^2 + b^2}\). It represents the distance from the origin to the point \((a, b)\) in the complex plane. Note that \(|z|^2 = z\bar{z}\).
  • Argument (\(\arg(z)\)): The argument \(\theta\) is the angle that the vector from the origin to \((a, b)\) makes with the positive real axis. It is found using \(\tan\theta = b/a\), adjusting the quadrant based on the signs of \(a\) and \(b\). The principal argument, \(\text{Arg}(z)\), is the unique angle in the interval \((-\pi, \pi]\).
Source: Subject Guide, 7.1.5; Anthony & Harvey, 13.1.1
3
State De Moivre's Formula.
For any real number \(\theta\) and any integer \(n\), De Moivre's Formula states: \[ (\cos\theta + i\sin\theta)^n = \cos(n\theta) + i\sin(n\theta) \] In exponential form, this is equivalent to \((e^{i\theta})^n = e^{in\theta}\). This formula is extremely useful for finding powers and roots of complex numbers.
Source: Subject Guide, 7.1.6; Anthony & Harvey, 13.1.4
4
How do you find the \(n\)-th roots of a complex number \(z = re^{i\theta}\)?
A complex number \(z\) has \(n\) distinct \(n\)-th roots. If \(z = r(\cos\theta + i\sin\theta)\), its \(n\)-th roots are given by: \[ w_k = \sqrt[n]{r} \left( \cos\left(\frac{\theta + 2k\pi}{n}\right) + i\sin\left(\frac{\theta + 2k\pi}{n}\right) \right) \] for \(k = 0, 1, 2, \dots, n-1\). Geometrically, the roots lie on a circle of radius \(\sqrt[n]{r}\) and are equally spaced by an angle of \(2\pi/n\).
Source: Subject Guide, 7.1.3; Anthony & Harvey, 13.1.5
5
What is a complex vector space?
A complex vector space is a vector space where the scalars are complex numbers (from \(\mathbb{C}\)) instead of real numbers. It must satisfy the same ten vector space axioms as a real vector space, including closure under addition and scalar multiplication, associativity, commutativity, existence of a zero vector, additive inverses, and distributive properties. The space \(\mathbb{C}^n\) is a key example of an \(n\)-dimensional complex vector space.
Source: Subject Guide, 7.2; Anthony & Harvey, 13.2
6
What is a complex matrix?
A complex matrix is simply a matrix whose entries are complex numbers. All the standard matrix operations like addition, scalar multiplication (with complex scalars), and matrix multiplication apply to complex matrices in the same way they do to real matrices.
Source: Subject Guide, 7.3; Anthony & Harvey, 13.3
7
How is the standard complex inner product on \(\mathbb{C}^n\) defined?
For two vectors \(\mathbf{x} = (x_1, \dots, x_n)\) and \(\mathbf{y} = (y_1, \dots, y_n)\) in \(\mathbb{C}^n\), the standard complex inner product is defined as: \[ \langle \mathbf{x}, \mathbf{y} \rangle = \sum_{i=1}^n x_i \bar{y}_i = x_1\bar{y}_1 + x_2\bar{y}_2 + \dots + x_n\bar{y}_n \] This can also be written in matrix form as \(\mathbf{y}^*\mathbf{x}\), where \(\mathbf{y}^*\) is the conjugate transpose (Hermitian conjugate) of \(\mathbf{y}\).
Source: Subject Guide, 7.4.1; Anthony & Harvey, 13.4.1
8
What are the defining properties of a complex inner product?
A complex inner product \(\langle \cdot, \cdot \rangle\) on a vector space \(V\) must satisfy the following properties for all \(\mathbf{x}, \mathbf{y}, \mathbf{z} \in V\) and \(\alpha \in \mathbb{C}\):
  1. Conjugate Symmetry: \(\langle \mathbf{x}, \mathbf{y} \rangle = \overline{\langle \mathbf{y}, \mathbf{x} \rangle}\)
  2. Linearity in the first argument: \(\langle \alpha\mathbf{x} + \mathbf{y}, \mathbf{z} \rangle = \alpha\langle \mathbf{x}, \mathbf{z} \rangle + \langle \mathbf{y}, \mathbf{z} \rangle\)
  3. Positive-definiteness: \(\langle \mathbf{x}, \mathbf{x} \rangle \ge 0\) and \(\langle \mathbf{x}, \mathbf{x} \rangle = 0 \iff \mathbf{x} = \mathbf{0}\).
Note that linearity is only in the first argument. For the second argument, it is conjugate-linear: \(\langle \mathbf{x}, \alpha\mathbf{y} \rangle = \bar{\alpha}\langle \mathbf{x}, \mathbf{y} \rangle\).
Source: Subject Guide, 7.4.2; Anthony & Harvey, 13.4
9
What is the Hermitian conjugate of a complex matrix \(A\)?
The Hermitian conjugate (or conjugate transpose) of a complex matrix \(A\), denoted \(A^*\), is obtained by taking the transpose of the matrix and then taking the complex conjugate of each entry. \[ A^* = \bar{A}^T \] For example, if \(A = \begin{pmatrix} 1+i & 2 \\ 3i & 4-i \end{pmatrix}\), then \(A^* = \begin{pmatrix} 1-i & -3i \\ 2 & 4+i \end{pmatrix}\).
Source: Subject Guide, 7.5.1; Anthony & Harvey, 13.5
10
Define a Hermitian matrix and a Unitary matrix.
  • A square complex matrix \(A\) is Hermitian if it is equal to its own Hermitian conjugate, i.e., \(A = A^*\). The diagonal entries of a Hermitian matrix must be real.
  • A square complex matrix \(U\) is Unitary if its inverse is its Hermitian conjugate, i.e., \(U^*U = UU^* = I\). Unitary matrices are the complex analogue of real orthogonal matrices.
Source: Subject Guide, 7.5.2, 7.5.3; Anthony & Harvey, 13.5
11
Prove that the eigenvalues of a Hermitian matrix are real.
Let \(A\) be a Hermitian matrix (\(A = A^*\)) with eigenvalue \(\lambda\) and corresponding eigenvector \(\mathbf{v} \neq \mathbf{0}\). So, \(A\mathbf{v} = \lambda\mathbf{v}\).
Consider the inner product \(\langle A\mathbf{v}, \mathbf{v} \rangle\). \[ \langle A\mathbf{v}, \mathbf{v} \rangle = \langle \lambda\mathbf{v}, \mathbf{v} \rangle = \lambda \langle \mathbf{v}, \mathbf{v} \rangle = \lambda ||\mathbf{v}||^2 \] Also, using the property \(\langle A\mathbf{x}, \mathbf{y} \rangle = \langle \mathbf{x}, A^*\mathbf{y} \rangle\), we have: \[ \langle A\mathbf{v}, \mathbf{v} \rangle = \langle \mathbf{v}, A^*\mathbf{v} \rangle = \langle \mathbf{v}, A\mathbf{v} \rangle = \langle \mathbf{v}, \lambda\mathbf{v} \rangle = \bar{\lambda} \langle \mathbf{v}, \mathbf{v} \rangle = \bar{\lambda} ||\mathbf{v}||^2 \] Equating the two expressions gives \(\lambda ||\mathbf{v}||^2 = \bar{\lambda} ||\mathbf{v}||^2\). Since \(\mathbf{v} \neq \mathbf{0}\), \(||\mathbf{v}||^2 \neq 0\), so we can divide by it to get \(\lambda = \bar{\lambda}\). This implies that \(\lambda\) must be a real number.
Source: Subject Guide, 7.5.2; Anthony & Harvey, 13.5.1
12
Prove that eigenvectors of a Hermitian matrix corresponding to distinct eigenvalues are orthogonal.
Let \(A\) be a Hermitian matrix. Let \(\lambda_1\) and \(\lambda_2\) be two distinct eigenvalues (\(\lambda_1 \neq \lambda_2\)) with corresponding eigenvectors \(\mathbf{v}_1\) and \(\mathbf{v}_2\). We know \(A\mathbf{v}_1 = \lambda_1\mathbf{v}_1\) and \(A\mathbf{v}_2 = \lambda_2\mathbf{v}_2\). Consider \(\langle A\mathbf{v}_1, \mathbf{v}_2 \rangle\): \[ \langle A\mathbf{v}_1, \mathbf{v}_2 \rangle = \langle \lambda_1\mathbf{v}_1, \mathbf{v}_2 \rangle = \lambda_1 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle \] Also, \(\langle A\mathbf{v}_1, \mathbf{v}_2 \rangle = \langle \mathbf{v}_1, A^*\mathbf{v}_2 \rangle = \langle \mathbf{v}_1, A\mathbf{v}_2 \rangle = \langle \mathbf{v}_1, \lambda_2\mathbf{v}_2 \rangle = \bar{\lambda}_2 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle\). Since eigenvalues of a Hermitian matrix are real, \(\bar{\lambda}_2 = \lambda_2\). So, \(\lambda_1 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle = \lambda_2 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle\). \[ (\lambda_1 - \lambda_2) \langle \mathbf{v}_1, \mathbf{v}_2 \rangle = 0 \] Since \(\lambda_1 \neq \lambda_2\), we must have \(\langle \mathbf{v}_1, \mathbf{v}_2 \rangle = 0\). Thus, the eigenvectors are orthogonal.
Source: Subject Guide, 7.5.2; Anthony & Harvey, 13.5.1
13
Prove that a complex matrix \(U\) is unitary if and only if its columns form an orthonormal basis of \(\mathbb{C}^n\).
Let \(U\) be an \(n \times n\) matrix with columns \(\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n\). The matrix \(U^*\) has rows \(\mathbf{u}_1^*, \mathbf{u}_2^*, \dots, \mathbf{u}_n^*\). The product \(U^*U\) is an \(n \times n\) matrix where the entry in the \(i\)-th row and \(j\)-th column is \((\mathbf{u}_i^* \mathbf{u}_j) = \langle \mathbf{u}_j, \mathbf{u}_i \rangle\). \[ (U^*U)_{ij} = \langle \mathbf{u}_j, \mathbf{u}_i \rangle \] The matrix \(U\) is unitary if and only if \(U^*U = I\). This is equivalent to the condition that the entries of the product matrix are \((U^*U)_{ij} = \delta_{ij}\) (the Kronecker delta). So, \(\langle \mathbf{u}_j, \mathbf{u}_i \rangle = 1\) if \(i=j\) and \(\langle \mathbf{u}_j, \mathbf{u}_i \rangle = 0\) if \(i \neq j\). This is precisely the definition of the set of vectors \(\{\mathbf{u}_1, \dots, \mathbf{u}_n\}\) being an orthonormal set. Since there are \(n\) such vectors in \(\mathbb{C}^n\), they form an orthonormal basis.
Source: Subject Guide, 7.5.3; Anthony & Harvey, 13.5.2
14
What does it mean to unitarily diagonalise a matrix?
A square complex matrix \(A\) is said to be unitarily diagonalisable if there exists a unitary matrix \(U\) such that \(U^*AU = D\), where \(D\) is a diagonal matrix.
The columns of the unitary matrix \(U\) are an orthonormal basis of eigenvectors for \(A\), and the diagonal entries of \(D\) are the corresponding eigenvalues.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
15
Define a normal matrix.
An \(n \times n\) complex matrix \(A\) is called normal if it commutes with its Hermitian conjugate. That is, \[ AA^* = A^*A \] This class of matrices is important because a matrix is unitarily diagonalisable if and only if it is normal.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
16
Prove that any Hermitian matrix is normal.
Let \(A\) be a Hermitian matrix. By definition, \(A = A^*\). We need to show that \(AA^* = A^*A\).
Starting with the left side: \[ AA^* = A(A) = A^2 \] Starting with the right side: \[ A^*A = (A)A = A^2 \] Since both sides are equal to \(A^2\), we have \(AA^* = A^*A\). Therefore, any Hermitian matrix is normal.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
17
Prove that any unitary matrix is normal.
Let \(U\) be a unitary matrix. By definition, \(UU^* = U^*U = I\). We need to show that \(UU^* = U^*U\).
This is true by the very definition of a unitary matrix. Both \(UU^*\) and \(U^*U\) are equal to the identity matrix \(I\), so they are equal to each other. Therefore, any unitary matrix is normal.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
18
State the Spectral Theorem for normal matrices.
A square complex matrix \(A\) is unitarily diagonalisable if and only if it is a normal matrix.
This is a fundamental result in linear algebra. It means that the set of matrices that can be diagonalised using a unitary transformation (which preserves lengths and angles) is precisely the set of normal matrices.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
19
What is the spectral decomposition of a normal matrix?
If \(A\) is a normal matrix, it is unitarily diagonalisable. Let \(\lambda_1, \dots, \lambda_n\) be its eigenvalues and \(\mathbf{u}_1, \dots, \mathbf{u}_n\) be a corresponding orthonormal basis of eigenvectors. The spectral decomposition of \(A\) is the expression: \[ A = \lambda_1 \mathbf{u}_1\mathbf{u}_1^* + \lambda_2 \mathbf{u}_2\mathbf{u}_2^* + \dots + \lambda_n \mathbf{u}_n\mathbf{u}_n^* \] This can be written as \(A = \sum_{i=1}^n \lambda_i E_i\), where \(E_i = \mathbf{u}_i\mathbf{u}_i^*\) is the matrix that represents the orthogonal projection onto the subspace spanned by the eigenvector \(\mathbf{u}_i\).
Source: Subject Guide, 7.7; Anthony & Harvey, 13.7
20
What are the properties of the projection matrices \(E_i\) in a spectral decomposition?
The matrices \(E_i = \mathbf{u}_i\mathbf{u}_i^*\) in the spectral decomposition of a normal matrix have the following properties:
  1. They are Hermitian: \(E_i^* = (\mathbf{u}_i\mathbf{u}_i^*)^* = (\mathbf{u}_i^*)^*\mathbf{u}_i^* = \mathbf{u}_i\mathbf{u}_i^* = E_i\).
  2. They are idempotent (they are projections): \(E_i^2 = E_i\).
  3. They are mutually orthogonal: \(E_i E_j = 0\) for \(i \neq j\).
  4. They sum to the identity: \(\sum_{i=1}^n E_i = I\).
Source: Subject Guide, 7.7; Anthony & Harvey, 13.7
21
How can you use the spectral decomposition of a normal matrix \(A\) to find a matrix \(B\) such that \(B^2 = A\)?
If \(A = \sum \lambda_i E_i\) is the spectral decomposition of \(A\), then a square root \(B\) can be found by taking the square root of the eigenvalues: \[ B = \sum \sqrt{\lambda_i} E_i \] where \(\sqrt{\lambda_i}\) can be any of the two complex square roots of \(\lambda_i\). This gives \(2^n\) possible square roots in general. Then, \(B^2 = (\sum \sqrt{\lambda_i} E_i)^2 = \sum (\sqrt{\lambda_i})^2 E_i^2 = \sum \lambda_i E_i = A\), using the properties \(E_i E_j = 0\) for \(i \neq j\) and \(E_i^2 = E_i\).
Source: Subject Guide, 7.7; Anthony & Harvey, 13.7
22
Outline the Gram-Schmidt process for a set of vectors \(\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}\) in a complex inner product space.
The Gram-Schmidt process transforms a linearly independent set \(\{\mathbf{v}_i\}\) into an orthonormal set \(\{\mathbf{u}_i\}\) that spans the same subspace.
  1. Set \(\mathbf{w}_1 = \mathbf{v}_1\). Normalize to get \(\mathbf{u}_1 = \frac{\mathbf{w}_1}{||\mathbf{w}_1||}\).
  2. Project \(\mathbf{v}_2\) onto \(\mathbf{u}_1\) and subtract it: \(\mathbf{w}_2 = \mathbf{v}_2 - \langle \mathbf{v}_2, \mathbf{u}_1 \rangle \mathbf{u}_1\). Normalize to get \(\mathbf{u}_2 = \frac{\mathbf{w}_2}{||\mathbf{w}_2||}\).
  3. For \(\mathbf{v}_k\), subtract its projections onto all previous orthonormal vectors: \(\mathbf{w}_k = \mathbf{v}_k - \sum_{j=1}^{k-1} \langle \mathbf{v}_k, \mathbf{u}_j \rangle \mathbf{u}_j\). Normalize to get \(\mathbf{u}_k = \frac{\mathbf{w}_k}{||\mathbf{w}_k||}\).
Source: Subject Guide, 7.4; Anthony & Harvey, 13.4.2
23
What is the relationship between the determinant of a matrix and its eigenvalues?
The determinant of a square matrix \(A\) is the product of its eigenvalues (counting multiplicities). \[ \det(A) = \lambda_1 \lambda_2 \cdots \lambda_n \] This is true for both real and complex matrices. It can be seen by considering the characteristic polynomial \(p(\lambda) = \det(A - \lambda I)\). Setting \(\lambda=0\) gives \(p(0) = \det(A)\). Also, from the factored form \(p(\lambda) = (-1)^n(\lambda - \lambda_1)\cdots(\lambda - \lambda_n)\), setting \(\lambda=0\) gives \(p(0) = (-1)^n(-\lambda_1)\cdots(-\lambda_n) = \lambda_1\cdots\lambda_n\).
Source: Anthony & Harvey, 8.1.4
24
What is the relationship between the trace of a matrix and its eigenvalues?
The trace of a square matrix \(A\), denoted \(\text{tr}(A)\), is the sum of its diagonal entries. It is also equal to the sum of its eigenvalues (counting multiplicities). \[ \text{tr}(A) = \sum_{i=1}^n a_{ii} = \sum_{i=1}^n \lambda_i \] This can be shown by examining the coefficient of the \(\lambda^{n-1}\) term in the characteristic polynomial.
Source: Anthony & Harvey, 8.1.4
25
Prove that a matrix \(U\) is unitary if and only if it preserves the standard complex inner product.
We want to prove that \(U\) is unitary \(\iff \langle U\mathbf{x}, U\mathbf{y} \rangle = \langle \mathbf{x}, \mathbf{y} \rangle\) for all \(\mathbf{x}, \mathbf{y} \in \mathbb{C}^n\).
(\(\Rightarrow\)) Assume \(U\) is unitary, so \(U^*U = I\). Then \[ \langle U\mathbf{x}, U\mathbf{y} \rangle = (U\mathbf{y})^*(U\mathbf{x}) = (\mathbf{y}^*U^*) (U\mathbf{x}) = \mathbf{y}^*(U^*U)\mathbf{x} = \mathbf{y}^*I\mathbf{x} = \mathbf{y}^*\mathbf{x} = \langle \mathbf{x}, \mathbf{y} \rangle \] (\(\Leftarrow\)) Assume \(\langle U\mathbf{x}, U\mathbf{y} \rangle = \langle \mathbf{x}, \mathbf{y} \rangle\) for all \(\mathbf{x}, \mathbf{y}\). This means \(\mathbf{y}^*U^*U\mathbf{x} = \mathbf{y}^*\mathbf{x}\). Let \(\mathbf{y} = \mathbf{e}_i\) and \(\mathbf{x} = \mathbf{e}_j\) (standard basis vectors). Then \(\mathbf{e}_i^* (U^*U) \mathbf{e}_j = (U^*U)_{ij}\) and \(\mathbf{e}_i^*\mathbf{e}_j = \delta_{ij}\). Therefore, \((U^*U)_{ij} = \delta_{ij}\), which means \(U^*U = I\). So \(U\) is unitary.
Source: Subject Guide, 7.5.3; Anthony & Harvey, 13.5.2
26
If a matrix is unitarily diagonalisable, is it normal? Prove it.
Yes. If \(A\) is unitarily diagonalisable, there exists a unitary matrix \(U\) and a diagonal matrix \(D\) such that \(A = UDU^*\). We need to show \(AA^* = A^*A\). \[ A^* = (UDU^*)^* = (U^*)^*D^*U^* = UD^*U^* \] (Note: \(D\) is diagonal, so \(D^* = \bar{D}^T = \bar{D}\). \[ AA^* = (UDU^*)(UD^*U^*) = UDD^*U^* \] \[ A^*A = (UDU^*)^*(UDU^*) = (UD^*U^*)(UDU^*) = UD^*DU^* \] Since diagonal matrices commute (\\(DD^* = D^*D\\)), we have \(AA^* = A^*A\). Thus, \(A\) is normal.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
27
Find the roots of the polynomial \(z^3 = -8i\).
First, write \(-8i\) in exponential form. The modulus is \(r=8\). The angle is \(\theta = -\pi/2\). So, \(-8i = 8e^{i(-\pi/2 + 2k\pi)}\). We want to solve \(z^3 = 8e^{i(-\pi/2 + 2k\pi)}\). The roots are: \[ z_k = \sqrt[3]{8} e^{i\frac{-\pi/2 + 2k\pi}{3}} = 2e^{i(\frac{-\pi}{6} + \frac{2k\pi}{3})} \] For \(k=0, 1, 2\):
  • \(k=0: z_0 = 2e^{-i\pi/6} = 2(\cos(-\pi/6) + i\sin(-\pi/6)) = 2(\frac{\sqrt{3}}{2} - \frac{1}{2}i) = \sqrt{3} - i\)
  • \(k=1: z_1 = 2e^{i(-\pi/6 + 2\pi/3)} = 2e^{i\pi/2} = 2(0+i) = 2i\)
  • \(k=2: z_2 = 2e^{i(-\pi/6 + 4\pi/3)} = 2e^{i7\pi/6} = 2(\cos(7\pi/6) + i\sin(7\pi/6)) = 2(-\frac{\sqrt{3}}{2} - \frac{1}{2}i) = -\sqrt{3} - i\)
Source: Subject Guide, 7.1.3; Anthony & Harvey, 13.1.5
28
Is the matrix \(A = \begin{pmatrix} 1 & 1+i \\ 1-i & 2 \end{pmatrix}\) Hermitian, unitary, or normal?
1. Hermitian? We check if \(A=A^*\). \[ A^* = \begin{pmatrix} 1 & \overline{1-i} \\ \overline{1+i} & 2 \end{pmatrix} = \begin{pmatrix} 1 & 1+i \\ 1-i & 2 \end{pmatrix} = A \] Yes, \(A\) is Hermitian.
2. Normal? Since all Hermitian matrices are normal, \(A\) is normal.
3. Unitary? We check if \(AA^* = I\). Since \(A=A^*\), we check \(A^2=I\). \[ A^2 = \begin{pmatrix} 1 & 1+i \\ 1-i & 2 \end{pmatrix} \begin{pmatrix} 1 & 1+i \\ 1-i & 2 \end{pmatrix} = \begin{pmatrix} 1+(1+i)(1-i) & 1+i+2(1+i) \\ 1-i+2(1-i) & (1-i)(1+i)+4 \end{pmatrix} = \begin{pmatrix} 3 & \dots \\ \dots & \dots \end{pmatrix} \neq I \] No, \(A\) is not unitary.
Source: Subject Guide, 7.5, 7.6; Anthony & Harvey, 13.5, 13.6
29
What is the norm of the vector \(\mathbf{v} = (1, i, 1-i)\) in \(\mathbb{C}^3\)?
The norm \(||\mathbf{v}||\) is calculated as \(\sqrt{\langle \mathbf{v}, \mathbf{v} \rangle}\). \[ \langle \mathbf{v}, \mathbf{v} \rangle = v_1\bar{v}_1 + v_2\bar{v}_2 + v_3\bar{v}_3 \] \[ = (1)(\overline{1}) + (i)(\overline{i}) + (1-i)(\overline{1-i}) \] \[ = (1)(1) + (i)(-i) + (1-i)(1+i) \] \[ = 1 - i^2 + (1 - i^2) = 1 + 1 + (1+1) = 4 \] Therefore, the norm is \(||\mathbf{v}|| = \sqrt{4} = 2\).
Source: Subject Guide, 7.4.1; Anthony & Harvey, 13.4.1
30
Find an orthonormal basis for the subspace of \(\mathbb{C}^2\) spanned by \(\mathbf{v}_1 = (1, i)\).
The subspace is a line. We just need to find a unit vector in the same direction as \(\mathbf{v}_1\). This is a 1-dimensional basis. First, find the norm of \(\mathbf{v}_1\): \[ ||\mathbf{v}_1|| = \sqrt{\langle \mathbf{v}_1, \mathbf{v}_1 \rangle} = \sqrt{1\cdot\bar{1} + i\cdot\bar{i}} = \sqrt{1(1) + i(-i)} = \sqrt{1+1} = \sqrt{2} \] The orthonormal basis is given by the single vector \(\mathbf{u}_1\): \[ \mathbf{u}_1 = \frac{\mathbf{v}_1}{||\mathbf{v}_1||} = \frac{1}{\sqrt{2}}(1, i) = \left(\frac{1}{\sqrt{2}}, \frac{i}{\sqrt{2}}\right) \] The basis is \(\left\{ \left(\frac{1}{\sqrt{2}}, \frac{i}{\sqrt{2}}\right) \right\}\).
Source: Subject Guide, 7.4; Anthony & Harvey, 13.4.2
31
What are the eigenvalues of a unitary matrix \(U\)?
The eigenvalues \(\lambda\) of a unitary matrix \(U\) all have a modulus of 1, i.e., \(|\lambda| = 1\). They lie on the unit circle in the complex plane. Proof: Let \(\mathbf{v}\) be an eigenvector for \(\lambda\). Then \(U\mathbf{v} = \lambda\mathbf{v}\). Since unitary transformations preserve the norm, we have \(||U\mathbf{v}|| = ||\mathbf{v}||\). So, \(||\lambda\mathbf{v}|| = |\lambda| \cdot ||\mathbf{v}|| = ||\mathbf{v}||\). Since \(\mathbf{v} \neq \mathbf{0}\), we can divide by \(||\mathbf{v}||\) to get \(|\lambda| = 1\).
Source: Anthony & Harvey, 13.5.2
32
Unitarily diagonalise the matrix \(A = \begin{pmatrix} 2 & i \\ -i & 2 \end{pmatrix}\).
1. Check if normal: \(A^* = \begin{pmatrix} 2 & i \\ -i & 2 \end{pmatrix} = A\). Since A is Hermitian, it is normal.
2. Find eigenvalues: \(\det(A - \lambda I) = (2-\lambda)^2 - (-i)(i) = (2-\lambda)^2 - 1 = \lambda^2 - 4\lambda + 3 = (\lambda-3)(\lambda-1) = 0\). So \(\lambda_1=3, \lambda_2=1\).
3. Find eigenvectors: For \(\lambda_1=3\): \(A-3I = \begin{pmatrix} -1 & i \\ -i & -1 \end{pmatrix} \to \begin{pmatrix} 1 & -i \\ 0 & 0 \end{pmatrix}\). Eigenvector \(\mathbf{v}_1 = (i, 1)\). For \(\lambda_2=1\): \(A-I = \begin{pmatrix} 1 & i \\ -i & 1 \end{pmatrix} \to \begin{pmatrix} 1 & i \\ 0 & 0 \end{pmatrix}\). Eigenvector \(\mathbf{v}_2 = (-i, 1)\).
4. Orthonormalize: \(||\mathbf{v}_1|| = \sqrt{i(-i)+1(1)} = \sqrt{2}\). \(||\mathbf{v}_2|| = \sqrt{(-i)i+1(1)} = \sqrt{2}\). \(\mathbf{u}_1 = \frac{1}{\sqrt{2}}(i, 1)\), \(\mathbf{u}_2 = \frac{1}{\sqrt{2}}(-i, 1)\).
5. Form U and D: \(U = \frac{1}{\sqrt{2}}\begin{pmatrix} i & -i \\ 1 & 1 \end{pmatrix}\), \(D = \begin{pmatrix} 3 & 0 \\ 0 & 1 \end{pmatrix}\).
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
33
What is the main difference between diagonalising a real symmetric matrix and a complex Hermitian matrix?
The process is almost identical, but the type of transformation matrix used is different.
  • A real symmetric matrix \(A\) is diagonalised by a real orthogonal matrix \(P\) (where \(P^T = P^{-1}\)), such that \(P^TAP = D\).
  • A complex Hermitian matrix \(A\) is diagonalised by a unitary matrix \(U\) (where \(U^* = U^{-1}\)), such that \(U^*AU = D\).
The concept of an orthogonal matrix is the real-valued special case of a unitary matrix. The process in both cases involves finding an orthonormal basis of eigenvectors.
Source: Subject Guide, 7.6; Anthony & Harvey, 11.1, 13.6
34
If a normal matrix has real eigenvalues, what can you conclude?
If a normal matrix has real eigenvalues, then it must be a Hermitian matrix.
Proof Sketch: Since the matrix \(A\) is normal, it is unitarily diagonalisable, so \(A = UDU^*\) for some unitary \(U\) and diagonal \(D\). The diagonal entries of \(D\) are the eigenvalues of \(A\). If the eigenvalues are real, then \(D\) is a real matrix, which means \(D^* = D^T = D\). Now we check if \(A\) is Hermitian: \[ A^* = (UDU^*)^* = (U^*)^*D^*U^* = UDU^* = A \] Thus, \(A\) is Hermitian.
Source: Subject Guide, 7.6; Anthony & Harvey, 13.6
35
Find the spectral decomposition of \(A = \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix}\).
1. Check Normal: \(A^* = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}\). \(AA^* = I\), \(A^*A = I\). So A is normal (it is unitary).
2. Eigen-problem: Eigenvalues are \(\lambda_1=i, \lambda_2=-i\). Corresponding eigenvectors are \(\mathbf{v}_1=(1, i), \mathbf{v}_2=(1, -i)\).
3. Orthonormalize: \(||\mathbf{v}_1|| = \sqrt{2}, ||\mathbf{v}_2|| = \sqrt{2}\). So \(\mathbf{u}_1 = \frac{1}{\sqrt{2}}(1, i), \mathbf{u}_2 = \frac{1}{\sqrt{2}}(1, -i)\).
4. Find Projection Matrices \(E_i = \mathbf{u}_i\mathbf{u}_i^*\): \[ E_1 = \frac{1}{2}\begin{pmatrix} 1 \\ i \end{pmatrix}\begin{pmatrix} 1 & -i \end{pmatrix} = \frac{1}{2}\begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix} \] \[ E_2 = \frac{1}{2}\begin{pmatrix} 1 \\ -i \end{pmatrix}\begin{pmatrix} 1 & i \end{pmatrix} = \frac{1}{2}\begin{pmatrix} 1 & i \\ -i & 1 \end{pmatrix} \] 5. Decomposition: \(A = \lambda_1 E_1 + \lambda_2 E_2 = i E_1 - i E_2\). \[ A = \frac{i}{2}\begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix} - \frac{i}{2}\begin{pmatrix} 1 & i \\ -i & 1 \end{pmatrix} = \frac{1}{2}\begin{pmatrix} i & 1 \\ -1 & i \end{pmatrix} - \frac{1}{2}\begin{pmatrix} i & -1 \\ 1 & i \end{pmatrix} = \frac{1}{2}\begin{pmatrix} 0 & 2 \\ -2 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix} \]
Source: Subject Guide, 7.7; Anthony & Harvey, 13.7
36
Can a non-symmetric real matrix be diagonalised?
Yes. A real matrix is diagonalisable if it has \(n\) linearly independent eigenvectors. It does not need to be symmetric.
However, a real matrix is orthogonally diagonalisable (i.e., with an orthogonal matrix \(P\) such that \(P^TAP=D\)) if and only if it is symmetric. This is a much stronger condition.
Source: Anthony & Harvey, 8.2, 11.1
37
What is the fundamental difference between a real inner product and a complex inner product?
The key difference lies in the symmetry property.
  • A real inner product is symmetric: \(\langle \mathbf{x}, \mathbf{y} \rangle = \langle \mathbf{y}, \mathbf{x} \rangle\).
  • A complex inner product is conjugate symmetric (or Hermitian): \(\langle \mathbf{x}, \mathbf{y} \rangle = \overline{\langle \mathbf{y}, \mathbf{x} \rangle}\).
This change is necessary to ensure that the norm-squared, \(\langle \mathbf{x}, \mathbf{x} \rangle\), is a non-negative real number for complex vectors. If it were symmetric, \(\langle i\mathbf{v}, i\mathbf{v} \rangle = i^2\langle \mathbf{v}, \mathbf{v} \rangle = -||\mathbf{v}||^2\), which would violate the positive-definiteness property.
Source: Subject Guide, 7.4.2; Anthony & Harvey, 13.4
38
If \(A\) is a real matrix with a complex eigenvalue \(\lambda\) and eigenvector \(\mathbf{v}\), what can be said about \(\bar{\lambda}\) and \(\bar{\mathbf{v}}\)?
If \(A\) is a real matrix (\\(A = \bar{A}\\)) and \(A\mathbf{v} = \lambda\mathbf{v}\\), then by taking the complex conjugate of the entire equation, we get: \\[ \overline{A\mathbf{v}} = \overline{\lambda\mathbf{v}} \] \\[ \bar{A}\bar{\mathbf{v}} = \bar{\lambda}\bar{\mathbf{v}} \] Since \(A\) is real, \(\bar{A} = A\\), so: \\[ A\bar{\mathbf{v}} = \bar{\lambda}\bar{\mathbf{v}} \] This shows that \(\bar{\lambda}\) is also an eigenvalue of \(A\), with corresponding eigenvector \(\bar{\mathbf{v}}\). Complex eigenvalues of real matrices always come in conjugate pairs.
Source: Anthony & Harvey, 13.3
39
What is the geometric interpretation of multiplying a complex number by \(i\)?
Multiplying a complex number \(z\) by \(i\) corresponds to rotating the vector representing \(z\) in the complex plane by \(90^\circ\) (or \(\pi/2\) radians) counter-clockwise.
This can be seen using the exponential form. Let \(z = re^{i\theta}\). Since \(i = e^{i\pi/2}\), the product is: \\[ iz = (e^{i\pi/2})(re^{i\theta}) = re^{i(\theta + \pi/2)} \] The new vector has the same modulus \(r\) but its argument is increased by \(\pi/2\).
Source: Subject Guide, 7.1.4
40
Is the set of all \(2 \times 2\) Hermitian matrices a complex vector space?
No. The set of Hermitian matrices is not closed under multiplication by a general complex scalar.
Let \(A\) be a non-zero Hermitian matrix (\\(A=A^*\\)). Let \(\alpha = i\\). We check if \(iA\) is Hermitian: \\[ (iA)^* = \bar{i}A^* = (-i)A = -A \] For \(iA\) to be Hermitian, we would need \(iA = (iA)^* = -A\\), which implies \((i+1)A = 0\\). Since \(A\) is non-zero, this is not true.
However, the set of Hermitian matrices does form a real vector space, since it is closed under multiplication by real scalars.
Source: Anthony & Harvey, 13.5.1
41
What is the relationship between a unitary matrix and an orthogonal matrix?
A unitary matrix is the complex analogue of a real orthogonal matrix. A real matrix \(Q\) is orthogonal if \(Q^T Q = I\). A complex matrix \(U\) is unitary if \(U^* U = I\).
If a unitary matrix \(U\) happens to have only real entries, then its Hermitian conjugate \(U^*\) is the same as its transpose \(U^T\). In this case, the condition \(U^*U=I\) becomes \(U^TU=I\), which is the definition of an orthogonal matrix. Therefore, a real unitary matrix is an orthogonal matrix.
Source: Subject Guide, 7.5.3
42
If \(E\) is a projection matrix from a spectral decomposition, what is \(E\mathbf{v}\) if \(\mathbf{v}\\) is in the eigenspace corresponding to \(E\)?
If \(E_i = \mathbf{u}_i\mathbf{u}_i^*\) is the projection matrix onto the eigenspace for \(\lambda_i\) (spanned by \(\mathbf{u}_i\)), and \(\mathbf{v}\\) is in that eigenspace, then \(\mathbf{v} = c\mathbf{u}_i\) for some scalar \(c\). Then: \\[ E_i\mathbf{v} = E_i(c\mathbf{u}_i) = c(E_i\mathbf{u}_i) = c(\mathbf{u}_i\mathbf{u}_i^*) \mathbf{u}_i \] Since \(\mathbf{u}_i^*\mathbf{u}_i = ||\mathbf{u}_i||^2 = 1\), this becomes: \\[ c(\mathbf{u}_i)(1) = c\mathbf{u}_i = \mathbf{v} \] So, \(E_i\mathbf{v} = \mathbf{v}\). The projection of a vector already in the subspace is the vector itself.
Source: Subject Guide, 7.7
43
If \(E_i\) and \(E_j\) are projection matrices from a spectral decomposition with \(i \neq j\), what is \(E_i\mathbf{v}\) if \(\mathbf{v}\\) is in the eigenspace for \(E_j\)?
If \(\mathbf{v}\\) is in the eigenspace for \(E_j\), then \(\mathbf{v} = c\mathbf{u}_j\) for some scalar \(c\). Then: \\[ E_i\mathbf{v} = E_i(c\mathbf{u}_j) = c(E_i\mathbf{u}_j) = c(\mathbf{u}_i\mathbf{u}_i^*) \mathbf{u}_j \] Since the eigenvectors \(\{\mathbf{u}_k\}\) form an orthonormal set, \(\mathbf{u}_i^*\mathbf{u}_j = \langle \mathbf{u}_j, \mathbf{u}_i \rangle = 0\) for \(i \neq j\). Therefore: \\[ c(\mathbf{u}_i)(0) = \mathbf{0} \] So, \(E_i\mathbf{v} = \mathbf{0}\). The projection of a vector onto an orthogonal subspace is the zero vector.
Source: Subject Guide, 7.7
44
Can a non-symmetric real matrix be diagonalised?
Yes. A real matrix is diagonalisable if it has \(n\) linearly independent eigenvectors. It does not need to be symmetric.
However, a real matrix is orthogonally diagonalisable (i.e., with an orthogonal matrix \(P\) such that \(P^TAP=D\)) if and only if it is symmetric. This is a much stronger condition.
Source: Anthony & Harvey, 8.2, 11.1
45
What is the algebraic and geometric multiplicity of the eigenvalues of \(A = \begin{pmatrix} 2 & 1 \\ 0 & 2 \end{pmatrix}\)?
The characteristic equation is \(\det(A - \lambda I) = (2-\lambda)^2 = 0\). The only eigenvalue is \(\lambda=2\).
  • Algebraic Multiplicity: The eigenvalue \(\lambda=2\) is a root of multiplicity 2 in the characteristic polynomial. So, the algebraic multiplicity is 2.
  • Geometric Multiplicity: We find the dimension of the eigenspace \(N(A-2I)\). \\(A-2I = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}\). The null space is spanned by the vector \((1, 0)\). The dimension of the eigenspace is 1. So, the geometric multiplicity is 1.
Since the geometric multiplicity (1) is less than the algebraic multiplicity (2), the matrix is not diagonalisable.
Source: Anthony & Harvey, 8.3.3
46
If \(A\) is a normal matrix, are its eigenvectors corresponding to distinct eigenvalues orthogonal?
Yes. The proof is very similar to the Hermitian case. Let \(A\mathbf{v}_1 = \lambda_1\mathbf{v}_1\) and \(A\mathbf{v}_2 = \lambda_2\mathbf{v}_2\) with \(\lambda_1 \neq \lambda_2\). Consider \(\langle A\mathbf{v}_1, \mathbf{v}_2 \rangle = \lambda_1 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle\). Also, \(\langle A\mathbf{v}_1, \mathbf{v}_2 \rangle = \langle \mathbf{v}_1, A^*\mathbf{v}_2 \rangle\). We know \(A^*\mathbf{v}_2 = \bar{\lambda}_2\mathbf{v}_2\) (an eigenvector of \(A^*\)). So \(\langle \mathbf{v}_1, A^*\mathbf{v}_2 \rangle = \langle \mathbf{v}_1, \bar{\lambda}_2\mathbf{v}_2 \rangle = \lambda_2 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle\). Thus, \(\lambda_1 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle = \lambda_2 \langle \mathbf{v}_1, \mathbf{v}_2 \rangle\), which means \((\lambda_1 - \lambda_2) \langle \mathbf{v}_1, \mathbf{v}_2 \rangle = 0\). Since \(\lambda_1 \neq \lambda_2\), we must have \(\langle \mathbf{v}_1, \mathbf{v}_2 \rangle = 0\).
Source: Generalization of Hermitian proof
47
Express \(z = 1+i\) in polar and exponential forms.
Cartesian: \(z = 1+i\).
1. Modulus: \(r = |z| = \sqrt{1^2 + 1^2} = \sqrt{2}\).
2. Argument: The point \((1,1)\) is in the first quadrant. \(\tan\theta = 1/1 = 1\), so \(\theta = \pi/4\).
Polar Form: \[ z = \sqrt{2}(\cos(\pi/4) + i\sin(\pi/4)) \] Exponential Form: \[ z = \sqrt{2}e^{i\pi/4} \]
Source: Subject Guide, 7.1.5
48
Use De Moivre's formula to compute \((1+i)^8\).
First, convert \(1+i\) to exponential or polar form. From the previous card, \(1+i = \sqrt{2}e^{i\pi/4}\).
Now, apply De Moivre's formula: \\[ (1+i)^8 = (\sqrt{2}e^{i\pi/4})^8 = (\sqrt{2})^8 (e^{i\pi/4})^8 \] \\[ = 2^4 e^{i(8\pi/4)} = 16 e^{i2\pi} \] Since \(e^{i2\pi} = \cos(2\pi) + i\sin(2\pi) = 1 + 0i = 1\), the result is: \\[ (1+i)^8 = 16(1) = 16 \]
Source: Subject Guide, 7.1.6
49
What is the main diagonalisability difference between a real symmetric matrix and a complex Hermitian matrix?
The process is almost identical, but the type of transformation matrix used is different.
  • A real symmetric matrix \(A\) is diagonalised by a real orthogonal matrix \(P\) (where \(P^T = P^{-1}\)), such that \(P^TAP = D\).
  • A complex Hermitian matrix \(A\) is diagonalised by a unitary matrix \(U\) (where \(U^* = U^{-1}\)), such that \(U^*AU = D\).
The concept of an orthogonal matrix is the real-valued special case of a unitary matrix. The process in both cases involves finding an orthonormal basis of eigenvectors.
Source: Subject Guide, 7.6; Anthony & Harvey, 11.1, 13.6
50
If \(A\) is a real and symmetric matrix, is it also Hermitian?
Yes.
A matrix is Hermitian if \(A = A^*\). The Hermitian conjugate \(A^*\) is defined as \(\bar{A}^T\).
If \(A\) is a real matrix, then all its entries are real, so taking the complex conjugate has no effect: \(\bar{A} = A\). Therefore, for a real matrix, \(A^* = A^T\).
If \(A\) is also symmetric, then \(A = A^T\).
Combining these, if \(A\) is real and symmetric, then \(A = A^T = A^*\). Thus, \(A\) is Hermitian.
Source: Subject Guide, 7.5.2
51
Is the product of two Hermitian matrices always Hermitian?
No, not always. Let \(A\) and \(B\) be Hermitian, so \(A=A^*\) and \(B=B^*\). We check the product \(AB\): \\[ (AB)^* = B^*A^* = BA \] For \(AB\) to be Hermitian, we would need \((AB)^* = AB\). This means we need \(BA = AB\). The product of two Hermitian matrices is Hermitian if and only if the matrices commute.
Source: Anthony & Harvey, 13.5
52
If \(A\) is a normal matrix, is \(A^2\) also normal?
Yes. If \(A\) is normal, then \(AA^* = A^*A\). We need to check if \((A^2)(A^2)^* = (A^2)^*(A^2)\\).
LHS: \((A^2)(A^2)^* = A^2(A^*)^2 = A A A^* A^*\). Since \(A\) is normal, we can swap \(A\) and \(A^*\): \(A(AA^*)A^* = A(A^*A)A^* = A^*A A A^* = A^* (AA^*) A = A^*(A^*A)A = (A^*)^2 A^2 = (A^2)^*A^2\). So, \((A^2)(A^2)^* = (A^2)^*(A^2)\). The statement is true.
Source: Generalization of properties
53
What is the geometric meaning of the projection matrix \(E_i = \mathbf{u}_i\mathbf{u}_i^*\) from a spectral decomposition?
The matrix \(E_i\) is the orthogonal projection from \(\mathbb{C}^n\) onto the one-dimensional subspace spanned by the eigenvector \(\mathbf{u}_i\).
For any vector \(\mathbf{v} \in \mathbb{C}^n\), the vector \(E_i\mathbf{v}\) is the projection of \(\mathbf{v}\) onto the line defined by \(\mathbf{u}_i\). \\[ E_i\mathbf{v} = (\mathbf{u}_i\mathbf{u}_i^*)\mathbf{v} = \mathbf{u}_i(\mathbf{u}_i^*\mathbf{v}) = \mathbf{u}_i \langle \mathbf{v}, \mathbf{u}_i \rangle = \langle \mathbf{v}, \mathbf{u}_i \rangle \mathbf{u}_i \] This is the standard formula for orthogonal projection of \(\mathbf{v}\) onto the unit vector \(\mathbf{u}_i\).
Source: Subject Guide, 7.7
54
Can a real matrix have complex eigenvectors?
Yes. A real matrix can have complex eigenvalues, and the eigenvectors corresponding to these complex eigenvalues will themselves be complex.
For example, the rotation matrix \(A = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}\) is real, but its eigenvalues are \(\pm i\) and its eigenvectors are \((1, -i)\) and \((1, i)\), which are complex.
Source: Subject Guide, 7.3; Anthony & Harvey, 13.3
55
If \(A\) is a real \(n \times n\) matrix, can \(\mathbb{C}^n\) have a basis of eigenvectors of \(A\) even if \(\mathbb{R}^n\) does not?
Yes. This happens when the matrix is not diagonalisable over \(\mathbb{R}\\) but is diagonalisable over \(\mathbb{C}\).
For a matrix to be diagonalisable over a field, it must have a full set of \(n\) linearly independent eigenvectors in the vector space over that field.
A real matrix like \(A = \begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}\) has no real eigenvalues, so it has no eigenvectors in \(\mathbb{R}^2\) and cannot form a basis. However, it has two distinct complex eigenvalues \(\pm i\) and thus two linearly independent eigenvectors in \(\mathbb{C}^2\), which form a basis for \(\mathbb{C}^2\).
Source: Subject Guide, 7.3; Anthony & Harvey, 13.3
56
What is the result of \(\sum_{i=1}^n E_i\), where \(E_i\) are the projection matrices from a spectral decomposition?
The sum of the projection matrices from a spectral decomposition is the identity matrix: \\[ \sum_{i=1}^n E_i = E_1 + E_2 + \dots + E_n = I \] This is because \(I = UU^*\). If \(U = [\mathbf{u}_1 | \dots | \mathbf{u}_n]\), then \(UU^* = \sum_{i=1}^n \mathbf{u}_i\mathbf{u}_i^* = \sum_{i=1}^n E_i\). This reflects the fact that projecting a vector onto every basis vector of an orthonormal basis and summing the results reconstructs the original vector.
Source: Subject Guide, 7.7
57
If \(A\) is a normal matrix with spectral decomposition \(A = \sum \lambda_i E_i\), what is the spectral decomposition of \(A^k\)?
Using the properties of the projection matrices \(E_i\), we have: \\[ A^k = (\sum \lambda_i E_i)^k = \sum \lambda_i^k E_i \] This works because when expanding the power, all cross terms \(E_i E_j\) for \(i \neq j\) are zero, and \(E_i^k = E_i\) for \(k \ge 1\).
This provides a very efficient way to compute powers of a unitarily diagonalisable matrix.
Source: Subject Guide, 7.7
58
Is the matrix \(A = \begin{pmatrix} 1 & i \\ i & 1 \end{pmatrix}\) normal?
We check if \(AA^*=A^*A\). \\[ A^* = \begin{pmatrix} 1 & -i \\ -i & 1 \end{pmatrix} \] \\[ AA^* = \begin{pmatrix} 1 & i \\ i & 1 \end{pmatrix} \begin{pmatrix} 1 & -i \\ -i & 1 \end{pmatrix} = \begin{pmatrix} 1-i^2 & -i+i \\ i-i & -i^2+1 \end{pmatrix} = \begin{pmatrix} 2 & 0 \\ 0 & 2 \end{pmatrix} \] \\[ A^*A = \begin{pmatrix} 1 & -i \\ -i & 1 \end{pmatrix} \begin{pmatrix} 1 & i \\ i & 1 \end{pmatrix} = \begin{pmatrix} 1-i^2 & i-i \\ -i+i & -i^2+1 \end{pmatrix} = \begin{pmatrix} 2 & 0 \\ 0 & 2 \end{pmatrix} \] Since \(AA^* = A^*A\), the matrix is normal.
Source: Subject Guide, 7.6
59
How can \(\mathbb{C}^n\) be viewed as a real vector space? What is its dimension in this case?
We can view \(\mathbb{C}^n\) as a real vector space by restricting the scalars to be real numbers (\\(\mathbb{R}\)).
A vector \(\mathbf{v} \in \mathbb{C}^n\) can be written as \(\mathbf{v} = \mathbf{a} + i\mathbf{b}\) where \(\mathbf{a}, \mathbf{b} \in \mathbb{R}^n\). A basis for \(\mathbb{C}^n\) as a real vector space can be formed by taking the standard basis vectors of \(\mathbb{R}^n\) and their multiples by \(i\). For example, a basis for \(\mathbb{C}^2\) as a real vector space is \(\{(1,0), (i,0), (0,1), (0,i)\}\).
The dimension of \(\mathbb{C}^n\) as a real vector space is \(2n\).
Source: Subject Guide, 7.2
60
If \(A\) is an invertible complex matrix, what is \((A^*)^{-1}\) in terms of \((A^{-1})^*\)?
They are equal: \((A^*)^{-1} = (A^{-1})^*\).
Proof: We know \(AA^{-1} = I\). Taking the Hermitian conjugate of both sides: \\[ (AA^{-1})^* = I^* \] \\[ (A^{-1})^* A^* = I \] This equation shows, by definition of an inverse, that \((A^{-1})^*\) is the inverse of \(A^*\). Therefore, \((A^*)^{-1} = (A^{-1})^*\).
Source: Generalization of matrix properties
61
Find the cube roots of unity.
We want to solve \(z^3 = 1\). In exponential form, \(1 = 1 \cdot e^{i(0 + 2k\pi)}\). The roots are given by: \\[ z_k = \sqrt[3]{1} e^{i\frac{2k\pi}{3}} \quad \text{for } k=0, 1, 2 \]
  • \(k=0: z_0 = e^{i0} = 1\)
  • \(k=1: z_1 = e^{i2\pi/3} = \cos(2\pi/3) + i\sin(2\pi/3) = -\frac{1}{2} + i\frac{\sqrt{3}}{2}\)
  • \(k=2: z_2 = e^{i4\pi/3} = \cos(4\pi/3) + i\sin(4\pi/3) = -\frac{1}{2} - i\frac{\sqrt{3}}{2}\)
These are often denoted \(1, \omega, \omega^2\).
Source: Subject Guide, 7.1.3
62
Is the set of \(n \times n\) unitary matrices a subspace of \(\mathbb{C}^{n \times n}\)?
No. It fails on multiple conditions.
  1. Zero Vector: The zero matrix is not unitary, since \(0^*0 = 0 \neq I\). A subspace must contain the zero vector.
  2. Closure under Addition: If \(U_1\) and \(U_2\) are unitary, \(U_1+U_2\) is generally not unitary. For example, \(I\) is unitary, but \(I+I=2I\) is not, since \((2I)^*(2I) = 4I \neq I\).
Therefore, the set of unitary matrices is not a subspace.
Source: Subject Guide, 7.5.3
63
If \(A\) is a normal matrix, show that \(||A\mathbf{x}|| = ||A^*\mathbf{x}||\) for any vector \(\mathbf{x}\).
We want to show \(||A\mathbf{x}||^2 = ||A^*\mathbf{x}||^2\). \\[ ||A\mathbf{x}||^2 = \langle A\mathbf{x}, A\mathbf{x} \rangle = \langle \mathbf{x}, A^*A\mathbf{x} \rangle \] \\[ ||A^*\mathbf{x}||^2 = \langle A^*\mathbf{x}, A^*\mathbf{x} \rangle = \langle \mathbf{x}, (A^*)^*A^*\mathbf{x} \rangle = \langle \mathbf{x}, AA^*\mathbf{x} \rangle \] Since \(A\) is normal, \(A^*A = AA^*\). Therefore, \\[ \langle \mathbf{x}, A^*A\mathbf{x} \rangle = \langle \mathbf{x}, AA^*\mathbf{x} \rangle \] This implies \(||A\mathbf{x}||^2 = ||A^*\mathbf{x}||^2\), and since norms are non-negative, \(||A\mathbf{x}|| = ||A^*\mathbf{x}||\).
Source: Anthony & Harvey, 13.6
64
If \(A\) is a normal matrix and \(\mathbf{v}\\) is an eigenvector of \(A\) with eigenvalue \(\lambda\), show that \(\mathbf{v}\\) is also an eigenvector of \(A^*\).
We know \(A\mathbf{v} = \lambda\mathbf{v}\). We want to show \(A^*\mathbf{v} = \bar{\lambda}\mathbf{v}\). Consider the matrix \(B = A - \lambda I\). Since \(A\) is normal, \(B\) is also normal: \\[ B^*B = (A^* - \bar{\lambda}I)(A - \lambda I) = A^*A - \lambda A^* - \bar{\lambda}A + |\lambda|^2 I \] \\[ BB^* = (A - \lambda I)(A^* - \bar{\lambda}I) = AA^* - \bar{\lambda}A - \lambda A^* + |\lambda|^2 I \] Since \(AA^*=A^*A\), we have \(BB^*=B^*B\). From the previous card, we know \(||B\mathbf{v}|| = ||B^*\mathbf{v}||\). Since \(\mathbf{v}\\) is an eigenvector of \(A\) for \(\lambda\), \(B\mathbf{v} = (A-\lambda I)\mathbf{v} = \mathbf{0}\). So \(||B\mathbf{v}||=0\). This means \(||B^*\mathbf{v}||=0\), which implies \(B^*\mathbf{v}=\mathbf{0}\). \\[ (A^* - \bar{\lambda}I)\mathbf{v} = \mathbf{0} \implies A^*\mathbf{v} = \bar{\lambda}\mathbf{v} \] Thus, \(\mathbf{v}\\) is an eigenvector of \(A^*\) with eigenvalue \(\bar{\lambda}\).
Source: Anthony & Harvey, 13.6
65
What is a skew-Hermitian matrix? What can you say about its eigenvalues?
A matrix \(A\) is skew-Hermitian if \(A^* = -A\).
The eigenvalues of a skew-Hermitian matrix are purely imaginary (or zero).
Proof: Let \(A\mathbf{v} = \lambda\mathbf{v}\). Then \(\langle A\mathbf{v}, \mathbf{v} \rangle = \lambda ||\mathbf{v}||^2\). Also, \(\langle A\mathbf{v}, \mathbf{v} \rangle = \langle \mathbf{v}, A^*\mathbf{v} \rangle = \langle \mathbf{v}, -A\mathbf{v} \rangle = \langle \mathbf{v}, -\lambda\mathbf{v} \rangle = -\bar{\lambda} \langle \mathbf{v}, \mathbf{v} \rangle = -\bar{\lambda} ||\mathbf{v}||^2\). So, \(\lambda ||\mathbf{v}||^2 = -\bar{\lambda} ||\mathbf{v}||^2\), which means \(\lambda = -\bar{\lambda}\). If \(\lambda = a+ib\), then \(a+ib = -(a-ib) = -a+ib\), which implies \(a = -a\), so \(a=0\). Thus \(\lambda = ib\) is purely imaginary.
Source: Anthony & Harvey, Problem 13.15
66
If \(A\) is a normal matrix, is \(A+I\) also normal?
Yes. We need to check if \((A+I)(A+I)^* = (A+I)^*(A+I)\). \\[ (A+I)^* = A^* + I^* = A^* + I \] LHS: \((A+I)(A^*+I) = AA^* + A + A^* + I\).
RHS: \((A^*+I)(A+I) = A^*A + A^* + A + I\).
Since \(A\) is normal, \(AA^* = A^*A\). Therefore, the LHS and RHS are equal, and \(A+I\) is normal.
Source: Generalization of properties
67
Use the Gram-Schmidt process to find an orthonormal basis for the subspace of \(\mathbb{C}^3\) spanned by \(\mathbf{v}_1=(1,i,0)\\) and \(\mathbf{v}_2=(1,2,1+i)\).
1. Normalize \(\mathbf{v}_1\): \\(||\mathbf{v}_1||^2 = 1^2 + |i|^2 + 0^2 = 1+1=2 \implies ||\mathbf{v}_1|| = \sqrt{2}\). \\(\mathbf{u}_1 = \frac{1}{\sqrt{2}}(1, i, 0)\).
2. Find \(\mathbf{w}_2\): \\(\langle \mathbf{v}_2, \mathbf{u}_1 \rangle = \frac{1}{\sqrt{2}}(1\cdot\bar{1} + 2\cdot\bar{i} + (1+i)\cdot\bar{0}) = \frac{1-2i}{\sqrt{2}}\). \\(\mathbf{w}_2 = \mathbf{v}_2 - \langle \mathbf{v}_2, \mathbf{u}_1 \rangle \mathbf{u}_1 = (1,2,1+i) - \frac{1-2i}{\sqrt{2}} \frac{1}{\sqrt{2}}(1,i,0) = (1,2,1+i) - \frac{1-2i}{2}(1,i,0)\) \\(= (1,2,1+i) - (\frac{1-2i}{2}, \frac{i+2}{2}, 0) = (\frac{1+2i}{2}, \frac{2-i}{2}, 1+i)\).
3. Normalize \(\mathbf{w}_2\): \\(||\mathbf{w}_2||^2 = \frac{5}{4} + \frac{5}{4} + 2 = \frac{18}{4} = \frac{9}{2} \implies ||\mathbf{w}_2|| = \frac{3}{\sqrt{2}}\). \\(\mathbf{u}_2 = \frac{\sqrt{2}}{3} (\frac{1+2i}{2}, \frac{2-i}{2}, 1+i) = \frac{1}{3\sqrt{2}}(1+2i, 2-i, 2+2i)\). The basis is \(\{\mathbf{u}_1, \mathbf{u}_2\}\).
Source: Subject Guide, 7.4
68
If \(A\) is a real \(2 \times 2\) matrix with eigenvalues \(a \pm ib\), what is \(\det(A)\) and \(\text{tr}(A)\)?
The determinant is the product of the eigenvalues and the trace is the sum of the eigenvalues.
  • \\(\det(A) = (a+ib)(a-ib) = a^2 - (ib)^2 = a^2 + b^2\). The determinant is a real number.
  • \\(\text{tr}(A) = (a+ib) + (a-ib) = 2a\). The trace is a real number.
This is consistent with the fact that for a real matrix, both the determinant and trace must be real numbers.
Source: Anthony & Harvey, 8.1.4
69
Show that the product of two unitary matrices is unitary.
Let \(U\) and \(V\) be two \(n \times n\) unitary matrices. By definition, \(U^*U = I\) and \(V^*V = I\). We want to show that their product, \(UV\), is also unitary. We check if \((UV)^*(UV) = I\). \\[ (UV)^*(UV) = (V^*U^*)(UV) \] Using associativity of matrix multiplication: \\[ = V^*(U^*U)V = V^*(I)V = V^*V = I \] Thus, the product \(UV\) is unitary.
Source: Subject Guide, 7.5.3
70
If a matrix is diagonal, is it normal?
Yes. Let \(D\) be a diagonal matrix. Its entries are \(d_{ij} = 0\) for \(i \neq j\). The Hermitian conjugate \(D^*\) is the conjugate transpose. The transpose of a diagonal matrix is itself. So \(D^* = \bar{D}\), which is also a diagonal matrix with entries \(\bar{d}_{ii}\) on the diagonal. Since diagonal matrices commute, we have: \\[ DD^* = D\bar{D} = \bar{D}D = D^*D \] The entry \((DD^*)_{ii} = d_{ii}\bar{d}_{ii} = |d_{ii}|^2\). The entry \((D^*D)_{ii} = \bar{d}_{ii}d_{ii} = |d_{ii}|^2\). All off-diagonal entries are zero. Therefore, any diagonal matrix is normal.
Source: Subject Guide, 7.6
71
Find a \(2 \times 2\) matrix that is normal but not Hermitian, unitary, or diagonal.
Consider \(A = \begin{pmatrix} 1 & i \\ i & 1 \end{pmatrix}\). We checked in a previous card that it is normal.
  • It is not diagonal.
  • It is not Hermitian, since \(A^* = \begin{pmatrix} 1 & -i \\ -i & 1 \end{pmatrix} \neq A\).
  • It is not unitary, since \(AA^* = \begin{pmatrix} 2 & 0 \\ 0 & 2 \end{pmatrix} \neq I\).
This matrix is normal but does not fall into the other categories.
Source: Subject Guide, 7.6
72
If \(A\) is a normal matrix, show that \(A\) and \(A^*\) have the same eigenvectors.
We have already proven this. Let \(\mathbf{v}\\) be an eigenvector of a normal matrix \(A\) with eigenvalue \(\lambda\). We showed that \(||(A-\lambda I)\mathbf{v}|| = ||(A-\lambda I)^*\mathbf{v}||\). Since \((A-\lambda I)\mathbf{v} = \mathbf{0}\), it follows that \(||(A^*-\bar{\lambda}I)\mathbf{v}|| = 0\), which means \((A^*-\bar{\lambda}I)\mathbf{v} = \mathbf{0}\). This shows that \(A^*\mathbf{v} = \bar{\lambda}\mathbf{v}\). Therefore, \(\mathbf{v}\\) is an eigenvector of \(A^*\) (with eigenvalue \(\bar{\lambda}\)). The set of eigenvectors is the same.
Source: Anthony & Harvey, 13.6
73
What is the main diagonalisability difference between a general real matrix and a general complex matrix?
The key difference is related to eigenvalues. By the Fundamental Theorem of Algebra, any \(n \times n\) complex matrix has \(n\) complex eigenvalues (counting multiplicities). A real matrix may not have any real eigenvalues.
This means a complex matrix is more 'likely' to be diagonalisable. A complex matrix is diagonalisable if and only if for every eigenvalue, the geometric multiplicity equals the algebraic multiplicity.
A real matrix can fail to be diagonalisable over \(\mathbb{R}\) simply by not having enough real eigenvalues, even if it would be diagonalisable over \(\mathbb{C}\).
Source: Subject Guide, 7.3
74
If \(A\) is a normal matrix and \(U\) is a unitary matrix, is \(U^*AU\) also normal?
Yes. Let \(B = U^*AU\). We check if \(BB^*=B^*B\). \\[ B^* = (U^*AU)^* = U^*A^*(U^*)^* = U^*A^*U \] LHS: \(BB^* = (U^*AU)(U^*A^*U) = U^*A(UU^*)A^*U = U^*AIA^*U = U^*AA^*U\).
RHS: \(B^*B = (U^*A^*U)(U^*AU) = U^*A^*(UU^*)AU = U^*A^*IAU = U^*A^*AU\).
Since \(A\) is normal, \(AA^*=A^*A\), so the LHS and RHS are equal. Thus, \(B\) is normal. This property is called "unitarily invariant".
Source: Generalization of properties
75
Find a \(2 \times 2\) matrix \(B\) such that \(B^3 = A = \begin{pmatrix} 8 & 0 \\ 0 & -i \end{pmatrix}\).
Since \(A\) is diagonal, it is normal. Its spectral decomposition is simple. The eigenvalues are \(\lambda_1=8, \lambda_2=-i\). The projection matrices are \(E_1 = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}\) and \(E_2 = \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}\). So \(A = 8E_1 - iE_2\). We want \(B = \lambda_1^{1/3}E_1 + \lambda_2^{1/3}E_2\). The cube roots of 8 are \(2, 2e^{i2\pi/3}, 2e^{i4\pi/3}\). Let's pick \(\sqrt[3]{8}=2\). The cube roots of \(-i = e^{-i\pi/2}\) are \(e^{-i\pi/6}, e^{i\pi/2}, e^{i7\pi/6}\). Let's pick \(\sqrt[3]{-i} = e^{i\pi/2} = i\). One possible matrix \(B\) is: \\[ B = 2\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} + i\begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 2 & 0 \\ 0 & i \end{pmatrix} \] Checking: \(B^3 = \begin{pmatrix} 2^3 & 0 \\ 0 & i^3 \end{pmatrix} = \begin{pmatrix} 8 & 0 \\ 0 & -i \end{pmatrix} = A\).
Source: Subject Guide, 7.7
76
What is the key takeaway from the Spectral Theorem?
The Spectral Theorem provides a complete classification of which matrices can be unitarily diagonalised. It states that this is possible if and only if the matrix is normal (\\(AA^*=A^*A\\)).
This is a powerful theoretical tool because it tells us exactly which class of matrices has the 'nice' property of possessing an orthonormal basis of eigenvectors. This includes Hermitian, skew-Hermitian, and unitary matrices as special cases.
Source: Subject Guide, 7.6
77
If \(z = 3e^{i\pi/3}\), what is \(z\) in Cartesian form?
We use Euler's formula: \(e^{i\theta} = \cos\theta + i\sin\theta\). \\[ z = 3(\cos(\pi/3) + i\sin(\pi/3)) \] We know that \(\cos(\pi/3) = 1/2\) and \(\sin(\pi/3) = \sqrt{3}/2\). \\[ z = 3\left(\frac{1}{2} + i\frac{\sqrt{3}}{2}\right) = \frac{3}{2} + i\frac{3\sqrt{3}}{2} \] So, the Cartesian form is \(\frac{3}{2} + i\frac{3\sqrt{3}}{2}\).
Source: Subject Guide, 7.1.6
78
If \(A\) is a real matrix, can it be normal without being symmetric?
Yes. For a real matrix \(A\), the normal condition \(AA^*=A^*A\) becomes \(AA^T=A^TA\).
A symmetric matrix (\\(A=A^T\\)) is always normal.
However, a real orthogonal matrix (\\(A^T=A^{-1}\\)) is also normal: \(AA^T = A A^{-1} = I\) and \(A^TA = A^{-1}A = I\).
The rotation matrix \(A = \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix}\) for \(\theta \neq k\pi\) is orthogonal but not symmetric, and it is normal.
Source: Anthony & Harvey, 13.6
79
Is the diagonalisation of a complex matrix unique?
No, it is not unique.
  1. The order of eigenvalues in the diagonal matrix \(D\) can be changed.
  2. Changing the order of eigenvalues in \(D\) corresponds to changing the order of the corresponding eigenvector columns in the transformation matrix \(P\) (or \(U\)).
  3. Eigenvectors are not unique; any non-zero scalar multiple of an eigenvector is also an eigenvector. While this scalar is often chosen to normalize the vector in unitary diagonalisation, one could still multiply by a scalar of modulus 1 (like \(i\) or \(-1\)) to get a different orthonormal eigenvector.
Source: Anthony & Harvey, 8.2
80
What is the main challenge when diagonalising a matrix with repeated eigenvalues?
The main challenge is determining if there are enough linearly independent eigenvectors.
For a matrix to be diagonalisable, the geometric multiplicity (the dimension of the eigenspace) of each eigenvalue must be equal to its algebraic multiplicity (the multiplicity of the root in the characteristic polynomial).
If an eigenvalue \(\lambda\) is repeated \(k\) times, you must be able to find \(k\) linearly independent eigenvectors for that \(\lambda\). If you cannot, the matrix is not diagonalisable. For symmetric/Hermitian/normal matrices, this is guaranteed to be possible.
Source: Anthony & Harvey, 8.3
81
If \(A\) is a \(3 \times 3\) matrix with eigenvalues 1, 1, 5, what can you say about \(\det(A)\) and \(\text{tr}(A)\)?
The determinant is the product of the eigenvalues and the trace is the sum of the eigenvalues.
  • \\(\det(A) = 1 \cdot 1 \cdot 5 = 5\)
  • \\(\text{tr}(A) = 1 + 1 + 5 = 7\)
This holds regardless of whether the matrix is diagonalisable.
Source: Anthony & Harvey, 8.1.4
82
Let \(\mathbf{u}, \mathbf{v} \in \mathbb{C}^n\). Show that \(||\mathbf{u}+\mathbf{v}||^2 = ||\mathbf{u}||^2 + ||\mathbf{v}||^2\) if \(\mathbf{u}\) and \(\mathbf{v}\) are orthogonal. (Pythagoras' Theorem)
We expand \(||\mathbf{u}+\mathbf{v}||^2\) using the definition of the norm: \\[ ||\mathbf{u}+\mathbf{v}||^2 = \langle \mathbf{u}+\mathbf{v}, \mathbf{u}+\mathbf{v} \rangle \] Using the properties of the inner product: \\[ = \langle \mathbf{u}, \mathbf{u} \rangle + \langle \mathbf{u}, \mathbf{v} \rangle + \langle \mathbf{v}, \mathbf{u} \rangle + \langle \mathbf{v}, \mathbf{v} \rangle \] \\[ = ||\mathbf{u}||^2 + \langle \mathbf{u}, \mathbf{v} \rangle + \overline{\langle \mathbf{u}, \mathbf{v} \rangle} + ||\mathbf{v}||^2 \] Since \(\mathbf{u}\) and \(\mathbf{v}\) are orthogonal, \(\langle \mathbf{u}, \mathbf{v} \rangle = 0\). Therefore, the middle two terms are zero. \\[ ||\mathbf{u}+\mathbf{v}||^2 = ||\mathbf{u}||^2 + ||\mathbf{v}||^2 \]
Source: Anthony & Harvey, 13.4.2
83
If \(A\) is an \(n \times n\) matrix, how do you find its complex eigenvalues and eigenvectors?
The procedure is the same as for real eigenvalues and eigenvectors, but the arithmetic is done in \(\mathbb{C}\).
  1. Find Eigenvalues: Solve the characteristic equation \(\det(A - \lambda I) = 0\) for \(\lambda\). By the Fundamental Theorem of Algebra, there will be \(n\) roots in \(\mathbb{C}\) (counting multiplicities).
  2. Find Eigenvectors: For each eigenvalue \(\lambda_i\), find the null space of the matrix \((A - \lambda_i I)\). Any non-zero vector in \(N(A - \lambda_i I)\) is an eigenvector corresponding to \(\lambda_i\). This involves solving a system of linear equations with complex coefficients.
Source: Subject Guide, 7.3
84
What is the key difference between the Gram-Schmidt process in \(\mathbb{R}^n\) and \(\mathbb{C}^n\)?
The process is structurally identical, but the inner product used is different.
In \(\mathbb{R}^n\), the projection formula uses the real dot product: \(\text{proj}_{\mathbf{u}}(\mathbf{v}) = \frac{\mathbf{v} \cdot \mathbf{u}}{||\mathbf{u}||^2}\mathbf{u}\).
In \(\mathbb{C}^n\), the projection formula must use the complex inner product: \(\text{proj}_{\mathbf{u}}(\mathbf{v}) = \frac{\langle \mathbf{v}, \mathbf{u} \rangle}{||\mathbf{u}||^2}\mathbf{u}\).
Because the complex inner product is conjugate symmetric, the order matters: \(\langle \mathbf{v}, \mathbf{u} \rangle\) is not the same as \(\langle \mathbf{u}, \mathbf{v} \rangle\). The formula \(\mathbf{w}_k = \mathbf{v}_k - \sum_{j=1}^{k-1} \langle \mathbf{v}_k, \mathbf{u}_j \rangle \mathbf{u}_j\) must be used with care regarding the order of the vectors in the inner product.
Source: Subject Guide, 7.4
85
If \(A\) is a matrix that is both unitary and Hermitian, what can you say about its eigenvalues?
If \(A\) is Hermitian, its eigenvalues are real.
If \(A\) is unitary, its eigenvalues have a modulus of 1.
The only real numbers with a modulus of 1 are \(1\) and \(-1\).
Therefore, if a matrix is both unitary and Hermitian, its eigenvalues must be either 1 or -1.
Source: Subject Guide, 7.5
86
Is the sum of two normal matrices always normal?
No. Let \(A\) and \(B\) be normal matrices. We check if \(A+B\) is normal. \\((A+B)(A+B)^* = (A+B)(A^*+B^*) = AA^* + AB^* + BA^* + BB^*\) \\((A+B)^*(A+B) = (A^*+B^*)(A+B) = A^*A + A^*B + B^*A + B^*B\) Since \(AA^*=A^*A\) and \(BB^*=B^*B\), for the sum to be normal we need \(AB^* + BA^* = A^*B + B^*A\). This is not true in general.
Source: Generalization of properties
87
How do you find the orthogonal projection from \(\mathbb{C}^n\) onto a subspace spanned by an eigenvector \(\mathbf{u}_i\)?
The matrix that performs this projection is the \(i\)-th projection matrix \(E_i\) from the spectral decomposition. It is given by the outer product: \\[ E_i = \mathbf{u}_i \mathbf{u}_i^* \] where \(\mathbf{u}_i\) is the normalized eigenvector (i.e., \(||\mathbf{u}_i||=1\)).
If you have an orthogonal eigenvector \(\mathbf{v}_i\) that is not normalized, the projection matrix is \(\frac{1}{||\mathbf{v}_i||^2} \mathbf{v}_i \mathbf{v}_i^*\).
Source: Subject Guide, 7.7
88
If a matrix has a complete set of orthonormal eigenvectors, what kind of matrix must it be?
It must be a normal matrix.
The Spectral Theorem states that a matrix is unitarily diagonalisable if and only if it is normal. A matrix is unitarily diagonalisable if and only if it has an orthonormal basis of eigenvectors. Therefore, having a complete set of orthonormal eigenvectors is equivalent to being a normal matrix.
Source: Subject Guide, 7.6
89
Find \(i^{2025}\).
The powers of \(i\) cycle with a period of 4: \(i^1=i, i^2=-1, i^3=-i, i^4=1\). To find \(i^{2025}\), we can find the remainder of 2025 when divided by 4. \\[ 2025 = 4 \times 506 + 1 \] So, \(2025 \equiv 1 \pmod{4}\). Therefore, \\[ i^{2025} = i^1 = i \]
Source: Subject Guide, 7.1
90
If \(A\) is a normal matrix, is its inverse \(A^{-1}\) also normal (assuming it exists)?
Yes. If \(A\) is normal, \(AA^*=A^*A\). We want to check if \(A^{-1}(A^{-1})^* = (A^{-1})^*(A^{-1})\). We know \((A^{-1})^* = (A^*)^{-1}\). Start with \(AA^*=A^*A\). Multiply on the left by \(A^{-1}\) and on the right by \((A^*)^{-1}\): \\[ A^{-1}(AA^*) (A^*)^{-1} = A^{-1}(A^*A)(A^*)^{-1} \] \\[ (A^{-1}A)A^*(A^*)^{-1} = A^{-1}A^*A(A^*)^{-1} \] \\[ I = A^{-1}A^*A(A^*)^{-1} \] Now multiply on the left by \((A^*)^{-1}\) and on the right by \(A^{-1}\): \\[ (A^*)^{-1}A^{-1} = (A^{-1})^*A^{-1} = A^{-1}(A^*)^{-1} \] This shows \((A^{-1})^*A^{-1} = A^{-1}(A^{-1})^*\), so \(A^{-1}\) is normal.
Source: Generalization of properties
91
What is the main procedural difference between diagonalising a general complex matrix and unitarily diagonalising a normal matrix?
The main difference is the treatment of eigenvectors within each eigenspace.
  • For a general diagonalisation, you just need to find a basis of eigenvectors for each eigenspace. The collection of all these basis vectors must form a set of \(n\) linearly independent vectors.
  • For a unitary diagonalisation, you must find an orthonormal basis for each eigenspace. This often requires applying the Gram-Schmidt process to the basis vectors found for any eigenspace with dimension greater than 1.
Eigenvectors from different eigenspaces of a normal matrix are automatically orthogonal, so you only need to apply Gram-Schmidt within each eigenspace.
Source: Subject Guide, 7.6
92
If \(A\) is a real skew-symmetric matrix (\\(A^T = -A\\)), what can you say about its eigenvalues?
A real skew-symmetric matrix is also a skew-Hermitian matrix, because if \(A\) is real, \(A^* = \bar{A}^T = A^T\). So \(A^* = -A\).
The eigenvalues of any skew-Hermitian matrix are purely imaginary or zero. Therefore, the eigenvalues of a real skew-symmetric matrix are purely imaginary or zero.
Source: Anthony & Harvey, Problem 13.15
93
Let \(z = 2-2i\). Find \(|z|\) and \(\text{Arg}(z)\).
The complex number is \(z = 2-2i\). This corresponds to the point \((2, -2)\) in the complex plane.
  • Modulus: \(|z| = \sqrt{2^2 + (-2)^2} = \sqrt{4+4} = \sqrt{8} = 2\sqrt{2}\).
  • Argument: The point is in the fourth quadrant. The angle with the positive real axis is \(\tan\theta = -2/2 = -1\). The principal argument in \((-\pi, \pi]\) is \(\theta = -\pi/4\).
Source: Subject Guide, 7.1.5
94
If \(A\) is a normal matrix, is \(A - cI\) also normal for any \(c \in \mathbb{C}\)?
Yes. Let \(B = A - cI\). We check if \(BB^* = B^*B\). \\[ B^* = (A - cI)^* = A^* - \bar{c}I^* = A^* - \bar{c}I \] LHS: \(BB^* = (A-cI)(A^*-\bar{c}I) = AA^* - cA^* - \bar{c}A + c\bar{c}I\).
RHS: \(B^*B = (A^*-\bar{c}I)(A-cI) = A^*A - cA^* - \bar{c}A + \bar{c}cI\).
Since \(A\) is normal (\\(AA^*=A^*A\)) and \(c\bar{c}=\bar{c}c=|c|^2\), the LHS and RHS are equal. Thus, \(A-cI\) is normal.
Source: Generalization of properties
95
If \(A\) is unitarily diagonalisable, is \(A\) invertible?
Not necessarily. A matrix is invertible if and only if 0 is not one of its eigenvalues.
If \(A\) is unitarily diagonalisable, it means \(A = UDU^*\) where \(D\) contains the eigenvalues of \(A\). The determinant of \(A\) is the product of its eigenvalues. \\[ \det(A) = \det(UDU^*) = \det(U)\det(D)\det(U^*) = \det(D) = \lambda_1\lambda_2\cdots\lambda_n \] If one of the eigenvalues is 0, then \(\det(A)=0\) and \(A\) is not invertible. For example, the zero matrix is normal and diagonal, but not invertible.
Source: Subject Guide, 7.6
96
What is the relationship between the column space of a complex matrix \(A\) and the null space of \(A^*\)?
They are orthogonal complements of each other in \(\mathbb{C}^m\) (where \(A\) is \(m \times n\)). \\[ R(A)^{\perp} = N(A^*) \] This is a fundamental theorem of linear algebra. It means that every vector in the column space of \(A\) is orthogonal to every vector in the null space of its Hermitian conjugate.
Source: Anthony & Harvey, 12.2.2 (generalized to complex case)
97
If \(A\) is a \(3 \times 3\) normal matrix with eigenvalues \(1, i, -i\), find the eigenvalues of \(A^2\).
If \(\lambda\) is an eigenvalue of \(A\) with eigenvector \(\mathbf{v}\), then \(A\mathbf{v} = \lambda\mathbf{v}\). Then \(A^2\mathbf{v} = A(A\mathbf{v}) = A(\lambda\mathbf{v}) = \lambda(A\mathbf{v}) = \lambda(\lambda\mathbf{v}) = \lambda^2\mathbf{v}\). So, if \(\lambda\) is an eigenvalue of \(A\), then \(\lambda^2\) is an eigenvalue of \(A^2\).
The eigenvalues of \(A^2\) are:
  • \(1^2 = 1\)
  • \(i^2 = -1\)
  • \((-i)^2 = -1\)
The eigenvalues of \(A^2\) are 1, -1, -1.
Source: Anthony & Harvey, 9.1
98
If \(A\) is Hermitian, is \(iA\) also Hermitian?
No. If \(A\) is Hermitian, \(A^*=A\). Let's check the conjugate transpose of \(iA\): \\[ (iA)^* = \bar{i}A^* = (-i)A = -A \] For \(iA\) to be Hermitian, we would need \(iA = -A\), which implies \((i+1)A=0\). This is not true for any non-zero matrix \(A\).
In fact, if \(A\) is Hermitian, then \(iA\) is skew-Hermitian.
Source: Anthony & Harvey, Problem 13.15
99
What is the key advantage of expressing a complex number in exponential form \(re^{i\theta}\) for multiplication and division?
Multiplication and division become much simpler.
  • Multiplication: To multiply two complex numbers, you multiply their moduli and add their arguments. \\[ (r_1e^{i\theta_1})(r_2e^{i\theta_2}) = r_1r_2e^{i(\theta_1+\theta_2)} \]
  • Division: To divide two complex numbers, you divide their moduli and subtract their arguments. \\[ \frac{r_1e^{i\theta_1}}{r_2e^{i\theta_2}} = \frac{r_1}{r_2}e^{i(\theta_1-\theta_2)} \]
This is much faster than multiplying or dividing in Cartesian form, which requires FOIL and multiplying by the conjugate.
Source: Subject Guide, 7.1.5
100
How does the concept of an orthogonal matrix in \(\mathbb{R}^n\) translate to the complex case?
The concept of an orthogonal matrix in \(\mathbb{R}^n\) (where \(Q^TQ=I\)) is generalized to a unitary matrix in \(\mathbb{C}^n\).
The transpose operation is replaced by the Hermitian conjugate (conjugate transpose), denoted by \(*\).
A real matrix \(Q\) is orthogonal if \(Q^T = Q^{-1}\).
A complex matrix \(U\) is unitary if \(U^* = U^{-1}\).
Just as the columns of an orthogonal matrix form an orthonormal basis for \(\mathbb{R}^n\) with the real dot product, the columns of a unitary matrix form an orthonormal basis for \(\mathbb{C}^n\) with the complex inner product.
Source: Subject Guide, 7.5.3; Anthony & Harvey, 13.5.2