Showing \( \left( A B \right) C = A \left( B C \right) \) is relatively a simple matter using the super duper summation convention.
Firstly the ith row, jth column of a matrix \( A \) is designated by \( A_{ij} \), \( A^{i}_{j} \) or \( A^{ij} \).
Matrix multiplication is then defined as
$$
\left[ XY \right]_{ij} = X_{ik}Y_{kj}
$$
where square brackets have been placed around \( X Y \) to designate the \( i j \) element of the matrix \( X \cdot Y \).
\begin{aligned}
\left[ \left( A B \right) C \right]_{ij} & = \left( A B \right)_{ik} C_{kj} \\
& = A_{il} B_{lk} C_{kj} \\
& = A_{il} \left( B_{lk} C_{kj} \right) \\
& = A_{il} \left[ B C \right]_{lj} \\
& = \left[ A \left( B C \right) \right]_{ij} \\
\therefore \left( A B \right) C & = A \left( B C \right)
\end{aligned}
Monday, October 31, 2011
Sunday, October 30, 2011
Matrix Inverses are unique
Left Inverse = Right Inverse
Let \( A \) be an \( n \times n \) matrix and \( AB = \mathbb{I} \), so that \( B = A^{-1} \).It follows that \( BA = \mathbb{I} \)...
\begin{aligned} AB & = \mathbb{I} \\ B \left( AB \right) & = B \mathbb{I} \\ \left( B A \right) B & = B \\ \therefore \left( B A \right) & = \mathbb{I} \end{aligned}
Inverses Are Unique
Suppose another inverse of \( A \) existed, \( C \), so that \( AC = CA = \mathbb{I} \).It follows that \( B = C = A^{-1} \)...
\begin{aligned} A B & = \mathbb{I} \\ C \left( A B \right) & = C \mathbb{I} \\ \left( C A \right) B & = C \\ \left( \mathbb{I} \right) B & = C \\ \therefore B & = C \end{aligned}
Subscribe to:
Posts (Atom)