Matrix Multiplication part 2

This is part two of the “boring stuff.” I put that in quotes to evoke a sense of perspective. How boring can it be? We’re still doing math.

Suppose I have linear transformations T:U\to V, and S:V\to W. Let \mathcal A, \mathcal B, and \mathcal C be bases for U, V, and W respectively. What happens when I compose T and S? I sure hope that it turns out to me multiplying the matrices (T)_{\mathcal A\to \mathcal B} and (S)_{\mathcal B\to \mathcal C}.

Let \mathcal A=\{a_1,\dots,a_\ell\}\mathcal B=\{b_1,\dots,b_m\}\mathcal C=\{c_1,\dots,c_n\}.

Let M=(S\circ T)_{\mathcal A\to \mathcal C}. I want to know what the entry M_{i,j} in the matrix is. So I want to find out what S\circ T does to a the basis vector a_i and then look at the component of c_j. That will give me the appropriate coefficient.

(S\circ T)a_i=S(Ta_i)=S\left(\displaystyle\sum_{k=1}^mt_{k,i}b_i\right)=\displaystyle\sum_{k=1}^mt_{i,k}S(b_k)=\sum_{k=1}^mt_{i,k}\left(\sum_{p=1}^ns_{k,p}c_p\right)

Of course, we only care about the coefficient of c_j, so we can disregard any time p\neq j. This gives us:


And of course, this is exactly what you would get if you multiplied matrices (S_{\mathcal B\to\mathcal C})\cdot( T_{\mathcal A\to\mathcal B}) the way you were taught.

Okay. I’m pretty sure we’re done with annoying computations for a while.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s