matrix multiplication is defined such that the expression \mathcal{M}(ST) = \mathcal{M}(S)\mathcal{M}(T) holds:
While matrix multiplication is distributive and associative, it is NOT!!!!!!!!!!! commutative. I hope you can see that ST\neq TS. memorization its always row-by-column, move down rows first then columns multiply element-wise and add (row times column and add) other ways of thinking about matrix multiplication it is “row times column”: (AC)_{j,k} = A_{j, .} \cdot C_{., k} it is “matrix times columns”: (AC)_{. , k} = A C_{., k} matrix as a linear combinator Suppose A is an m by n matrix; and c = \mqty(c_1\\ \vdots\\ c_{0}) is an n by 1 matrix; then:
(i.e. you can use a vector to linearly combinate the column vectors.) linear maps are like matrix multiplication \begin{equation} \mathcal{M}(Tv) = \mathcal{M}(T)M(v) \end{equation} “the matrix of a vector formed by applying some Linear Map T onto v is the same as the product of the matrix of T and the matrix of a vector of v” Proof: Let v_1 \dots v_{n} be a basis of v. So, we have that Tv = c_1Tv_{1} + \dots + c_{n}T v_{n} by the additivity and homogeneity of T. Then, converting it all to matricies:
because the columns of a matrix represent where each basis vector gets taken in the new space. You will notice now that c_1 \dots c_{n} are the scalars needed to construct v, and that \mathcal{M}(T)_{.,1} \dots are the vectors needed to construct \mathcal{M}(T). So:
as desired. \blacksquare