[This content was created with ML, available from the App Store.]

# Linear Transformations

A linear transformation is a special function, a function that maps an object in one vector space to an object in another vector space (Note: the two vector spaces can be the same vector space.)

Let $V$ and $W$ be vector spaces with ordered bases $\beta$ and $\gamma$ respectively:

Let $T$ be a linear transformation from $V$ into $W$:

$T:V\to W$

The transformation $T$ takes any vector $x$ in $V$ and maps it to a vector $T\left(x\right)$ in $W$:

$x\to T\left(x\right)$

For example, the vectors in the basis $\beta$ are mapped as follows:

${x}_{j}\to T\left({x}_{j}\right)$

The above expression maps the basis vector ${x}_{j}$ of $V$ to the vector $T\left({x}_{j}\right)$in $W$. The vector $T\left({x}_{j}\right)$ is computed as a linear combination of the $M$ basis vectors in the basis $\gamma$ of $W$; each basis vector contributes to the combination by an amount determined by the (unique) real number ${a}_{ij}$. Therefore, there are $M$ such numbers for each $T\left({x}_{j}\right)$ that corresponds to ${x}_{j}$.

The coefficients ${a}_{ij}$ in the linear combinations above form a matrix $A$ which we call the transformation matrix that represents $T$. Each column of this matrix is a coordinate vector that corresponds to a basis vector in the basis $\beta$ of the vector space $V$. More specifically, the $j$th column of this matrix represents the coordinate vector ${\left[T\left({x}_{j}\right)\right]}_{\gamma }$ of $T\left({x}_{j}\right)$ corresponding to the basis vector ${x}_{j}$:

$A=\left(\begin{array}{ccc}{\left[T\left({x}_{1}\right)\right]}_{\gamma }& \cdots & {\left[T\left({x}_{N}\right)\right]}_{\gamma }\end{array}\right)$

We also use the notation ${\left[T\right]}_{\beta }^{\gamma }$ to denote this matrix in order to stress the fact that $T$ is a linear transformation with respect to the ordered bases $\beta$ and $\gamma$.

$A={\left[T\right]}_{\beta }^{\gamma }$ and ${A}_{ij}={a}_{ij}$ where $1\le i\le M$ and $1\le j\le N$

The following are important facts worth remembering about the transformation matrix $A$.

Each column of $A$ represents a coordinate vector corresponding to a basis vector in $\beta$ of $V$. Namely, the $j$th column of $A$ represents the coordinate vector ${\left[T\left({x}_{j}\right)\right]}_{\gamma }$ of $T\left({x}_{j}\right)$ corresponding to the basis vector ${x}_{j}$ in the basis $\beta$ of $V$.

The transformation matrix $A$ has $N$ columns, because its $j$th column represents the coordinate vector of $T\left({x}_{j}\right)$ corresponding to the basis vector ${x}_{j}$ and there are $N$ such basis vectors.

The transformation matrix $A$ has $M$ rows, because its $j$th column represents the coordinate vector ${\left[T\left({x}_{j}\right)\right]}_{\gamma }$ of $T\left({x}_{j}\right)$ corresponding to the basis vector ${x}_{j}$ and the coordinate vector has $M$ elements.