Linear Independence is when the vector can not be expressed with the other vectors in the set (when you have to use s or t).
and Linear Dependence is when the vector can be expressed as some sort of product or some of the other vectors in the set.
Vectors $ \vec{u}_1 … \vec{k} \in \mathbb{R}^m $ are laid to the inequality dependent / independent $\mu [ \vec{u}_1, \vec{u}_k | 0 ]$ has no trivial solutions
Theorem: Vectors $\vec{u}_1 … \vec{u}_k \in \mathbb{R}^m$ are linearly dependent $\Leftrightarrow$ for some i, 1 $ 1 \leq i \leq k, \vec{u}_i $ is expressed iff as a linear dependant
Proposition: Let $ \vec{u}_1 … \vec{u}_k \in \mathbb{R}^m $, if k > m (aka: more rows then columns in the matrix), then vectors $ \vec{u}_1 … \vec{u}_k$ are linearly dependent ()
$$ A = \left[ \begin{array}{rrrr} a_{11} & a_{12} & \dots & a_{1n} \\ a_{11} & a_{12} & \dots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \dots & a_{mn} \end{array} \right] = \left[ \begin{array}{r} R_1 \\ R_2 \\ \vdots \\ R_m \end{array} \right] = \left[ \begin{array}{r} C_1 \\ C_2 \\ \vdots \\ C_n \end{array} \right] $$
m = n then this is called a square matrix. if all the non zero values are 1, then it is a identity matrix.
Note: these will only work when the m and n values of the matrix are equivalent
Let A and B be $2 \times 2$ matrices.
$A + B$ is defined as every entity in A being added to every corresponding entity in B.
$$ [ a_{ij} + b_{ij} ], 1 < i < m, 1 < j < n$$
$A \cdot B$ is defined similarly as every entity in A being multiplied to every corresponding entity in B.
$$ [ a_{ij} \cdot b_{ij} ]$$