A vector has both magnitude and direction.
In [[linear algebra]], vectors are typically represented as column vectors like
$\vec v = \begin{pmatrix} 5 \\ 3 \end{pmatrix}$
where the vector $\vec v$ has magnitude of $5$ and direction $3$. We could draw this in 2-dimensional space as a line of length $5$ with a slope of $\frac35$. Note that the origin of this line is not something we care about, however by convention a vector is plotted from "standard position" or the origin $(0, 0)$. #diagram
## algebraic operations on vectors
To sum two vectors, sum the corresponding coordinates.
$\begin{pmatrix} a \\ b \end{pmatrix} + \begin{pmatrix} c \\ d \end{pmatrix} = \begin{pmatrix} a+c \\ b+d \end{pmatrix}$
Multiply a vector by a scalar by multiplying the scalar by each coordinate.
$c \begin{pmatrix} a \\ b \end{pmatrix} = \begin{pmatrix} ca \\ cb \end{pmatrix}$
## unit vectors
Another way to represent vectors is using unit vectors. Unit vectors are often but not always denoted with $u$.
For a 2-dimensional space, we can define two unit vectors that represent one unit magnitude in the first dimension $\hat i$ and a one unit magnitude in the second dimension $\hat j$.
$\begin{align} \hat i = \begin{bmatrix} 1 \\ 0 \end{bmatrix} && \hat j = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \end{align}$
Thus, any vector can be represented as in terms of the unit vectors. For example
$\vec v = 5 \hat i + 3 \hat j$
The unit vector of vector $v$ can be calculated as
$u = \frac{1}{||v||}v$
where $||v||$ is the magnitude of the vector. Each element of $u$ will be the equivalent of a one unit magnitude in the corresponding dimension.
## magnitude of a vector
The magnitude (norm) of a vector, denoted $||\vec v||$ is calculated by the square root of the dot product of itself
$||\vec v|| = \sqrt{v \cdot v} = \sqrt{v_i^2 + v_2^2 + \dots v_i^2}$
Note that in the two dimensional case, this is simply the Pythagorean Theorem. Thus you can think of the magnitude as the length of the vector (as the hypotenuse would be the length of the vector in 2d space).
## distance between two vectors
The distance between two vectors $u$ and $v$ is simply the magnitude of their difference.
$dist(u,v) = ||u - v||$
## linear combinations of vectors
A linear combination of vectors is any combination like
$a_1 \vec v_1 + a_2 \vec v_2 + \dots + a_n \vec v_n$
for any series of real numbers $a_1, a_2, \dots, a_n$.
The **parallelogram rule** can be used to solve linear combinations of vectors. You can shift any vector to the ending point of another vector to sum the two vectors.
Solving a linear combination of vectors involves finding the values for the coefficients $a_i$ such that the linear combination holds. Use Gaussian-Jordan elimination on the augmented matrix to solve the system of linear equations for each $a_i$.
## span
Span refers to the set of all vectors that can be created through a linear combination of vectors. One might ask whether vector $v_3$ in the span of vectors $v_1, v_2$, which is to say is there some linear combination of vectors $v_1, v_2$ that would result in vector $v_3$.
Given vectors
$v_1 = \begin{bmatrix} 1 \\ -2 \\ 5 \end{bmatrix}, \ v_2 = \begin{bmatrix} 5 \\ -13 \\ -3 \end{bmatrix} , \ b = \begin{bmatrix} -3 \\ 8 \\ 1 \end{bmatrix}$
is $b$ in the span of $v1, v_2$?
The span of $v_1, v_2$ is
$\text{span} \{v_1, v_2 \} = a_1v_1 + a_2v_2 $
If the solution to
$a_1v_1 + a_2v_2 = b$
is non-existent, $b$ is not in the span of $v_1, v_2$.
## Linearly independent
A set of vectors is linearly independent if the linear combination of all vectors in the set equals zero if and only if all scalar coefficients are zero. If this is not the case, that implies that at least one of the vectors is exactly a scaled linear combination of one or more other vectors in the set, and thus those vectors would not be independent.
Similarly, columns of a matrix are linearly independent if the only solution set is the trivial set, all zeros. To determine if the column of a matrix are linearly independent, write the augmented matrix with $0$ as the solutions.
## Linear transformation
A linear transformation on a matrix transforms the dimensions of the matrix.
## dot product
To get the dot product, or inner product, of two vectors, transpose one vector and multiply.
$A \cdot B = B \cdot A = A^TB = B^TA$
## orthogonal
Orthogonal vectors are perpendicular vectors. If $u, v$ are orthogonal then
- $u \cdot v=0$
- $||u + v||^2 = ||u||^2 + ||v||^2$
- in $R^2$ and $R^3$, $u \cdot v = ||u|| \ ||v|| \ cos(\theta)$
## orthogonal projection
The orthogonal projection of vector $y$ to $\hat y$ is calculated by
$\hat y = \frac{y \cdot u}{u \cdot u} u$
> [!Tip]- Additional Resources
> - [Linear algebra | Khan Academy](https://www.khanacademy.org/math/linear-algebra)