Let be a vector space, let be distinct. Show that is linearly dependent if and only if and are multiples of each other.
We have two directions to prove.
""
Assume that is linearly dependent. Then there exist not both zero such that . If , then we may multiply both sides by to get
Otherwise, , and we analogously get
so in both cases, and are multiples of each.
""
Suppose and are multiples of each other. Without loss of generality, we assume that is a multiple of , i.e., there exists a scalar such that . In the other case, we can swap and in the following proof. Rearranging , we get
Thus, the zero vector is a non-trivial linear combination of and since , so is linearly dependent.
Let be a field of characteristic not equal to two, and let be a vector space over .
""
Assume that is linearly independent. Let's show that is linearly independent. Suppose we have scalars such that
We need to show that . Distributing and refactoring, we get
but is linearly independent, so this forces
Adding the equations together and subtracting them, we get the system
Since has characteristic different from , we know , so . Thus, multiplying the equations by , we get .
""
Assume that is linearly independent. Now assume are such that . We need to show that .
In order to use the fact that , we need to turn into a linear combination of and . Let's work backwards: assume there were scalars such that
By comparing coefficients, we know that if we could solve
for and , then our idea works. Adding and subtracting, this system becomes
Like before, because has characteristic not equal to , we can divide by , so
With these choices of and , we have
So because is linearly independent, this forces the coefficients to be :
Solving like before, we get .
If has characteristic , then , so . This means that , and so can never be linearly dependent. As a concrete example, let and . Then if and , then is linearly independent, but
This is almost identical to part 1, so I'll omit it.