{sn}n≥m is monotone if it's increasing or decreasing.
A general trend in math is that monotonethings tend to have nice properties, which is why we're interested in them. With sequences, we have the following theorem:
Theorem
All bounded monotone sequences converge.
As you may guess, a bounded increasing sequence converges to its supremum, and a bounded decreasing sequence converges to its infimum. This is a very useful tool when working with recursive sequences.
Example 1.
(10.9)
Let s1=1 and sn+1=(n+1n)sn2 for n≥1.
Prove that n→∞limsn exists.
Prove n→∞limsn=0.
Solution.
You may be tempted to do the following: (which is wrong!)
If we take limits on both sides, then we get the equation s=s2, so s=0 or s=1. Since sn<1 for n≥1, this means s=1, so s=0.
The problem is the very first step: this fake proof is trying to apply the limit laws without knowing if the limits exist or not. This is why it's important to do (1) before doing (2).
Generally, when you're given a recursive sequence like this, you want to prove that it's bounded and monotone (usual by induction, though it's not always necessary). We want to figure out whether the sequence is increasing or decreasing, so a good way to start is to write out a few terms and take a guess:
s1=1,s2=32,s3=31.
Based on this, we're going to guess that sn is decreasing. Since anything square is non-negative, we automatically get the lower bound sn≥0. Next, we need to try to prove that the sequence is decreasing, i.e., we want to show that sn+1≤sn. Generally, however, it's usually much easier to show that sn+1−sn≤0, so let's see what happens:
If we can show that sn−1≤0, then we will have shown that the sequence decreases, so let's try to do that by induction:
Base case: (n=1)
s1=1≤1, so the base case holds.
Inductive step:
Assume that sn≤1. We need to show that sn+1≤1 also:
sn+1=n+1nsn2≤n+1n+1⋅12=1.
Thus, the inductive step holds, so sn≤1 for all n≥1. This also proves that sn+1≤sn for all n≥1, which means that {sn}n is a decreasing sequence bounded below, so it converges.
Now that we know that the sequence converges, now we can take limits on both sides. Let s=limn→∞sn, so by our limit laws,
Thus, s(s−1)=0, which means s=0 or s=1. We saw that sn is decreasing, so sn≤s2=32 for all n≥2. This means that s<1, so s=0.
Constructing Sequences
This is an important technique and is best illustrated through an example.
Example 2.
Let a∈R. Then there exists a sequence of rational numbers {qn}n which decrease to a.
Solution.
The main idea is to use density of rationals over and over again, but in a careful way to get the properties we want. There are two things we want from our sequence:
It gets closer and closer to a. For concreteness, we want a≤qn≤a+n1 for each n≥1, but you can replace n1 with any other positive sequence that converges to 0 (another good choice is 2n1).
It is a decreasing sequence.
To construct a sequence, we generally want to do it by induction and making sure that at each step, we pick a rational number that does both of these.
Base case: (n=1)
Since Q is dense in R, there exists a rational q1∈(a,a+1). To illustrate the inductive step, I will also do the case n=2 by hand:
To accomplish (1), we want q2∈(a,a+21). However, we need to also guarantee that q2≤q1, so we also need to require that q2∈(a,q1). Let ε=min{21,q1−a}, so (a,a+ε) is a non-empty interval. Thus, because Q is dense, there exists a rational q2∈(a,a+ε), and this satisfies both (1) and (2).
Inductive step:
Now suppose we have chosen q1,…,qn such that qk∈(a,a+k1) and a≤qn≤qn−1≤⋯≤q1. Let ε=min{n+11,qn−a}, so (a,a+ε) is a non-empty interval. By density of Q, there exists a rational qn+1∈(a,a+ε), and by construction,
a≤qn+1≤a+ε≤a+n+11anda≤qn+1≤a+(qn−a)=qn.
Thus, the inductive step holds. By induction, we obtain a decreasing sequence {qn}n such that qn∈(a,a+n1) for each n≥1, i.e., a≤qn≤a+n1. By the squeeze theorem, limn→∞qn=a, which completes the proof.
Cauchy Sequences
Definition
A sequence {sn}n is Cauchy if for all ε>0, there exists N∈R such that ∣sn−sk∣<ε whenever n>N and k>N.
This definition looks a lot like the definition of a limit, but there is an important difference: there is no limit specified. Instead, we just put another term of the sequence in the absolute values.
You're probably thinking "what's the point?" which is a valid question. The point is the following theorem:
Theorem
A sequence is Cauchy if and only if it is convergent.
The "if and only if" tells you that the definition of Cauchy sequences and the definition of a convergent sequence describe the same thing. The key difference, though, is that in the definition of a Cauchy sequence, we don't have to know what the limit is. Generally, this is very useful when you do more abstract math since you have less information about the sequences you're working with.
Example 3.
Let's say we want to show that an infinite sum converges: n=1∑∞sn, which is defined to be N→∞limn=1∑Nsn. We're already cheating here, since we're defining the infinite sum to be the limit of something that we don't even know exists or not.
We can't really use the definition of a limit since we don't know if the limit even exists, but if we use the Cauchy definition of a convergent sequence, then we're in good shape: we just need to show that the sequence {n=1∑Nsn}N is Cauchy, and this is fine since every term in the sequence is a finite sum, so we're dealing with things that we know exist.
Definition
Let {sn}n be a sequence.
The limit superior of {sn}n is
n→∞limsupsn=N→∞limsup{sn∣n>N}.
The limit inferior of {sn}n is
n→∞limsupsn=N→∞liminf{sn∣n>N}.
Remark.
Let uN=sup{sn∣n>N}. Notice that {sn∣n>N+1}⊆{sn∣n>N}, so by the properties of the supremum,
sup{sn∣n>N+1}≤sup{sn∣n>N}⟹uN+1≤uN,
i.e., {uN}N is a decreasing sequence. This tells you that limsupn→∞sn, which is equal to limN→∞uN, always makes sense. If {uN}N is bounded, then it converges by the earlier theorem. Otherwise, it diverges to −∞. In the first case, this tells you that
n→∞limsupsn=Ninfsup{sn∣n>N}.
Similarly,
n→∞liminfsn=Nsupinf{sn∣n>N}.
Intuitively, limsupn→∞sn is the "biggest thing sn can converge to" and liminfn→∞sn is the "smallest thing sn can converge to". For example:
Example 4.
Let sn=(−1)n. Look at the following "subsequences":
sn−1−111−1−111−1−111⋯⋯⋯
If we only look at the odd terms (second row), then we get a "subsequence" which converges to −1. Similarly, if we look at the even terms (third row), we get one that converges to 1. This tells us that