What Are Sequences and Series? Ordered Numbers and Their Sums
A sequence is a list. A series is a sum.
That's the entire distinction. A sequence gives you numbers in order: 1, 2, 3, 4, 5. A series adds them up: 1 + 2 + 3 + 4 + 5 = 15.
The sequence is the pattern. The series is the accumulation.
This difference matters because sequences and series behave differently. A sequence can go to infinity without any crisis—the natural numbers 1, 2, 3, 4, ... are a perfectly well-defined sequence. But when you try to add them up? The sum explodes. 1 + 2 + 3 + 4 + ... doesn't equal any number. It diverges.
Yet some infinite series do have finite sums. Add up 1/2 + 1/4 + 1/8 + 1/16 + ..., and you get exactly 1. The terms get small fast enough that their infinite sum stays bounded.
This is the key question of series: When does adding forever give you a finite answer?
Sequences: Patterns with Positions
A sequence assigns a number to each position. The first term, the second term, the third term—each spot in line gets exactly one value.
We write sequences with subscripts: a₁, a₂, a₃, a₄, ... or with a formula: aₙ = n² gives you 1, 4, 9, 16, 25, ...
The positions matter. The sequence 1, 2, 3 is different from the sequence 3, 2, 1. Order is information.
Finite vs. Infinite Sequences
A finite sequence has a last term. The sequence 2, 4, 6, 8, 10 stops at 10.
An infinite sequence goes on forever. The sequence 2, 4, 6, 8, 10, ... (with the dots) never ends. Every even number appears somewhere.
Infinite sequences raise a natural question: Where does the pattern go? Does it approach some value? Blow up to infinity? Oscillate forever without settling?
When we ask "where does aₙ go as n → ∞?" we're asking about the limit of the sequence.
Series: Sums of Sequences
A series takes a sequence and adds its terms. If the sequence is a₁, a₂, a₃, ..., the series is:
S = a₁ + a₂ + a₃ + ...
For finite sequences, this is just arithmetic. The sum of 2 + 4 + 6 + 8 + 10 is 30.
For infinite sequences, the question becomes profound: Can you actually add infinitely many numbers and get a finite result?
Intuition says no. Adding forever should give you infinity.
But intuition is wrong. Watch:
1/2 + 1/4 + 1/8 + 1/16 + ...
Each term is half the previous. The partial sums are:
- S₁ = 1/2
- S₂ = 1/2 + 1/4 = 3/4
- S₃ = 3/4 + 1/8 = 7/8
- S₄ = 7/8 + 1/16 = 15/16
See the pattern? Sₙ = 1 - (1/2)ⁿ. As n → ∞, the partial sums approach 1.
The infinite series converges to 1.
Partial Sums: The Bridge Between Sequences and Series
To make sense of infinite sums, we use partial sums. The partial sum Sₙ is what you get when you add the first n terms:
Sₙ = a₁ + a₂ + ... + aₙ
Now the partial sums themselves form a sequence: S₁, S₂, S₃, S₄, ...
If this sequence of partial sums approaches a limit L, we say the series converges to L.
If the partial sums don't approach any limit—they blow up, or oscillate without settling—we say the series diverges.
Convergence of an infinite series is really about the convergence of its partial sums. The infinite sum is defined as the limit of what you get by adding more and more terms.
The Two Questions
For any sequence, we can ask:
- What does aₙ approach? (The limit of the sequence)
- What does the sum of all aₙ equal? (The convergence of the series)
These are different questions with different answers.
The sequence 1/n approaches 0 as n → ∞. Good—the terms get small.
But the series 1 + 1/2 + 1/3 + 1/4 + ... (the harmonic series) diverges. The terms shrink, but not fast enough. Add enough of them and you exceed any bound.
Meanwhile, the series 1/2 + 1/4 + 1/8 + ... converges to 1. These terms shrink fast enough that their infinite sum is finite.
The behavior of the terms determines whether the series converges, but not always in the way you'd expect.
Why Sequences and Series Matter
Sequences and series appear everywhere:
In finance: Compound interest generates geometric sequences. The present value of future payments is a geometric series.
In calculus: Taylor series express functions as infinite polynomials. e^x = 1 + x + x²/2! + x³/3! + ...
In physics: Fourier series decompose signals into sums of sine waves. Every sound you hear is a series.
In computer science: Recurrence relations define sequences where each term depends on previous ones. The runtime of recursive algorithms follows these patterns.
The language of sequences and series lets you describe patterns that evolve, accumulate, and converge. It's the mathematics of process and accumulation—what happens when you follow a rule forever.
What Comes Next
This series explores:
- Arithmetic and geometric sequences — The two most common patterns
- Summation formulas — Shortcuts for adding up sequences
- Infinite series — When forever gives you a finite answer
- Convergence tests — How to tell if a series will converge
- Power series — Functions as infinite polynomials
The goal is not just to calculate, but to understand: What does it mean to add infinitely many things, and when can you actually do it?
Part 1 of the Sequences Series series.
Previous: Sequences and Series Explained Next: Arithmetic Sequences: Adding the Same Amount Each Time
Comments ()