Permutations: Ordered Arrangements

Permutations: Ordered Arrangements
Permutations: Ordered Arrangements | Ideasthesia

Three runners cross the finish line. Alice is first. Bob is second. Carol is third.

Now swap them: Bob first, Carol second, Alice third.

Same three people. Different race. Different outcome. Different permutation.

This is what makes permutations different from combinations: order matters. ABC is not the same as BAC. First place is not the same as second. The sequence carries information that the set does not.

And it turns out that when order matters, the number of possibilities explodes.


The Definition

A permutation is an ordered arrangement of objects.

If you have n distinct objects and you arrange all of them in a row, there are n! (n factorial) possible permutations.

Why? Start with the first position: you have n choices. For the second position, you've used one object, leaving n-1 choices. For the third, n-2 choices. Continue until you place the last object (1 choice remaining).

By the multiplication principle: n × (n-1) × (n-2) × ... × 2 × 1 = n!

Examples:

  • 3 objects (A, B, C): 3! = 6 permutations (ABC, ACB, BAC, BCA, CAB, CBA)
  • 5 books on a shelf: 5! = 120 permutations
  • 10 people in a line: 10! = 3,628,800 permutations
  • 52 cards in a deck: 52! ≈ 8 × 10⁶⁷ permutations

The growth is super-exponential. By the time you reach 20!, you're already at 2.4 quintillion (2.4 × 10¹⁸). By 70!, you've exceeded the number of atoms in the observable universe.

This isn't just abstract scaling. It's the reason cryptographic keys work. The number of possible arrangements is so vast that brute-force search becomes impossible.


Partial Permutations: Arranging a Subset

What if you don't arrange all n objects—just k of them?

Example: You have 10 runners, but only 3 will medal (gold, silver, bronze). How many possible medal arrangements?

You're choosing 3 people from 10 and the order matters (gold ≠ silver ≠ bronze).

The formula: P(n, k) = n! / (n - k)!

Or, more intuitively: n × (n-1) × (n-2) × ... × (n-k+1)

You're filling k positions with n choices for the first, n-1 for the second, down to n-k+1 for the kth.

For our example: P(10, 3) = 10 × 9 × 8 = 720 possible medal outcomes.

Why does the division by (n-k)! work? Because n! counts all arrangements of all n objects, but we only care about the first k positions. Dividing by (n-k)! removes the arrangements of the remaining n-k objects we're not using.


When Permutations Matter: Real-World Examples

Permutations aren't just abstract math. They model situations where sequence determines outcome.

Race Results

Olympic sprinters cross the finish line. The order determines medals. If 8 runners compete, there are P(8, 3) = 336 possible ways to assign gold, silver, bronze.

Passwords

A 4-digit PIN has 10⁴ = 10,000 possibilities if repetition is allowed. But if each digit must be unique (a permutation of 10 digits taken 4 at a time), there are P(10, 4) = 5,040 possibilities.

The security comes from counting. A system that locks after 3 failed attempts makes brute-force infeasible when there are thousands of possibilities.

Playlist Order

You have 20 songs and want to arrange 5 for a workout playlist. The order matters—starting energetic vs. ending energetic creates different experiences.

P(20, 5) = 1,860,480 possible playlists.

Spotify's shuffle isn't truly random—it avoids certain orderings (like playing the same artist twice in a row) to feel more "random" to human perception. The number of possible orderings is the design space they're navigating.

Task Scheduling

You have 7 tasks to complete today, but you can only do 3 before a deadline. The order you do them affects outcomes (some tasks depend on others, some are urgent).

P(7, 3) = 210 possible schedules. Not all are valid (dependencies constrain the space), but permutations define the full possibility space.


Permutations with Repetition

What if some objects are identical?

Example: How many ways can you arrange the letters in "BOOK"?

If all letters were distinct, it would be 4! = 24. But two O's are identical. If you swap them, the arrangement looks the same.

The formula for permutations with repetition: n! / (n₁! × n₂! × ... × nₖ!)

Where n is the total number of objects, and n₁, n₂, ..., nₖ are the counts of each repeated type.

For "BOOK":

  • Total letters: 4
  • O appears twice
  • B, K appear once each

Permutations = 4! / 2! = 24 / 2 = 12.

You can verify: BOOK, BOKO, BOKО, OBKO, OBKО, OKBO, OKОB, OOBK, OOKB, KOBO, KОOB, KOОB. (Twelve distinct arrangements.)

Another example: "MISSISSIPPI"

  • Total: 11 letters
  • I appears 4 times
  • S appears 4 times
  • P appears 2 times
  • M appears 1 time

Permutations = 11! / (4! × 4! × 2! × 1!) = 34,650.

Without accounting for repetition, you'd calculate 11! = 39,916,800—massively overcounting.


Circular Permutations: When There's No Starting Point

Arranging objects in a line is different from arranging them in a circle.

In a line, ABC and CAB are different. But seated around a circular table, rotating everyone one position doesn't create a new arrangement—it's the same relative configuration.

The formula for circular permutations: (n - 1)!

Why? Fix one object's position (to eliminate rotational equivalence), then arrange the remaining n-1 objects. That's (n-1)! arrangements.

Example: 6 people sitting around a table. Linear permutations: 6! = 720. Circular permutations: 5! = 120.

Each of the 120 circular arrangements corresponds to 6 linear arrangements (one for each starting point as you rotate).

Why this matters: Seating charts for round tables. Necklaces or bracelets (though bracelets also have reflectional symmetry, reducing it further). Round-robin tournaments where relative order matters but absolute position doesn't.


Derangements: Permutations Where Nothing Stays Put

A derangement is a permutation where no element appears in its original position.

Example: Four people check hats at a restaurant. The hat check scrambles them. How many ways can all four people get the wrong hat?

This is D(4), the number of derangements of 4 objects. The answer: 9.

The formula for derangements: D(n) = n! × (1 - 1/1! + 1/2! - 1/3! + ... ± 1/n!)

Or recursively: D(n) = (n - 1) × [D(n-1) + D(n-2)]

Why this is surprising: As n grows, the probability that a random permutation is a derangement approaches 1/e ≈ 0.368 (36.8%).

Roughly one-third of all permutations are derangements. Intuitively, you might expect the probability to drop as n increases (more chances for at least one element to stay put), but it stabilizes around 37%.

Applications:

  • Secret Santa assignments: Everyone must give to someone else (no self-gifting).
  • Card tricks: Shuffle a deck so no card is in its original position.
  • Cryptography: Permutation ciphers where avoiding fixed points adds security.

Permutations in Cryptography

Modern cryptography relies on permutations at multiple levels.

Substitution Ciphers

A substitution cipher replaces each letter with another. The key is a permutation of the 26-letter alphabet.

How many possible keys? 26! ≈ 4 × 10²⁶.

That's astronomically large—unbreakable by brute force. Yet substitution ciphers are easily broken by frequency analysis (E is the most common letter in English, so whatever symbol appears most often likely represents E).

Security isn't just about key space—it's about structure. Permutations alone aren't enough if patterns leak information.

Block Ciphers

Modern encryption like AES (Advanced Encryption Standard) uses permutation-substitution networks. Data is repeatedly permuted and substituted in rounds. Each round scrambles the bits further.

The permutation step diffuses information across the data block. Without permutations, encryption would be vulnerable to localized attacks.

Public-Key Cryptography

RSA encryption doesn't directly use permutations, but the problem it's based on—factoring large numbers—connects to permutation group theory. The difficulty of certain permutation-related problems (like discrete logarithms in cyclic groups) secures many cryptographic protocols.

Permutations are the scrambling function at the heart of secure communication.


Permutations in Algorithms

Many algorithms fundamentally manipulate permutations.

Sorting

Sorting a list is finding the permutation that arranges elements in order. If you have n items, there are n! possible orderings. Only one (or a few, if there are duplicates) is sorted.

Every comparison in a sorting algorithm gives you information about which permutation you're in. To distinguish among n! possibilities using binary decisions (comparisons), you need at least log₂(n!) comparisons.

Using Stirling's approximation, log₂(n!) ≈ n log₂(n).

This is why comparison-based sorting can't do better than O(n log n) in the worst case. It's a fundamental limit derived from counting permutations.

Random Shuffling

The Fisher-Yates shuffle algorithm generates a uniformly random permutation in O(n) time. Why does it work? Because at each step, it selects one of the remaining elements with equal probability, building up a permutation where all n! outcomes are equally likely.

Bad shuffling algorithms (like repeatedly swapping random pairs) don't uniformly sample the permutation space—they introduce bias. Casinos care about this: biased shuffles can be exploited.

Traveling Salesman Problem (TSP)

Given n cities, find the shortest route visiting each exactly once. This is asking: which permutation of cities minimizes total distance?

There are n! possible routes (well, (n-1)!/2 if you account for symmetry). For 10 cities, that's 181,440 routes. For 20 cities, it's 6 × 10¹⁶.

No known algorithm solves TSP in polynomial time. Brute-force enumeration of permutations is infeasible for large n. This is why TSP is NP-hard.


Permutation Groups and Symmetry

In abstract algebra, permutations form groups—mathematical structures where you can compose permutations (apply one after another) and they obey certain properties.

The symmetric group S_n is the set of all permutations of n objects. It has n! elements.

Why this matters:

  • Rubik's Cube: Solving it is navigating the permutation group of cubelet positions. There are ~4.3 × 10¹⁹ possible configurations, but the group structure lets you solve it efficiently.
  • Molecular chemistry: Permutations describe symmetries in molecules. Benzene's hexagonal symmetry is a permutation group (rotations and reflections).
  • Quantum mechanics: Identical particles (like electrons) are described by symmetric or antisymmetric wavefunctions—permutations of particle states.

Permutation groups unify diverse phenomena under a common mathematical structure.


Generating Permutations Algorithmically

How do you systematically generate all permutations of n objects?

Lexicographic Order

Start with the smallest permutation (e.g., 123). Repeatedly apply a rule to generate the next permutation in dictionary order.

Algorithm:

  1. Find the largest index i such that a[i] < a[i+1].
  2. Find the largest index j such that a[i] < a[j].
  3. Swap a[i] and a[j].
  4. Reverse the sequence from a[i+1] to the end.

This generates all n! permutations without repetition in O(n) time per permutation.

Example for [1,2,3]:

  • 123 → 132 → 213 → 231 → 312 → 321

Heap's Algorithm

Heap's algorithm generates all permutations via minimal swaps (each permutation differs from the previous by a single swap). It's efficient and elegant, running in O(n!) time (which is optimal—you have to generate n! outputs).

Why it matters: Generating permutations is useful in brute-force search, testing, and combinatorial optimization. Having efficient generation algorithms makes exploration feasible for moderate n.


Permutations vs. Combinations: The Key Distinction

Let's make this crystal clear:

Permutations Combinations
Order matters Order doesn't matter
ABC ≠ BAC ABC = BAC = CAB
Racing podiums (1st, 2nd, 3rd) Committees (just members)
Passwords (sequence matters) Lottery numbers (set matters)
Playlist order Playlist selection
P(n, k) = n! / (n-k)! C(n, k) = n! / (k!(n-k)!)

Relation: C(n, k) = P(n, k) / k!

Why? Each combination (unordered set of k items) corresponds to k! permutations (all ways to order those k items). So permutations count k! times more than combinations.

If you're choosing and arranging, use permutations. If you're just choosing, use combinations.


Real-World Permutation Problems

DNA Sequencing

DNA is a sequence of four bases: A, T, C, G. A gene is a specific permutation of these bases. The order determines the protein it encodes.

A sequence of length 100 has 4¹⁰⁰ ≈ 10⁶⁰ possible permutations. Only a tiny fraction code for functional proteins. Evolution explores this space via mutation (random permutation changes).

Tournament Brackets

In a single-elimination tournament with n players, the bracket structure is a specific ordering (permutation) of who plays whom. For 8 players, there are different bracket configurations, though some symmetries reduce the count.

Permutations determine match-ups. Seeding (assigning rankings) is choosing a permutation to balance competition.

Anagrams

An anagram is a permutation of letters forming a different word. "LISTEN" and "SILENT" are permutations of the same 6 letters.

How many anagrams of "LISTEN"? 6! = 720 (assuming all letters are distinct). Most won't be valid English words, but they're all mathematically valid permutations.

Word games like Scrabble and Wordle are navigating permutation spaces.


The Computational Cost of Permutations

Permutations have a dark side: they scale impossibly fast.

An algorithm with O(n!) runtime is computationally infeasible for even moderate n. By n = 20, you're in the trillions. This is why brute-force approaches (try all permutations) fail for most problems.

Examples of O(n!) algorithms:

  • Brute-force Traveling Salesman: Try all permutations of cities. Infeasible beyond ~15 cities.
  • Brute-force subset sum: Try all permutations of selections. Exponential, not factorial, but still intractable.

The factorial barrier is why computer scientists seek heuristics, approximations, and polynomial-time algorithms. You can't enumerate permutations—you must reason about structure.

Why this matters: Understanding permutation growth explains why some problems are "hard" not because we lack cleverness, but because the possibility space is fundamentally vast.


Permutations as Bijections

In mathematics, a permutation is a bijection: a one-to-one, onto function from a set to itself.

For the set {1, 2, 3}, the permutation 2→3, 3→1, 1→2 (written as the cycle (1 2 3)) is a bijection. Every element maps to exactly one element, and every element is mapped to.

This perspective connects permutations to functions, making them objects of study in algebra, topology, and category theory.

Cycle notation: Permutations can be written as cycles. The permutation (1 2 3) means 1→2, 2→3, 3→1. Every permutation decomposes into disjoint cycles.

Example: The permutation [3, 1, 2, 4] (element 1 goes to position 3, element 2 to position 1, etc.) is the cycle (1 3 2)(4), or more simply (1 3 2) since 4 is fixed.

Cycle structure reveals properties like permutation parity (even or odd) and order (how many times you must apply it to return to the identity).


Why Order Matters

We began with a simple premise: order matters.

In permutations, the sequence carries information the set does not. ABC is not BAC. First is not second. Temporal order, spatial arrangement, priority ranking—these structures depend on permutations.

The number of permutations explodes super-exponentially because each choice of position is independent, and they multiply. This explosion makes brute-force infeasible but also creates vast spaces for cryptography, randomness, and complexity.

Permutations aren't just arrangements. They're the mathematics of ordered structure—the framework for understanding sequence, precedence, and configuration in a world where order matters.


Further Reading

  • Knuth, Donald E. The Art of Computer Programming, Volume 4A: Combinatorial Algorithms, Part 1. Addison-Wesley, 2011. (Generating permutations, cycle structure)
  • Stanley, Richard P. Enumerative Combinatorics, Volume 1. Cambridge University Press, 2nd edition, 2011. (Permutation theory)
  • Graham, Ronald L., Donald E. Knuth, and Oren Patashnik. Concrete Mathematics. Addison-Wesley, 2nd edition, 1994. (Factorial identities, Stirling numbers)
  • Rosen, Kenneth H. Discrete Mathematics and Its Applications. McGraw-Hill, 8th edition, 2018. (Permutations with repetition, circular permutations)
  • Heap, B. R. "Permutations by Interchanges." The Computer Journal, 1963. (Heap's algorithm)

This is Part 3 of the Discrete Mathematics series, exploring the foundational math of computer science, algorithms, and computational thinking. Next: "Combinations Explained — When Order Doesn't Matter."


Part 3 of the Discrete Mathematics series.

Previous: Combinatorics: Counting Without Counting Next: Combinations: Unordered Selections