Vectors: Quantities with Direction and Magnitude

Vectors: Quantities with Direction and Magnitude
Vectors: Quantities with Direction and Magnitude | Ideasthesia

A physicist, a computer scientist, and a mathematician walk into a bar. The bartender asks what they want.

The physicist says, "A beer. Five degrees north of due east from my position, magnitude 0.5 liters."

The computer scientist says, "An array: [5, 12, 3] representing ounces of beer, ice, and lime juice."

The mathematician says, "An element of R³ satisfying the conditions for a normed vector space."

The bartender hands them all the same thing. They're all talking about vectors.


What Vectors Are (And Why You Already Know Them)

A vector is a list of numbers with geometric meaning.

That's it. That's the definition.

In two dimensions: (3, 4). Three dimensions: (2, -1, 5). Four dimensions: (1, 0, -2, 7). The numbers tell you how far to move in each direction.

But here's the thing: you've been working with vectors your entire life without calling them that.

When you say "go three blocks east and four blocks north," that's a vector: (3, 4).

When GPS tells you that you're at latitude 37.7749, longitude -122.4194, that's a vector: (37.7749, -122.4194).

When you describe a color as RGB (255, 0, 128), that's a vector.

When you track temperature, humidity, and pressure simultaneously, that's a vector.

Vectors are how you represent anything with multiple components. They're the natural language of multidimensional data.


The Geometric Picture

The cleanest way to think about vectors is as arrows.

The vector (3, 4) is an arrow that starts at the origin and points to the location (3, 4). The first number tells you how far to go horizontally. The second tells you how far vertically.

This geometric picture unlocks intuition.

Magnitude: The length of the arrow. For (3, 4), use Pythagoras: √(3² + 4²) = √25 = 5. The magnitude is 5 units.

Direction: Where the arrow points. The vector (3, 4) points northeast. The vector (-3, -4) points southwest. Same magnitude, opposite direction.

Position vs. displacement: Vectors don't care where they start. The arrow from (0,0) to (3,4) represents the same vector as the arrow from (1,2) to (4,6). What matters is displacement—how far and which direction—not absolute position.

This last point is crucial. Vectors represent changes, movements, directions. Not locations. Location requires choosing an origin. Displacement doesn't.


Vector Operations

Vectors wouldn't be interesting if you couldn't combine them. Fortunately, you can add them and scale them, and the geometric interpretation is beautiful.

Addition: Tip-to-Tail

To add vectors (3, 4) and (1, 2), add component-wise: (3+1, 4+2) = (4, 6).

Geometrically, place the second arrow at the tip of the first. The sum is the arrow from the start of the first to the end of the second.

You go (3, 4), then you go (1, 2). Where did you end up? At (4, 6).

This is why vectors represent displacements. Displacements compose by addition. Go three east and four north, then one more east and two more north. Total displacement: four east and six north.

Addition is commutative: (3, 4) + (1, 2) = (1, 2) + (3, 4). Geometrically, tip-to-tail works regardless of order.

Scalar Multiplication: Stretching and Flipping

Multiply a vector by a number (called a scalar): 2 × (3, 4) = (6, 8).

Geometrically, you've stretched the arrow. Twice as long, same direction.

Multiply by ½: (1.5, 2). Shrunk to half length.

Multiply by -1: (-3, -4). Same length, opposite direction.

Multiply by 0: (0, 0). The zero vector. No displacement at all.

Scalar multiplication changes magnitude. It can flip direction (negative scalars). But it doesn't rotate. The arrow stays on the same line through the origin.

Linear Combinations: The Real Power

Now combine both operations. Take vectors v and w. Form av + bw, where a and b are scalars.

This is a linear combination. You're scaling each vector then adding them.

Example: Let v = (1, 0) and w = (0, 1). Then 3v + 4w = 3(1,0) + 4(0,1) = (3, 0) + (0, 4) = (3, 4).

Here's the insight: every vector in the plane can be written as a linear combination of v and w. Want (7, -2)? That's 7v - 2w. Want (π, e)? That's πv + ew.

The vectors v and w span the plane. They're a basis. Every vector is some combination of them.

This generalizes. In three dimensions, you need three vectors to span space. In n dimensions, n vectors. The right choice of vectors gives you coordinates.


Vectors in Higher Dimensions

Two-dimensional vectors are easy to visualize. Three-dimensional vectors are manageable—think of actual space.

But vectors don't stop at three dimensions.

A vector in four dimensions: (2, -1, 3, 5). You can't visualize this as an arrow. But the algebra works identically. Add component-wise. Scale component-wise. Magnitude: √(2² + (-1)² + 3² + 5²) = √39.

A vector in a hundred dimensions: a list of a hundred numbers. Still a vector. Still follows all the same rules.

This is where linear algebra gets powerful. Your geometric intuition comes from two and three dimensions. But the mathematics works in any number of dimensions.

Machine learning lives in high-dimensional spaces. An image is a vector with one component per pixel. A document is a vector with one component per word. A customer is a vector of purchasing behaviors, demographic features, and interaction history.

You can't visualize million-dimensional space. But you can compute with it. And all your intuition from two dimensions—adding vectors, measuring distances, finding directions—generalizes perfectly.


Types of Vectors (They're Everywhere)

Once you start looking, vectors are everywhere.

Position vectors: Points in space. (x, y, z) tells you where something is.

Velocity vectors: Displacement per unit time. (dx/dt, dy/dt) tells you speed and direction of motion.

Force vectors: Magnitude and direction of push or pull. (Fₓ, Fᵧ, Fᵤ).

Color vectors: (Red, Green, Blue) or (Hue, Saturation, Value). Every color is a point in color space.

Sound vectors: Amplitude at each frequency. A Fourier transform turns sound into a vector.

Feature vectors: In machine learning, every object becomes a vector of features. An email is a vector of word frequencies. An image is a vector of pixel intensities. A user is a vector of behaviors and preferences.

The abstraction is powerful. Once something is a vector, all the machinery of linear algebra applies. You can add them, scale them, measure distances between them, find clusters, compute projections.

Turning things into vectors is how you make them mathematically tractable.


Dot Product: Measuring Alignment

There's one more fundamental vector operation: the dot product.

For vectors v = (v₁, v₂) and w = (w₁, w₂), the dot product is:

v · w = v₁w₁ + v₂w₂

Multiply corresponding components and add.

For (3, 4) · (1, 2): 3×1 + 4×2 = 3 + 8 = 11.

But what does it mean?

The geometric interpretation: v · w = |v| |w| cos(θ), where θ is the angle between the vectors.

The dot product measures alignment.

  • If vectors point the same direction, cos(θ) ≈ 1, dot product is large and positive.
  • If perpendicular, cos(90°) = 0, dot product is zero.
  • If opposite directions, cos(180°) = -1, dot product is negative.

The dot product projects one vector onto another. It tells you how much of v points in the direction of w.

This is used constantly:

  • Projections: Break a vector into components parallel and perpendicular to a direction.
  • Angles: Compute cos(θ) = (v · w) / (|v| |w|).
  • Orthogonality: If v · w = 0, the vectors are perpendicular (orthogonal).
  • Work in physics: Work = Force · Displacement. Only the component of force along the displacement direction does work.

The dot product turns geometric questions into arithmetic.


Unit Vectors: Pure Direction

A unit vector has magnitude 1. It represents pure direction, no magnitude.

To make any vector into a unit vector, divide by its magnitude:

v̂ = v / |v|

The vector (3, 4) has magnitude 5. The unit vector: (3/5, 4/5). Still points the same direction, but length exactly 1.

Unit vectors are useful because they separate direction from magnitude. Any vector can be written as:

v = |v| × (direction)

The magnitude times a unit vector in that direction.

Standard basis vectors are unit vectors along the axes:

  • î = (1, 0, 0) points along x-axis
  • ĵ = (0, 1, 0) points along y-axis
  • k̂ = (0, 0, 1) points along z-axis

Every three-dimensional vector is a combination: v = xî + yĵ + zk̂.


Why Vectors Matter

Vectors are the language of multidimensional relationships.

In physics, vectors represent physical quantities with direction: force, velocity, momentum, electric field, magnetic field. The entire formalism of classical mechanics is built on vector operations.

In computer graphics, vectors represent positions, directions, colors, normals. Every transformation—rotation, scaling, translation—is implemented using vector operations.

In machine learning, vectors represent data. Every data point is a vector in feature space. Similarity between data points is measured as vector distance. Learning algorithms find vectors (weights) that best transform input vectors to output vectors.

In economics, vectors represent allocations, preferences, prices. Optimization problems find the best vector subject to constraints.

Vectors are how you make multiple quantities work together as a coherent mathematical object. They turn multidimensional problems into single-object problems.

And once you have vectors, you need something to transform them.

That's where matrices come in.


The Insight

Here's what vectors really are: they're the native data structure of multidimensional space.

Arrays let you store multiple numbers. Vectors let you do geometry with multiple numbers. Addition corresponds to displacement. Scaling corresponds to magnitude change. Dot product corresponds to projection and angle.

The algebra follows the geometry. The operations aren't arbitrary—they're what you need for geometric reasoning to work.

And here's the thing: geometric reasoning works in any number of dimensions. Your intuition comes from visualizing arrows in two or three dimensions. But the mathematics scales. The same operations, the same principles, the same structure.

Vectors are everywhere because multidimensional data is everywhere. And once you see the world as vectors, you start seeing the transformations that act on them.

Those transformations are matrices. And that's where linear algebra gets really interesting.


This is Part 2 of the Linear Algebra series. Next: "Matrices: Arrays That Transform Vectors."


Part 2 of the Linear Algebra series.

Previous: What Is Linear Algebra? The Mathematics of Many Dimensions Next: Matrices: Arrays That Transform Vectors