09 October 2005

The history of gauge theories in physics

The most basic gauge theory in physics is Maxwellian electrodynamics. While Maxwell developed his theory in the late 19th century, it was not realized until the 1950s and 1960s that the concept of gauge invariance was crucial to developing theories that explain the fundamental forces between elementary particles.

But what does gauge mean? Here's a hint from the Webster dictionary:
A measurement (as of linear dimension) according to some standard or system: as (1) : the distance between the rails of a railroad (2) : the size of a shotgun barrel's inner diameter nominally expressed as the number of lead balls each just fitting that diameter required to make a pound [a 12-gauge shotgun] (3) : the thickness of a thin material (as sheet metal or plastic film) (4) : the diameter of a slender object (as wire or a hypodermic needle) (5) : the fineness of a knitted fabric expressed by the number of loops per unit width

Hermann Weyl is the scientist who first introduced the idea of gauge invariance, but in a different context. He was trying to come up with a theory to unify electromagnetism and gravitation. For him, gauge meant "scale". He thought that physics might be invariant under a change of scale at the local level.

Weyl's ideas were a remarkable insight at the time, but unfortunately he was wrong -- his theory does not describe nature. In modern physics, gauge theories refer to physical theories that are preserved under certain local symmetry transformation. For instance, in quantum electrodynamics, the Lagrangian is preserved under multiplication by a complex phase (technically known as a U(1) symmetry).

But whenever you encounter "gauge" in physics literature, it might amuse you to think of railroad tracks!

No comments:

Post a Comment