Though you've likely never given it much thought, a universally accepted unit of measurement like the humble meter is an amazing thing. It lets scientists separated by culture, language, race and even thousands of miles of geography work together on equations and problems like they were sitting next to each other. So how did this unit of measurement come to be?
Well, before we discuss the meter, it's important to understand what came first. Prior to the meter, Europe's standard-ish unit of measurement was yards and inches. Though today there is an agreed exact length for an inch, go back a few hundred years and the definition was a little more lax.
For example, for hundreds of years, the official definition of an inch was as follows.
For those of you who don't care for barley, in some places an inch was equal to the combined length of 12 poppy seeds instead. As discussed in the book, The Britannica Guide to Numbers and Measurement, the above definition was brought in during the rein of King Edward II in the 14th century. However, it's known that barleycorns were a standard unit of measurement for hundreds of years before this dating all the way back to the Anglo-Saxons.
Also, earlier than this in 1150, King David of Scotland declared "the breadth of a man's thumb" as the standard unit of measurement, which like many of these other measurements, while practical in some respects, is massively stupid if you care for fine-point accuracy. However, throughout England, it was the barleycorn that reined supreme, virtually unchallenged for hundreds of years.
Amazingly, a universally accepted value for the inch wasn't put into practise worldwide until July 1, 1959, after a number of countries collectively signed the International yard and pound agreement earlier that year in February. The countries, which included, the US, Canada, Britain, South Africa, New Zealand and Australia came to the conclusion that an inch was to be officially and universally recognised as being 25.4 Millimeters.
So what made these fancy metric units so accurate that they were deemed better for measuring the length of an inch than barleycorns? Well it's because the meter was derived from something everyone on Earth could use for reference, the Earth itself.
The idea for the meter as a unit of measurement was first proposed during the French Revolution. As an example of just how necessary a universally accepted unit of measurement was needed, according to Ken Adler, author of, The Measure of All Things: The Seven-Year Odyssey that Transformed the World, there were around 250,000 different units of weights and measure in use in France during that time.
Now originally there were two proposed methods of discovering a standard unit of measurement; the first involved a pendulum with a half period of a single second. The alternate idea put forward was to find the length of one quadrant of the Earth's meridian and divide it by 10 million.
The French Academy Of Science opted for the latter due to the fact that gravity can vary ever so slightly depending on where you are on Earth, which would affect the swing of a pendulum and result in a standard, world-wide measurement being impossible to discern.
However, even though a method of deriving the unit was agreed upon in 1791, the exact distance of one quadrant of the Earth's meridian wasn't known at that time. To discover it, two notable French astronomers of the era, Pierre Méchain and Jean-Baptiste Delambre were sent in opposite directions from Paris to work out the length of the Earth's meridian between Dunkirk and Barcelona.
What should have taken the two men little more than a year, actually ended up taking 7 years, which is where the title of Ken Adler's book mentioned above came from. Why did it take so long? Loads of reasons not the least of which was that they were frequently arrested during their respective journeys- traipsing around surveying things presumably looked suspicious to authorities during the French Revolution.
They eventually got the needed measurements, but there was a problem- Méchain made a small, but nonetheless significant error very early on in the process of mapping the meridian, which wasn't discovered until later. He failed to take into account that the rotation of the Earth made for a non-uniform shape. As a result, this unintentionally threw off the entire result by a very small margin. This mistake proceed to haunt Méchain for the rest of his life, which wasn't long. In the process of traveling and attempting to correct the error a few years later, he contracted yellow fever and died.
In the end, the mistake resulted in the first meter being off by approximately 1/5 of a millimeter from what the definition stated.
While the pair were off gallivanting around Europe though, the French still needed something to call a meter, so they had several platinum bars cast based on earlier, less exact calculations. When the pair returned and the exact-ish figure for a meter was calculated, the bar closest to this result was placed in a vault and it became the official standard of meter measurement in 1799. Later that year, the so dubbed, Metric System was implemented across France.
This platinum bar, known as the mètre des Archives was actually used as the literal measuring stick to which all other meters were measured for a few years. However, pressure quickly mounted on the scientific community to find a more effective, easily reproduced method of discerning the length of a meter as more and more countries began to implement the metric system.
After all, meter sticks being cast from the platinum original were prone to both damage and general wear and tear, resulting in no one being totally sure they were using the exact same definition as the other guy, which is kind of a bad thing when you're trying to do science that requires exacting measurements.
To combat this confusion and so that a universally agreed upon standard for the meter could be arranged, representatives from over two dozen countries were invited to attend The International Metre Commissionin Paris. These representative met several times from 1870-1872 and decided on the casting of several new "metric prototypes" made of 90% platinum and 10% iridium, which would become the new standard everyone measured off of.
As time has progressed, we've gotten a bit more exacting about the process of measuring the meter. Starting in 1960, the official definition changed to:
…the length equal to 1,650,763.73 wavelengths in vacuum of the radiation corresponding to the transition between the levels 2p10 and 5d5 of the krypton 86 atom.
This lasted only until 1983 when the definition of a meter again changed as the technology to measure it continued to improve.
Today, the measuring of a meter has come full circle bringing us back to the original discarded suggestion of using time, though we've gotten a bit more advanced than pendulums. Specifically, a meter is defined as being exactly:
The length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second.
This is a figure that was agreed upon after scientists measured it using something all good science should try to incorporate for maximal awesomeness- lasers.
How does the modern version compare to the original measurements by Méchain and Delambre? It turns out their meter was only off from the modern definition by half a millimeter. Not too shabby.
If you liked this article, you might also enjoy:
- How the Speed of Light was First Measured
- Why Firewood is Measured in Cords and the Origin of Other Odd Units of Measurement
- How the Calorie Content of Food is Determined
- Why We Divide the Day Into Seconds, Minutes, and Hours
- The Man Who Accurately Estimated the Circumference of the Earth Over 2,000 Years Ago
This post has been republished with permission from TodayIFoundOut.com.
Image by Håvar og Solveig under Creative Commons license.