The Evolution of the Metre

metreThough you’ve likely never given it much thought, a universally accepted unit of measurement like the humble metre is an amazing thing. It lets scientists separated by culture, language, race and even thousands of miles of geography work together on equations and problems like they were sitting next to each other. So how did this unit of measurement come to be?

Well, before we discuss the metre, it’s important to understand what came first. Prior to the metre, Europe’s standard-ish unit of measurement was yards and inches. Though today there is an agreed exact length for an inch, go back a few hundred years and the definition was a little more lax.

For example, for hundreds of years, the official definition of an inch was as follows.

three grains of barley, dry and round, placed end to end, lengthwise

For those of you who don’t care for barley, in some places an inch was equal to the combined length of 12 poppy seeds instead. As discussed in the book, The Britannica Guide to Numbers and Measurement, the above definition was brought in during the reign of King Edward II in the 14th century. However, it’s known that barleycorns were a standard unit of measurement for hundreds of years before this dating all the way back to the Anglo-Saxons.

Also, earlier than this in 1150, King David of Scotland declared “the breadth of a man’s thumb” as the standard unit of measurement, which like many of these other measurements, while practical in some respects, is massively stupid if you care for fine-point accuracy. However, throughout England, it was the barleycorn that reined supreme, virtually unchallenged for hundreds of years.

Amazingly, a universally accepted value for the inch wasn’t put into practise worldwide until July 1, 1959, after a number of countries collectively signed the International yard and pound agreement earlier that year in February. The countries, which included, the US, Canada, Britain, South Africa, New Zealand and Australia came to the conclusion that an inch was to be officially and universally recognised as being 25.4 Millimetres.

So what made these fancy metric units so accurate that they were deemed better for measuring the length of an inch than barleycorns? Well it’s because the metre was derived from something everyone on Earth could use for reference, the Earth itself.

The idea for the metre as a unit of measurement was first proposed during the French Revolution. As an example of just how necessary a universally accepted unit of measurement was needed, according to Ken Adler, author of, The Measure of All Things: The Seven-Year Odyssey that Transformed the World, there were around 250,000 different units of weights and measure in use in France during that time.

Now originally there were two proposed methods of discovering a standard unit of measurement; the first involved a pendulum with a half period of a single second. The alternate idea put forward was to find the length of one quadrant of the Earth’s meridian and divide it by 10 million.

The French Academy Of Science opted for the latter due to the fact that gravity can vary ever so slightly depending on where you are on Earth, which would affect the swing of a pendulum and result in a standard, world-wide measurement being impossible to discern.

However, even though a method of deriving the unit was agreed upon in 1791, the exact distance of one quadrant of the Earth’s meridian wasn’t known at that time. To discover it, two notable French astronomers of the era, Pierre Méchain and Jean-Baptiste Delambre were sent in opposite directions from Paris to work out the length of the Earth’s meridian between Dunkirk and Barcelona.

What should have taken the two men little more than a year, actually ended up taking 7 years, which is where the title of Ken Adler’s book mentioned above came from. Why did it take so long? Loads of reasons not the least of which was that they were frequently arrested during their respective journeys- traipsing around surveying things presumably looked suspicious to authorities during the French Revolution.

They eventually got the needed measurements, but there was a problem- Méchain made a small, but nonetheless significant error very early on in the process of mapping the meridian, which wasn’t discovered until later. He failed to take into account that the rotation of the Earth made for a non-uniform shape. As a result, this unintentionally threw off the entire result by a very small margin. This mistake proceed to haunt Méchain for the rest of his life, which wasn’t long.  In the process of traveling and attempting to correct the error a few years later, he contracted yellow fever and died.

In the end, the mistake resulted in the first metre being off by approximately 1/5 of a millimetre from what the definition stated.

While the pair were off gallivanting around Europe though, the French still needed something to call a metre, so they had several platinum bars cast based on earlier, less exact calculations. When the pair returned and the exact-ish figure for a metre was calculated, the bar closest to this result was placed in a vault  and it became the official standard of metre measurement in 1799. Later that year, the so dubbed, Metric System was implemented across France.

This platinum bar, known as the mètre des Archives was actually used as the literal measuring stick to which all other metres were measured for a few years. However, pressure quickly mounted on the scientific community to find a more effective, easily reproduced method of discerning the length of a metre as more and more countries began to implement the metric system.

After all, metre sticks being cast from the platinum original were prone to both damage and general wear and tear, resulting in no one being totally sure they were using the exact same definition as the other guy, which is kind of a bad thing when you’re trying to do science that requires exacting measurements.

To combat this confusion and so that a universally agreed upon standard for the metre could be arranged, representatives from over two dozen countries were invited to attend The International Metre Commission in Paris. These representative met several times from 1870-1872 and decided on the casting of several new “metric prototypes” made of 90% platinum and 10% iridium, which would become the new standard everyone measured off of.

As time has progressed, we’ve gotten a bit more exacting about the process of measuring the meter.  Starting in 1960, the official definition changed to:

…the length equal to 1,650,763.73 wavelengths in vacuum of the radiation corresponding to the transition between the levels 2p10 and 5d5 of the krypton 86 atom.

This lasted only until 1983 when the definition of a metre again changed as the technology to measure it continued to improve.

Today, the measuring of a metre has come full circle bringing us back to the original discarded suggestion of using time, though we’ve gotten a bit more advanced than pendulums.  Specifically, a metre is defined as being exactly:

The length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second.

This is a figure that was agreed upon after scientists measured it using something all good science should try to incorporate for maximal awesomeness- lasers.

How does the modern version compare to the original measurements by Méchain and Delambre? It turns out their metre was only off from the modern definition by half a millimetre.  Not too shabby.

If you liked this article, you might also enjoy our new popular podcast, The BrainFood Show (iTunes, Spotify, Google Play Music, Feed), as well as:

Expand for References
Share the Knowledge! FacebooktwitterredditpinteresttumblrmailFacebooktwitterredditpinteresttumblrmail
Print Friendly, PDF & Email
Enjoy this article? Join over 50,000 Subscribers getting our FREE Daily Knowledge and Weekly Wrap newsletters:

Subscribe Me To:  | 

3 comments