Wednesday, April 08, 2009

Timing Accuracy

As technological improves you would expect that our ability to measure things would improve. So if we take the accuracy of measurements as a crude gauge of technological progress can we graph how quickly we are progressing?

So take time. How quickly is our ability to measure time accurately improving?

I'm going to have a look at how accurately people have been able to measure time since the start of the modern world (about the time Newton got hit by an apple) till now. I take accuracy to mean how many seconds you could count before you would be out by one on average.

1656, Dutch astronomer Christian Huygens. It had an error of less than one minute a day according to here
1721, George Graham improved the pendulum clock’s accuracy to within a second a day
1761 John Harrison, half a second a day
1889 to Siegmund Riefler's clock with a nearly free pendulum, which attained an accuracy of a hundredth of a second a day
1952 Quartz clock at accuracy of 1 in 10^8 seconds.

Modern times are graphed here

For the atomic age I got the figures from here
So that gives the data

year accuracy
1650 10E+03
1721 10E+05
1889 10E+07
1952 10E+08
1955 10E+10
1970 10E+13
1989 10E+16
2008 10E+17

One problem with this line of thought is that exponential increases tend to stop. Otherwise we would be up to our ears in Fibonacci's rabbits. The problem with this exponential reasoning is described here

There are all sorts of other things whose accuracy have improved over time. Has temperature and mass measurement accuracy increased at the same rate?

No comments: