Worth emphasizing that decimal time is not just historical curiosity and fringe idea, especially the astronomical community uses decimal time fairly regularly, various "julian date" systems attest that. https://en.wikipedia.org/wiki/Julian_day
The USPS used it when I worked there in the 90s. They used 24 hour days with 100 ticks of 36 seconds per each hour. The time clocks operated in this format.
Which was interesting, until you also realized, they expected you to clock in precisely at the start of your shift, giving you a 36 second window to do it in.
On the first day I was wondering why there were so many time clocks for the size of the office and then I witnessed the shift change "stampede" that this policy incidentally created.
If I recall correctly, for up to 5 "ticks" you could make it up at the end of your shift, for anything more you got reported. It was a union job and so there were a lot of little "parsimonious" rules like this.
To answer the other question at the same time, in the 90s, was most shops that used an analog time clock would only advance the stamped time every 5 minutes. So, if you clocked in at 8:04, it would still punch 8:00 on your card. So, it was notable to me at the time that they had clocks with a finer grain of accuracy that fed into their computers immediately, and they actually cared about the "slop" involved in time keeping.
A decimal fraction is not the same as decimal time. There may be a simple conversion (* 1000), but the conversion to 24hr time is almost as simple. The problem with decimal time is that it is a proposal for people to use, and that it solves no problems at an immense conversion cost.
As someone who's done quite a bit of astronomy programming, I can say the Julian Date is not really a replacement for decimal time. The Julian Date really isn't much different than a Unix Timestamp (in fact, you can convert between the two quite easily: jd = UnixTimestamp / 86400 + 2440587.5). They are both good for punching in to algorithms, but not good for direct use by people.
That's a neat website, but it did confuse me for a moment. It's titled itself "metric time", but seems to be the same idea as detailed on the "Decimal time" Wikipedia article - https://en.wikipedia.org/wiki/Decimal_time.
However, that article warns to not confuse it with its own "Metric time" article, which isn't the same as that metric time website, it's just the SI units related to time - https://en.wikipedia.org/wiki/Metric_time.
This website indexes its decimal time on the "standard" 12-hour timezones though, which is fundamentally broken in my opinion.
Decimal time was designed before the introduction of fixed timezones, and was relative to Paris in Revolutionary France, so a "correct" decimal clock should either use local time (using the browser's geolocation API), Paris time, or could devise a timezone system based on decimal time.
But using the sexagesimal timezones means there will be 0.83 hour difference between Paris and London for example, which is neither practical or meaningful.
I have a strong opinion on this because I have implemented decimal time (and decimal calendar) a few times, but of course it's just my opinion. Nobody uses this for actual, current applications that actually require accuracy.
I prefer this. I'm actually building it into one of my web sites. The only issue is that the thing was a bit of a stunt, and as part of that they centered it on Switzerland, which is GMT+1, instead of aligning it with UTC.
If we didn't have ten fingers would we care so much about decimals? They aren't particularly well suited to anything more than any other number system. Base 12 actually has a lot more going for it.
The best thing about SI is that it only has one unit per dimension, and no conversion factors between units (it is "coherent"). Unfortunately, this simplicity and elegance usually goes unappreciated; while people focus on arbitrary powers of ten (e.g. that 100 hundredths is one)
PS: The SI unit of time is the "second"; which is actually a sexagesimal fraction (a "pars minuta secunda", AKA a second small part; which is a sixtieth of a "pars minuta", a sixtieth of an hour)
Just appending 0s to change scale is true in any base. If you'd grown up in base 12 you'd write that 100 one-hundred-forty-fourths is one and that there's nothing particularly simple about A As (84 in your natural numerals), where A=9+1.
That is somewhat idealistic view on SI, and even more so of the metric system. While SI is nominally coherent, it is in practice accomplished by having very awkward values for defining constants, and that factor is excluded from the coherence consideration. E.g. to convert from seconds to meters you need to use conversion factor c, which is very much not 1.
I would also be very careful and not conflate metric and SI systems. There are all sorts whacky metric systems (although most of them historical) that are not nearly as clean as SI. And even SI historically has been messier than its today.
> e.g. to convert from seconds to meters you need to use conversion factor c, which is very much not 1.
SI cannot "convert from seconds to meters", since it treats length and duration as separate dimensions. Any multiplier which turns a duration into a length must have dimension of length/duration (and hence SI units of meters/seconds); something with those units is a speed, not a conversion factor (which is dimensionless). That's not specific to any particular speed, whether c or otherwise.
My example was weak, but the greater point is that SIs coherence is oversold because it still requires many unnecessary (non-one) constant factors in equations; SI merely brushes the issue under the rug by labeling those as "fundamental" as if they are not almost completely arbitrary.
c is used in the definition of the meter, but you don’t remotely need to know that in day-to-day life. As long as the seven base units (length, time, mass, temperature, electric current, luminosity) end up with reasonable values the system is practically coherent.
(are the values reasonable? well, human-scale capacitors and resistors seem pretty heavily skewed away from the center of the scale…)
The duodecimal meme needs to die. Base 12 is not that much better than base 10. Yes it has more integer divisors (2, 3, 4, 6) than base 10 (2, 5) but there's only two reasons to care about that.
The first one is to split a batch of something: there's more ways to split 12 of something than 10 of something, but you can use base 10 and still do things in batches of 12.
The second reason is for easy divisibility checks, but for that all that matters is prime divisors. Base 12 and base 10 have the same number of prime divisors 2 and 3 versus 2 and 5.
But base 10 has an interesting property here, because 10-1 is 9, which is a power of 3 you get somewhat easy divisibility rules for both 9 and 3. 12-1, on the other hand, is 11 which is prime so you only have a relatively uninteresting divisibility rule for 11.
I think at best you can claim that 12 is about as good as 10. Base 60 would be strictly better than base 10 if it weren't for the fact that we can't easily memorize multiplication tables for base 60.
> The first one is to split a batch of something: there's more ways to split 12 of something than 10 of something, but you can use base 10 and still do things in batches of 12.
I think you minimize how useful that is.
If I am at the grocery store and want to figure out how much something is relative to something else, I have to divide quantity by price for two different things, which is annoying to do when you only have 1/2 and 1/5 be round numbers.
When splitting a check between people the same thing applies.
Just because we have credit cards and calculators doesn't mean these abilities aren't useful, or wouldn't have been useful for the vast majority of human existence. I mean, British money was on a 12 / 20 basis up until the 1970s, and Romans used 12 which is still present in US measures. So obviously people found it useful. It is by chance and history that we ended up basing SI units on decimals -- it could have easily been 12 if say a few French scientists who would have convinced the rest hadn't had their heads chopped during the revolution or whatever.
I think you do memorize them, albeit not consciously. The results of operations on small numbers are "cached" in your head while operations on large numbers are subdivided into more operations on smaller numbers, and so on.
I guess OP meant that for a base-60 system we would have to do a lot of caching.
Base 10 isn't so special either, there are some cultures that used/are using base 8, counting the spaces between their fingers instead: https://en.wikipedia.org/wiki/Octal
There are finger-counting methods for pretty much any base. Most are more efficient (capable of counting higher) than the base 10 approach of associating each finger with a digit
It is easy to eyeball divisions of 2 and 3 by hand, and get results with high degrees of accuracy. A division of 4 can be accomplished by dividing by 2 twice.
It is not easy to eyeball divisions of 5 by hand; you pretty much need to use a ruler for that to get results of tolerable accuracy.
Well, 10 is only divisible by 2 and 5, while you can divide 12 by 2, 3, 4 and 6 (in the integer space, of course). So you can express more basic divisions without remainder.
Iirc, there actually is one. 13 months, all with the same number of days plus one special day that is out of all months plus leap day every four years like usual (again, out of any single month)
Worth emphasizing that decimal time is not just historical curiosity and fringe idea, especially the astronomical community uses decimal time fairly regularly, various "julian date" systems attest that. https://en.wikipedia.org/wiki/Julian_day
The USPS used it when I worked there in the 90s. They used 24 hour days with 100 ticks of 36 seconds per each hour. The time clocks operated in this format.
Which was interesting, until you also realized, they expected you to clock in precisely at the start of your shift, giving you a 36 second window to do it in.
On the first day I was wondering why there were so many time clocks for the size of the office and then I witnessed the shift change "stampede" that this policy incidentally created.
How does the time system relate to the clock-in expectation? Otherwise you would have 60 seconds to do it in, which feels like not much more time?
Oh, interesting! Is there an historical explanation somewhere on how and why they adopted this system?
What happened if you missed it and clocked in the following "minute"?
If I recall correctly, for up to 5 "ticks" you could make it up at the end of your shift, for anything more you got reported. It was a union job and so there were a lot of little "parsimonious" rules like this.
To answer the other question at the same time, in the 90s, was most shops that used an analog time clock would only advance the stamped time every 5 minutes. So, if you clocked in at 8:04, it would still punch 8:00 on your card. So, it was notable to me at the time that they had clocks with a finer grain of accuracy that fed into their computers immediately, and they actually cared about the "slop" involved in time keeping.
Thanks! Very interesting.
A decimal fraction is not the same as decimal time. There may be a simple conversion (* 1000), but the conversion to 24hr time is almost as simple. The problem with decimal time is that it is a proposal for people to use, and that it solves no problems at an immense conversion cost.
As someone who's done quite a bit of astronomy programming, I can say the Julian Date is not really a replacement for decimal time. The Julian Date really isn't much different than a Unix Timestamp (in fact, you can convert between the two quite easily: jd = UnixTimestamp / 86400 + 2440587.5). They are both good for punching in to algorithms, but not good for direct use by people.
not to forget that an astronomoical day has no 24 hours https://en.wikipedia.org/wiki/Sidereal_time
Previous (as soon as two days ago): https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... and misnamed: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
Reminds me of Vernor Vinge's future human starfaring civilization using kiloseconds, megaseconds, and gigaseconds to measure time. <https://neurocline.github.io/dev/2015/08/13/telling-time-in-...>
Same in Glasshouse by Charles Stross, and probably in many other sci-fi novels. It even glances over acclimatization to a new time representation.
A while ago there was a HN post about a metic time website, I still look at it sometimes: https://metric-time.com/
That's a neat website, but it did confuse me for a moment. It's titled itself "metric time", but seems to be the same idea as detailed on the "Decimal time" Wikipedia article - https://en.wikipedia.org/wiki/Decimal_time.
However, that article warns to not confuse it with its own "Metric time" article, which isn't the same as that metric time website, it's just the SI units related to time - https://en.wikipedia.org/wiki/Metric_time.
This website indexes its decimal time on the "standard" 12-hour timezones though, which is fundamentally broken in my opinion.
Decimal time was designed before the introduction of fixed timezones, and was relative to Paris in Revolutionary France, so a "correct" decimal clock should either use local time (using the browser's geolocation API), Paris time, or could devise a timezone system based on decimal time.
But using the sexagesimal timezones means there will be 0.83 hour difference between Paris and London for example, which is neither practical or meaningful.
I have a strong opinion on this because I have implemented decimal time (and decimal calendar) a few times, but of course it's just my opinion. Nobody uses this for actual, current applications that actually require accuracy.
https://news.ycombinator.com/item?id=37853181 (336 comments)
Me too, I occasionally go there time to time. Also the more I think about metric time, the more I love it.
Related https://en.wikipedia.org/wiki/Swatch_Internet_Time
I prefer this. I'm actually building it into one of my web sites. The only issue is that the thing was a bit of a stunt, and as part of that they centered it on Switzerland, which is GMT+1, instead of aligning it with UTC.
If we didn't have ten fingers would we care so much about decimals? They aren't particularly well suited to anything more than any other number system. Base 12 actually has a lot more going for it.
The best thing about SI is that it only has one unit per dimension, and no conversion factors between units (it is "coherent"). Unfortunately, this simplicity and elegance usually goes unappreciated; while people focus on arbitrary powers of ten (e.g. that 100 hundredths is one)
http://www.chriswarbo.net/projects/units/metric_red_herring....
PS: The SI unit of time is the "second"; which is actually a sexagesimal fraction (a "pars minuta secunda", AKA a second small part; which is a sixtieth of a "pars minuta", a sixtieth of an hour)
Just appending 0s to change scale is true in any base. If you'd grown up in base 12 you'd write that 100 one-hundred-forty-fourths is one and that there's nothing particularly simple about A As (84 in your natural numerals), where A=9+1.
That is somewhat idealistic view on SI, and even more so of the metric system. While SI is nominally coherent, it is in practice accomplished by having very awkward values for defining constants, and that factor is excluded from the coherence consideration. E.g. to convert from seconds to meters you need to use conversion factor c, which is very much not 1.
I would also be very careful and not conflate metric and SI systems. There are all sorts whacky metric systems (although most of them historical) that are not nearly as clean as SI. And even SI historically has been messier than its today.
> e.g. to convert from seconds to meters you need to use conversion factor c, which is very much not 1.
SI cannot "convert from seconds to meters", since it treats length and duration as separate dimensions. Any multiplier which turns a duration into a length must have dimension of length/duration (and hence SI units of meters/seconds); something with those units is a speed, not a conversion factor (which is dimensionless). That's not specific to any particular speed, whether c or otherwise.
My example was weak, but the greater point is that SIs coherence is oversold because it still requires many unnecessary (non-one) constant factors in equations; SI merely brushes the issue under the rug by labeling those as "fundamental" as if they are not almost completely arbitrary.
c is used in the definition of the meter, but you don’t remotely need to know that in day-to-day life. As long as the seven base units (length, time, mass, temperature, electric current, luminosity) end up with reasonable values the system is practically coherent.
(are the values reasonable? well, human-scale capacitors and resistors seem pretty heavily skewed away from the center of the scale…)
The duodecimal meme needs to die. Base 12 is not that much better than base 10. Yes it has more integer divisors (2, 3, 4, 6) than base 10 (2, 5) but there's only two reasons to care about that.
The first one is to split a batch of something: there's more ways to split 12 of something than 10 of something, but you can use base 10 and still do things in batches of 12.
The second reason is for easy divisibility checks, but for that all that matters is prime divisors. Base 12 and base 10 have the same number of prime divisors 2 and 3 versus 2 and 5. But base 10 has an interesting property here, because 10-1 is 9, which is a power of 3 you get somewhat easy divisibility rules for both 9 and 3. 12-1, on the other hand, is 11 which is prime so you only have a relatively uninteresting divisibility rule for 11.
I think at best you can claim that 12 is about as good as 10. Base 60 would be strictly better than base 10 if it weren't for the fact that we can't easily memorize multiplication tables for base 60.
> The first one is to split a batch of something: there's more ways to split 12 of something than 10 of something, but you can use base 10 and still do things in batches of 12.
I think you minimize how useful that is.
If I am at the grocery store and want to figure out how much something is relative to something else, I have to divide quantity by price for two different things, which is annoying to do when you only have 1/2 and 1/5 be round numbers.
When splitting a check between people the same thing applies.
Just because we have credit cards and calculators doesn't mean these abilities aren't useful, or wouldn't have been useful for the vast majority of human existence. I mean, British money was on a 12 / 20 basis up until the 1970s, and Romans used 12 which is still present in US measures. So obviously people found it useful. It is by chance and history that we ended up basing SI units on decimals -- it could have easily been 12 if say a few French scientists who would have convinced the rest hadn't had their heads chopped during the revolution or whatever.
>we can't easily memorize multiplication tables for base 60.
I don't think anybody memorizes multiplication tables anymore. At least I wasn't taught that when I was in school (I'm 29)
I think you do memorize them, albeit not consciously. The results of operations on small numbers are "cached" in your head while operations on large numbers are subdivided into more operations on smaller numbers, and so on.
I guess OP meant that for a base-60 system we would have to do a lot of caching.
You would still have to memorize 60 different symbols. A base so large is unwieldy either way.
How do you calculate 5 * 7? With a calculator?
Base 10 isn't so special either, there are some cultures that used/are using base 8, counting the spaces between their fingers instead: https://en.wikipedia.org/wiki/Octal
But we do have 10 fingers. And apart from that, base 10 is just about right. Not too small to be hard to comprehend, not too big to force your memory.
Would it be easier for you if you are using base 2 or base 16?
I think having 1/4 and 1/3 be whole numbers would be more useful than being able to count on my fingers.
There are finger-counting methods for pretty much any base. Most are more efficient (capable of counting higher) than the base 10 approach of associating each finger with a digit
https://en.wikipedia.org/wiki/Finger-counting
Are 3 and 4 magical? Why not 1/2 and 1/5?
They're all magical because they are small. 12 is a highly factorable number and 10 is not. Ancient Greece was not ignorant of this fact.
https://en.wikipedia.org/wiki/Highly_composite_number
https://youtu.be/2JM2oImb9Qg
It is easy to eyeball divisions of 2 and 3 by hand, and get results with high degrees of accuracy. A division of 4 can be accomplished by dividing by 2 twice.
It is not easy to eyeball divisions of 5 by hand; you pretty much need to use a ruler for that to get results of tolerable accuracy.
Well, 10 is only divisible by 2 and 5, while you can divide 12 by 2, 3, 4 and 6 (in the integer space, of course). So you can express more basic divisions without remainder.
A better idea is to go to duodecimal everything else!
While at it, also switch all writing systems to Tengwar (which is a great, highly logical writing system) and maybe study Sindarin at schools.
About as realistic.
Only when you explain timekeeping to a small kid you realize how strange the whole system is.
We also have 12 months, is there a decimal year thing too?
Iirc, there actually is one. 13 months, all with the same number of days plus one special day that is out of all months plus leap day every four years like usual (again, out of any single month)
The International Fixed Calendar: https://en.wikipedia.org/wiki/International_Fixed_Calendar