> A one-day “world holiday” between the final Saturday of the year and Sunday, Jan. 1, would bring the total number of days to 365.
I was thinking to myself I liked this idea until I saw this. So there are days that now do not belong to a week. I cannot imagine how much special casing this would require.
> “A ‘month’ does not mean anything,” they wrote. “A day means something. A year means something. But a month?”
Then by the same logic the week needs to be abolished. Then the problem above wouldn't exist. Now tell that to the religions.
I know right? Lets skip all this stupid special casing nonsense and just make each new second in this calendar equal to ~1.002747253 current seconds. Extra day solved! If we start this on the day equivalent to January 1 in the current calendar my plan has the added benefit of drastically reducing skin cancer by making 8 am on July 1 of the new calendar roughly equivalent to 8 pm in the current calendar. Can't get sun induced skin cancer if you aren't awake enough to be in the sun all summer!
Let's just drop time zones. Then noon is not at 12 o'clock in most parts of the world ... who cares? The sun will rise at your place not at 6 or 7 but at maybe 11 or 17 or 23. I would love it, humanity would have the same time all over the planet. I think it is a silly tradition from medieval times to expect 12 to be high noon.
Yah, i love time zones because it means sun roughly corresponds to the same time on the clock at all places and instantly makes decisions about other time zones easy. I hate daylight savings time because it means my sleep schedule is messed for about two weeks twice a year and i get absolutely nothing from it except screwed out of drive in movies being as much of a thing as they should be and increased risk of sudden death from a cardiovascular event.
That works great for people who live close to the equator in somewhere with timezones that reflect the sun. Many regions are not like that though.
In the middle of Europe (Paris, Berlin) the time the sun is visible varies by over 5 hours, and if you go further north it's even worse. Where I live it is already dark when I leave work at 5pm. Three months ago, it was brighter at 10pm.
Many regions have timezones that are too big. In Europe at the east of the CET timezone the sun rises at 6:36. At the west it is 8:16 - nearly two hours difference!
China is even more extreme, although they have one time people in the west of the country start their days later, so if you are in the office at 8am in Beijing, although it's 8am in western cities too, you don't know if people are in the office or not. At that point timezones have lost thier value, as you can't use them alone for scheduling.
You aren't thinking about this right. Instead of worrying about the sun rising at 0330, worry about how you are going to build enough blackout blind factories to profit from the obvious solution to not writing thousands of lines of code.
It's really not that difficult compared to what we have now. In our current system, there's one hour a year that happens twice and one that gets skipped. It's a weird special case, but we manage.
Badly, we manage badly. I have wasted hundreds of hours debugging. Other people's code and writing involved workarounds because the team couldn't resist using "standard" ISO timestamps instead of the much more "arrow of time" like numeric offset (like Unix time)...
How are iso timestamps a problem? You need a time zone of course. Usually people stick to UTC. But it's trivial to convert iso timestamps to unixtime so it's hard to imagine how one would have more bugs.
ISO 8601 timestamps do require a time zone when it is relevant. You can only use non-offset timestamps if "local time" is constant. If your time zone varies, you cannot use non-offset timestamps. Your choices are to choose a single tz and stick to it (in which case offsets should be used to clarify for humans), use UTC (which is probably the simplest option since you're sticking with one tz to begin with) or use offsets properly, marking the current timezone on all of them. Trying to use "local time" in places with DST is like trying to build a bridge out of paper.
For this reason, most people either use UTC time or they use offset timezones for timestamps. The latter is nicer for humans who like understanding when things happened in terms other than hundreds of millions of seconds since 1970, but more complex to implement.
Using standard ISO timestamps is good. The reason why people would like to use them is because they sort well, they parse everywhere, and they're human-readable. That last reason is the main and really only reason why you would prefer it to a Unix timestamp. They're standard, but they're a standard for how to communicate things in a human-readable manner. If you wanted to just store accurate time, store a binary 128-bit fixed-point long instead of using UTF-8, since it doesn't need to be human-readable.
Is there anything in the standard to support what you're saying? Based on [1], the guidance is:
> [5.3.1.1] When the application identifies the need for an expression only of a time of the day then the complete representation shall be a single numeric data element comprising six digits in the basic format, where [hh] represents hours, [mm] minutes and [ss] seconds.
> [5.4.1] The components of a time-point shall be written in the following sequence
> For calendar dates:
> year - month - day - time designator - hour - minute - second - zone designator
> The zone designator is empty if use is made of local time of the day in accordance with [5.3.1.1]
Yeah, exactly. When there's only a need for time of day then the time is enough. The standard lets you pick the tool you need here, it's not going to have some kind of procedure for determining whether your application should use one thing or not. It's very clear that here that undesignated time stamps are the wrong one. The standard only describes a set of tools, what I said was common sense.However, the information is there. In 5.3.4.1 you have "When it is required to indicate the difference between local time and UTC". When it is required. If your time zone changes it obviously is required.
> That last reason is the main and really only reason why you would prefer it to a Unix timestamp.
I think this sort of argument is a bit shortsighted. Using ISO timestamps brings with it the semantics of including UTC offsets in pretty much every operation and historically date library APIs have had their ways of confusing consumers. Sure you can read the data without having to convert them in troubleshooting. You also will end up troubleshooting more because somebody somewhere used a library call that discarded an offset or something similar.
I had to look up that last bit. There was a system I was working with whose developers told me “Assume Eastern Time” all the time (I'm on PDT/PST)…until the daylight saving time change in March. I missed some data from their end and their developers told me to “just fetch again and correct your code”
> While it may be safe to assume local time when communicating in the same time zone, it is ambiguous when used in communicating across different time zones.
I love how the time zone designator chapter on Wikipedia also illustrates a common complexity pitfall.
> So the zone designation for New York (on standard time) would be "−05:00"
Except for the other half of the year where it is "-04:00". If you, by chance, pick that offset up in logic, make sure that where you apply it it is still the same half of the year! Classical off by one.
Calendar items that I added a month ago have now shifted by an hour. It's all rather exasperating. Not to mention the three weeks of headache each spring after the change.
Absolutely nothing is preventing you from celebrating your birthday the way we do Holidays. My birthday isn't September 22, it's the third Saturday in September.
Heck it's not even weird to not bother with this redefinition and just have a party on a more convenient day.
There's nothing preventing that, sure, but I don't think I'd really enjoy that. I want to celebrate my birthday on the anniversary of the day I was born, not on the nearest weekend day.
(My birthday falls on a national holiday where I live, though, so in practice I've never had to worry about school or work on my birthday.)
In this calendar system, though, the third Saturday in September would always be the same day of the week, so it would keep you from celebrating in that format.
Nah, the opposite is true for me. If I know for sure it's always a Tue, I can plan around it. I can schedule PTO for Mon and Tue, and have a nice long weekend, for example.
What's so frustrating to me is when I realize that, for example, public holiday falls on the weekend (I know this is solved by some countries by moving the holiday to Monday or Friday). Last year, 24th of Dec was a Saturday, pretty much the worst time for Christmas.
That's not a reason to prefer one calendar over another. Celebrating the anniversary of your birth is an arbitrary decision. Preferring it to fall on a weekend is odf. Anyone can celebrate anything at any time.
That doesn't really prove anything; I could invent a language where my word for Wednesday meant "first day". But that doesn't make it so. The language merely would follow my convention, not describe something inherently true about the numbering.
(FWIW, I agree that Monday is the second day, but mainly just because that's what I was raised to believe. Plus if Sunday is the end of the week, that means Saturday isn't a weekend anymore, which is nonsense.)
Similar for Georgian, Monday through to Thursday are named roughly "two-Sabbath" to "five-Sabbath", with Saturday being Sabbath of course. Not sure why those numbers, but I assumed it was because it was the number of days past Saturday?
Friday and Sunday get different names, unrelated to Sabbath as far as I can tell -- paraskevi and kvira.
Oh nice, the more you know! Kvira probably refers to the same word meaning "week", but seems like there's theories it could derive from Greek mythology as well.
Lots of influence from Greek culture in Georgia for sure. For a fun fact, the Georgian for "Greece" is saberdzneti -- the land of the wise. Berdzeni itself means wise (as well as Greek), but I'm not certain if it's derived from an alternative historical name for the Greek.
Doesn't it mean "Moon's day"? From a quick google it seems it derives from the greek "Selēnēs hēmera", where "Selēnēs" is the gooddess of the moon and "Hēmera" is the personification of day(all of this is straight from google[1] btw, i know nothing about ethymology, only thing that sparked my interest is that in most romance languages the word for monday has a similar root to "moon")
He said the Greek (modern, not ancient) name -- Δευτέρα (Deftéra), which has nothing to do with moons, but means "second day". Obviously this only makes sense in a Christian context which starts the week on Sunday. Monday is "first day" (星期一) in Chinese, for example.
Christian context is a bit fuzzy here. On one hand, Monday is called the same in Hebrew (יום שני [yom sheni] = "second day"). On the other hand, Russian "вторник" [vtornik] ≈ "second" is Tuesday, same in other Slavic languages.
I don't think I'll ever get this. Sunday and Saturday are at the (opposite) ends of the week. If Monday is the first day of the week and Sunday is the last, Saturday is no longer a weekend. It's a... near-the-weekend.
Not sure I follow? I've always considered both Saturday and Sunday to be the weekend. When someone asks me what I'm doing on the weekend, I'll think they are asking about Saturday and Sunday and in addition eve of Friday.
The weekdays currently act as a weak parity check.. if I said "let's have a meeting on Tuesday 12/8" you'd double check because December 8th isn't a Tuesday.
The weak parity didn't prevent a friend from booking a rental car in April although we were going to be in the destination city in July.. the days of the week matched!
Changing the calendar in general seems stupid. The gregorian calendar is already well established simple to understand. What benefit does changing it actually bring?
More importantly the US should switch to the metric system.
Your use of "switched" is leaving out a lot of nuance. Metric is by law preferred for trade and commerce, but what to actually use is left up to each individual and business, and individuals and business don't need to actually care about this law.
Like your arguments against changing the calendar to something more rational are the same arguments a lot of US folks use to justify not switching to the metric system.
> importantly the US should switch to the metric system
Why switch to a system that is widely agreed to be worse for normal human communication than the one we have.
Fahrenheit is far superior of a system, where 0 to 100 roughly reflects the range of temperatures one might expect in the lower 48. When you are outside that range, you know you’re in for very uncomfortable weather. Celsius doesn’t give much benefit beyond the vague water will freeze around 0.
And then we get to distance. The US has a system where a foot is divisible by 2, 3, 4, and 6. Metric is only divisible by 5 or 2. It is horrid if you need to divide things in thirds.
> When you are outside that range, you know you’re in for very uncomfortable weather.
Which is... Pretty random? What is "very uncomfortable"? 'Switch to autumn/spring clothes' uncomfortable (<20ºC)? 'Switch to winter clothes' uncomfortable (≈0ºC)? 'Get a sweater below your winter clothes, and maybe double trousers, too' uncomfortable? (<-20ºC)
Celsius fits the mental model of normal human communication quite well if you use it.
> Celsius doesn’t give much benefit beyond the vague water will freeze around 0.
Which is a very major shift: rain turns into snow, roads get slippery, you should dry your hair extra-carefully (> 0ºC it's raining anyway, you'll get wet anyway, why bother; <0ºC it will turn to ice on your head, you definitely don't want that).
> Celsius fits the mental model of normal human communication quite well if you use it.
That last bit is the key: Fahrenheit also fits the mental model of normal human communication quite well if you use it.
The F vs. C wars are silly; if you were raised using a particular unit system, then that system will most likely feel the most natural to you. If you didn't, you can certainly learn another and adapt, but for most people it won't feel right.
This is all just fine. Yes, it's annoying when you travel to a place that uses a different unit system than you're used to, but life is full of such things, and we should just live with it rather than trying to argue what's "right". Nothing is right, nothing is wrong. Whatever is, just... is.
(Note that I'm not making any arguments about which one is better for math or science or whatever. Just about what feels comfortable. For the record, I do think many things -- including global interop -- would be easier if we all used metric. But I don't think, no matter how hard I try, I'll ever be able to "feel" celsius properly.)
Neither Farenheit nor Celcius are "superior" on their own. However, Celcius slots in to scientific notation nicely, which gives it a huge benefit.
I'm Canadian. I deal with a bastard mix of metric and imperial daily. Farenheit is the one unit I've never needed to learn in the slightest. I can deal with lbs, feet, inches, etc.
As for claiming base 12 is superior because you can divide by 3, what happens when you divide by 10? Surely if base
> Fahrenheit is far superior of a system, where 0 to 100 roughly reflects the range of temperatures one might expect in the lower 48. When you are outside that range, you know you’re in for very uncomfortable weather. Celsius doesn’t give much benefit beyond the vague water will freeze around 0.
First, 0 and 100 F seem completely arbitrary.
Second, if they have such immense value, people using centigrade could manage to remember "-18" and "38."
My brain is sadly stuck on the default "F" units since I grew up in and continue to reside in the US, yet even I can remember general mappings of Centigrade to "how it feels" as a human. This is not a difficult thing to do/learn.
I sometimes appreciate the extra dynamic range of Fahrenheit for describing temperatures relevant to human environmental comfort. With C you need to use a decimal point and significant digit. It's rarely that important though. You could get by using just the even numbers of F temps for casual purposes, or while numbers of C.
I personally doubt I could ever really tell a 1F difference.
Other factors (sunlight, wind, humidity) make that level of precision kind of useless for a "how does it feel" situation, which IMO just further degrades the value of 0-100F thing as conveying any particular significance.
Your argument appears to be that Americans are too stupid to figure out metric.
Water freezing around 0 is a good indicator that the temperature is cold enough to kill you given that you're mostly water.
If I'm dividing a length of wood in thirds, I use my metric ruler, and I mark out three even cuts. It's crazy how they sold me a 2x4 in feet and it still divided into centimeters.
I agree that base 10 is not optimal but the US/imperial distance system is hard to defend.
A foot is 12 inches, a yard is 3 feet and a mile is 1760 yards. And don't get me started on the weight and volume units.
I entertain a hypothesis that such inconsistent unit systems (and heavy usage of fractions instead of decimal or other positional system) do make people less apt in handling numbers. Having lived for a while in England (which is legally metric but in practice imperial for many things) for a while, I think it is reflected in how things involving numbers are often stated in quite convoluted terms.
Why does the foot being divisible by 12 help anything? If you're dealing with something 7.3 ft long, what's the advantage over 2.23 meters?
Hell, if you've got something 2.4 meters long and 7.87 ft long, it's the metric length that happens to be conveniently divisible by 12.
The situation with machine tools in the US is unbelievable, by the way. So many stupid mistakes have been caused by the confusion between mils (1/1000 of an inch) and millimeters. And many many tools and bits are designated in fractional inches rather than whole or decimal units (as in countries using metric) which is a massive pain in the ass because both CAD software and quick mental comparisons are generally not conducive to bizarre fractions like 9/64".
The situation is legitimately a little bit different with minutes and hours, since we are able to specify units of time somewhat arbitrarily to match the units, i.e., if hours were 64 minutes instead of 60, many meetings would instead be 64 (or 32) minutes long. This has to do with the reality that we generally do not know accurately in advance exactly how much time is required, so in general there's a lot more approximation involved with common measures of time than common measures of distance, and it's handy to be able to split the hour cleanly in multiple ways.
Carpentry and construction are both also areas where you don't really have an "exact" need to be at certain measure. You can usually get away pretty well with just picking a nice round number and going from there.
Machining is different, and metric is accordingly often used there.
> Carpentry and construction are both also areas where you don't really have an "exact" need to be at certain measure. You can usually get away pretty well with just picking a nice round number and going from there.
This does not sound like you’ve done much of either. I spent my summer rebuilding my kitchen and, oh boy, do millimetres matter.
> The situation with machine tools in the US is unbelievable, by the way. So many stupid mistakes have been caused by the confusion between mils (1/1000 of an inch) and millimeters.
To be fair, this wouldn't be an issue if we had only imperial units. It's only possible to mix these up because we have an awkward mix of both systems.
> The situation is legitimately a little bit different with minutes and hours, since we are able to specify units of time somewhat arbitrarily to match the units, i.e., if hours were 64 minutes instead of 60, many meetings would instead be 64 (or 32) minutes long.
The same situation is true of carpentry, though. when we're building something out of wood we rarely have specific dimensions we must meet, we're building something artificial and can pick the closest round number. 2x4s are 2" by 4" not because they need to be slotted into some naturally-occurring fixture but because it's a nice round number that is sufficiently useful. The standard ceiling height in the US is 8' not because of any law of the universe but because it's a round number that's easy to measure and easy to divide (quarters, halves).
Your examples of something 7.3 feet long or 7.87 feet long simply don't come up in carpentry because we'd usually just round up to a nice number and scale the whole project accordingly. And if we do have something that needs to be precise within +/- a certain tolerance, a base unit that is easily divisible is more likely to be able to round conveniently while remaining within the tolerance.
Ugh, so I was just making some new window screen frames to replace some that were missing, and what a pain that was to measure things out. I had fun dimensions like 21-5/16" all over the place. Now, they certainly didn't need to be accurate to a sixteenth-inch, but being off by even a quarter inch could easily make them not fit.
Too large and you just can't get the screen into the frame, and too small and it can just fall out, or at the very least leave gaps large enough that bugs can get through.
I agree with you that there are many things where we don't have a specific physical need for certain dimensions, so we just pick round numbers. But I think the cases where we do have to conform to some messy existing physical dimensions come up more frequently than you think.
Hours being base 12 is why time is way more painful to calculate with than other quantities, and why SI just sticks to seconds.
Would Americans love pre-decimal British currency? All those shillings and florins with lots of integer divisions? I bet the average metric hating American would be unironically demanding decimal currency back when faced with it.
I always get curious about how American construction works when I hear this argument. Do builders spend all their time dividing stuff into 3 rather adding, multiplying or subtracting? I would've thought adding a sequence of measurements up would be a far more common operation.
What happens when you need to divide something into 3 a second time? Or you need to divide an arbitrary length into 3? Or what if you need to divide into 3 sections, but there is something extra between each section (like a frame or a post), or you have to account for the width of the cuts etc?
Is an 8x4ft sheet (eg plywood etc) really easier to mentally divide into 3 (on either dimension) than a 2400x1200mm one?
But if your sabbath falls on on a Tuesday this year, Monday next year, etc. it's going to cause lots of trouble with store hours, work weeks, school schedules, and so on.
Because the religious make up a majority of society (and it is apparently only going to grow due to secularism failing to pass on the idea that having more than one kid matters).
Ah, yes, the "programming it's too hard, let's go shopping!" school of thought.
Visionaries at companies like Google and Apple are known for their "ugh, special cases!?" approach to not finishing Jira tickets, which is why they are where they are today.
> I cannot imagine how much special casing this would require.
Someone who administered a scheduling system for flight crew was telling me how hard it is. Multiple time zones, contracts, unions, jurisdictions, companies and staff roles (which require frequent recertification) etc.
Imagine handling a bonus day in this context. It does not sound good.
I have tried explaining some of this complexity to people when discussing Y2K (usually them rolling their eyes at the naked cash grab by overpaid consultants and me arguing that while there clearly was a gold-rush there were also real problems that got fixed which was why the stuff wasn't breaking left, right and centre).
A useful suggestion, definitely worth considering. But having a year that is seven days longer would complicate business and taxes, which was the main reason for the regular calendar. Also, dates might get too much out of sync with things like the equinox for some religious tastes.
The French Republican calendar (used by the Revolutionary government in France in the 1790s and early 1800s before Napoleon abolished it as he did most revolutionary innovations) had a similar system of 5 or 6 holidays outside any "week" (technically "decades", as they used 10 day blocs as part of the same base 10 obsession that led to the metric system) or month called the "Sansculottides". It apparently worked fine. The main drawback was the 10-day week as workers only got off on one day in 10 rather than 1 in 7 (weekends weren't a thing then and people normally only got off on Sundays).
I think the best backwards-compatible calendar that we should adopt is the ISO week date calendar, commonly used in finance: forget months, every day is a week number + a day. So today is 44.7, which means week 44, day 7. This also gets you a nice almost even 13 weeks/quarter, and easy calculations (that report is due w48.5, aka Friday in 4 weeks). We used a worse version of this calendar in Intel and it was great (we callled them work weeks instead and iirc it was not iso8601 aligned).
This way, people can play with legacy dates for their birthdays, religious observances, etc but use a sane system at all other times. This already happens with Chinese/Islamic calendars anyways, Chinese new year is a date I have to look up every year, and that’s OK because the calendar is used for literally nothing else outside traditional festivals. Optimise for the common case, after all. Christmas is w52.1 this year, w52.3 the next, etc, just as CNY is 22-jan and 10-feb this and next year (but 1/1 on the Chinese calendar).
The only trouble is that some years have 52 weeks and some 53, but such is the price to pay for backwards compatibility.
Having worked in banking and hedge funds for many years for US, UK and European firms i have never once encountered this ISO calendar. I also part trained as an auditor when i started out, similarly never saw this. In fact this is the first time I've ever heard of it.
This sounds nice — but is it really necessary to treat our good friend, the decimal point, so badly?
Calling today 44.7 means that we're effectively using base 7 after the decimal point (but it's 1 through 7 instead of the usual 0 through 6, for extra fun), while using base 10 before it.
Why not just something nice and easy-to-parse without all the mathematical confusion, like "44d7"?
That dot has many interpretations; in dates, sometimes it separates the month from year, or the day from a month. In version numbers, it separates major from minor, etc.
It's so much not a "decimal point", that in some languages (Hungarian and German, for instance) it is a thousands separator, whereas for decimal fractions a comma is used (10,000.00 in English is 10.000,00 in Hungarian, making Excel flip)
It is really just a dot. Our interpretation is what makes it a certain kind of separator. Or not a separator at all, like at the end of this sentence.
But... It doesn't in IP addresses, version numbers, dates, with different locales, and possibly many other uses.
These examples clearly show that "numbers with a dot" doesn't always mean decimal.
A dot is a dot. Mainly with the _role_ of a separator. When you interpret it, it might mean a _decimal_ separator for you, but as well might mean a thousand-separator for MS excel with the German locale.
That depends on the country. For example, in Japan it is very common to write dates with either a dot, or with the full kanjis for year, month and day.
“As a consequence, if 1 January is on a Monday, Tuesday, Wednesday or Thursday, it is in week 01. If 1 January is on a Friday, Saturday or Sunday, it is in week 52 or 53 of the previous year (there is no week 00). 28 December is always in the last week of its year.”
no, the one where there are years without a january 1, and years with two january 1. there's nothing that says january 1 has to be new years, and in some cultures it already isn't. there's also nothing that says you can't continue celebrating january 1 as the new year, while using a week-day calendar for all official purposes.
a new calendar not mapping cleanly to the current months hardly seems like a valid criticism of a calendar that intentionally doesn't map to the current months.
Because there's no confusion about whether you're talking about the new February or the old February. Is Feb 1 the 32nd day of the year or the 29th day?
i typically use day of year only, its super helpful, especially when calculating time between dates, or time until date. Also i use metric time which makes life even easier for to the second calcs. For many people, myself included, regular breaks through things like weekends do not truly exist so there is no point tracking them in the main date format. You might need it on occasion but it is information so rarely useful that recording it every time i write or use a date is wasted energy, especially when you consider how frequently you need to handle long range dates which DOY does more direct than day of week / week of year.
Not sure if you’re serious, but the 7-day week as so entrenched that no system that breaks it will see any adoption. Maintaining the 7-day week and the 365 day year is non negotiable for any kind of calendar reform.
But those 365 days don't have to all fit into weeks. Take 364 days, divide those evenly into 7-day weeks, and then have the one (two if leap) new days be not part of the weeks. Call it Caturday or something. Hold a contest.
I always figured the funnest thing would just be not to give it a name, and then when people ask be like, well yesterday was Sunday and tomorrow is Monday.
But the more reasonable proposal I heard was just to call it New Years Day/Weekend, depending?
The fine article talks about that---there was Jewish resistance to the idea because they felt that the Sabbath was always the 7th day and that would get out of sync.
While we obviously can't change the length of the year in days, is there any actual reason to have a 7 day week beyond "that's what we've always done"?
That’s really a strong reason, is it not? I have no idea for how long humanity has tracked the seven day week, but suspect it’s been well over 2000 years.
7 days for the moon to go in half a cycle. Full to half. New moon to half moon. A pretty good demarkation of a period of time between one day and a month (full lunar cycle).
Yeah, momentum. I can't think of what authority would be able to change it and make it be globally followed. A tradition of thousand years, with roots in religious texts. I don't think it is within anyone's power to change it.
Just plain day number as the calendar is worse than weeks imo. We want to optimise for the common case, and wanting to know the day of week and number of weeks from today is much more common than day of year.
For example, “our next meeting is w48.3” is much more understandable than “day 339”.
I don't see how. You are making assumptions and claims for others with no support. To me that week.day number looks more complicated and less meaningful than plain days.
Perhaps weeks seems to make sense since we sometimes use them for offsets and ranges, like "this project will take 6 weeks" and figuring out how many days that is is inconvenient. But if we were using days then we simply wouldn't use groups of 30 or 7 for things like that, we would use groups and multiples of 10 or 5. We'd probably end up using kilodays and millidays too instead of years or minutes, and years would be arbitrary, but 51.9something weeks is already arbitrary anyways.
And as that # of weeks example just showed, .day is ambiguous with decimal fractions of a unit.
All in all I see no special obvious rightness here. It's kinda crap actually.
If making such a change, why not make it even simpler: use year + day number in year. e.g. today is 2023 day 309 (or day 308 if more sensible 0-based indexing would be used)
And keep the named 7-day week cycle as-is independently of that, this one has been going for millenia and we need some weekends after all
This is called the Julian (or Ordinal; I don't think there is a difference) calendar and is in wide use in (at least) parts of the US military for tracking things.
Actually, looking it up now it looks like the usage of the name "Julian" is due to the way the old tracking systems were programmed and used "JDATE" or similar. Nowadays it's just burned in habit that it's called the Julian Date.
I know not to hope for it to ever catch on, but my dream date-time format would be:
year/day/hectosecond.second(.millisecond and so on to the desired precision)
The current date-time, to the second, would be 2023/308/342.65
The days, like the years and second-based units, are 0-indexed.
There are 100 seconds in a hectosecond, and normally 864 hectoseconds in a day.
And to go with it, I'd advocate either a "3-on, 2-off", or "6-on, 4-off" work cycle, depending on the nature of the work, or even just personal preference. This way, today(308) would obviously be on a "weekend" either way.
Why even have years? Like Unix epoch, pick a start date, and just have the number of weeks increment indefinitely. If you really wish to do some separation, just say "last 50 weeks" or something.
so… are we going to talk about which day is day 1? because even here we can’t agree. this also translates to not having the same week number in certain years.
> The Jewish day of rest — the Sabbath — falls on every seventh day. With an added blank day inserted each December, the Jewish seven-day cycle (believed to be dictated by God) would no longer align with the days of the week. The Sabbath — a day on which work is prohibited for Jews — would land on a different day of the week (and not necessarily on a weekend) each year.
So did mankind keep the universal 7 day week (as god intended it) and in doing so, inadvertently prove it to be a man-made construct?
The Jews have been keeping track of 7-day weeks consistently for the past 4 millennia. If anything, it proves that trying to break the cadence gets ugly fast.
I think you are confusing the Sumerians with the Hebrews here. It was the Sumerians who have the oldest documented reference to a 7-day week, not the jews.
Well, you know these things, and like the ban on eating pork (pork doesn't walk in the sand, so you have to buy it from somewhere) and morals in general, are a bit opportunistic. There are 28 days in a lunar month, easy to divide by 2, impossible to divide by 3, so 4 is a convenient number. Nevertheless, God makes the sun and the stars and the moon, which go round every 28 days - the light before the stars, which could be well aligned with Einstein. When it's time to harvest, look at the moon, don't worry about the sun. And perhaps Jehovah of the Seven Armies was El in the beginning. A local deity who was certainly a harvest deity. The problem is difficult. The year doesn't agree with the months and the months don't agree with the days, hence the calendar.
I like to fantasize about what life would be like if we had different length weeks. What if we ended up with three or 5 day weeks? Or perhaps 10 days?
7 days feels like it aligns well with the way we think and communicate about time. But maybe that is because it’s the only week length I’ve ever known.
The phases of the moon cycle every ~29.5 days [1], so you could use those to make an argument that a ~7-day week is pretty "natural."
One could imagine early humans being much more aware of the phase of the moon (whereas I haven't even seen that thing in months), and the 29.5 day cycle naturally splits into two pieces, since it's (new moon) -> (full moon) -> (new moon).
And one could also imagine people were pretty aware of when they were halfway between a new moon and a full moon — so now we've naturally arrived at a rough time interval of 29.5/4 = ~7 days.
We even have words for the halfway phases between new moon and full moon: first quarter and last quarter. The 7 day week is not directly derived from natural cycles (unlike the month and year lengths), but the moon phases are convenient synchronization points for (approximately) 7 day cycles.
I think it would be interesting to try a 14-day workweek, working 9 days on and 5 days off.
The advantage being that you can do a lot more on your weekends (international travel on a weekend is now possible), and during the week you'd have a lot of uninterrupted time to work. I imagine the productivity would be similar to a five-day week but it would feel like a lot more time off.
I feel like it would be better to split up the current streak of 5 working days we have than to clump it into 9 days of continuous working. You would definitely get burnt out after 9 days, as people already get burnt out after 5.
We should go in the opposite direction - something like Weekend Wednesday [1] appeals to me. That way, you always have a break to look forward to, and you don't feel like your off days are unproductive. For longer breaks and international travel, that's what vacations and holidays could be for.
I could see that a one-day weekend might not seem long enough for some people, in which case I have a compromise: the current system. It isn't perfect for everyone, and isn't immune from change, but it's a decent enough middle ground.
Only slightly related, but I sometimes wonder how much more advanced we might be if we'd lucked into adopting a base-8 number system (8 fingers, ignoring the thumbs!) instead of base-10. Obvious comp-sci benefits aside, imagine the utility of a metric system based on (no pun intended) base-8.
(I know some people say base-12 would be better, as it has lots of integer divisors, but I think the ease of repeated doubling and halving, along with that of binary conversion, makes base-8 a clear win.)
It is an interesting thought experiment to wonder if certain things were discovered earlier or later owing to a particular type of maths that we practiced.
I was amused to learn that there are societies dedicated to the cause of dozenal system who write papers and books urging everyone to switch. On the other hand, I don't know if it is exaggerated but French revolutionist had grand ideas of creating decimal based systems even for time. But better sense prevailed in the long run!
It is interesting to think how can changes of the form "if we can align everyone to change and adopt this new way" can be brought about at any scale beyond a few thousand folks. I think large mass adopted switches (horses to cars, smart phones) have been about incentives and convenience demonstrated by some early adopters. But for basic arithmetic this seems impossible to pull off!
Yes, it's a coordination problem where the incentives for a single individual don't match the optimal global solution. Aka trapped by network effects. You don't see this problem with cars or smartphones because the benefit to the individual is obvious and independent of other people's choices.
Or we could adopt a number system that has actual, proven arithmetic benefits. See the Kaktovik numerals (https://en.wikipedia.org/wiki/Kaktovik_numerals). As a bonus the numerals are all one single stroke, so it's super fast to write!
Though, the benefits to calculation come from the fact that the numerals are iconic, not due to it being base-20.
I'm not sure we would be any more advanced. Anyone to whom binary conversion is relevant has sufficient intelligence to not need the convenience of base 8. For practical needs, convenient representation of repeated doubling and halving are of limited use. Much more relevant to the daily needs of actual people for most of history was the ability to reckon measurements using common objects such as the human body, hence the bewildering diversity of different measurement systems across different peoples, see e.g. https://en.wikipedia.org/wiki/Medieval_weights_and_measures. The base of the number system was a non-factor.
> I know some people say base-12 would be better, as it has lots of integer divisors
IF people with an extra digit on their hand were more common and they were somehow elevated to a priestly / godly class at some point by a dominant tribe ... we would pronbably have had base-12 number system
My point was that you could imagine an alternate history where people landed on base-8 (which of course would still be called "base-10" in that timeline) because of 8 fingers ignoring thumbs. And that that would have had some advantages. Obviously society isn't going to change now.
The problem would be solved if you could just change the orbit of the Earth around the sun so that it took exactly 364 days to orbit instead of 365 and some change. Right?
That would mean moving the Earth slightly closer to the sun. I wonder what side effects that would create (assuming of course, you could actually find a way to do it - not exactly in the realm of possibility).
Well the Earth's orbit is an ellipse, so there are times of the year where we are closer to the sun (January is when we are at perihelion).
We could either change our eccentricity in order to shorten our year to exactly 364 days, or we could keep our eccentricity constant and change our perihelion and aphelion.
I'm too lazy to do the math to figure out whether the change via either approach would be less than the difference we already experience due to our current orbital eccentricity.
Would love to know, if anybody feels like doing the algebra at the very least in terms of insolation.
Or, if that’s too hard, leave the orbit the same but slow the spin down a little so each day is longer. A 0.35% decrease in spin speed seems fine right?
According to Wikipedia [0], 70 million years ago the earth rotated about 7 more times per year than it did today, meaning we just need to wait about another 10 million years and it'll happen on its own.
I remember doing a physics problem way back in high school where the teacher asked how long it would take to slow the Earth down 1 m/s if everyone on the planet started walking in unison westward and never stopped. The population was several hundred million less then than it is today, but I think the answer was still measured in centuries.
We could potentially do it over the span of millions of years. Not without risk though. And generally we would want to move further from the sun when it reaches a later stage in its life and starts to expand. https://en.m.wikipedia.org/wiki/Moving_Earth
All I am saying is, if we change the earth's rotation and period of revolution, we can fix it all on something sane for once.
May as well correct that eccentricity while we're fiddling about.
We slow the rotation, give us twenty-five hours in a day. We could all use a little extra. Then we go for four hundred days per year, with the extra near-thirty-five days as a kind of bonus. This gives us a nice hundred thousand hours per year. We just tighten up the second a tad, so we have a hundred thousand of those per day.
There was a Saturday Night live skit about the metric day that did this. If I remember a minute was 100 seconds. An hour was 100 minutes and a day was 100 hours.
How will you stay up for a 40 hour work day? There will be an adjustment period that will be helped by G.A.S. Government Amphetamine Supplement.
This is really cool! I can't imagine that many of the people who are alive now being able to adjust to something like this easily (having to completely change their understanding of what a second, minute, hour etc. is), but I do wonder how well someone learning this system from scratch would fare.
> When rendering a clock that ticks with this new time, I realize that a second is too fast for a human to count. In fact, it's a little over 10 times faster (11.574x). So I invented a new term: Tenet. A tenet is 10 seconds. It's a human friendly time counter. Not affiliated with Christopher Nolan.
The word "month" descends from the Indo-European "mehns" which also translates as Moon so if we are to be true to the meaning of the word a month of 28 days makes no sense given that the Moon's sidereal period is 27.3 days and its synodic period is 29.5 days. The synodic period makes much more sense as a basis for a calendar as there are 12.3 lunations in a solar year. Man's earliest observations, looking at the night sky, would have been the 12 lunations in the year. We forget this when discussion veers off down the inevitable blind alley of pure numerical reasoning. Nature is showing us the way if we can see it.
Just start the months between days. Why do they have to align? The month is currently a completely useless measurement of time and only used to tell the approximate time of year. It doesn't matter if the next month starts half way through a day (which, of course, it does).
And the US almost achieved official metrication in the 80's, but the head of NPR (and Kissinger bagman) with another bagman convinced Reagan to stop it.
> By 1928, Eastman had already implemented the IFC internally at Kodak and was spending his own money to persuade the rest of the world to follow suit. Soon 140 American companies joined him, maintaining a 13-month calendar for their businesses and wagering that calendar reform would continue to gain traction.
I've seen this repeated before, but never with any details or attribution. Did Kodak really use this 13 month calendar? What about the 140 other companies? If it was really so popular, I would have expected it to leave more of a mark.
Almost adopted, or almost back to it. I remember reading in Toynbee that the solar calendar was an effect of the hard transition from a matriarchal society to a patriarchal one. I also remember reading Works and Days and how the Boreas impregnated the mares. So yes, sure it was a hard revolution, I think the most important one. 13 was the unlucky number, the Satan number and so on. But 28 days is easy and convenient, and even in the High Middle Ages --- as I said yesterday --- people count months as 28 days.
I'd love the positist calendar. That said, even just having 7 months with 30 days and 5 with 31 and either grouping all the months with the same length together would be nice. E.g. jan-May have 31 days, all others 30.
Also might be nice if we can fix names so that months match their names again (September being the seventh, October the eighth and so on.). It's insane we lost this because two emperors named some months after themselves 2000 years ago and we are still stuck with it.
The emperors are not at fault. They renamed months and didn’t insert new ones. It’s just that the beginning of the year changed from March to January at some point. (Hence the leap day at the end of February as well.)
It's so weird to me that they assassinated him, but didn't rename the month back. Imagine Trump had renamed June to "Trump", he then gets impeached and 2000 years later the month is still called Trump.
Apart from a better calendar and one universal time zone, time should be stated in swatch internet time / French revolutionary time.
One day has 1000 beats each 85.4s long. Perfect!
Upon [intentional?] billing confusion, it was sneakily explained that the rental company uses 28-day "billing periods" for its "monthly rental promotion."
Sneaky. What I did (since we had an agreement for "toilet rental, six months") was not pay for the ficticious/additional "month." They can literally eat a turd.
Wouldn't their 6 periods be shorter than your 6 months? I assume this means that they tried to charge you for the 7th period. But probably also didn't tell you when the 6th period was ending and the unit was expected back.
Edit: I wonder if there is a legal definition of "month" and contracts are not allowed to redefine it...
Sure, not harder to remember a single date. But correlating your life events to past/future dates requires too much unintuitive math.
What's the next date that we'll celebrate Christmas on? On what date is your 4 week follow-up appointment?
You could also switch to decimal periods of 10/100/1000 days, but the whole source of the calendar problem is that humans care about the seasons outside.
Personally, I always thought 19 months, 19 days each, for 361 days and take the last four or five off (we do anyway) made the most sense. Then I learned the Baha’i calendar uses this system.
One variation I would support is a 19 day week, with 14 days on then 5 days off. That’s 266 working days per year, about the same as our present 260.
Alternatively, 10 days on 9 days off is 190 working days, basically instituting 2 months of vacation, which our friends in Europe already enjoy.
Personally, I like the calendar with twelve 30-day months and 6-day weeks, with an additional "new year" holiday 5-6 day quasi-month. You get equal quarters, sane numbering inside "working" months, and leap days stay inside holiday quasi-month.
Because you get exactly 5 weeks inside a month. We could use six 5-day weeks instead, but I think that having an even number of days in a week is convenient in many cases (gym routines is one such example). Also, with 6-day weeks, the quasi-month will not be longer than a week, so "holiday month's Monday/Sunday" will be unambiguous.
If an economy is advanced enough to allow it, it would stay 2-day weekend with 4 working days. I think it would be more balanced than the 4-day workweek which slowly gets traction today.
I think Solar Hijri calendar is better, but with Scientific Anno Mundi (instead of Hijrah), and then the local time zone to be based on mean solar time (with solar noon at 12:00 and midnight at 00:00) (using UTC for nonlocal timekeeping).
For the sabbath issue, I'd say make the "extra" day an additional day of rest. That way you never go more than 6 days without one. It matches the spirit of the issue, no?
But you would certainly never convince everyone. :-)
Well, it certainly doesn't resonate with me, although I think I understand it well enough for these purposes. What I understand is that some Jews would be fine with it and some would not, and there would be endless argument among rabbis. Jewish religious tradition is amenable to logic and informed debate, even if the logic doesn't make personal sense to me.
When someone asks you "Do you have any plans for the weekend?", will you answer with what you're going to on saturday, or sunday, or both?
For me, "weekend" very much means one unit consisting of saturday and sunday and most often friday night as well. "Weekend" meaning a single day which can be either saturday or sunday seems not to match any use of the word "weekend" I've ever encountered both in personal life and in things like TV shows.
The in the US the week has a free day at the start, a free day at the end, and 5 work days in the middle.
In the Jewish calendar Saturday is the only free day, Sunday is a regular work day, and is the start of the week - in Hebrew the days of the week are not named, they are simply "1st day", "2nd day", etc. (Israel added Friday as a free day because that's the Islamic holy day.)
Nope. In Portuguese, Monday in "Segunda", which literally means "second". It's named as such because it's the second day of rest before the Passover Sunday. This makes Sunday the first day of the week.
Something that is not mentioned in the article but the wikipedia page explains well is how the IFC was handling leap years, which is basically by adding another day that is not part of the week.
Sounds like yet another manifestation of the fake rationality that was so popular in the late nineteenth and early twentieth centuries. We already have a calendar that is universally understood and works perfectly well for any day-tracking problem we throw at it.
I use a personal system, more for filenames than anything else, calling files blah_yyyymmdd, so that I can easily sort them with ls and even grep on the filenames if I'm looking for a particular year or month.
Tweaks definitely needed. Perhaps I leaned toward what would be fine for my familiar non-location/time dependent work, already very flexy. Not true for most people, of course, but things change.
UK has 252 working days in 2023. Above is 243. The 41 single “off” days could be treated specially to give people more chance to set appointments or do certain kinds of business, like a more business-y Saturday.
If we redo the calendar then can we rename the months so they make sense. Sept/Oct/Nov/Dec need to be the 7/8/9/10.
Also, can we just switch to the duodecimal system. 12 is a much better base as divisible by 2, 3, 4 and 6. Divisions into 12 are already fairly natural - hours, months, inches in feet, dozens...
Most civilizations started with a lunar calendar (still used in Chinese and Islamic calendars), because it was much easier to observe the lunar cycle than the solar cycle. The solar calendar was the more useful one for getting good harvests, but technically much more challenging (requiring several tweaks over may centuries get the calibration right).
Lining up with the solar cycle / journey is more useful though (away from the equator at least). I suspect that is why people landed on a ~365 day calendar.
if our orbit around the sun was a perfect circle instead of elliptical we could ignore the orbit completely, or at least I would be willing to, and focus on the moon or just some interval of rotations - day/night cycles
While we're changing systems, could we all just agree to move to military time? In today's global economy, 7 o'clock for an international meeting isn't obviously am or pm. And it's easier to know how close you are to the 24th hour of the day when you're at 2200 vs 10pm.
Americans resist the 24 hour clock for the same reason they resist metrification and any other sane sensible scientific simplification of anything under the sun.
Ok, hear me out. What if for business stuff and basically everything we need to do with scheduling etc. why don’t we use Unix time, and why don’t we let everyone else have whatever religious calendar they want?
There were no emperors when the Julian calendar was created, during the Roman Republic (even though Caesar was de-facto proto-emperor at this point), and it was based on the Egyptian calendar which had 12 months. The Roman calendar was crap and had to be manually tweaked every year by the Pontifex Maximus (Caesar), and he had been inspired by the Egyptian calendar during his time there with Cleopatra.
The only months named after emperors are July and August.
I'm not entirely sure about Roman times specifically, but the number 12 was used a lot more in the past (hence words like "dozen") because it's easily divisible by 2, 3, 4, and 6, and doesn't have nonsense like 10/3 (well, 12/9, but who does that?)
The old Roman calendar had 10 months, which is why "December" comes from Roman "deci" for 10. Herp derp. IIRC they could add an extra "leap month" decided manually, or something. I'd have to look it up how it worked exactly. It was pretty complex and irregular, which is why it was replaced with what we have now. Aside from one minor bugfix it's worked pretty well for about 2,100 years.
The new roman calendar makes more sense if you start in March. Then the leap day is intercalary, and the numbered months match their positon. Oh well, people want their new years festival in January.
Janus is a god, not an emperor. February is named after a purification ritual, not an emperor. March is named for Mars, a god, not an emperor. April’s etymology is unknown but we can be confident it wasn’t named after an emperor. July is named after Caesar, August after the man once known as Octavian.
Our dumb calendar. Daylight savings time. Hell, time zones themselves. How many more dumb decisions of the past will we have to continue enduring just for the sake of, well, keeping things the same?
Daylight savings time is dumb. But, I think time zones are not. It's very useful to have some measure of the hours that is roughly the same around the world
Maybe some time zone borders are dumb, but mostly they follow the necessary complexities of political boundaries, geography, history, etc.
And then you talk to anyone from western parts of China and they tell you the single-timezone policy is blatant geographical discrimination by the central government.
> It's very useful to have some measure of the hours that is roughly the same around the world
You could just have the same time everywhere. The only thing that changes is that 12 o’clock isn’t midday anymore, which doesn’t match with the current system almost anywhere.
So in a story someone says, "I woke up at 5:00 every day" it would be meaningless without also knowing roughly what longitude they were at (With time zones it's approximate, but we know it means they woke up early). If we had a "universal time" - like all our clocks are on UTC - I think people would naturally invent local times and we'd be back to time zones.
Time zones acknowledge that everyone synchronizes with the people around them, possibly with some added information about people who are theoretically too far away to be in the same time zone together but still want to synchronize with each other. (This gets really absurd in the Pacific, but every time zone has odd jogs.) Removing time zones won't make that synchronization stop happening, it'll just make it more difficult to predict which clock distant people are really keeping in terms of when they're up and active and doing business.
One of the most relevant countries in the world still uses length measurements based on the foot of the length of their previous despot and they absolutely refuse to change.
I grew up in Canada where we use SI for everything official but Imperial for most daily things. For instance, I only know my height in cm because of the license I was issued when I was 16. The funny thing with units of measure is that they are all mostly irrelevant to daily life because people inherently think in things they deal with. For example, 12 Mg is just as useless to both SI and Imperial. But, if you say "as heavy as 6 cars", everyone immediately understands. SI is nice because you can attempt to change 12 Mg into something like 12 kL of water (since every L of water weighs a kg) or the amount of water in a pool of size 1m by 1m by 12m. Still, 6 cars is much easier to reason about.
> For example, 12 Mg is just as useless to both SI and Imperial. But, if you say "as heavy as 6 cars", everyone immediately understands.
I’m confused. Everyone understands what 12 tonnes are here. Besides, is your average car really 2 tonnes? That sounds like an awful lot, and it varies depending on location.
> SI is nice because you can attempt to change 12 Mg into something like 12 kL of water (since every L of water weighs a kg) or the amount of water in a pool of size 1m by 1m by 12m.
SI is nice because it does not depend on your cultural background. A gram is a gram regardless of your daily experience. So you can be sure that it will be understood the same way by anyone on Earth, contrary to overly heavy cars.
> Still, 6 cars is much easier to reason about.
I think this is highly culture dependent. As a matter of fact it is very rare to see this sort of analogies here (it’s more common to see football pitches for large areas, for example).
> it’s more common to see football pitches for large areas, for example
That's the entire point of my comment. Everyone "knows" what 12 tonnes, 12Mg, 12,000 kg are but they are internally recalculating to something they actually know. The Imperial system basically emerged from how people actually perceive things which is why every conversion is wonky. For instance, a bushel is about how many apples someone can carry, a cord of wood was originally an amount of wood bundled with a cord which has since changed definition, some for pecks, fathoms, and knots. If this still isn't convincing, consider that mach is basically the go to unit when talking about aeroplane speeds because thinking in multiples of speed of sound is internally what many would instinctively do anyway.
A litre is exactly the size of a carton of milk over here, so that is a very common amount that you use every day. A cubic metre or if you really wish a kilolitre (kl, not the Americanism kL) is an easy to imagine amount since we work with metres every day as well. Cars differ widely in size, and I have barely any idea about their weight.
But… You kind of know basically how long an average literal foot is, right? Within an order of magnitude at least, and surely a lot more precisely and usefully than that. A 5-foot monster will be a few times bigger than a house cat, a few times smaller than a camel. Approximately.
How long is a literal meter? However long whoever says, by definition.
It is good that they define foot, inches, etc by SI units, because SI units are well-defined scientific units (except, until recently, the definition of kilogram was no good), and for the appropriate purposes the foot, inches, etc can be reasonable measures (while SI units are good for general purposes, the others are good for the specific purposes), so now they are also well-defined units.
You're alluding to the US customary units, but as far as I can tell, at no point was the US foot or any antecedent unit ever based on the length of someone's foot, let alone a ruler's foot.
The primary length standard at the time of US independence appears to have been the length of a physical yardstick, which first appears to have come into existence sometime in the Middle Ages. What was used to determine the size of the various yardsticks is extremely unclear, but at no point does it seem to have been based on anybody's foot (although an arm is one of the suggestions).
The people who named those months had years started in March, so it checks out like that legacy variable in that legacy code whose name, type, description in the comment, and actual purpose all have grown to contradict each other.
Odds are that bothered people in the past more than it bothers people now, making it a somewhat strange example of being forced to live with the stupid decisions of the past.
Genes are, at best, a correlate to culture and ethnicity, and sometimes a weak one. For example, the "one-drop rule" ensured that people with very little African ancestry would be Black (or other, ruder terms) for the purposes of American racial discourse and, therefore, culture.
Of course it will. If you can say for instance change races when you move to a different country or fundamentally modify the human drive to be tribal, you can change a lot.