Why should the country I'm in and the language that I'm currently speaking determine if I want periods or commas as thousands separators? Or if I want dates to be presented in one of dozens of insane formats instead of RFC 3339/ISO 8061?
Because that's what you grew up, learned, and have used forever?
Let's ask it another way - why would the country you are in and the language you are currently speaking NOT decide those things (which are defined by said languages.)
What are we doing right now to each other besides banging out some symbols that have a shared meaning?
>why would the country you are in and the language you are currently speaking NOT decide those things (which are defined by said languages.)
Because these have no relation to how a particular piece of content should be formatted. The fundamental mistake made by many of these legacy APIs with respect to localization is the assumption that the locale should be determined based on some property of the user, which is reflected in some OS or application-wide setting that applies to all content. This only works as a rough approximation in the small minority of cases where the user is a member of a cultural sphere / bubble where exposure to multiple languages is a rare exception, like the united states. In the rest of the world, it is an everyday occurrance for multiple languages to exist side by side, including within individual web pages, documents, spreadsheets, and other content, which is why the only reasonable solution is the one where a locale is a property of a piece of content in its most atomic form, not of the user.
I grew up with dd/mm/yy, comma as decimal separators, dot as thousands separator, but:
1. The fact that I grew up with those formats holds little weight for me. At this point, most of the literature I read and content I consume doesn't come from my own country. And why should it? when I have access to books, movies, websites from all over the world, and my country makes up for a tiny fraction of that. With the massification of international remote work this tendency will only increase.
2. Even when working with people from my own country, I can agree with them to use formats different from the ones we grew up with. In fact, everytime we type a floating point literal in any programming language, we do it without following the rules we grew up with.
3. In the 21st century, in the context of globalization, massification of the Internet and widespread access to computing, to keep doing these things differently by country/language makes no sense, specially when there are already good international standards that we can follow. It's not that hard to learn, either. We even accept English as the de facto language of programming and software development, and that's a full natural language that we have to spend years learning.
4. In my opinion, yyyy-mm-dd and dot as decimal separator are simply better for practical reasons. However, if we collectively decide that other formats are the "international standard" then I'd follow whatever rule we decide, as long as it's not too bad.
Can you imagine if instead of using SI units, each country used their own special units of measurement? We are definitely better off with SI, standarization makes everything so much easier. I can talk to someone from Japan about meters and kilograms and they will understand without any ambiguity.
The trouble with this is it essentially reduces all differences to "biggest player wins". Because it almost never matters what symbol or representation you use for these things, your personal preference is largely based on what you grew up with.
I could resign myself to Americans* being allowed to pick the global date format to save a few cycles on an operation, but frankly I don't want to. It's not that important to me, and not having aspects, albeit minor aspects, of my culture steamrolled in the name of pointless efficiently is at least a little bit important to me.
Differences create friction yes, but I'm alright with that. I would prefer a little friction to grey uniformity. You're free to think differently, but you're not the speaker for everybody else.
* I should clarify that most Americans don't seem to want this either, this isn't a jab, it's just that if we did standardise everything their choices would probably be the ones which won out.
I'm not OP but a related point is that locale-aware formatting can be offputting if your application is not actually translated into the language. At a previous job we delivered an English-language product that used locale-aware number formatting. Our German customers asked us specifically to use English number formatting, both because it was a technical programming product and because the application was entirely in English. German number formatting with English text was undesirable to them, and they didn't expect us to translate into German. They saw it as an American product and that carried into their expectations about formatting. We went into the code and explicitly specified the American English locale for all string formatting, and they were happy.
Out of sheer idiosyncrasy, I've developed the habit of formatting dates as dd.mm.yyyy rather than the more typical US mm/dd/yy (and long dates as dd mmm yyyy). Setting that as my date format in OS X for a long time caused the paper size to default to A4 even though everything else was set with US settings. Somewhere along the line, that particular "feature" was removed, although I couldn't say exactly when.
So dates in the US are not RFC 3339, they're MM/DD/YYYY drunk-endian. And units are all US Customary, not SI. If you want something else, you need a locale other than en.US, and if you want other things (like spell check) to match US conventions you end up needing to create a custom locale.
Having locales by itself makes sense, you after all want to map real world information into the digital realm and for that to work you have to stick with whatever real world conventions are already established. Keep in mind that computers are often just used as better typewriters and a lot of information exchange still happens on paper. Moving from paper to fully digital is a process that takes decades, so you can't just decide that digital uses '.' instead of ',' as all your printouts will be wrong.
The part that doesn't make sense in C and other languages is that locales are forced on you as global state. And not just by default, they don't even provide locale-free alternatives, you have to modify the global state and reset it after every use, which is cumbersome, slow and error prone. C++ added the locale-free `std::to_chars()`, so things are slightly improving at least, but it's still an ugly and largely unnecessary mess.
I find this annoying as well because none of the premade locales match my preference. Why can't I set my prefences, just load the defaults from a local.
Maybe I'll have to write my own locale file. That doesn't sound fun.
Ditto this. One person, one device, one language, one locale is a terrible assumption.
My Android phone is set to a language that doesn't match the country I'm in, and it hyphenates all the local phone numbers wrong. This is beyond stupid. The phone knows which country the numbers belong to. I would expect it to format each number according to the conventions of the country they belong to, not according to the language I've selected. I'm not doing anything fancy like RTL, either.
Why should the country I'm in and the language that I'm currently speaking determine if I want periods or commas as thousands separators? Or if I want dates to be presented in one of dozens of insane formats instead of RFC 3339/ISO 8061?