In addition to ignoring the big picture economics, it makes rather silly claims such as "And with that [the elimination of the need to own cars], the elimination of entire industries built up around the existence of car ownership like: mechanics, car washes, parking..."
I think he's confusing eliminating the need to buy a whole car (dubious in the first place even if/ especially if they're automated) with the elimination of cars in general (???) Or he somehow thinks that part of "self-driving" means also self-fixing, self-washing, and cars that never park... roaming the streets in packs I suppose, ganging up on people when the cops aren't looking and stealing their jobs...
Thanks for sharing the link. I see that the organization of the paper is rather odd (compared to most scientific papers I read). I would have expected the methods to be described in more detail, nearer to the beginning of the paper, and for the results section to show up after the methods section. I am beginning to get a sense of why this paper was not published in a more notable journal.
Fun fact- Nimrod means mighty hunter. The negative connotation came from early Bugs Bunny cartoons where he applies the name sarcastically to Elmer Fudd.
Fun to try with code (mildly). Here's the geekiest I could do that's worth sharing :-) http://i.imgur.com/IBaLCn4.png - based on the first stanza from the following:
Tiger got to hunt,
Bird got to fly;
Lisper got to sit and wonder, (Y (Y Y))?
Tiger got to sleep,
Bird got to land;
Lisper got to tell himself he understand.
— Kurt Vonnegut, modified by Darius Bacon,
I enjoyed that far, far more than I expected. His intro from the index:
"So I was me and I was in math class watching paint dry it was starting to crack when suddenly I realized there was a page for which the internet was invented. I set out to create that page, ultimately succeeding with the sierpinski triangle page to end most sierpinski triangle pages ™.
...
So while the sierpinski triangle page to end most sierpinski triangle pages ™ purports to be some kind of exploratory rundown of the Sierpinski triangle, it's also a fractal expression of just how carried away I get..."
It's math like I like my music: when the author takes the subject seriously without taking themselves too seriously.
In my experience it's very responsive and rational. It benefits as a language from being opinionated which is sure to ruffle feathers from time-to-time. Obviously not as big of a community as some others, but that hasn't been an issue for us.
I've seen heated flamewars about important issues rarely, about non-important/emotional issues only very rarely and only in IRC (where the current official policy is "we're a new language so don't kick someone out unless absolutely necessary"). More often with Nim you'll see the occasional "wtf- who implemented this? it needs to change to ---". To be honest, IRC isn't the best place to judge a "community"- trolls and good people venting have disproportionate voice there. And to be honest, if that exchange referenced was disturbing... just hope you never have to be involved in a decision with the Linux kernel team or (the most abrasive example I have first-hand knowledge of) the Apache development team 10 years ago ;-)
Notice there's only one comment there from Araq and it's trying to de-escalate things. If you were to pop onto IRC and ask something technical or philosophical that wanted a rational response Araq or many of the others not represented in that slashdot comment will most likely answer very politely and rationally within a minute or so (I think he's in Germany though? so timing may be an issue).
I think a better "feel" for a community can be gauged by the tone and quality of blog entries coming out and threads in github issues / pull-requests than IRC snippets.
That and the fact that if you decide to use Nim based on its technical merits then you _are_ the community- and at this early stage you can easily influence the tenor of discussions and decision-making for good if you choose.
Rust takes a completely different approach to this. If you behave badly, you're out. Even on IRC. It also keeps discussions direct and accusal-free most of the times.
It's one of the reasons I chose Rust as the next community to work in. I have zero tolerance for such things and hate being in discussions with people that cannot - well - discuss. I just don't want to waste my time on such communities.
Given that Rust now has a surprising number of meetups around the world (Berlin alone has a regular learners group with ~ 25 attendees weekly (some regular, some new)), they have a knack for good community building. And keeping insulting behavior to zero is one important cornerstone of this.
I've been doing community org and tracker triage for quite a few years now: you _can_ and _should_ judge a community by their community spaces (IRC, trackers, bulletin boards) and not by single-person publications (GH, blogs).
We just switched from Rust to Nim for a very large proprietary project I've been involved with after rejecting proofs of concept in Go and Erlang earlier in the process. This despite the fact that we had formally decided on Rust, had tons of code developed in it, and only stumbled upon Nim by complete accident randomly one day. Even though many large components were already developed in Rust we have already reached par and have shot ahead of where we were at. I don't want to diss Rust in any way (or Go for that matter), but I figure they both have corporate backing and can easily speak for themselves, whereas Nim does not- so I don't feel too bad offering our somewhat informed opinion :-)
In a nutshell, Go ended up not really being a systems language and Rust has the beautiful goal of memory safety and correctness without garbage collection but in the end feels like it's designed by committee.
Nim, with Araq it's benevolent dictator, has had the opportunity to be highly opinionated. Much of the time that seems to simply allow projects to either shoot themselves in the foot or stagnate, but occasionally it produces true gems- and I really feel like that's what Nim is.
Some aspects of Nim that keep blowing me away more and more:
* Truly allows you to be close to the metal/OS/libraries (most beautiful and simple ffi ever) AND simultaneously allows you to think about the problem at hand in terms of the problem- i.e., high level abstractions and DSL creation like you can do with Ruby (but even cleaner in many ways). I always thought those were two opposite ends of a spectrum and didn't expect a language to even attempt to bridge the chasm- then along comes Nim and not only attempts but succeeds beyond what I ever would have thought possible. To illustrate: it has become my go-to language for quick scripting, faster to whip something together than shell or ruby or perl (and many orders of magnitude faster to run- often even including the automatic compilation the first time)- while also being the fastest, cleanest, safest way to do something low-level, like a mmapped memory-mirrored super-fast circular buffer or cpu-cache optimized lock-free message passing...
* Even though it has C as an intermediate step, it doesn't feel or act anything like C. While not as strong as Rust in this area (I know, Rust goes straight to LLVM's IR instead of C- but the risks are the same)- it generates code that is much more safe (and concise) than writing it straight in C. I know, there are other language that do this well also- but usually by sacrificing the ability to easily and cleanly do system-level coding- like a raw OS system call, for example.
* On a related note, despite feeling more high level than Rust, it can do so much more at a low level. For example, using musl instead of glibc (which at least last time I checked wasn't possible in Rust due to some accidental coupling).
* So fast. While premature optimization is considered the root of all evil, and linear optimization often does not end up adding significant real value, I've been reminded more in the last few weeks than ever before that when speedups are around 2+ orders of magnitude _it is often a complete game changer_- it often allows you to model the problem in a completely new way (don't have a generally understandable example for Nim offhand but take docker vs dual-boot for an extreme non-Nim example- VMs were a mostly linear improvement over dual-booting and other network-based workarounds, but only with linux containers and their orders-of-magnitude "startup" time, snapshotting, etc. etc. was docker able to say "these are so cheap and fast we can assume a single running process per 'image'").
* Even though it's not as formally safe as Rust yet, in practice it feels and acts as safe, without the cognitive overload.
* Rust gets tons of new commits from tons of contributors every day. I worried when looking at Nim that the lower volume of commits meant that the language was less... alive or long-term-tenable. But on closer evaluation I realized that there was literally less that needed changing, and that the changes that seemed to need code to change all over the Rust repository would, given an equivalent scope in Nim, require trivial changes to just one or two files. (for example, Rust's massive [and extremely slow] bootstrapping pipeline and parsing architecture, vs Nim's built in PEG syntax and 5-line loop that keeps compiling itself using the last iterations' outputs until the outputs don't change anymore).
In short:
- All the simple beauty (IMO) and conciseness of a python-like/pseudocode-like syntax without all the plumbing and pedantic one-way that comes with python. In other words, beats python at python's own strengths (not even mentioning speed or language features etc...)
- Top-down modeling and DSL affinity like Ruby with less feeling of magic- e.g., better "grep"-ability and tracing. In fact, the DSLs and up being cleaner than idiomatic Ruby ones. So beats Ruby IMO at one of Ruby's core strengths.
- Seems as safe as Rust with much cleaner syntax and simpler semantics. (this is something we're actively quantifying at the moment). Easily 10x the productivity. _Almost_ beats Rust at its core strength.
- Easiest FFI integration of any language I've worked with, including possibly C/C++.
- All the low-level facilities of C/C++ without the undefined-behavior, with better static checking, much better meta-programming, etc. etc. Beats C/C++ at their core strength.
- Utterly intuitive obliteration of autotools/configure/makefile/directives type toolchains required for system development using C/C++.
- It's very fast and efficient per-thread garbage-collection (when you want it) essentially allows it to eat Go and Java's lunch as well.
- It's a genuinely fun language (for us at least).
I've already been too verbose and am out of time but I should include for some sense of completeness some current weaknesses of Nim. None of these ended up being remotely show-stoppers for us, but at least initially we worried about:
* Too much attention to windows (doing nix and even linux-only development is so easy it has turned out to be a non-issue).
Less safety than Rust when doing manual memory management (hasn't been an issue and we believe unlikely to be an issue in practice).
* Lack of (implied) long-term corporate support (already more stable than some others and community is strong where it counts. Also, no matter how much corporate support they get- Rust will still be more verbose and designed by committee and Go will still fall short of being a true to-the-metal systems programming language and neither will ever have Ruby and Lisp's moldability and endless potential to get out of your way).
* Smaller community/package ecosystem than many others (so easy to do FFI or to even _reimplement something done in C using 1/5 the code_ that it also has turned out to be a non-issue. Now I worry that it's so easy to code that it will have a library-bloat issue like Ruby requiring something like ruby-toolbox)
* "How have I not heard about this before?? Why isn't it bigger? Is something wrong with it?" I start worrying about things like D's standard-library wars or lisp & scheme's utter flexibility combined with poor readability... Nope, just new and only spread through word-of-mouth, like Ruby, Python, and Perl long ago...
[there, once I've written three separate lists in a single comment I can safely say I've said too much and should get back to work].
> - Seems as safe as Rust with much cleaner syntax and simpler semantics. (this is something we're actively quantifying at the moment). Easily 10x the productivity. _Almost_ beats Rust at its core strength.
> Less safety than Rust when doing manual memory management (hasn't been an issue and we believe unlikely to be an issue in practice).
One major safety issue with Nim is that sending garbage collected pointers between threads will segfault, last I checked (unless you use the Boehm GC). Moreover, Nim is essentially entirely GC'd if you want (thread-local) safety; there is no safety when not using the GC. So Nim is in a completely different category from Rust; Rust's "core strength" is memory safety without garbage collection, which Nim (or any other industry language, save perhaps ATS) does not provide at all.
So since this is such an amazing language.. I wonder why it doesn't have a Wikipedia article? Oh, wait.. I remember. Its because Wikipedia admins are an incestuous cabal and will do anything to avoid admitting one of them was wrong. Or because Wikipedia in general is a joke.
Think I an exaggerating? I believe this is the best programming language out there. Just try to add a Wikipedia article. Not in a million years.
The Wikipedia notability rules and process are ridiculous and completely unfair, when every porn star, popular smut video on the internet, rare mushroom, and Pokemon DVD has an article.. But the best programming language in the world cannot.
This is an example of what is wrong with our society.
And also from what I've heard, the tooling isn't very good. Autocomplete isn't context sensitive and using GDB to resolve a variable like "foo" actually becomes "foo_randomnumber".
This is actually one of the things that keeps turning me away each time I try Nim. All software has bugs, got it, but in my mind a language nearing 1.0 should squash some of that list (or remove/feature gate things causing them) before even thinking about a 1.0 IMO.
In my opinion, 1.0 is about the language specification becoming stable. That said, some experimental features have actually been gated in preparation for the 1.0 release.
I see similar lists for GCC when a branch is underway. I'm actually impressed with the number of active contributors, and I find the design very compelling. I'll be keeping an eye on this project.
As for scripting: I just tested on Debian Jessie, and at least for trivial code (read: not nim itself) -- nim seems quite content to work with tcc. Tcc is a pretty awful choice for c compiler in general -- but while my current desktop is a little too fast to be able to tell -- nim w/tcc was typically as fast on first compile (read: compiler not cached in ram) as clang/gcc were on second run). Honestly, on this box:
time ./bin/nim --cc:tcc c -r examples/hallo.nim
vs
time ./bin/nim --cc:gcc c -r examples/hallo.nim
time ./bin/nim --cc:clang c -r examples/hallo.nim
is a toss-up -- and they all lose against:
time python3 -c 'print("Hello, world")'
(by an order of magnitude that ends up being almost insignificant, it's ~0.5 seconds for clang/gcc on first run, ~0.2 seconds for tcc and ~0.02 seconds for python). But the binary tcc makes runs in ~0.001 -- or basically too fast to time -- gcc/clang versions are presumably faster).
Normally I think the startup time for python is less than instant (especially without dropping some standard includes/search with -sS) -- but apparently when running on a quad-core i7 at ~4Ghz with the OS on an SSD -- it makes no practical difference. I'll try later on my slower laptop (which is slow enough that "python -c "print('hello')"" doesn't feel quite instant) -- but the main point I wanted to make was that nim -c -r with --cc:tcc makes for a quite usable "scripting" tool, thanks to tcc's compilation/startup/parsing speed (if nim w/gcc/clang wasn't fast enough already).
I also apologize beforehand for a long reply. Long posts get long replies.
I can definitely understand this point of view, but I just can't agree. The parent probably wants his/her claim that Nim seems as memory-safe as Rust to not be interpreted literally, as a literal interpretation would make the statement false (by any fair comparison using idiomatic code from both languages to accomplish the same thing).
What the parent is surely talking about is how it pans out in practice. Different languages have their different trade-offs here with different pitfalls, and denying that Nim can crash and burn due to memory management mistakes would be false. Denying it with respect to Rust would also be false, due to Rusts optional unsafe features, but the important distinction is how easy it is to make these mistakes in idiomatic code and what the consequences will be. Only time will tell, which is why anecdotes are of interest, of course - both the parents and everyone elses.
However, I find some choices of words to be a bit disingenuous (though hopefully unintentionally so).
The claim about being able to do "so much more at a low level", like e.g. being able to switch out libc variants, which allegedly is not possible in Rust due to accidental coupling. Is this a temporary difference? If so, it may only be relevant in the short term. I can't answer this question, but it would be interesting if someone did.
Most importantly: "Even though it's not as formally safe as Rust yet, in practice it feels and acts as safe, without the cognitive overload." Yet? Making Nim as formally safe as Rust would require completely changing key aspects of the language. Feeling as safe is possible, and acting as safe is possible too...
... until it doesn't anymore, that is, because the team grew (as it always does, some leave, some join, etc) and the code base ballooned and someone made a simple memory management mistake somewhere that is now a serious debugging problem and no code can be eliminated beforehand from the necessary auditing because the entire code base is vulnerable to these classes of errors.
Memory management errors have a way of resulting in seriously trashed core dumps, etc, sometimes severely complicating and limiting debugging possibilities. Where's my stack trace? Oh, we seem to have been executing data and not code. Where did we come from? Oh, no intelligible stack frames. No valid return address in the register, etc. I've been there, as I'm sure many of us have. Memory management errors can lead to complete debugging nightmares, and that's if they're even reproducible by developers. If they're only triggered at the customers site due to their unique circumstances, good luck. Having a deterministic test trigger it and being able to run it through valgrind until it's solved is the optimal cake walk scenario, but that's not real life most of the time.
Rust can step quite easily from low-level stuff to high-level features and meta-programming too, and I feel no comparison is really made by the parent, only talk of Nims features. The central premise as always for Rust is that it provides what it can provide while still maintaining memory safety. Rust without this prerequisite would not be Rust, and the constraints for everything else flows from it.
The repeated claim of design-by-committee is also not the best one. Having followed Rusts back-and-forths for years, I have to say I feel the discussion has been extremely well functioning, and most importantly: The choices have been very pragmatic within the constraints of preserving the key safety features of the language.
Personally, having gone through many languages all over the abstraction level spectrum and specifically having spent quite some time in embedded C/C++, I am terribly, horribly tired of fatal runtime errors in general and memory management errors in particular. They can cost so much time to debug and fix that development time can swoosh past what it would have been in a language with a type system preventing them in the first place. Your mileage may vary, of course!
There is something to be said for languages that simply eliminate these classes of errors compile-time, and that something is actually a lot. For the small programs, tooling, scripts... I can write them in anything. There are hundreds of choices. That's not what this is about. For the software that matters, that ships and that others will expect to work, I no longer have the patience or tolerance for these error classes.
Many languages with such safety guarantees (and Nim is not one of them) have already existed for a long time, but very few that can be applied to all the use cases that Rust can. That is what it's about. This is why people are excited.
Software development is a form of art and a form of engineering, at the same time. A lot of software doesn't have to be as reliable as space shuttle firmware, and I'm not claiming it has to, but the general bar could sure as heck be raised several notches. We know how the world works, and yesterdays quick hack or proof of concept is todays firmware shipment for use in live environments. Successful software lives for a long, long time. Software is eating the world, and society is now at its mercy.
Personally, I will sleep so much better knowing that these error classes were wiped out compile time in 99.?% of the code I shipped to those customers, while being able to maintain on par performance with the C code it replaced.
These are of course my $0.02, and I hope it didn't come across as combative as that was definitely not my intention - only passionately conveying my own perspective. :)
Thanks for the thoughtful response- it didn't come across as combative at all to me.
The true, provable safety of Rust was what drew me to it as well. I've always hated having to choose between un-principled memory management (with it's security and functionality vulnerabilities that can lie dormant for many years before kicking your butt) and garbage-collection forcing you away from the metal and removing deterministic reasoning about memory usage, runtime behavior, and runtime overhead.
I've been going through the academic papers, forerunners, and source-code for Rust's static memory routines and borrowing semantics. My hope and suspicion is that it can be added to Nim without core changes to the language like lifetimes. It's definitely not a guarantee, but with lots of experience in both languages now I feel very strongly that adding region-based-memory-management to Nim is possible while adding Nim's clarity, abstractions, and efficiency to Rust feels impossible.
I agree that at the moment Rust is the only responsible choice right now if provable memory safety is a primary concern, but I suspect that will change. In the mean-time, for us anyway, the price was too high in productivity when we discovered that we could do manual memory management in Nim in very well-considered isolated places and confidently use Nim's fast, real-time deterministic per-thread garbage-collection for everything else without a noticeable performance penalty.
Having said that, I don't think I actually disagree with anything you said (:
> I've been going through the academic papers, forerunners, and source-code for Rust's static memory routines and borrowing semantics. My hope and suspicion is that it can be added to Nim without core changes to the language like lifetimes. It's definitely not a guarantee, but with lots of experience in both languages now I feel very strongly that adding region-based-memory-management to Nim is possible while adding Nim's clarity, abstractions, and efficiency to Rust feels impossible.
I'm not so sure. The trickiest part of getting memory safety without garbage collection working is not the lifetimes but the borrow check, which relies on inherited mutability and, most importantly, the lack of aliasable mutable data. The APIs and libraries of garbage collected imperative languages invariably depend on aliasable, mutable memory. Consider something as simple as a tree or graph data structure with mutable nodes. Or consider taking two mutable references to different indices of an array, or splitting an array into mutable slices with dynamically computed indices. These are all things you (presumably) can do today in Nim, and a borrow checker would break them. The likelihood that the library APIs depend on being able to do it is very high.
I never say never: you could implement multiple types of references, some GC'd and some not, and copy the Rust borrowing semantics. But they would be incompatible with most existing APIs and libraries. I don't think it can be realistically retrofitted onto a language without breaking most APIs: aliasable, mutable data is just too common.
Regarding efficiency/performance, what in particular seems impossible to add to Rust?
Thanks for your reply! An enjoyable exchange in the midst of what often feels like a bit of a very tiring flame war.
I am constantly on the lookout for languages that could be suitable for replacing (or greatly diminishing) the use of C/C++ in my work, and so far Rust is one of the front runners.
However, I am also very much aware of some of the troubles I would most likely face in convincing my colleagues, like language complexity and productivity, and I completely respect the decision that it may not be worth it, depending on a wide variety of factors.
I try to keep an open mind, and I look forward to reading more about the improvements to Nim you envision! Thanks again (and good night). :)
To be fair to Nim, I don't see any reason why it couldn't be made memory safe by using the Boehm GC (though I'm not an expert in Nim by any means). Of course, using the Boehm GC negates the advantages of the thread-local heaps, but I don't think that Nim's implementation of them scales up to large-scale software in any case for the reasons I detailed in my other comments. IMHO, if you have a garbage-collected, multithreaded language that must compile to C (and doesn't need interoperability with a reference-counted object system like e.g. Swift does), the Boehm GC is the best choice.
Thanks for correcting me, Patrick. I certainly didn't mean to be unfair (especially as I replied to a comment I felt wasn't being completely fair itself, intentionally or not), and I should have been more precise about the use cases.
I agree about the GC considerations. I meant my points to mainly apply to the use cases where safety such as that offered by Boehm is eschewed in order to achieve other powers at its expense, which I feel is brought up a lot by Nim proponents as strengths during these discussions.
> The claim about being able to do "so much more at a low level", like e.g. being able to switch out libc variants, which allegedly is not possible in Rust due to accidental coupling. Is this a temporary difference? If so, it may only be relevant in the short term. I can't answer this question, but it would be interesting if someone did.
It's definitely intended that the Rust standard library can compile against many libc's. I personally hope that it can eventually be completely self contained and not even link to libc in certain configurations.
I actually coded a predecessor to the system under development a few years ago and have used Erlang and more recently Elixir a lot, but in the end it just wasn't low-level enough, fast enough, or runtime-free-enough for our current project (for example we need to generate real-time processing kernels that can run on GPUs and FPGAs-- a realm not even really contemplated at the Erlang level).
I am not anti-metric by any means, but having done carpentry a lot in the past it always strikes me when this comes up that one of the central arguments for (a limited use-case of) the imperial system is usually glossed over: the fact that in many crafts (especially historically), using base-12 makes certain things much easier. It divides into 3rds far more easily, divides into 4ths slightly more easily, and still divides into 5ths with only one digit after the decimal.
Just like computer programmers have no problem immediately recognizing that 256 is 2^8, it became intuitive when working with the Imperial system (at limited scales) that 48 inches is the same as 4-feet but that it is also 3-stud-distances long (studs in walls are often placed 16 inches apart).
Even if you don't work in crafts where dividing things by 3 is more frequent than dividing by 5 it is easy to imagine how certain things might be more difficult if we used base-10 for time (as, it has been pointed out, has been attempted)- and thereby using the ability to easily divide an hour into 3 parts (for example).
Consistent base-10 and international standardization has advantages that far outweigh these minor things- but I think it's important to recognize that there is, surprise, a rational practical reason for sticking in some cases to Imperial units- it's not purely tradition or politics or their "organic-ness" (anymore).
My house is 100 years old, and having some construction experience myself I take care of small maintenance issues. I'm European but I grew up with both imperial and metric systems side-by-side, so it was easy for me to pick up American standards on things like stud distances and so on. It is convenient to have some things like that standardized, although the particular standardized measures themselves are highly arbitrary.
However, because my house is old nothing is perfectly standard any more - all the angles are off by a degree or two, different parts of the house have slightly stretched or compressed over the course of a century, and so on. So whenever I measure something I end up noting both metric and imperial - imperial because I am going to be forced to deal with it at the store/supply depot, metric because I want to get the numbers right and I would way rather work in base 10 that mirrors my 10 fingers than juggling fractions of an inch (a unit which is divided into 16ths instead of 12ths because...er...um...).
Unfortunately, I don't expect this change any time soon.
> 48 inches is the same as 4-feet but that it is also 3-stud-distances long (studs in walls are often placed 16 inches apart).
To some extent that's an artifact of the units we use. We could just as easily put studs 40cm apart, which is approximately the same distance, or 50cm (0.5m) apart, which would be particularly convenient.
From Wikipedia: "In the United States [...] typically placed 16 inches (406 mm) from each other's center, but sometimes also at 12 inches (305 mm) or 24 inches (610 mm)." (https://en.wikipedia.org/wiki/Wall_stud)
A quick search suggests that 400mm is common in metric-using countries.
It was 50x100 when I was building in Chch ~10 years ago.
It was explained to me that the rough cut timber was closer to the stated size, and that the finished timber we used for framing was named after the unfinished dimensions. Of course, I don't know how true that actually is.
Certainly the unfinished 75x50 we used for roofing purlins (spelling?) was noticeably thicker than the finished structural 100x50s
When dealing with fractions, twelve is more efficient. I agree imperial units still aren't worth the trouble of being different than the rest of the world.
You shall count only the prime ones, both have only 2 prime divisors.
Then 60 would be of real use (and it has been used in the past[1]... and still is in angular measurements with the famous 360 degrees) but it contains only 5 as extra prime divisor.
Carpentry is a bad example... 1/4 inch vs 6mm or 7mm doesn't make any difference in this low precision craft... it is far from being precise to the millimeter... Even if you were doing higher precision wood working, you could make your box-joint 6mm wide instead of 1/4" and it would still be a perfect fit.... Also, try dividing an inch into tenths using a ruler?
In fields where precision does matter - such as machining or PCB design - decimal inches are used. Calipers work in decimal inches and steel rules are often marked in tenths. PCB layout programs typically allow you to use either millimeters or mils.
Give 240cm or 360cm a name (like STL for "standard-timber-length", 1 stl = 240 cm) and you're effectively doing the exact same thing as the imperial system. Then it's just a popularity contest (which is fine and appropriate and metric wins except for wrt time) but at least conceding the fact that base-10 scaling of units is not intrinsically superior for all units & situations.
Edit: If the argument is that metric is more intuitive (less memorization because of consistent scales) then I think that's a great argument. If the argument is that it's more standardized and more widely adopted, I think that's also a great argument. But I think that the argument that the metric-system is somehow superior intrinsically because only scales of powers of 10 are worth naming (as the name implies) is a poor argument, as you've helped illustrate.
You're claiming that the ability to divide evenly is an intrinsic quality of Imperial, and then when I point out that actually it can be done in any measurement system (including metric), you claim that this makes metric like Imperial, because...well, because dividing evenly is a property of Imperial.
Unless you're suggesting that timber should only be available by 1 inch, 1 foot and 1 yard measurements in the US and by 1 cm, 1 meter, ermm...I kilometer(?) in other countries?
Edit: I didn't claim metric was better at all (it is, but I didn't claim it!). You claimed that Imperial was better for dividing up lengths of timber. I explained why it wasn't.
I guess the real issue is which units get names. Yes, you can do with 12cm exactly what you can do with 12in- but the latter gets its own named unit. The Imperial system has a preference for scaling its units by some slightly more practical number of sub-units- 12, 60, etc. I can imagine someone saying, for example, "why name 1000 centimeters as another unit? Can't I say 'thousands' using the same number of syllables? What's worth naming a different unit is 240cm since that's used a lot with timber..."
Someone strictly advocating the metric system would say "that's the point, kilo is another way of saying 1000 no matter where you live in the world or what you're measuring. Feel free to call 240cm a 'frob' if you like, but please, only do so in private- don't order 14 frobs of lumber, order 33.6 meters." (edit: which is a perfectly valid point. It's the slow accumulation of frobs that made the imperial system untenable. We trade a little bit of efficiency at a local level for greater global efficiency when we adopt metric.)
Actually, I can remember at least one commonly used alternative name from my time in a German-speaking country. The term "Pfund" (literally, pound) was frequently used to refer to a half-kilogram. I remember it being particularly used in reference to a loaf of bread, by both bakers and customers. (That was a long time ago, don't know if it's still common.)
There's really only one unit for length: the meter. Centimeters aren't a different unit, they're "hundredths of a meter".
You can call your lengths of timber "frobs" if you want, even in public! You could have people order them that way and sell them that way. The only requirement most places have is that you also specify what that is in meters so that people who don't know what a frob is, know what they're buying.
It just makes sense that your frobs should be a useful number of meters so that they can be divided or handled easily and don't require 15 decimal places to express.
Edit: I have a question for you: would you support changing your currency away from 100 cents to the Dollar to something like the old Pound with 240 pennies to the Pound?
After all, if you're talking about measurements everyone uses and need to divide up, it's far more commonly required for cash in people's everyday lives than length or volume or anything else!
Seriously though good question. First though it made me realize that you never see prices in thirds of a dollar- as if everyone avoids it and have simply gotten used to avoiding it. I can't imagine a situation where ease of dividing by three for money actually adds any efficiency. Similarly, while I do see the value in dividing the day into 24 hours, I certainly wouldn't advocate a unit that's defined as one 60th of a second (even though it has even more prime factors than 12 ;)
I concede that the use-cases where having more prime factors and therefore easy non-decimal division are few and far between. I guess what surprised me when doing construction was that there was a very rational reason for a foot being 12 inches rather than 10- that it's not simply a relic of the fact that a human foot seems to be about 12 thumbs long- some arbitrary number accidentally ingrained in some cultures. And as illustrated by the fact that stocks were eventually decimalized and then made to trade at penny-granularity, computers and the fact that we don't do a lot of division in our heads or on paper anymore will probably eventually erase most remaining efficiencies.
> I certainly wouldn't advocate a unit that's defined as one 60th of a second
Veering sharply offtopic, seconds are actually called seconds because they're "second order minutes". So, just as a minute is 1/60 of an hour, a second-order minute is 1/60 of 1/60 of an hour.
In the past, people have indeed used "thirds" (1/60 of a second) and in the 13th century, Roger Bacon went as far as using "fourths" (1/3600 of a second)!
I actually would support changing the divisions of the dollar to a non-base-ten standard. Specifically I would make it dollars and quarters and dispense with anything smaller. We used to have a half-penny coin. We got rid of it when a penny was the same value as a quarter today.
You miss the base-10 point: you can switch in zero time from 240cm to 2.4m and vice-versa, as you can switch in zero time from $2,40 to ¢240 when you expect ¢60 in return.
And you can totallyignore the "2", and focus on the "40" part, if you care about the precision, in zero time, and switch back to a global view.
The real problem then is our counting system. If base-12 is the most convenient, then we could count in base 12 too. That, however, is clearly never going to happen, so we are stuck with the less efficient base 10 system.
The other thing is that if everything is in powers of 10, then complex calculations are done that require multiple units, it's much easier to calculate. You don't need to keep a dictionary of 'conversion factors' to have these complex calculations make sense.
Not one of the downvoters, but working in (hundreds of) centimetres (240cm) as opposed to meters (2.4m) is not as crazy as it sounds. Tape measures or carpenter's rulers will have centimetres on them all the way.
Also, people consider it a matter of precision. If I ask for a piece of wood that's 2.4m long, I am probably less concerned with the precise length than if I ask for one that's 2400mm long.
What? The precision of $2.40 and 240 cent is exactly the same. Two significant digits.
It's kinda funny, every time this debate comes around, I see that the people advocating imperial are always confused by the concept of precision, which is handled very naturally in the metric system, but people advocating metric are confused about the way the imperial system uses subdivision.
For a metric person, a measurement such as 13/64" looks weird, and for an imperial person, a measurement such as 5.2mm looks weird, when in reality they are very close, and of the same precision.
They're usually referred to as 2.4m and 3.6m if that makes it easier for you to track. I converted them to cm as I thought that would be simpler for people.
(replying to the wish for a name for 2.4m; edited out): Why for the 2.4m, but not the 1.8m or the 3.6m? Surely it makes just as much sense to refer to it as the "two point four" as anything else?
A piece of 2" x 4" timber is still referred to as "2 by 4" in the UK, except it now measures 50mm x 100mm (actual conversion is 50.8mm x 101.6mm, the difference being moot when building scale is taken into account)
Perhaps not to you, but most folk I know (in the UK) aren't quite so hung up about it. I'll sometimes refer to a half-liter as a pint, or a liter as a "couple of pints", or a meter as a yard.
Though the pint is by far the stupidest unit. Why on earth wasn't it standardised at 500ml? It would have fixed pints, quarts and gallons all in one go!
I know, you can think up immediate objections, but having 568ml to the pint and 4.546 litres to the gallon, is just wrong!
Nah, I just couldn't picture anyone doing woodwork in hundreds or thousands of units (now, where did I get that idea...?), or dividing base 10 units. Since it sounds like wood is sold and worked in base 12, I'd use metric no problem. But that makes the difference between imperial and metric pretty arbitrary, doesn't it?
I've actually worked in the construction industry in New Zealand, which was thoroughly metric when I was building houses ~10 years ago.
And we did everything in mm, standardised timber was "100x50" - mm was assumed, which was close to 2x4 Inches.
(Actually, I think the finished standardised timber is smaller than 100mmx50mm, but for some reason was still called 100x50)
sheets of Gib plaster (Drywall?) were 1200x3600,(possibly 1600x.. I don't remember) sometimes bigger. rooms had 2400mm or 3600mm stud heights.
90mm nails were used to assemble the house frames, and 50mm nails were used when the 90mm nails would have been excessive.
We only really used Meters when we were being vague - "Go about a meter further out!" and usually converted to mm when we were actually cutting or fastening something. I never poured any concrete myself, but I'm pretty sure the foundation boxing would have been measured out to the mm, even over 10s of meters of distance.
If I ever gave someone a measurement in cm I'd get told "Only Dressmakers user centimeters!" I don't think I ever used feet, and I only ever used inches when discussing lumber, and would always be told to use metric.
I strongly disagree. Using the right tool for the right job is worth far more than enforcing arbitrary universal consistency. In a math course, you'd measure angles in radians so you're dealing with nice small multiples of pi. I think it'd be downright dangerous to use radians in a fighter jet, say, when the precision you need is much better provided by degrees.
Imagine if we enforced only one programming paradigm, since they're all equivalent. Or if we demanded that logicians and cs professors only prove things using turing machines, instead of picking the model of computation that best suits the problem. If there's a case where the english measurement system is more convenient, people should use it.
You just get used to whatever units you're working in. If you'd been trained in radians when learning how to fly, everything would be fine. It would be funny though to hear ATC clear someone to land on runway pi/2. That's probably not as succinct as runway 9, but people would be used to it if they learned it in their primary training.
The real problem though is switching costs. There are tens of thousands of planes out there and more than a million pilots. To switch all of the compasses, GPSs, flight computers, heading indicators, etc. as well as retrain a metric pantload of pilots would be ludicrous for no appreciable gain.
Don't get me wrong though. I love the metric system and think it's great, however first and foremost we are creatures of practicality. If US dominance in the world slips over the next few generations, I would imagine it will at some point join the rest of the world and abandon imperial measurements.
>Using the right tool for the right job is worth far more than enforcing arbitrary universal consistency.
Except units of measure are not "tools"...they are ways of communicating information. And when each person feels the need to enforce their own version of "right tool for right job" then things like the mars orbiter incident happen.
>Imagine if we enforced only one programming paradigm, since they're all equivalent.
Hardly. A better analogy would be to enforce identical calling parameters for DLLs libraries across all languages. That would benefit all languages and improve interoperability.
>units of measure are not "tools"...they are ways of communicating information.
How you communicate information is important and has real impacts. Given a graph, you could choose to encode it as an adjacency matrix or as a collection of adjacency lists. Both convey the same information, but nobody would say we should pick one and only one. I disagree with the assertion that measurement units aren't tools or that they are exempt from similar consideration.
Sure, the Mars incident is a call for consistency, but it's a call for consistency inside NASA. That shouldn't affect carpenters who need divisibility by 2, 3 and 4 more than they need 5. Or those who like to use inches because theyre significantly larger than centimeters and consequently easier to approximate by eye.
>Sure, the Mars incident is a call for consistency, but it's a call for consistency inside NASA.
No its not. Realistically space travel is going to be a humanity as a whole deal, not "inside NASA". Remember that the USA is like 4% of the world population. Even if the US contributes 10x more than everyone else per person that still puts the US in the minority. Consistency here is not a "nice to have"...its mission critical.
>That shouldn't affect carpenters
So you propose teaching your future carpenters and space engineers different units of measure?
>approximate by eye.
Approximate by eye? Seriously? Intel is aiming at 10 nanometers. Nobody is eyeballing anything in the modern world. Note...nanometers not nanofeet...this from an American company.
> No its not. Realistically space travel is going to be a humanity as a whole deal, not "inside NASA".
I think it's a fair bet that when spaceflight becomes widespread and commonplace, we're going to have many discrete groups of people developing, producing, operating, and maintaining their own particular spacecraft independently of each other, and not "humanity as a whole", as a singular undifferentiated mass, working on a single uniform spaceflight project.
Consistency within a specific project is clearly necessary; uniformity among distinct projects is, speaking at the macro level, a liability. Variation is an evolutionary advantage; artificial uniformity slows progress.
> So you propose teaching your future carpenters and space engineers different units of measure?
Why would this even be a question? Should programmers only be familiar with one single programming language? Should people in general only ever learn to speak a single verbal language? Is there ever an advantage to only being familiar with a single set of tools, and ignorant of all others?
> Nobody is eyeballing anything in the modern world.
Most people are eyeballing most things in the modern world. It's only in the case of activities on the scale of building spaceships and 10-nm-process integrated circuits that people require the level of precision that you're talking about. The vast majority of human activity remains outside of these domains.
The original article was about American resistance to adopting metric units as a default practice in day-to-day life, and not about the use of the metric system by people engaged in highly specialized disciplines. It'd be a bit absurd to suggest that the measuring units selected for high-precision work by the small set of people currently working on microprocessor design are necessarily the optimal ones for e.g. baking a cake or tiling your bathroom. In the latter use cases, one could make a very compelling case that units optimized for alignment with intuition are vastly more useful than ones optimized for micro-scale precision.
>Consistency within a specific project is clearly necessary
I'll concede that it only matters within a project...but projects invariably consist of many people and each of them "think" in their unit of measure. Sure you can mix them and hope they remember to "think" in metric at 1AM when pushing for a deadline, but really...
>Is there ever an advantage to only being familiar with a single set of tools
Thats the thing. These are not tools. They are units of measure. Aside from the odd instance where its easier to divide by X all you're gaining from using many units of measure is chaos.
>Most people are eyeballing most things in the modern world.
Definitely. Eyeballing happens regardless of unit of measure. And usually when someone is eyeballing it its not life/death.
>It'd be a bit absurd to suggest that the measuring units selected for high-precision work by the small set of people currently working on microprocessor design are necessarily the optimal ones for
Absurd indeed, but not what I was getting at. Units of measure is something thats internalized from a high-school age. So unless you have a way of splitting the kids between space engineers and carpenters at that age then why tech imperial? And even if you could split them, the mix of units of measures employed nationally would be much worse than randomly picking one.
I get that Americans are attached to imperial...its just very difficult for everyone else to under why given this: http://i.imgur.com/YJzhkZl.jpg
> but projects invariably consist of many people and each of them "think" in their unit of measure.
Obviously, you'd assemble project teams who are familiar with the tools, techniques, and conventions that you intend to use with your project.
> Sure you can mix them and hope they remember to "think" in metric at 1AM when pushing for a deadline, but really...
And the Chinese engineer who's working on a project where all of the documentation is in English might slip up when he's punchy at 1 AM and accidentally complete some of his work in Chinese.
This is a good argument for making sure that team members are well-rested and alert while doing their work. It's also a good argument for scheduling work reviews, proofreading, and time for correcting errors in the project plan. It's not at all an argument for abolishing the Chinese language.
> Thats the thing. These are not tools. They are units of measure.
Units of measures are tools. Tools are devices, whether physical or conceptual, that we use to extend our capacities for interacting with the world. In this case, since human beings do not natively have the capacity to quantify continuities, we apply the tool of measuring units to break continuities down into discretely countable chunks.
And, like all tools, how well they work depends on what goals you're trying to accomplish, and in what order of priority, in a given set of circumstances. Metric units are great in a limited set of contexts in which uniformity of post-hoc representation is more important than practicality in the activity of measurement itself; but this means that they are, for the same reason, less effective than customary units for the vast majority of situations that involve measurement.
> Definitely. Eyeballing happens regardless of unit of measure. And usually when someone is eyeballing it its not life/death.
Very few situations are matters of life and death, and in those rare circumstances that are, people will naturally be cautious and rigorous in their methods: I'd expect people to use precise measuring instruments, and to perform measurements multiple times, so in such a situation, questions of familiarity with particular units are scarcely relevant. When you're relying on the precision of instruments, the actual measuring units you're using are less important: metric, imperial, or otherwise, they'll all work just as well.
> Units of measure is something thats internalized from a high-school age
I don't know how much anyone "internalizes" any measuring units, but to the extent that they familiarize themselves with theDefinitely. Eyeballing happens regardless of unit of measure. And usually when someone is eyeballing it its not life/death.m, there's certainly no cause to familiarize oneself with only one set, at the exclusion of another.
> So unless you have a way of splitting the kids between space engineers and carpenters at that age then why tech imperial?
How about we keep doing things the way we are - teaching everyone both sets of units - and letting them determine for themselves which ones are most useful to them for each particular application?
> I get that Americans are attached to imperial...its just very difficult for everyone else to under why given this: http://i.imgur.com/YJzhkZl.jpg
All that graphic demonstrates to me is that while, with imperial/customary measures, there are a variety of separate base units to choose from, each appropriate to a particular scale of operation, the metric system only offers one base unit, and pretends that applying a 10^x coefficient to that single base unit somehow makes it a different unit.
That's what you're not getting: feet, yards, inches, etc. are fundamentally distinct units that have been tweaked to relate to each other, where necessary, by factors that are often much more convenient than 10. But you're just as capable with customary units as with metric of sticking with a single unit and applying scaling factors: I can just as easily say e.g. 24.2 x 10^4 feet as 45.83 miles.
I can even use metric prefixes if I'm so inclined, and say 24.2 kilofeet! Or 2.9 megainches! All the same value. But using these prefixes is just a bizarre re-implementation of scientific notation: in what way does it make sense to encode quantitative information as a verbal prefix appended to the name of the thing you're counting, instead of just using numbers?
Wow, I would not have believed someone would make the argument that everyone needs to use metric because some day we'll all be in space until I read it. Thanks for broadening my horizons.
Degrees are no more precise than radians. If you're a human you use a certain number of digits and they both work. If you're a computer using integers you're far too imprecise either way. If you're a computer using floats they are equally precise. If you're a computer using fixed point you're better off with something like 2^32ths of a circle.
Wood is an easily workable, relatively inexpensive, and, importantly, renewable resource. We'll be seeing more of 3D printing for sure, but you're kidding yourself if you think it'll replace carpentry.
I want a 3d printer / wood router hybrid. All I've managed to do so far is a 3d printer / laser cutter hybrid, but it takes forever for it to cut wood.
"Even the car's name tag positioned on the rear bumper, is carved in a 24 hour process from a solid block of aluminium. It is the Pagani way of reassuring its customers that their cars are built to their own standards exceeding the highest in the industry."
Great example really. Basically highlighting my message; it can be done (and can be awesome- machined parts are top-quality) but the Huayra costs $1.3M.
I would love to hear from someone who does carpentry (or some craft-like equivalent) with the metric system- specifically what kind of ruler/tape they use and whether it has a special 1/3 and 2/3 mark on it (since you don't want to have to eyeball 0.33333....) or if they just avoid dividing things by three altogether.
Because the truth is, your perspective is a political one more than a practical one. The whole idea of "officially" adopting either system is somewhat misleading- regardless of politicians or zealots decide, most craftspeople would use rules that easily divide into three, most programmers would often reference things in terms of base-2, and most people would still prefer 24 hours in a day. There is no general-case optimal winner that applies to all measurement and work-flow situations.
In the case of base-2, international bodies have already found a nice dual-system ( http://physics.nist.gov/cuu/Units/binary.html ). A simple solution for (for example) carpentry that would give the best of both worlds would be to define a unit that is exactly 1/60th of a meter. Let's call it a "sixer." Now someone crafting things can have all the convenience and easy division they need (here- cut this at 1/5 of 1/3 of a meter- no problem!) and still have it unified with the meter at some scale. There would be issues- but it's situational, which is my point.
You know, there's about 25 millimeters to the inch, and I've never seen anyone measure anything with more than 16th-inch precision for any construction job - maybe for very high-end cabinetmaking people use 32nds, but generally millimeters will give you more precision than you need, certainly if you're using a carpenter's pencil, which makes marks thicker than the smallest scale on a ruler.
I have no problem dividing things by 3 in metric. Rounding up to the next millimeter is going to result in an error equivalent to a couple of human hairs. If I'm working with wood and I need a perfect fit I'm going to be using sandpaper long before that.
Seriously, do you imagine that construction workers and craftsmen in Europe, Japan and the rest of the world spend their days ina state of helpless anxiety because of their inability to divide things with sufficient precision, or do you think they just get on with it and make buildings and furniture as good as any you can find in the US? The construction industry didn't collapse in the UK, Ireland, or Australia when they adopted the metric system.
No, I assumed they had a very good way of doing it, which is why I asked about it :-) It was a sincere request for comment but it obviously came off as a cynical challenge, which wasn't my intention.
> I would love to hear from someone who does carpentry (or some craft-like equivalent) with the metric system- specifically what kind of ruler/tape they use and whether it has a special 1/3 and 2/3 mark on it (since you don't want to have to eyeball 0.33333....) or if they just avoid dividing things by three altogether.
I would love to hear from someone that uses the imperial system how they determine exactly 1/3 of 10 inches and how they eyeball 1/3 of 5/6 of an inch.... My point is it is easy to find measurement scenarios that would make life a little harder for either system. 12cm (metric) for example is trivial to divide into 1/3, 2/3, 1/2, 1/4, 1/6 etc etc. 4.72 inches (imperial equivalent of 12cm) would be harder to do.
Besides there are very easy geometric methods to divide a line exactly into any number of equal parts, even if you do not know its actual length precisely.
I've done a bit of carpentry around my house. Since I live in Sweden I always use the metric system.
For me the focus on division by three is a bit strange. Sure I sometimes need to divide by three, but I also need to divide by five, or two or seven. I guess the thing is that our building standards are expressed in the metric system, so it makes sense thinking about them in terms of meters or centimeters. US building standards are expressed in the imperial system, so it makes sense to think about them in feet and inches.
Why would I want a unit that's 1/60th of a meter when I have a perfect 1/100th of a meter? A third of a meter is 33.3cm, working with wood you don't need more precision than that, if the decimal sign makes it difficult to do calculations, just step it up to 333mm.
I'm not a carpenter by trade but I guess that a professional carpenter here learns quite well how to divide by three even in metric. I see this argument popup every time there's a discussion about using the metric system in the US and I find it weak. For me this argument only sounds like "but it's hard to learn something new".
I'm sure there are pros and cons with both the metric and the imperial system, but in the globalized world we live in, the fact that The Rest Of The World uses the metric system should be a pretty convincing argument to use it in the US as well. I mean after all, The Rest Of The World has accepted English (in one form or the other) as the Lingua Franca in order to make communication across borders easier. You don't see us whine about stuff like "but in Swedish I can much more succinctly express that it's my paternal or maternal grandparent I'm talking about, so I don't want to use English".
Point taken, and I acknowledged in my first comment that it was a weak reason compared to the reasons for adopting. It wasn't meant as advocacy :-)
I would point out however that no one (that I know of) claims that English is the most popular because it is intrinsically the most efficient for all situations- i.e., on its merits as an efficient language rather than geopolitical factors. I definitely think that there are certain situations where communicating using non-English would be (locally) optimal- even if the parties communicating were fluent in English. Even as a native English speaker I truly wish English had the equivalent of the Tagalog particle "daw/raw," for example ( http://tagaloglang.com/Tagalog-English-Dictionary/English-Tr... ), or the fact that Cantonese can often communicate the same information as English using far fewer syllables...
In other words, I'm perfectly fine saying that using English is globally optimal (and convenient for me) because of its adoption, while still being perfectly happy knowing that some languages have attributes that make them more efficient, if you know them, in certain circumstances. I don't pretend ex-facto that English got this way because it is a truly superior solution in all circumstances if only everyone knew it. Nor do I feel like English's status as a (quasi)-standard is threatened by someone pointing out that Swedish would be more efficient in some settings (say, family history) even if everyone did know English. I would say "cool."
Defining units in terms of other units via something other than base-10 scaling in order to have more integer factors than 2 and 5- in some cases trading the 5 for 3 and 4- can be advantageous. That's all I'm saying, nothing more or less :-)
When i was building stuff out of timber i would typically only work in mm (or metres for longer lengths (timber is sold in full lenths of 6 metres or shorter bits at various lengths (at least in .au)) and don't need any further precision.
a 334 mm length will quite happily go in a 333mm gap. though again i'm not a professional carpenter i just worked with my father a lot.
I think he's confusing eliminating the need to buy a whole car (dubious in the first place even if/ especially if they're automated) with the elimination of cars in general (???) Or he somehow thinks that part of "self-driving" means also self-fixing, self-washing, and cars that never park... roaming the streets in packs I suppose, ganging up on people when the cops aren't looking and stealing their jobs...