All of the popular interpreted languages we use now were dog slow when they originally came out and we still had to deal with many the quirks of low level languages back then but here we are. If you don't like it just don't use it.
This really depends on the task in question and which languages you were comparing it to. Back in the day it was fairly common to have Perl, PHP, Python, etc. outperform languages like C in cases where the task complexity meant that the C programmer had been so busy debugging that they never got around to implementing a better algorithm which required more complex code, or it turned out that the C code behind the Perl regex engine, Python list/dict, etc. already had those optimizations. Java and C++ were compiled but back then the implementations were far less mature and it was easy to find cases where they underperformed in cases which should have been easy on paper. Not having a package manager really favored languages with one or a strong standard library since the alternative was often someone doing a quick naive implementation and never getting around to significant optimizations. The other thing to remember is that in the pre-SSD era it was much easier to be I/O-bound, masking most of the differences.
Someone once posted Perl6 and C/C++ code to #perl6 on freenode.net
According to them, Perl6 was faster.
(The Perl6 code was also a lot shorter, and I could argue it was easier to understand.)
---
My guess is that the reason was that the C/C++ code had to scan for null terminators often, and copy strings around. (Or perhaps more precisely the stdlib had to do that.)
MoarVM doesn't use null terminated strings, and it treats strings as immutable objects.
If you do something like create a substring, it creates a substring object that points into the original string. So rather than copying a bunch of data it basically just creates a pointer.
(Strings are technically immutable in Perl6, it is easy to assume otherwise though.)
Exactly. If someone else gets wrongly convicted for the crime then the actual perpetrator now has no fear of ever being prosecuted for what they did. The fear of being caught for a past criminal act is a deterrence to committing more criminal acts. Once someone else is convicted for that crime, to the criminal, they've gotten away with it. If a novice gambler wins big early, they're more likely to take bigger risks later. The same is true of criminals.
It's all fun and games until you're the one in prison and sometimes executed for a crime you didn't commit. Why do I doubt the author would just accept their fate for the greater good of society if it was them wrongly convicted and imprisoned for a crime they did not commit?
There's a key difference though. In Linux and other UNIX inspired OS's /etc is only accessible by root but the Windows registry holds config data for users too. Any userspace program in Windows could potentially bork any registry key that the user has access too.
I understand the point you're making but according to the fancy chart you posted it did reach a peak in '95 but your statement implies that it's been in decline since then. According to the fancy chart it then steadily rose to almost the same level in '04. It fell again to it's lowest level in '12 and has consistently been above that level in the following years. Also the article makes it clear that the current rise is a result of local government policy and reduction in regulations which has been one of the primary obstacles in preventing mass deforestation. If we repeat the causes, we'll just inevitably repeat the results.