As a practical matter, this would require the enforcement agency to publish the correct price for every asset regulated.
Publishing a formula for this calculation wouldn't be good enough, because then you would be required to value things based on the formula and that leads you back to...accounting.
All that talk of "doubling the size of its customer service staff" sounds good, but it doesn't mention numbers. Did they go from 1 to 2? 100 to 200?
This is one of my pet peeves about journalism--quantitative statements that aren't anchored to any numbers. The most outrageous example most of the time is talk of government budget "cuts" that are nearly always "growth in spending at a slower than originally planned rate" rather than actual reductions in spending.
Since you are hung up on "objective analysis," what objective evidence do you have that this claim is true?
In any case, here is an objective problem: Common Lisp does not include a standard for regular expressions.
That isn't anecdotal and it isn't misplaced blame and it isn't an unnamed idiosyncrasy. It's a failure within the CL standard to include one of the most powerful tools around for text processing--a tool that pretty much every dynamic language includes.
I've tried using lisp before for text processing and found it brutal--practically impossible--compared to other dynamic languages. That isn't due to popularity, it's because python and perl and awk have built-in facilities for manipulating the hell out of text that sit right on the surface, were easy to find, and work well. Despite having lots of functions for chars and strings, common lisp never felt anywhere as easy for those tasks.
If I'm wrong, then I will look forward to being educated, but I honestly believe that practical limits have everything to do with why people don't turn to CL for scripting.
I am no Lisp expert by a longshot, but when I want regular expressions, I use this library: http://www.cliki.net/CL-PPCRE (available for easy install via quicklisp) Since I like to pick on Perl... This library's logic is, as it happens, allegedly faster than Perl's regular expressions. That probably is not difficult to do, since Perl is interpreted, whereas most CL implementations are a both interpreted and compiled: you get the extra advantage of the expression being preprocessed.
CL has a nice variety of libraries that extend the language. It is certainly debatable what constitutes good "core language" features vs what should be in good "language extensions" or libraries, but I should make one comment on what I have seen of Lisp libraries: they have high quality, and wherever there is lack, it is documented. There is even a humility about it, a distinct lack of attempt at showmanship or marketing, as if the community cares more about really good theory than the next buck. I suspect this faithfulness to the reality of the various algorithms is part of what turns away people who are on the hunt for shinies.
As a counter example, Perl is well-known for having a huge base of libraries via CPAN, but the signal to noise ratio is very low. There are few libraries that have any quality and a general lack of consistency and interoperability between them with the notable exception of packages like Moose.
From what I understand in reading about this lack of built-in regexp support, it has something to do with regexes being a theoretically weak approach towards parsing. I have some ideas on what that means, but I have no strong participation in such discussions, so I am cautious about proceeding into that territory.
It is possible what allowed me to get to the point where I was confident I could solve any problem I wanted in Lisp (web, database, random scripting, 3D game application, etc) was the fact that I spent a lot of time playing with Lisp and shifting my mental model several times. And I still feel like an egg compared to what some of the folks are doing on some of the Lisp forums. It seems like the people who are breaking new ground in programming theory (and not just rehashing the same old concepts in different syntax approaches) are in the lisp communities, although I should give a good nod to Haskell's continuation of the ML line.
Anyway, I am a little hung up on this definition of "practical". Maybe I am lucky and happen to look in the right places? I did start off on the wrong foot and dig deeply into ASDF stuff a while back when it was messier, but that kind of thing is outdated now with the introduction of the QuickLisp library.
One last note: when you said "scripting", did you really mean "programming"? There is a connotation that scripts are more for one-offs, for hacking stuff together, and Lisp is more oriented towards serious, large applications, for managing huge and complex problems.
I know the lisp community stands behind Edi Weitz and I respect him, but compared to Python's "Batteries Included" or Perl's standard regular expression library, your solution is problematic. Consider: Weitz's library is just one of five possible regular expression libraries listed on Cliki! Why did you choose his? Weitz's library isn't even the top choice! Are the others broken? Unreliable? Do I have to try them all?
Now, I realize such simple questions as these may not be "breaking new ground in programming theory," but a lot us just want a turn-key solution that works everywhere; the sort that sysadmins use everyday to keep companies humming and the internet buzzing.
That's the definition of "practical" that I'm hung up on.
If I want to do command line text processing with pipes--and many do--how does lisp help me more than Awk? Awk is brilliant in its problem space; it's fairly standardized; it's guaranteed to be everywhere. Choosing Awk or Perl or Python is practical--not being a slave to fashion (trapped in the popularity contest you allude to)
I think you know this, and I think you know just how practical Python/Perl/Awk/Ruby are, which is why you subtly changed your argument from "practical" without qualification to "managing huge and complex problems" by the end of your response, even though the parent specifically includes hobby programming in his classification of "practical" problems.
I think it's cool that you've done "web, database, scripting, 3D game programming, etc" in lisp and can even think in lisp. And I admire your willingness to battle past CL's 1000+ page spec and then the sea of competing libraries to find the one you like and are willing to debug yourself if something is broken.
But that doesn't invalidate all the other tools--it just means there can be more than one way to do it.
The original post mentioned C and career use as well. I have no problem with using awk and so forth for quick one-offs, but that was not the issue. I suppose the real question is "practical for what?". You exaggerate the complexity. ;)
As a counter example, Perl is well-known for having a huge base of libraries via CPAN, but the signal to noise ratio is very low.
Sturgeon's Law applies so its no good picking on just Perl libraries here!
There are few libraries that have any quality and a general lack of consistency and interoperability between them with the notable exception of packages like Moose.
> Either way, it seems like we all benefit from devices that make doctors more effective.
The problem with your statement and approval is in no way justifies the magic $44K number. Why not a $2 billion incentive for each doctor that switches?
I'm assuming it's based on the perceived cost of the software in question(not just Dr Chrono but all EMR system). I was surprised that Dr Chrono doesn't charge around that for it's software. A lot of those packages have so few clients that it needs to be priced ridiculously to make it viable. This is mostly due to over kill design by committee that makes it's expensive to develop. Dr Chrono as a small shop probably has the advantage here.