Oh my why would anyone want to lean Perl anymore? Unless you are working with a legacy codebase learn Python, Ruby, Java, C#, Erlang, Scala, Haskell... in other words run the f away from Perl if you can unless you are into S&M.
Because it's powerful, it's easy to start, it's flexible, it's productive, it's ubiquitous, and it has an unparalleled extension ecosystem devoted to quality and ease of use.
Thank you I did read the post. Python has pypi. Ruby has gems. I'm not going to google the rest but if they don't have one there is no reason why they can't.
And FYI pypi is better than CPAN so all the other stuff in the that post is crap imho. I was a Perl dev for ~10 years. Now have about 5 worth of Python. So thank you.
I don't know about python, but ruby's gems are quite inferior. Just two points:
gems has no idea if any specific gem is already installed, so if you ask it to, it will just install over whatever is there, no matter if it's the same or not.
Additionally it does not automatically run tests and abort if they fail, so the chance of installing a broken gem over your existing working gem is pretty big.
There is a culture of documentation, cooperation and tolerance in the Perl community that, IME, doesn't really exist for other language communities (Ruby and Python).
Compare, for example, AnyEvent's documentation with that of Twisted - or any of Perl's standard documentation with that of Ruby.
I'm not going to google the rest but if they don't have one there is no reason why they can't.
That somehow fails to convince me that other language ecosystems parallel the CPAN in breadth, scope, maturity, and ecosystem. See CPAN Testers, for example.
Oh my apologies I wasn't trying to convince anyone of anything if that's what you are thinking. I was only stating fact based on my experience but that's okay just downvote this and make yourself right.
Nobody was down voting you to make themselves right. But the remarks that you made about CPAN to be frank didn't make any sense at all to anybody who has worked with CPAN seriously over years.
Both in terms of quantity and quality CPAN beats any other scripting ecosystem by a very great margin. And its really not about the number of the modules. That many number of modules would not have been possible if Perl(syntax, extensions system) was not flexible enough to allow them.
The traditional approach used by languages is to first build a set of semantics define a syntax and standard library for it. Then any other development in that language happens through frameworks and libraries. Perl is special in this case that Perl allows syntax extensions through modules. There fore you will find not just Modules to do your task, but also modules that add and extend to exiting Perl syntax with sugar. Perl 6 extends this concept further through grammars.
How often and how many language are there today(Counting Python and Ruby especially) that can add something like Moose(Moosex extensions) and other syntactical extension to their language, without breaking backwards compatibility?
Python took around 8 years and broke backwards compatibility to make as little changes as context of a for loop and print statement. Now imagine what it would take Python to fix its object system or its scoping problems.
Thank you for the excellent reply. That was response I was hoping for. I don't know everything and I appreciate you talking on point and not just pressing the down arrow because someone said something you don't like.
> Are you really surprised about the down votes...?
For that post of course not. For the others yes.
I can only assume you are coming from this viewpoint
"Please avoid introducing classic flamewar topics unless you have something genuinely new to say about them."
and no I don't think I had anything genuinely new to bring but I how could I know about something I didn't know about.
I guess what I am really saying if people can't acknowledge the flaws in a thing how could they ever make it better. And to say even more if they can't vocalize their reasoning how does downvoting make it okay but I am sure that is another thing I have missed.
Hi. Thank you for taking all the things I said out of context. I am going to try my best to do the same despite the fact that I believe this is what HN is trying to avoid, moderator.
> My trivial point was that your claim about the down votes was contradicted in the first sentences. It seems you missed that, too.
I am sorry you think your thoughts are trivial and despite your grammatical errors I would love to know how a first sentence can be plural. I have no doubt that I missed a lot things prior. That was my entire point of my following posts. Maybe if I knew which first sentence you were referring to I could have a better response.
> Quite fun to do personal attacks like that after such creative way of misreading.
There were no personal attacks in the previous posts unless of course you are Perl. In this post maybe there are many only because you made it so.
And I would appreciate if you read this small thread, moderator, and came back with an intelligent reply.
Yes I could have not been a dick on this but you reap what you sow.
I hear about new Perl projects starting in large enterprises everyday!
Unless you are in the web programming domain(Somehow web developers think their's is the only software being written in the whole world) where your Python and Ruby frameworks seem to be famous. Perl is pretty big in the backend.
Perl is here to stay, and its simply too useful to be going away anytime soon.
I didn't get your point. I have found Perl(And many have) very useful for whatever we do.
So we continue to use it. Just because there is some new shiny stuff around, we don't use just for the sake for using it.
Trolling Perl to advocate Python hasn't helped for decades and won't help now either. A better debate will be to argue on technical merits. Trolling and whining gets people curious to verify if all that is true, thereby forcing them to read and some readers adopt it too.
To be fair I didn't think I was trolling but I can completely understand why you think I was. I could only talk about things I as knew them to be. Whether that makes me an ass or an idiot I don't care. I did want to hear a good argument as why I was wrong and again I appreciate your reply.
EDIT: My original comment was really just meant to be humorous I am sure know why since you are aware of Perl's syntax. It did make me a lot of money at a time but nevertheless I stand by what I said as that is the best advice I could give myself right now.
I wrote a Pandora clone using django and last.fm for recommendations. Because of the licensing issues I plan on open sourcing it on github in the near or distant future.
Can anyone say when Oracle is justified? I have yet to encounter a problem the postgres or mysql didn't support but alas I have not worked on everything.
Support. You can properly find some solution to your issue with postgres online. Oracle can send an Oracle certified engineer who can solve your problem.
Consistency. Your enterprise depends on terabytes of hyper valuable information (Wallmart with their sales data), can you guarantee that you won't end up with corruption issues? Or that the next version will work with your system too?
That said as long as you make less than 20 mil/year, Oracle isn't likely to be the best solution.
Just a quick question: which engine do you use for your MySQL system?
InnoDB. I understand your support viewpoint but both of those dbs are opensource thereby leaving support open. From what I understand when the support comes into play is when you can sue the other party for not providing the service expected but there is company support for mysql and postgres so that still leaves me in the dark and I have never experienced corruption issues with either. Though my db experience is limited and I don't claim to be an expert.
When the ERP application you are installing has in it's specs
- Supported database: Oracle
I am sure a crack Postgres guy can bash and file that sucker to work. But the vendor will not support the result, and some places, some situations, that really counts for a great deal.
Oracle isn't selling databases. They're selling database support. Similar to how IBM operates - they make good stuff, but you're really not paying for the hardware. If you were, they'd be outrageously expensive compared to something built more simply - see Backblaze, for an example: http://blog.backblaze.com/2009/09/01/petabytes-on-a-budget-h... .
When you need multiple spatially distributed active/active write nodes (multiple masters...) you pretty much need Oracle RAC.
When confronted with the cost, you end up designing something where writes are pushed to a caching layer, which deals with an active/passive setup, with enough buffer to switch passive to active if something goes wrong.
They should be if they are payed for by public funds the same applies to software. Of course that doesn't apply to defense or otherwise potentially dangerous software but the general rule should be to opensource government contracted development because if they don't have a good reason for not doing so they are probably doing it wrong.
What's even more ridiculous is that they did it even though there is now a competitor with a better product. Yes they have a much larger installed base but I do believe G+'s non 140 char limit and other amenities will win out.
Wow I never looked at it like that before. Of course if the currency collapses the companies operating in it are f'd. We need to resolve the housing foreclosure mess and stop accruing debt.
There are a lot of messes that need to be resolved, but notice that mortgages barely figure in the chart, the housing foreclosure mess is 'merely' shit rolling downhill in this picture
I don't understand the jump in logic you are making from "the US government is at risk of defaulting on it's debt" to "the US dollar is at risk of failing as a currency". Can you explain this?
You seem to be jumping from one conclusion to another which much reasoning to link them, and then drawing another conclusion from that.
I'm afraid there is no jump. If the US defaults interest rates will rise but they can't as the market is dependant on cheap money. That why interest rates have been at 0% for a long as long as I can remember and it is also why QE1 and QE2 were done. To increase the supply of cheap money. If the US defaults the dollar will crash. Look at what happened to Argentina in the late 90s. No I don't think the chances are very significant now but the conditions are certainly there for it to happen. I'm sure a whole paper could be written on this but I am only trying explain where I am coming from.
jellicle is correct the ratings are complete bs. If AAA actually meant something the US would have been downgraded years ago. AAA means there is no chance of default no whatever so ever. It should only be reserved for countries with little or ideally no debt. The fact that our politicians and media are openly talking about default means that AAA does apply to the US but it does. So it means nothing.
The (non-BS) default risk of the US is very close to nil. Hence the AAA is deserved. But the currency risk and the interest rate risk is very high. Here's the pair trade if you want it:
Well there are 7 levels of A rating. The US certainly doesn't deserve to the top most tier since there is active discussion of default. No I don't think it actually would but what is the point of these ratings if they are being used like this?
OK, let's look at structural deficit, GDP growth, and Net International Investment Position.
NIIP is interesting. Despite the US's huge debt, they have a reasonable NIIP of about -25% (I think). So the money is mostly there. It's mostly in private hands, but as long as the private sector is OK, the government will keep collecting taxes.
The obvious problems with the US are its health system (more public funding than countries with free health, but terrible value for money), its military deployments (the military can be an OK if they use it to drive high-tech growth, but actually deploying it very expensive), its prison system (they tend to lock criminals up for a little too long, and they have way too much crime for other reasons).
>It seems really ugly to store derived information in a commit
I don't understand how generation numbers are derived information. They are used to find the position of the commit in relation to another. That makes them information that is essential to the commit. The problem was to get around them not being there timestamps were compared and that is not reliable for obvious reasons. So I really don't understand why any one would complain about this.
They're derived. You can tell that they're derived information by the fact that you can compute them for old commits, long after commit time, which is exactly what part of this proposal is to do. You can derive them simply by counting the maximum number of steps between a commit and any of its roots. The essential information isn't the generation numbers; it's the structure of the commit history -- the actual chains of commits, with all of the branches and merges. Generation numbers are just an artifact of counting.
On the other hand, this information is very handy, once you have it, for certain algorithms, and it could be expensive to re-compute all the time, which is why the proposal is to generate and store them explicitly. (This is also the reason that timestamps have been used before, even if they were a bit of a hack -- they're readily available, and way faster than recomputing generation numbers all the time)
I was wondering how they got along without generation numbers for so long. It was by comparing timestamps and those are unreliable because systems can be misconfigured. How they are going to handle legacy repos with that problem I still don't get. I am guessing that history is f'd.
New versions of git will actually go back and generate this information for old commits. This will lead to git being slightly slower when in old repositories until all the commits contain the generation information, but that should happen fairly quickly.
You just go the whole way up to the root node counting parents (taking max length when there are many routes), no problem with amiguity. The problem is - it's slow.
Not necessarily "all the commits", as I understand it (there's some debate in the thread, so I could be wrong) As long as at least one commit in each merged branch has a generation stored, it's simple to compute without going all the way back to all the roots.