Hacker News new | past | comments | ask | show | jobs | submit login

> Can you imagine, then, diving into a website's back-end to see @ all over? It turns out, the previous developer realized all those nasty notices and errors stopped happening if he slapped a @ on everything.

You've turned something that's a person's fault into something that's the language's fault.

Other languages have some sort of warning suppression as well, like Java's @SuppressWarnings or C#'s #pragma warning disable. Although they won't suppress all errors like PHP does, it can still bite you if you don't fix them.

It IS snobbery. You're taking a look at other people's code, and judging the language from it. I've written PHP for about 3 years and I've never once used the @ to suppress errors.




> You've turned something that's a person's fault into something that's the language's fault.

> You're taking a look at other people's code, and judging the language from it.

At some point, you have to start blaming the language for fostering an environment where that code is acceptable.


Meanwhile, people still manage to write memory leaks in garbage collected languages specifically designed to not leak memory.


GC is not about preventing memory leaks, it's there to make managing memory easier not automatic. One of it's biggest advantages is the ability to deal with memory fragmentation which is ridiculously hard to do well in C++ style languages.


You can write bad code in any language. It's the programmers responsibility to make sure the stuff they are writing is good and that only comes from experience with the language.


Of course you can. But is it always an equal share of bad code for each language? If not, then you have to admit that the language itself will encourage or discourage bad code or bad coders.


It's easier to write bad code in certain languages, as well as vice versa.


The article tries to be a comprehensive list of problems with PHP and @ is a notorious one, even if you personally are disciplined enough to avoid it.

Not to mention that I have no idea why the original commenter picked on this. It was listed as one of the 7 or so things that can go wrong with that one single, not unusual line of code. It's not like he had a whole paragraph about why @ is bad.


> The article tries to be a comprehensive list of problems with PHP and @ is a notorious one

Nonsense. The @ error suppression is a tool, just like any other. It should be used sparingly, but it does have its uses. I have been writing in PHP for over a decade and I have used it exactly one time. And yes, it irritates me when I see it in other's code all over the place ... which is why I refactor all external PHP code before I place it inside of mine.

The amazing thing about PHP is that it has a plethora of tools available and the language doesn't force you to write code in some constrained manner according to what some snob perceives as the right way. The only right way is the way that works and works well.

You don't like a feature of PHP? Don't use that feature. Simple as that.


Ok, let me fix that: "The article tries to be a comprehensive list of problems with PHP and @ is an error suppression tool that is notorious for being misused throughout the community".

> I refactor all external PHP code before I place it inside of mine

> You don't like a feature of PHP? Don't use that feature. Simple as that.

I don't know anything about you, but judging from that attitude you haven't worked in many teams. Of course it's not as simple as that. If I had a penny for every time I had to fix someone not checking that strpos() === FALSE, well, I could fund my own startup. You may have the luxury of refactoring all over the place, but the reality out there is that horrible code like this is left to fester until it causes real business damage.


> I don't know anything about you, but judging from that attitude you haven't worked in many teams.

I haven't worked on any teams. I've always written software solo. What of it?

> You may have the luxury of refactoring all over the place, but the reality out there is that horrible code like this is left to fester until it causes real business damage.

Refactoring is not a luxury, but a necessity. Not only do I refactor other peoples' code, but I refactor my own. That's the only way to get to a quality code base.


> I haven't worked on any teams. I've always written software solo. What of it?

Well, that means you have zero experience with the majority of concerns expressed in this thread, and so are not qualified to opine on them. You work in a happy bubble and I envy you for it, but in the real world you very very rarely get the go-ahead to refactor old code. So in the real world, you very very rarely get to see a quality code base. In any language, really, but PHP compounds this problem with its idiosyncrasies. But you wouldn't know about that.


Isn't the fact that you have to refactor so much of other peoples code an indication that something might be wrong? I'm not familiar with the PHP ecosystem but I don't know of many people having to refactor Ruby gems.


> Other languages have some sort of warning suppression as well

But there is a huge difference here. PHP directly encourages it, making it so easy to do "just prefix it with @" and dedicating a part of the core language syntax to it (thus spending such a nice character for such a triviality). I believe (correct me if I'm wrong) that in Java, it is just another annotation, and in C# just another preprocessor directive.

And I think that's what the article is about - so many things are wrong in the very foundation of the language.


In Java annotations can be used to ignore compiler warnings, not runtime errors.

To get the same effect as @ you'd have to use try {} catch {} blocks all over the place and leave the catch blocks empty, as with any other language with runtime exceptions. Sadly this is done more that one would think...


But that was exactly my point. When you do it in Java, you are obviously doing something wrong (or at least not intended to be done). It just looks wrong at the first glance, with empty blocks and all. In PHP, it's just a single character prefix, no bother at all to add, and looks just like any other sigil. Its usage is definitely not discouraged by design, quite the contrary.


> You've turned something that's a person's fault into something that's the language's fault.

PHP programmers have only one single thing in common. PHP.


> I've written PHP for about 3 years and I've never once used the @ to suppress errors.

Is it so wrong to use @? I've always used (@$_REQUEST['foo'] === 'bar') as a shorter way of writing (isset($_REQUEST['foo']) && $_REQUEST['foo'] === 'bar') - I'm curious if there's a problem with that approach.


Actually yes because the error gets generated anyway (and that is slow) and then it's just suppressed at the last moment.

You'd probably be better off creating a function to do what you want succinctly rather than using the suppression operator.


You be using array_key_exists instead.


"You've turned something that's a person's fault into something that's the language's fault."

Out of curiosity, where do you side on the C-is-an-unsafe-evil-language-because-of-pointers debate?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: