Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The thing about "Class based Programming" - or Object Orientated Programming - is that it allows you to model a problem domain in real-world terms. No one approach is ever going to be perfect; OO being mutable makes it perhaps less founded (on the face of things) on formal principles, but for a lot of large codebases it can have a great beneficial effect on readability and maintainability - cognitive overhead is, perhaps, reduced when you can "ground" your understanding in real-life terms.

At the end of the day, pick what language and coding paradigms best suit your problem domain and you as a coder. Pick right, and there's no mistake there.



>The thing about [OOP] is that it allows you to model a problem domain in real-world terms.

That's only true on a superficial level. Sure, in CS 101 it's easy to explain how to make a Point object and add a Move() method to it. But dive into any real-world code base and you're more likely to meet RectangleCollectionColorPickerFactory instead. Let's face it, writing a large program is an exercise in abstract symbol manipulation. Trying to make it correspond to "real" objects is about as hopeless as basing literature on coloring books.


"... RectangleCollectionColorPickerFactory ..." Sounds like you're blaming the OO world for Java's problems. But I digress...

Trying to make code correspond to real world objects is annoying because after you've modeled your Car with Wheels that support Tires, and an Engine with EngineParts attached ... you then have to write a adaptation layer to make all that fit into your storage engine, your user interface, your query engine, etc ad nauseam. Strings are not "real world" objects, but we use them in our code nonetheless. They're a great UI item that lets us interface with our users. "I need input from the user. Here, type some text in this field and I shall call it String." Now you can manipulate the String with operations (methods on the object) that make sense for Strings.

Point: OO is not about "modeling the real world," it's about code organization. Some people organize with OO. Some with other means.


It's not just java - any good OO platform has those problems. If you don't have that problem, it's worse: that tends to mean you failed to abstract away critical concepts like factories, and that you will have a hard time elsewhere, for example in testing (this is what you often see in C#, for instance) or with global state (a common issue in ruby code).

There is of course an alternative: use a function rather than a factory; but that's kind of the point of this discussion :-).

(Before we start a flame war: This isn't a criticsm of ruby or C# specifically, it's just something I've seen a lot in those code bases. Both languages allow writing factories or using lambdas.)


No, "any good OO platform has those problems" is just wrong. Point-like classes without factories or any other BS are alive in kicking in big code bases. And geometry (visualization) is actually an excellent example for where mutable state and simple classes make things easier (I can actually move the point/scene object and everyone who has a reference to it gets the update). I really hope those articles at some point will start being a little more balanced and stop trying to pretend like "the purer and functional the better" (or the opposite).


I'm not suggesting that everything has a factory; I'm suggesting that having a factory isn't some java-specific code smell; it's a necessary pattern in an OO language. You obviously don't need one for a point. The context was some color picker example - and I could well imagine multiple color pickers and a useful color picker factory.

And I entirely agree with you that a PointFactory is unwanted (barring special circumstances) - though I entire disagree you'd want points to be mutable. Certain control points? Certainly - make a MutablePoint which is (conceptually) simply a reference to a point value. All points? That's just asking for pain; I really have better things than to track down with nested submodule thought it was mutating a copy but due to some optimization interaction turned out to be mutating a copy someone was actually looking on. Not to mention that reference semantics don't work very nicely with hashtables and lots of other datastructures which become a lot more complicated when the values can change right under them.


OOP is not fundamentally about mapping your code to physical real-world objects, but about mapping it to natural, intuitive concepts. Those concepts can, and in many cases should, be as abstract as you like: eg, an "Algorithm" object.

Something like "RectangleCollectionColorPickerFactory" is obviously a confusing and ridiculous concept, and exemplifies a failed application of OOP, not a failure of OOP itself. One could invent equally absurd demonstrations of functional programming, or indeed of any other paradigm.


Those concepts can, and in many cases should, be as abstract as you like: eg, an "Algorithm" object.

I only really "got" design patterns when I realized that the strategy pattern was just treating an algorithm/implementation as an object and stopped thinking that OO was exclusively about mapping things to real world objects.


Of course no true Scotsman would write a class like RectangleCollectionColorPickerFactory.


The inability to comprehend why a mashup of nouns into a long class name might be useful is one of the most annoying memes on HN. Is it just that programmers are slow typist or something? Surely any competent programmer should be able to string together what the individual terms mean in your example (Rectangle, Collection, Color Picker, Factory), and easily grok what the class is intended to be used for.


The meme is more about the abuse of that idea into absurdly overengineered and the approach to programming where anything needs 3x the amount of work in the form of scaffolding and code bureaucracy.


>The thing about [OOP] is that it allows you to model a problem domain in real-world terms.

>> That's only true on a superficial level.

Exactly. I had forgot that I learned OOP by modelling a CD collection. When thinking about it now, "mapping" a program into real-world models and terms seems totally backwards.

What's the point of thinking in real-world models unless you program actually manipulates real-world objects? Of course there are properties that are useful to carry over, but why limit your program to real-world models?


OOP actually doesn't model the real world well at all. It seems to on the surface, but it completely ignores time.

In the real world, everything is a process. Nothing ever stays the same or stands still. No object is the same object from one millisecond to the next.

Immutable state isn't just a formal exercise--it's a more authentic model of the world.


"everything is a process"

To me that's as bad as saying "everything is an object".

I'd rather try and understand where each approach is best rather than picking a single approach and dogmatically applying it in every scenario.


I think his point is that the real world is concurrent states, not context switching. To loosely paraphrase Joe Armstrong, in the real world, data isnt shared, it's communicated. In this sense, the process as a fundamental abstraction more correctly models "the real world."


>> In the real world, everything is a process. Nothing ever stays the same or stands still. No object is the same object from one millisecond to the next.

Been reading up on Heraclitus lately?

While this is an interesting statement for a philosophical dialogue, it doesn't really make much sense in the context of programming, where many things are constant, many things are not processes, and there are plenty of use cases for both mutable and immutable state (which is exactly why I always get really defensive reading articles like this one, that pretend all problem domains would be best modelled by a single programming language/paradigm/model of computation).


It makes a lot of sense in the context of programming when you look at the big picture. I think Rich Hickey articulates it best: http://www.infoq.com/presentations/Value-Identity-State-Rich...


I'm sorry, are you arguing that immutability models the world more well than OO because no object in the real world ever stays the same?


Yes. Due to Special Relativity, different observers perceive events (changes to object state) at different moments, the state of the whole universe is not consistent. Therefore, we model the universe using a sequence of immutable universe snapshots, with different computational agents independently moving through the (branching) timeline of snapshots, so that each and every one of them views the universe consistently.


Most events humans care about happen in plain old classical physics, and no computer is moving at some large fraction of light-speed relative to any other computer. If you maintain synchronized clocks, everyone can agree on the exact same order of events (which would not be true in a relativistic situation).

Now it turns out to be difficult to maintain synchronized clocks, and Lamport timestamps and vector clocks are alternatives. The end result looks similar to a relativistic situation, but (and I'm not a physicist), it seems wrong to claim this situation is because of Special Relativity.

Thoughts?


Yes. I was half joking, half explaining why immutability actually is better to model the universe.

In practice, replace Special Relativity with Network Problems.


So what you're saying is the universe is mapped COW, and everytime there's a new observer, UniverseOS runs it with fork().


Check out the first 20 minutes or so of this SICP lecture. In it, Abelson specifically mentions special relativity as a way for us humans to "model" the real world as immutable values over the continuum of time (in contrast to a mutable world that fits with OO).

http://ocw.mit.edu/courses/electrical-engineering-and-comput...


No need to apologize. With immutability you have to explicitly model temporal changes.


Immutable doesn't mean unchanging. It means no mutation. I can throw a ball in the air and model it as a function of time without mutation.


> OOP actually doesn't model the real world well at all. It seems to on the surface, but it completely ignores time.

This is important, and something almost everybody ignores.

Get an OO codebase, and try to implement a time-dependent feature, like snapshots, or even a simple undo. The abstractions fall apart.


"OOP actually doesn't model the real world well at all."

Surely that is in part due to the developer. Poor developers will choose the wrong abstractions, and have a poor OOP model as a result. Better developers will choose the correct abstractions, and have better models. Obviously the domain you are modeling will make it easier or harder to model things, and in some cases OPP may not be a good choice at all.

In my day to day work I use Django. I start with a data model - based on an ER diagram. I code that in the Django model classes. Is that OOP modeling or ER modeling? I don't know. I don't really care too much. It works fairly well for most things I am doing.


We can think the state of the world is immutable. When the state changes, it's a different world.

If space-time is discontinuous, then we can think any change of state, like motion, as a set of discrete changes. If we think the motion of a particle from one energy bin to another is immediate (meaning the particle cannot be found on the border between the bins, or one moment the particle is in Bin A, next it is in Bin B) then we can think the particle was destroyed in Bin A and another one was created at Bin B. And this is what we call motion from A -> B.


A criminal tells the arresting police officer: - You're wrong, officer. It was yesterday-me who committed the crime but you are trying to arrest now-me. How wrong are you!

I bet the authorities around the world don't like your time-inclusive-dimenstion view much.


You do realize that there's a difference between an object with identity (ddd names it entity) and the one without it (ddd names it value object).

No matter what you do, identity does not change. You commiting a crime yesterday was an event that included you as an object with identity and today you have the same identtiy so you are clearly responsible for what you did yesterday.

GUARDS! :P


Yes. The article correctly points out potential pitfalls of OOP but glosses over its benefits:

> See, mutable state makes shorter English sentences, and the agent concept helps make analogies with our fellow humans. In the end, this first impression trumps the fact that avoiding mutable state where possible ultimately yield simpler programs.

- But shorter English sentences are awesome.

- Good analogies are awesome.

- Effectively communicating with our fellow humans is awesome.

Indeed, all these things contribute to simplicity.

I find that often critiques of OOP are actually critiques of poorly implemented OOP. When it's done well, it can retain many benefits of immutability while still mapping more naturally to domain concepts. This talk by Gary Bernhardt is a great discussion of using functional concepts in OOP, and doing OOP well:

https://www.destroyallsoftware.com/talks/boundaries

The talk "Functional Core, Imperative Shell" is also a must see.


> - But shorter English sentences are awesome.

You missed the author's point here. He of course prefers shorter sentences as well. He is complaining that mainstream language syntax is biased in favor of mutability since you need to use an extra keyword to make a const expression. In F# for example it is just the opposite, you have to use the keyword "mutable" in your declaration.


I love Hal Abelson's opening of this SICP lecture[1], in which he acknowledges that OO (with mutable state) is born from the idea of modeling computer programs the way we perceive the world. But then he goes on to say that the reason why OO can be so complicated (because of having to deal with the can-of-worms that is mutable state) is because maybe we have the wrong view of reality.

He then proceeds to launch his piece of chalk across the room! (good stuff). And proposes that instead of the chalk having changing state (position, velocity, etc), instead, its better to thing of the chalk as a collection of immutable values; each value existing at a moment in time. And this of course aligns more to functional programming; and you see the same notion (immutable values over time) with descriptions of Datomic.

[1] http://ocw.mit.edu/courses/electrical-engineering-and-comput...


What a great lecture. Thank you for sharing. I own and have read a good portion of SICP and these lectures are a great supplement to that.


Mutable state can be a huge pain in the ass. A common mistake OOP beginners, and even "experienced" developers, make is the creation of this complex jungle of entangled objects that all directly or indirectly manipulate each other's internal states. If you've ever had to work with such a code base for a prolonged amount of time you are pretty quickly ripe for a year-long sabbatical. There is simply no way to reason about the control flow and state of the program without investing a lot of time and mental effort. Sometimes it's almost impossible.

I don't know, I think it's a point where education fails. Your average enterprise developer should be forced to read books like "Clean Code" by Bob Martin before being allowed to touch a keyboard.


>> Mutable state can be a huge pain in the ass [..] 8I don't know, I think it's a point where education fails*

I'd argue that the problem with software quality is not so much in the (ab)use of mutable state, but the fact that (especially in enterprises) software often reflects the organizational structure of the people who worked on it.

Exhibit A: the software solution where every tiny part of the system is developed by some (semi) random group of people that often changes, doesn't directly communicate with any of the other groups, doesn't bother to much with the 'architecture' of the complete system (a big ball of mud), and has no interest improving either the overall architecture of the system, or any of the components outside their scope. Add pervasive mutable state into this mix, and you have a recipe for disaster.

That's not to say the mantra 'eliminate (almost) all forms of mutable state' will improve this problem much.


>> Immutable state can be a huge pain in the ass. A common mistake FP beginners, and even "experienced" developers, make is the creation of this complex jungle of entangled functions.

See what I did there.


The analogy would be that those functions are mutually recursive and while mutual recursive functions can be an elegant solution to several problems (state machine, some algorithms)overuse is not a mistake i have seen very often. FP does coerce, or at least very strongly suggests, a very simple program structure. This makes it hard to model some more complex relationships that are easy in OO. Anyway, my point is that FP is indeed different in this regard compared to OO.


My main point was that bad code exists in all paradigms. If there were a perfect language or paradigm, everybody would be using it and there would be no such debate. Immutability is not necessarily good or bad, same for mutability. We should all do a better job at explaining to beginning programmers how to use all paradigms appropriately, and how to choose among them. And of course, to use vi.


Why should I use vi? Why wouldn't this decision be similar to what paradigm I choose?


It was a joke. Emacs vs. Vi, FP vs OOP, etc. All members of the set of useless debate topics.


Note that Class based != Object Oriented.

Also note that theories that have tried to categories real world objects into classes have a tendency to fail (although they also have a tendency to become enormously popular [see Platon and Aristoteles fx.]).

So the claim that classes allows you to model a problem in real-world terms, is most likely false.

See for example http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.56.... for a review.


> The thing about "Class based Programming" - or Object Orientated Programming

You can have OOP without classes - prototypal inheritance for example. In JS, you just clone an exemplar with Object.create() for example.

That said, the article author does the same thing.


Having used Angular for a month or so in summer, I find prototypal inheritance horrible and unpredictable. I am never 100% sure if the data is coming from or updating the parent object or the current object. Class based inheritance seems a lot more predicable to me.


If Angular is your only experience of JS, I'm surprised you don't think JavaScript is the most complicated language in existence.


> cognitive overhead is, perhaps, reduced when you can "ground" your understanding in real-life terms.

It absolutely reduces cognitive overload. Having "containers" of functionality that allow me to easily know at a glance what I'm working with is invaluable. I'm not knocking functional programming, since I haven't built a few projects in a purely functional language yet, but trying to do some things with the functional paradigm (or at least, my understanding of it) left me recognizing class-based programming as a powerful tool when used correctly.


In fp, those containers are known as modules.


> pick what language and coding paradigms best suit your problem domain and you as a coder

An addendum: there is often no single obvious best way to do something. Though many approaches are obvious as being less wrong than others.


The idea that the world appears to mutable and therefore programming languages should encourage mutation in order to model it is one that doesn't make any sense.

To me, it makes much more sense to try to create programming languages that are able to express human thought. People think in terms of things that are, to a first approximation immutable, memories are a good example.


I agree wholly that picking the right tool is critical. I also think you have to be skilled at a broad subset of all the tools to be able to even make any sort of comparison.

Many times I've heard this argument from someone who only knows the one or two tools they learned in college or in their first job, and they try to dress up their ignorance as wisdom for "not wasting time on the wrong tools".

I recommend http://norvig.com/21-days.html as a great place to start (I have only learned, at best, 3/6 of his suggested language categories, and I am on the 4th).


The problem is it's not real-world most of the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: