Writing a book on practical analysis and being both an OO and FP programmer (and architect, whatever that means), I've spent some time thinking about patterns.
I think the key phrase is this: Marick's provocative claim that, as an idea, software patterns failed is various degrees of true and false depending on how you define 'patterns' and 'failed'.
Yes. What we run into again, over and over again, is the difference between human language and understanding and formal languages and understanding. Human languages are mostly spoken, extremely loose, improvisational, and change while we're using them. Mathematical languages are all written, tight and consistent, and stay the same over decades or centuries.
One of the things I learned from the linguists was that written human languages, which we mostly think of as language, is in fact a very recent thing -- and once a language gets written all sorts of other things happen as a result. People start viewing the symbols on paper as having some kind of power that a few grunts and turn of a phrase do not. Somehow they seem more important, more real...but just the opposite is true. Instead, they give the illusion of being just like formal mathematics without actually being so.
(There's a wonderful scene in "The Wire" where two detectives view a recent murder scene and have a conversation using only the word "fuck". Masterful example of the difference between spoken and written language in action.)
The way this plays into patterns failing is that yes, there are recurring situations where the same types of problems come up. At some point, you can mathematically generalize these kinds of problems into a formal pattern of constructs and the formal pattern is less of a hassle than simply continuing to analyze and code, but that's a different concept entirely from saying that these problems are an example of Pattern X. It doesn't work like that. Our brains work like that, but solving problems doesn't.
This also explains the authors observation that students find patterns most useful. It gives them a formal construct to use using the computer language they already know that appears to give them traction on the problem. It explains why new folks to programming, architecture, and patterns tend to overuse them. Neither one of these groups has any larger context to know how to solve the problem, what the language can do or not do, how patterns fail, and so forth, yet a template for a solution looks to be right in front of them. Why not use it? After all, it's good enough! That's the way we think. We naturally are attracted to the purity of math and are inveterate over-generalizers. We have to be. Otherwise we couldn't get out of the bed in the morning.
For those interested in learning more about some of the concepts, here's a Wiki page I wrote up on the book: http://wiki.info-ops.org/?ref=hn
I think the key phrase is this: Marick's provocative claim that, as an idea, software patterns failed is various degrees of true and false depending on how you define 'patterns' and 'failed'.
Yes. What we run into again, over and over again, is the difference between human language and understanding and formal languages and understanding. Human languages are mostly spoken, extremely loose, improvisational, and change while we're using them. Mathematical languages are all written, tight and consistent, and stay the same over decades or centuries.
One of the things I learned from the linguists was that written human languages, which we mostly think of as language, is in fact a very recent thing -- and once a language gets written all sorts of other things happen as a result. People start viewing the symbols on paper as having some kind of power that a few grunts and turn of a phrase do not. Somehow they seem more important, more real...but just the opposite is true. Instead, they give the illusion of being just like formal mathematics without actually being so.
(There's a wonderful scene in "The Wire" where two detectives view a recent murder scene and have a conversation using only the word "fuck". Masterful example of the difference between spoken and written language in action.)
The way this plays into patterns failing is that yes, there are recurring situations where the same types of problems come up. At some point, you can mathematically generalize these kinds of problems into a formal pattern of constructs and the formal pattern is less of a hassle than simply continuing to analyze and code, but that's a different concept entirely from saying that these problems are an example of Pattern X. It doesn't work like that. Our brains work like that, but solving problems doesn't.
This also explains the authors observation that students find patterns most useful. It gives them a formal construct to use using the computer language they already know that appears to give them traction on the problem. It explains why new folks to programming, architecture, and patterns tend to overuse them. Neither one of these groups has any larger context to know how to solve the problem, what the language can do or not do, how patterns fail, and so forth, yet a template for a solution looks to be right in front of them. Why not use it? After all, it's good enough! That's the way we think. We naturally are attracted to the purity of math and are inveterate over-generalizers. We have to be. Otherwise we couldn't get out of the bed in the morning.
For those interested in learning more about some of the concepts, here's a Wiki page I wrote up on the book: http://wiki.info-ops.org/?ref=hn