Supporting the analogy of this construction error as a bug, the error itself was found by a person completely unrelated to the project who had access to the 'source code' (built plans).
For anyone interested in this sort of thing "Why Buildings Fall Down" [1] is a good related book. I did civil engineering at university and you'd be surprised how many beautiful and interesting buildings only ended up that way because of constraints forced on the engineers and architects by the surrounding environment.
> ... you'd be surprised how many beautiful and interesting buildings only ended up that way because of constraints forced on the engineers and architects by the surrounding environment.
I recently did a major remodel of my home and many of the decisions made were ultimately directed by things outside my control (e.g. physics, availability of materials, environmental conditions, etc.), which resulted in me getting not exactly what I originally wanted, but yielding a result better than my original imagining.
This also goes into how the bug was found by a student who managed to contact the lead architect and the process of quietly patching the building. The repair happened under a deadline of an approaching tropical storm for added suspense.
I encountered concern about unorthodox approaches when building my house: the code says that if you have a central support for the stairs and it is a rectangular pipe of cross section X then it's OK. My staircases floated on a set of plate struts -- basically climbable sculptures designed by the architect. In order to get the permit approved we had to pay someone to write a 15 page book full of differential equations to determine that indeed they wouldn't collapse.
I think the author underestimates the inherent complexity in large scale systems.
Yes, incidental complexity makes us skip tests that might have been good, but it is the inherent complexity that explodes in our face when totaly apparently unrelated issues causes emergent behaviour.
It has nothing to do with that it's a young field, it's just that we can create complexity that is not bound by laws of physics.
I usually show 'business people' Conway's game of life, and they are always fascinated, and then I ask them how many rules WE have.
Not sure if I understood it correctly - so there's a building that would collapse sooner or later in a very dangerous way, but it's still there after almost 40 years?
The building wasn't in imminent danger of collapsing. One would assume that had a gale-force storm been forecast for the city, someone would have ordered an evacuation. However, had they tried to explain the situation, many people would have over-reacted and refused to step foot inside, causing huge losses for everyone.
Right, and it wasn't "in secret" - they notified the authorities, etc. An evacuation plan was developed and would have been used if the storm had gotten closer to the city.
I've encountered Requirements bugs due to the requirement being flat out wrong. Those are usually easy to find. Contradictory ones can hide until you try to implement them.
But one of my favorite tricky ones was an apparently very clearly written Requirement. It passed all the Requirements reviews, design passed design review, code passed, etc. Then the original writer of the requirement went to use the product and said "but that's not what I meant" Sure, I implemented exactly what he very clearly said (wrote) but that turned out to be very different from what he meant to say.
Design errors are bugs. They're some of the deepest and most difficult to fix, in no small part because someone meant for it to be that way.
A site that's falling over or won't load is obviously broken. One whose design subtly invites bad behavior may not be immediately evidently broken. But it's broken all the same.
A website failing due to too many users is not a bug, it's a design issue.
And remembering we both build buildings and programs acknowledging they can catastrophically fail under certain possible real world conditions, but we only minimise it.
Here they accidentally didn't minimise the failure rate enough.
Plus if it never failed in testing or real life, was it even a bug?
There's no proof they were right and it would have failed. Lots of other fails safes might have come into play.
Plus if it never failed in testing or real life, was it even a bug?
Yes. I have proven that we shipped software with a bug that can crash the system given a very likely set of inputs. The fact that we never saw that particular combination of inputs was sheer dumb luck due to how it was most commonly configured.
I still call that a bug: if a user does X, where X is a reasonable thing to do, I guarantee the system will crash. That no user has done X yet does not mean it's not a bug.
It's clearly not just a design error. A conscious design decision was made: build the building with supports in the middle of the sides instead of the corners. This, in and of itself, was not a bug. In addition, a conscious implementation decision was made: hold the thing together with bolts instead of welds. Similarly, this alone was not a bug.
The two taken together hit a corner (heh) case and give rise to a bug.
they built something to take a particular input (hurricane-force winds) and produce a specific output (keep standing). Their calculations and over-engineering led them to expect that output all the way through production. Further investigation revealed that another output was more likely. I call that a bug.
http://people.duke.edu/~hpgavin/cee421/citicorp1.htm