Hacker News new | past | comments | ask | show | jobs | submit login

Every engineer, not just the ones building bridges. I was required to take an Engineering Ethics course that covered Tacoma Narrows as well as other incidents, such as the famous Hyatt Regency walkway.



I second that. As a mechanical engineer, we needed to learn about public works and prior disasters that has happened when there is lack of rigor or otherwise oversight, including Tacoma bridge and Hyatt Regency walkway.


i covered them twice even. once as a mechE undergrad, and again in business school. the former primarily focused on the engineering dynamics and ethics, and the latter primarily on the organizational behavior and decision-making failures.


We covered software-induced disasters in my CS courses. This isn’t standard, which explains a lot about our profession.

My two favorite examples were:

The Therac-25 — it’s just frontend GUI code. Why test it? What could go wrong?

The Siberian gas pipeline explosion of ‘82 — not technically an accident, but it shows the problem with testing untrustworthy code to correctness. It was also the biggest non-nuclear man-made explosion, at least at the time.

The Russians had stolen some pipeline schematics from US companies. The theft was discovered before they stole the control software. Instead of stopping the software from being stolen, US intelligence modified it so an integer would overflow after a year (or two) of operation. The Russian economy would be ruined if the pipeline wasn’t operational in less time than it would take the bug to trigger, so it wouldn’t show up in testing.

When it triggered, it slammed a bunch of valves shut, causing multiple parts of the network to explode at the same time.

The US military’s seismologists detected it, and thought the Russians had detonated a new type of nuclear weapon. The military was going to escalate until the intelligence service told them they were responsible for the blast.

Here’s a decent list of other incidents:

https://royal.pingdom.com/10-historical-software-bugs-with-e...


Fun talk by Paul Fenwick from a long ago OSCON about various particularly bad engineering failures. https://www.youtube.com/watch?v=KkoyVPPXt5w


wow, that's crazy




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: