> the staff had become “de-sensitized” to the risk of a serious accident.
Yet again serious problems happen due to the Normalization Of Deviance. We really need to find a way to create working conditions that encourage correcting problematic situations immediately before the behavior becomes normalized.
Regarding the management/other problems... I encourage everyone in ever industry that is ever involved with safety (i.e. most industries) to see Richard Cook's short talk about "Resilience in Complex Adaptive Systems"[1].
I'm picturing workers who have never spent time on wikipedia reading about nuclear accidents, much less in any books.
The list of nuclear accidents is hair-raising. The thought that a co-worker might assemble a near-critical mass nearby is... beyond hair-raising. Where the heck are they getting these guys, and what are they telling them?!
They are normal people, who follow normal psychological patterns.
In your root cause analysis, 99% of the time the answer should not be "human error". An error that one human makes will be repeated in the future by another human.
The problem is setting up patterns and processes in the working environment that are aware of these human limitations and that work around them. The air travel industry is a _great_ example of what happens when you don't simply blame the human, but look to fix the true problems that caused the human to make a mistake.
> The air travel industry is a _great_ example of what happens when you don't simply blame the human, but look to fix the true problems that caused the human to make a mistake.
Contrast to my company:
Our phones require you to dial "9" to get out of the building, and then "1" to dial long distance. A lot of my coworkers are foreign and in the process of trying to dial a cell phone number (international or otherwise) accidentally dial "911". When they realize their mistake, they hang up, not having grown up in America and learning to stay on the line and tell the operator that you made a mistake.
Company response? Threatening emails, verbal and written company-wide wrist slaps, and stickers on every phone that say not to hang up if you dial 911 or you'll get in trouble.
As a result people are now afraid of dialing long-distance numbers and so they simply avoid it.
Imagine if the company had instead tried to correct the true issue causing the errors (poor phone routing numbers)?
In many (all?) parts of the US, even if you stay on the line and tell the operator that the call was a mistake, you'll still get a visit from the police. The thinking is that someone who got a chance to dial 911 but then is discovered by a captor might be coerced into saying the call was a mistake. So they send the police to invesitigate ALL 911 calls -- hangups, mistakes, everything.
Another fun fact is that 911 in almost any dial sequence will dial 911. One of my colleagues discovered this when he was testing some dial plan thing years ago. I think he called 911 a few thousand times.
Interesting. I definitely made this mistake as a child. My dad's cell phone number started was 915-XXXX. I misdialed once and dialed "911".
I explained that it was a mistake, and they said not to worry about it and hung up. No officer was dispatched.
I wonder if something has changed since I made that mistake, if it's just inconsistent across the country, or if there is some discretion of the dispatcher to decide that the child probably did make a mistake.
I don't think anything has changed; I made test-calls to 911 (to confirm the call was routed properly) when commissioning new phone systems and never had police dispatched. I'd tell the operator that I was testing and wanted to confirm that they were showing the correct phone # and address.
This issue was covered in a NPR segment last year about an even older Harvard Business Review article. How to prevent the next oil rig disaster - solution - get the workers to admit vulnerability.
The investigation further revealed that the penalties imposed by the government on the private firms that make America’s nuclear weapons were typically just pinpricks, and that instead the firms annually were awarded large profits in the same years that major safety lapses occurred. Some were awarded new contracts despite repeated, avoidable accidents, including some that exposed workers to radiation.
I don't think the author was trying to create a conversation about the misalignment of the lab's management and safety. The article goes out of its way to sensationalize industrial accidents with "NUCLEAR RADIATION" to construct a narrative of uninformed fear of anything associated with nuclear material.
It reminded me a bit of some of the Discovery Channel's shows where they stress that if this multi-million ton container ship doesn't stay between the two entrance markers to the harbor, it kill everyone on board and probably destroy millions of dollars in goods. The entrance markers are 5 miles apart. The fact is accurate, the narrative is not.
If you want to be really scared go read the OSHA reports on any of DOW's chemical plants. I would be willing to wager that more people have died in the last 5 years of industrial accidents at their plants than all of Los Alamos. The Union Carbide Bhopal disaster killed over 2,000 people, one accident.
The point is the article was not written to compare the risk of accidents at Los Alamos to that of other similarly sized industries, nor was it written to highlight the misalignment of lab management with safety goals. It was written to characterize nuclear materials as scary and dangerous and carefully avoids any way for the reader to compare that risk with other risks they are both familiar with and are not concerned about.
On the topic of scary chemical plants: check out the Chemical Safety Board's videos on Youtube. Mostly computer-animated recreations of serious industrial accidents.
This is a great idea. I wish more of the more serious post-mortems were this thorough in our industry. But I guess people rarely die, they just have millions of peoples data leaked or result in serious service interruptions, so we don't have much of an incentive.
On the topic of those shows I hate the false drama in "will they make it on time".
Well yes, global logistics is incredible, the ship is working fine, the weather is good and there isn't a us destroyer for miles so of course they will.
I wish they'd just focus on the engineering and logistics and cut all the bullshit.
I'd also really love it if they'd take a swan dive on specific areas, navigation, maintenance, scheduling all that stuff.
If they made documentaries I'd like they probably wouldn't have many viewers.
Deadliest Catch was good in that regard, some interpersonal stuff (more in later seasons) but the barent sea is dramatic enough alone.
Is this another situation where these are the only companies that can do the work, so the government has no choice but to work with them in order to stay on some kind of schedule?
It must be pretty hard to disrupt the "government contracts for nuclear weapons production" industry..
>>It must be pretty hard to disrupt the "government contracts for nuclear weapons production" industry..
not to pick on you, but this is an interesting turn of phrase (I know it is sarcastic). There is an interesting gulf in SV culture nowadays between needing to "disrupt" an industry, and the industry actually just needing some renewed competition. A major problem is usually raising enough capital to really compete. And an interesting SV and banking assertion often made is that there is way too much capital chasing too few opportunities.
I guess that raising relatively small VC funds precludes putting large sums of money to work, because that means no diversification. On the other hand, if we are done with low-hanging-app-fruit for outsized returns, is the solution just to raise bigger funds? It seems there are problems that can be profitably solved with modern software, modern teams, and new ideas, but which are not solved due to too much diversification of capital.
Pretty much. VCs didn't start SpaceX, Tesla and Blue Origin, billionaires did after already earning money through other means.
Another issue that plays into diversification is the inherent riskiness of startups. Most will fail to create so much as a decent app while remaining on budget, who do you trust to create a reliable nuclear research lab? Even experts in the field might not have the requisite business experience, and if it fails it fails big time. You need someone like a Musk or Bezos to obsessively drive such an endeavor, and it's hard to pick out the future Musks and Bezoses if they haven't already proved themselves.
Yeah, in that context startups aren't really companies. They're proofs of executive competence. As a side benefit, if they are successful it enables the executives to self-capitalize.
Slotin's cautionary tale is such scary founding lore of the nuclear industry... Complacency born from habit is awfully powerful and that may be even scarier.
Slotin's story is such a great example of complacency. "Using a screwdriver was not a normal part of the experimental protocol. ... the screwdriver slipped and the upper beryllium hemisphere fell, causing a prompt critical reaction and a burst of hard radiation."
The radiation dosage section of that article is a mess and I think shows a limitation of wikipedia. The writers managed to not use the same unit in any two consecutive sentences and sometimes not even with in the same sentence and don't bother providing any way to relate the different units.
"The resulting blue glow — known as Cherenkov radiation — has accidentally and abruptly flashed at least 60 times since the dawn of the nuclear age,"
I'm not a physicist, but as far as I know the blue flash generated by a criticality is caused by gases in the air fluorescing. Cherenkov radiation is a different phenomena.
"Cherenkov radiation could also be responsible for the "blue flash" experienced in an excursion due to the intersection of particles with the vitreous humour within the eyeballs of those in the presence of the criticality."
Astronauts have reported seeing flashes of light when they close their eyes in space. It's believed to be caused by Cherenkov radiation from cosmic rays traveling through the vitreous humour of their eyes.
Essentially, they used a chemically incompatible kitty litter in bins of nuclear waste (I read another article somewhere that went more in depth about how the decision to switch went down...unfortunately I can't find it quickly, but I recall it had something to do with a lack of clarity in communication). That bad reaction caused the bin to explode underground in the Waste Isolation Pilot Plant and release radioactivity.
Shortly before this event, there was a truck that caught fire underground at the waste repository, shutting down operations. I think that event happening kept the amount of workers exposed in the bigger incident down.
From my experience, safety in these sorts of situations is more of a burden than a lifestyle from a worker perspective. Holding individuals _and_ their superiors accountable--not only those that do not follow safety practices but also supervisors--I think is the only way safety will actually be fully practiced.
"Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety" by Eric Schlosser enumerates accidents related to nuclear weapons, many of which do not end in criticality due to sheer luck in most cases. Illusion of safety indeed.
And also "A Review of Criticality Accidents", a short but action packed read referenced in the article. Nuclear materials are deadly in the most unexpected ways.
I read that book differently. In nearly every case the payload didn't end in criticality because of the design of the weapon. Instead, there were ancillary effects from non-explosions.
That is true, although I was left with the impression that in many cases it was due to luck. Older weapons with poor safeguards ended up in service far longer than was reasonable and the addition of necessary safeguards was resisted at nearly every level.
There were also many stories where a weapon was dropped during loading or caught in a fire, etc., and the general idea seemed to be that it could just have easily ended up in disaster.
The story of the Damascus incident, which served as the framing story for the rest of the book, also seems relevant. In that case there were procedures to ensure safety but over time the reasons for these procedures seemed to have been forgotten or the knowledge inadequately passed on to newer team members. Old equipment (like HAZMAT suits) were allowed to get old, rendering them useless. Many people in the administration chain had weak or little knowledge of the facilities... And then there was an accident where a small mistake compounded into pretty scary incident.
I didn't care for this book - I felt it was like a bad cookie with a few good chocolate chips. The Titan silo explosion that the book is structured around is more a demonstration of designed nuclear safety than not (and the nuclear part is almost nonexistent). Industrial accidents have been so much worse than this.
It's really entertaining and worth reading, but it's still intentionally biased to almost suggest "a little bit of radiation can be good for you" in a way that smoking was presented in older times by the tobacco companies.
And it's not so. An individual can be lucky, but it's not something to be approached with "don't worry". The proofs are in Eileen Welsome's "The Plutonium Files: America's Secret Medical Experiments in the Cold War"
"Los Alamos National Laboratory told me they knew nothing about the experiment and in fact denied that it occurred even though it was a subject of the congressional report in 1986."
It's been a pretty slow, sad unraveling of LANL (and to some degree the other two DP labs). The latest stories are pretty predictable given the transition from UC to LANS about a decade ago (I left the lab after the transition, since it was a morale mess - I really enjoyed my time there for about a decade before the change though). Things were already bad before that - the whole change of management from UC was in part due to a string of safety and security issues. Interestingly, I believe the issues that are currently being reported about have roots that trace all the way back to the last facility for doing that sort of work that got shut down up near Denver (Rocky Flats). The issues today are hardly new or novel - there has been some serious rotting going on within the NW complex since the Cold War began to wind down almost 30 years ago.
I wonder if anyone has considered building faux plutonium rods that can simulate criticality events. That is, they're packed with electronics and lights that display to you how much radiation they're giving off, and how close they are to going critical. They'd be electronic devices that measure each other's position. Accelerators could tell if they're dropped or banged on something. You'd also pair them with a faux radioactivity meter (Geiger counter), and perhaps a faux radiation dosage badge.
"Huh, if I bang these rods on each other, I see a flash of lights showing it was 50% to critical, I hear the Geiger counter jack up, and my radiation badge measures that I've just instantly received a year's worth of radiation. Ouch."
Maybe if technicians could experience first-hand how easy it is for these plutonium rods to go critical, they might be more wary of mishandling them. It's understandable how a person could become desensitized to the risk of something that they've never seen happen, where the threshold of that event occurring is not intuitively known.
When I'm driving my car in a high performance way, I can feel when I'm beginning to push the bounds of its traction. I can feel when antilock breaking kicks in. I get to experience the threshold where something bad can begin to happen. With fissile material, perhaps it is a problem that workers never get to experience the threshold boundary at all until it's too late.
Even if you can see/taste (air pollution) or hear (loud noises like gunshots or machinery) it people still actively resist measures to mitigate/prevent harm!
There's a few things abut training and so on here - but the problem is that plutonium rods should never be somewhere they are not meant to be imho. So to remove the risk when a rod is put somewhere it must be in a special stand that holds plutonium rods, the stand has little holes spaced the correct distance apart so criticality can't occur. The rule then is easier - plutonium rods will always be somewhere they should be. Also maybe paint them blue or something.
Edit: upon thought if I was anywhere that had a rod laying on a table I'd get the hell out of there as soon as possible, cause it's just a recipe for disaster
>PF-4 is also the only place where existing cores removed randomly from the arsenal can be painstakingly tested to see if they remain safe and reliable for use in the nuclear stockpile. That work has also been blocked, due to PF-4’s extended shutdown, according to internal DOE reports.
I wonder if having another facility that can perform this work wouldn't provide the right incentive for the underperforming unit to improve. Competition is a healthy thing. Knowing that there is another lab that is doing the work to specifications should provide good motivation to improve.
In the years leading up to World War I, the Royal Navy was convinced that rate of fire was everything.
Every ship drilled to maximize their rate of fire. Commanders of vessels who could not keep up with their peers were sacked.
When the Battle of Jutland came around, the Royal Navy greatly outnumbered the Kreigsmarine, and caught it unaware. The engagements were incredibly in the United Kingdom's favour.
Numerically, the battle was a disaster for the Royal Navy. Its losses were double that of their opponent. One of the reasons for it was that a number of RN ships exploded after taking a single hit - while German ones took shell after shell, and stayed afloat.
In their zeal to optimize their firing rate, the Royal Navy did away with the simple precaution of installing airlocks between their ammunition storerooms, and their gun rooms. You see - opening and closing an airlock every time you needed to get to and from the ammunition room wasted too much time.
Hitting a German gun room would destroy the gun. Hitting a British gun room would destroy the entire ship. But boy, did those gun crews shoot fast.
Unless your competitive process rigorously accounts for every form of cut corners, you're going to get what you measure... With some shortsighted optimizations thrown in.
Jutland: there were issues around the culture of waiting for signals (flags!) from the flagship and also around the personality of the Commander in Chief and some of the other admirals as well as the issue you mention.
See The Rules of the Game by Andrew Gordon if you have an interest in naval history or how large organisations respond to change.
"Finally, there is a brief summary of the findings which assert that the loss of the British BCs, contrary to conventional wisdom was the result of over sensitive British propellant charges rather than the tendency of the crews to ignore safe ammunition handling practices to increase rate of fire." Review of
https://www.amazon.com/Jutland-Analysis-Fighting-John-Campbe...
It is a way to incentivize and improve efficiency, but it can also be a very expensive way to do so. It requires a lot of redundancy. You're effectively doing the work multiple times in parallel and discarding the loser.
That's fine when you're talking small farmers growing strawberries who can switch to another crop if it turns out their strawberries aren't good enough.
But running multiple nuclear weapon processing facilities simultaneously and pitting them against each other is an insanely costly way of improving safety.
Competition in search for excellence is a dangerous tool, management implements a metric and the departments optimize for the metric no matter what the long-term goals are. This is how Sears became run into the ground: http://www.salon.com/2013/12/10/ayn_rand_loving_ceo_destroys...
Armed forces around the world rely on esprit de corps. The British SAS don't have to compete against Navy Seals to know they are the best, they know already.
Also the training. In special forces each serviceman is trained beyond the immediate job duties. If the commander or anyone with special skills dies during a mission someone else must step up immediately to replace them and of course they all had the requisite training.
You'd ask just how a nuke technician could even think of getting close to assembling a critical mass. Are they really that dumb? Have they never been told?
Yes, I guess that's what I'm getting at. And the SAS and SEALs are very aware of each other's existence, I'm sure, as well as that of the Russian Spetsnaz and similar forces.
Most of these problems are what I call "paper safety" noncompliance (e.g. somebody not taking a refresher safety class on time). The problem will be solved by employees taking even more mind numbing safety training. You can also apply these principles to security violations.
Without genuine selflessness, homeostasis or worse is inevitable. This is why in Japan you traditionally find the most enlightened Zen monks not teaching, but cooking or supervising the kitchen - because that's the prime point of safety vulnerability for a monastery.
As a former DOE lab employee, I can't even begin to imagine the hassle of changing operating contracts, although LANL has really been putting forth extraordinary effort to get the DOE to try.
What's most irritating is that the DOE use a broad brush and go overboard where it's easy and clearly don't do enough where it's hard. I had to attend ladder training before I was allowed to climb down under the 4' floor in my computer room and the rule was made that we had to use ladders all the time. You can't even fit the ladders in the tile holes, but the rule was on the books because it was "safe!". There were so many of those boneheaded rules, but these boneheads are allowed near plutonium to photograph it for no scientific purpose? Grrrrr
The current political talking point is that the private sector can do it cheaper. Which is exactly what you get with contractors: A mandate to pick the lowest bidder, and to not let it languish in RFP (by lobbying, typically).
France has the right model with nuclear power creation and generation through a vertically-integrated state owned company, Areva. Our stockpile has to be an effective deterrent. We shouldn't trust contractors with our stockpiles, that process should be vertical too, damn the costs involved.
I am very offended by the improper use of the term "accidental". Nothing described here was accidental at all. It was intentional. Insisting this is based on accidents, perhaps requiring endless more 'training', is counterproductive when the described acts were done with full awareness by qualified individuals who knew better. More training won't fix that. Therefore the solution of more training is doomed to fail.
What should happen, in my opinion, is the University of California, and all contractors and corporations, and employees and workers, should be fired, banned from industry permanently, and the entire system replaced, because none of this, as the article points out, are isolated incidents, but rather are systematic and endemic problems.
Yet again serious problems happen due to the Normalization Of Deviance. We really need to find a way to create working conditions that encourage correcting problematic situations immediately before the behavior becomes normalized.
Regarding the management/other problems... I encourage everyone in ever industry that is ever involved with safety (i.e. most industries) to see Richard Cook's short talk about "Resilience in Complex Adaptive Systems"[1].
[1] https://www.youtube.com/watch?v=PGLYEDpNu60