Many years ago, I had an opportunity to work on a similar type of system (though more recent than this). In the final round of interviews, one of the executives asked if I would be comfortable working on a device that could deliver a potentially dangerous dose of radiation to a patient. In that moment, my mind flashed to this story. I try to be a careful engineer and I am sure there are many more safeguards in place now, but, in that moment, I realized I would not be able to live with myself if I harmed someone that way. I answered truthfully and he thanked me and we ended things there.
I do not mean this as a judgement on those who do work on systems that can physically harm people. Obviously, we need good engineers to design potentially dangerous systems. It is just how I realized I really don't have the character to do it.
> I do not mean this as a judgement on those who do work on systems that can physically harm people
In industrial non software settings this is not as rare as you'd think, if you research the tower crane, truck crane and rigging industry. Lots of things can kill people. The important part is that we need the appropriate 'belt and suspenders' safety checks and engineering practices to prevent them from doing so.
> In the final round of interviews, one of the executives asked if I would be comfortable working on a device that could deliver a potentially dangerous dose of radiation to a patient
Automotive software engineer here: I've asked the same in interviews.
"We work on multi-ton machines that can kill people" is a frequently uttered statement at work.
I would have considered the fact that for a vast majority of people suffering from cancer, this device helps them instead of harms them. However, I can also imagine leadership at some places trying to move fast and pressure ICs into delivering something that isn't completely bulletproof in the name of bottom line. That is something I would have tried to discern from the executives. Similar tradeoffs have been made before with cars against expected legal costs.
There are plenty of other high stakes software that involve human lives (Uber self driving cars, SpaceX, Patriot missiles) and many of them completely scare me and morally frustrate me as well to the point where I would not like to work on one, but I totally understand if you have a personal profile that is different than mine.
I feel like if you're comfortable working on such software you'd probably be the least qualified person to do so. Seems to me that you can't be paranoid enough when developing these kinds of systems.
I do not mean this as a judgement on those who do work on systems that can physically harm people. Obviously, we need good engineers to design potentially dangerous systems. It is just how I realized I really don't have the character to do it.