I used to work in aerospace as a hardware/software engineer and while my team did use unit tests that wasn't how software was tested and qualified.
The process in aerospace is vastly slower and that can be signified by the fact that on average a programmer doing aerospace produces just 1000 lines of code a year. There is a clear reason why aerospace engineers produce a lot less code:
- Documentation - You get a pile as tall as your desk for a few thousand lines of code, the software is designed to the degree of knowing precisely what the maximum N can be going into a function and its exact runtime, every function has a fixed maximum and minimum values and a runtime associated with it preallocated before anyone writes code. Then the code is checked against the documentation.
- Bench testing - We had a complete dev/test environment in which all in dev hardware/software could do missions and had all the different teams and their avionics device within a network interacting. You then flew countless "missions" (real cockpit parts but swivel chair + simulator) testing all the different scenarios and what you just did.
- The wider team of engineers working on the avionics software would print out the entire code and all branches and "walk" the entire system from beginning to end with documentation in hand. Every line is validated by someone from a different team and every decision scrutinised.
- System test - multiple phases of. Completely separate teams would test every new release to death rigorously, against the clients specs, against the documentation produced and also with their own independent simulator and cockput setup. Not only that but there was a second team doing the same thing trying to catch the first in missing something.
- Then it spends years being tested in prototype vehicles before finally being signed off as ready.
End to end it took about 10 years to get 60k lines of code released working full time on one avionics device. We had unit tests but only for the purpose of testing the hardware not really for our own software beyond startup tests.
All of that is the rigour coming out of one principle, every time you find a problem in the code at any point its not the individuals fault its the teams fault and you work out the genuine cause and ensure it can't slip through again. Everything must be testable.
There is quite a lot of parallels with unit testing and indeed it could be a better way to capture tests for aerospace, potentially a slightly less man intensive way as to run all the tests everytime a new release is produced, but it wouldn't look anything like what your typical commercial company does as to them its not worth reducing the 10 bugs per 1000 lines of code they currently have down to more or less 0. Unit testing in aerospace would be about efficient repetition of running known tests much as it is in the commercial space but not driving any form of design process, but I could see test definitions being produced from the function lists. We did a lot of specification testing and limiting of language to allow that to occur and make the code more straight forward.
If I didn't know anything about software engineering, I'd assume it worked something like what you just described.
Compared to what it is in reality, it's a little disappointing, and it's not surprising a lot of people question the title engineer for the majority of developers. I've written semi-critical parts of large businesses and didn't do half of what you described.
I know the trade off between getting things done vs. doing it right, and we wouldn't have our current abundance (in tech) if we took the latter, but the security breaches, the lack of care for performance, the errors, crashes and deaths (more to come as we depend on self-driving cars, IoT, etc.) makes it questionable sometimes.
They are very conservative with writing software in aerospace. One of my first jobs out of school was working as a controls engineer for satellites, where I tested the controls and data handling subsystems. For a new satellite, my department determined either we needed to put a single minus sign in our standard code or the structures group needed to design some exotic assembly for some antennas and sensors. They sent me, the new guy, to the systems engineering meeting to discuss this - because it was a given they would NOT try to update the code when there was another solution, however complicated it was.
The only problem with producing 1000 LOC per year is that your brain will rot from being so narrowly focused on this one thing.
It could be a tolerable job with a programming hobby on the side (FOSS projects or whatever) in which you produce another 15,000 lines to get your "coding fix".
The process described is not narrowly focused. It's full system design, implementation, and validation. Engineering is more than just coding and can be much more satisfying.
I'm sorry, but that's such a narrow view of things. Different people are attracted to different kinds of jobs. What they described is essentially my dream job. I hate the haphazard way modern software is written - it's not engineering at all.
> All of that is the rigour coming out of one principle, every time you find a problem in the code at any point its not the individuals fault its the teams fault and you work out the genuine cause and ensure it can't slip through again.
This. In every organization it is the motivation of its parts that drives it towards the goal (or not). It is easy to misalign common goal and individual goals through bad management. In fact, in my experience this is one of the most common mistakes.
The process in aerospace is vastly slower and that can be signified by the fact that on average a programmer doing aerospace produces just 1000 lines of code a year. There is a clear reason why aerospace engineers produce a lot less code:
- Documentation - You get a pile as tall as your desk for a few thousand lines of code, the software is designed to the degree of knowing precisely what the maximum N can be going into a function and its exact runtime, every function has a fixed maximum and minimum values and a runtime associated with it preallocated before anyone writes code. Then the code is checked against the documentation.
- Bench testing - We had a complete dev/test environment in which all in dev hardware/software could do missions and had all the different teams and their avionics device within a network interacting. You then flew countless "missions" (real cockpit parts but swivel chair + simulator) testing all the different scenarios and what you just did.
- The wider team of engineers working on the avionics software would print out the entire code and all branches and "walk" the entire system from beginning to end with documentation in hand. Every line is validated by someone from a different team and every decision scrutinised.
- System test - multiple phases of. Completely separate teams would test every new release to death rigorously, against the clients specs, against the documentation produced and also with their own independent simulator and cockput setup. Not only that but there was a second team doing the same thing trying to catch the first in missing something.
- Then it spends years being tested in prototype vehicles before finally being signed off as ready.
End to end it took about 10 years to get 60k lines of code released working full time on one avionics device. We had unit tests but only for the purpose of testing the hardware not really for our own software beyond startup tests.
All of that is the rigour coming out of one principle, every time you find a problem in the code at any point its not the individuals fault its the teams fault and you work out the genuine cause and ensure it can't slip through again. Everything must be testable.
There is quite a lot of parallels with unit testing and indeed it could be a better way to capture tests for aerospace, potentially a slightly less man intensive way as to run all the tests everytime a new release is produced, but it wouldn't look anything like what your typical commercial company does as to them its not worth reducing the 10 bugs per 1000 lines of code they currently have down to more or less 0. Unit testing in aerospace would be about efficient repetition of running known tests much as it is in the commercial space but not driving any form of design process, but I could see test definitions being produced from the function lists. We did a lot of specification testing and limiting of language to allow that to occur and make the code more straight forward.