Wednesday, August 28, 2013

Homework 5 – Software Engineering – CSCI 362-001

The software problems that caused the aerospace accidents seem like things that should have been easily avoided to us now, but perhaps then not so much. I don't know how the schooling was for the engineers, nor the common procedures – we seem to have a process for trying to cover as much as possible nowadays and a better standard for making decisions on what should be included and making sure it works. I was most troubled when it was mentioned that software was apparently assumed correct until proved faulty – as if they were just going to see if a program worked on a cheap replaceable computer instead of launching hardware into space.

The lack of responsibility in these projects shocked me as well. I thought people in higher up agencies at least tried to make sure someone else could take responsibility – even if it was due to simple paranoia for their own skins. The fact no one was doing certain things was also baffling; certainly checking whether something was done or at least basically tested would be on the checklist? It seems odd to have more software and procedures than necessary on something that once deployed would be essentially unmodifiable. The old adage less is more should hold some weight when applied to something where everything must go perfect – and who isn't willing to specify exactly what they need when they're going to be shipping the thing off to space?

The Sentinel and Virtual Case File articles disappointed me. As a kid I always assumed that the government was always listening and constantly keeping up to date with information and how to manage it – it seems like the FBI should have a better system than paper to keep records. The fact they don't seem able to communicate with those it has working to make a more up to date system is also baffling – certainly they should be harassing those who work for it for constant work updates, if nothing else? These agencies are usually thought of as more capable – or at least more capable of intimidation in regards to getting others to do something they want. The fact that security was apparently an issue was also appalling – for any system for the government I would imagine security would be the top priority.

The situation with medical equipment – radiation emitters specifically – seems like something that should have happened at most with one device. I would expect more caution from anyone working on a medical device. Furthermore, I would think that the makers would be more paranoid about lawsuits or being held responsible for their product malfunctioning so horribly. It seems all of the systems did not have nearly enough feedback, and attempted to be way too complicated – offering features such as auto treatment when something so life threatening should be given instructions very discretely. For the Therac-25 it seemed like a horrible legal and PR move to deny knowing of other accidents at one point – especially when evidence existed to the contrary.

1 comment: