Share to linkedin
Despite the 737 MAX crashes, many observers say the current safety certification process for aircraft software has generally worked well. Safety-critical programming rarely fails to operate as designed; rather what problems there have been have tended to stem from failures to foresee danger points in the design specifications, including the unexpected ways that pilots can interact with the system, as seems to be the case with MCAS.
With a limited budget and staff, the FAA has for decades relied on industry to shoulder most of the burden of certifying the safety of aircraft. As of 2013, more than 90% of the work was being done by deputized consultants and employees at the manufacturers it oversees, according to afrom the Government Accountability Office. With software, the certification process is focused on spelling out what the programming is needed to do and ensuring that the code matches up to those requirements.
The software controlling autonomous cars and aircraft will have to be capable of learning from experience and reacting to situations the designers couldn’t anticipate, and its decisions may be hard to interpret.
If FAASafetyBrief doesn't have Engineers & Technologists as members, then the board can easily overlook technological safety issues. You didn't have 2b Rocket-Scientist to know Flawed Angle of Attack Sensor was recipe for disaster. Aviation, Boeing737Max, Planecrash, Boeing