New Flaw In Boeing 737 Forewarns Emergency Procedure Bugs In Driverless Cars

A newly found added flaw in the Boeing 737 flight system gives rise to considering how self-driving cars might be likewise flawed, relating to the use of emergency procedures during the driving

New Flaw In Boeing 737 Forewarns Emergency Procedure Bugs In Driverless Cars

Curated via Twitter from Forbes Tech’s twitter account….

This applies not only to the fully autonomous cars, such as Level 4 and Level 5, but also applies to the Level 2 and Level 3 semi-autonomous cars too. • The roadway tryouts need to somehow encompass emergencies yet do so safely and in a carefully contained manner, which should be done on closed tracks or providing grounds, rather than on our public streets. • Simulations of driverless cars need to especially tackle and exercise the emergency procedure aspects of self-driving cars, which otherwise could be neglected or not given proper attention. • Automakers and tech firms would be wise to not wait until someone somewhere gets into a deadly crash involving a self-driving car and for which later on belatedly it is found out that an emergency procedure in the system had a lurking flaw.

This applies not only to the fully autonomous cars, such as Level 4 and Level 5, but also applies to the Level 2 and Level 3 semi-autonomous cars too. • The roadway tryouts need to somehow encompass emergencies yet do so safely and in a carefully contained manner, which should be done on closed tracks or providing grounds, rather than on our public streets. • Simulations of driverless cars need to especially tackle and exercise the emergency procedure aspects of self-driving cars, which otherwise could be neglected or not given proper attention. • Automakers and tech firms would be wise to not wait until someone somewhere gets into a deadly crash involving a self-driving car and for which later on belatedly it is found out that an emergency procedure in the system had a lurking flaw.

It was found somewhat by happenstance, arising during simulator tests that were being done to try and overall gauge the readiness of the Boeing 737 to start flights again. • This latest found error has been hidden in the Boeing 737 all along and could have been encountered at any time (during the invocation of emergency procedures). • The newly found error was discovered not by intent of trying to find it, but merely as a consequence of otherwise doing general and overall testing (via simulation). • This added scrutiny and testing were done because of the sparked attention of the AOA sensor case, and likely might not have ever been overtly found, other than if an incident led to a crash and during the crash investigation this error was revealed and traced as the root of the cause.

It was found somewhat by happenstance, arising during simulator tests that were being done to try and overall gauge the readiness of the Boeing 737 to start flights again. • This latest found error has been hidden in the Boeing 737 all along and could have been encountered at any time (during the invocation of emergency procedures). • The newly found error was discovered not by intent of trying to find it, but merely as a consequence of otherwise doing general and overall testing (via simulation). • This added scrutiny and testing were done because of the sparked attention of the AOA sensor case, and likely might not have ever been overtly found, other than if an incident led to a crash and during the crash investigation this error was revealed and traced as the root of the cause.

In short, the driverless cars already on our public roadways might very well contain hidden errors (I’d say it is nearly a guarantee that they latent bugs in existence and not merely some slim chance), those driverless cars might get the green light for added expansion of use and yet still contain the latent error, and the error could remain undiscovered and only unmasked once (sadly, regrettably) a deadly car crash occurred involving the self-driving car.

In short, the driverless cars already on our public roadways might very well contain hidden errors (I’d say it is nearly a guarantee that they latent bugs in existence and not merely some slim chance), those driverless cars might get the green light for added expansion of use and yet still contain the latent error, and the error could remain undiscovered and only unmasked once (sadly, regrettably) a deadly car crash occurred involving the self-driving car.

Link to original article….

Leave a Reply

Leave a comment
%d bloggers like this:
scroll to top