Remix.run Logo
itissid 2 days ago

For any one who has not read the cockpit recording of air-france-447 I would encourage them to[1]. It is simply jaw dropping study in how things go wrong so fast — a risk with AI we have barely begun to acknowledge let alone regulate as a community.

[1](https://tailstrike.com/database/01-june-2009-air-france-447/)

jcalvinowens 2 days ago | parent | next [-]

Anybody who is interested should read the full report: https://www.faa.gov/sites/faa.gov/files/AirFrance447_BEA.pdf

ApolloFortyNine a day ago | parent | prev | next [-]

That's the one where one of the pilots pulled up the entire time, ignoring an alarm literally blaring the word "stall" for 2 minutes.

The poor captain found out I the last 10s what he had been doing but it was too late.

A couple accidents occurred largely due to Airbus averaging conflicting inputs with nothing more than a small warning light when it occurred. I'm pretty sure they would have gotten the Boeing treatment if social media was more entrenched at the time.

stevarino a day ago | parent [-]

A bit more complicated, as the aircraft itself was unable to detect the stall conditions due to icing of the pitot tubes so the warning itself was in and out several times. Clearly the copilots did not understand the situation so an inconsistent alarm could be seen as spurious or a secondary effect.

> At the same time he made an abrupt nose-up input on the side-stick, an action that was unnecessary and excessive under the circumstances. The aircraft's stall warning sounded briefly twice due to the angle of attack tolerance being exceeded

...

> The crew's lack of response to the stall warning, whether due to a failure to identify the aural warning, to the transience of the stall warnings that could have been considered spurious, to the absence of any visual information that could confirm that the aircraft was approaching stall after losing the characteristic speeds, to confusing stall-related buffet for overspeed-related buffet, to the indications by the flight director that might have confirmed the crew's mistaken view of their actions, or to difficulty in identifying and understanding the implications of the switch to alternate law, which does not protect the angle of attack.

Its a complicated interplay of systems, where autonomous control systems are changing modes and receiving bad information during a complex, raplidly developing situation.

ApolloFortyNine 11 hours ago | parent [-]

>A bit more complicated, as the aircraft itself was unable to detect the stall conditions due to icing of the pitot tubes so the warning itself was in and out several times.

74 times the stall warning blared [1]

Of the 3 pilots in the cockpit, only one thought he had to pull up, see page 31, unfortunately he was one of the ones in control.

>raplidly developing situation.

It was the same situation from when it began to the end, stuck pitot tubes. Though the stall warning only started blaring when the pilot stalled the plane. Bad airspeed indicators don't stall the plane, and are something pilots are supposed to be able to handle, that's why 2 of the 3 were shocked one did the exact opposite in the situation.

It was pilot error. Just look at the report, every finding starts with "the Crew". Planes aren't supposed to crash into the ground just because an air speed sensor failed.

[1] https://bea.aero/uploads/tx_elyextendttnews/annexe.01.en.pdf

macrocosmos 2 days ago | parent | prev [-]

That catastrophe is entirely on Bonin the bonehead.

tra3 2 days ago | parent [-]

I read through the link. The other pilot and the captain are complicit by the virtue of being there. Autopilot disengages at 2:10 and they crash at 2:14. Terrible.

My other immediate thought -- Tesla's autopilot. I've never used it so I'm not sure I'm fully correct here, but apparently it requires you to be vigilant and take over in certain situations? Wonder how well that works out in practice.

fragmede 2 days ago | parent [-]

In practice, there's a camera in the Tesla that looks at the driver to make sure they're paying attention. If they're not, perhaps fiddling with their phone or looking at something in the passenger's seat, then the system gives a warning and then a strike. Get five strikes and you can't use FSD for the next week or two. So drivers are directly incentivized to keep their eyes on the road because if they don't, they can't actually use the system which would suck for a long road trip.