| ▲ | operatingthetan 3 days ago |
| We already have advanced autopilots that can fly commercial airliners. We just don't trust them enough to not have human pilots. I would trust the autopilot more than freaking Claude. We already do, every day. |
|
| ▲ | dewey 3 days ago | parent | next [-] |
| I don't think anyone is suggesting we should do that...but it's still a fun project to play around with? |
| |
| ▲ | codingconstable 3 days ago | parent | next [-] | | Agreed. I think thats a really fun way to test out Claude's ability to perform an abstract task it's probably not trained on, was nice to read | |
| ▲ | freedomben 3 days ago | parent | prev [-] | | yeah, I think GP misunderstood the nature of a thing like this. It's what hackers do, we play with things. Nobody is suggesting we replace the pilots in real planes with claude, certainly not OP |
|
|
| ▲ | 16bytes 3 days ago | parent | prev | next [-] |
| In aviation there's a saying, "Aviate, Navigate, Communicate" which describes the hierarchy of things to pay attention to while piloting an aircraft. Autopilot can be thought of better as "auto-aviate". That is to say, if there is already a navigation plan, the aircraft can follow that plan. Simple autopilots just keep the wings level, others can hold an altitude and change heading. More sophisticated ones can change altitude or even fully land the plane. All of those things, however, require people to manage the "Navigate" part. "Aviate" is a deterministically solved problem, at least in normal flight operations. As you point out we trust autopilots today, including on (nearly) every single commercial flight. LLMs are a poor alternative to "aviate", but they could be part of a better flight management automation package. The parent article tries to use the LLM to aviate, with predictable results. If paired with a capable auto-pilot (not the relatively basic one on that C-172), the LLM could figure out how to operate the FMS and take you from post take-off to final approach and aid in situational awareness. Currently, I don't think there is a commercial solution for GA aircraft that could say, "Ok, I'm 20NM from KVNY, but there are three people ahead of me in the pattern, so I have to do a right 360 before descending and joining downwind on 34L". Having an LLM propose that course of action and tell the autopilot to execute on it definitely would be an improvement to GA safety. |
|
| ▲ | Ekaros 3 days ago | parent | prev | next [-] |
| I think we can trust them to not have human pilots. It is just that having human in loop is very useful in not that rare scenarios. Say airfield has too much wind or fog or another plane has crashed on all runways... Someone needs to make decision what to do next. Or when there is some system failure not thought about. And well if they are there they might as well fly for practise. And no. I would not allow LLM in to the loop of making any decision involving actual flying part. |
| |
| ▲ | LiamPowell 3 days ago | parent [-] | | There's also the issue that when something goes wrong, many people will never trust an autopilot again. Just look at how people have reacted to a Waymo running over a cat in a scenario where most humans would have made the same error. There's now many people calling for self-driving cars to never be allowed on roads and citing that one incident. | | |
| ▲ | girvo 3 days ago | parent [-] | | Which makes sense: a robot can’t be responsible for anything, a human can be. |
|
|
|
| ▲ | boring-human 3 days ago | parent | prev | next [-] |
| > We just don't trust them enough to not have human pilots. Much of the value of a human crew is as an implicit dogfooding warranty for the passengers. If it wasn't safe to fly, the pilots wouldn't risk it day after day. To think of it, it'd be nice if they posted anonymized third-party psych evaluations of the cockpit crew on the wall by the restrooms. The cabin crew would probably appreciate that too. |
| |
| ▲ | sandworm101 3 days ago | parent [-] | | There are soooo many pilot decisions that AI is nowhere near making. Managing a flight is more than flying. It is about making safety decisions during crisis, from deciding when to abort an approach to deciding when to eject a passenger. Sure, someone on the ground could make many of those decisions, but i prefer such things be decided by someone with literal skin in the game, not a beancounter or lawyer in an office | | |
| ▲ | DoctorOetker 3 days ago | parent | next [-] | | I doesn't sound ethical to eject passengers while aborting an approach, regardless of precise timing. | |
| ▲ | cucumber3732842 3 days ago | parent | prev | next [-] | | > It is about making safety decisions during crisis, from deciding when to abort an approach to deciding when to eject a passenger. Everyone likes to hand wring about this sort of stuff but I think it's the exception. Nailing the "macro level" decisions like "we'll go around this storm but we'll go over that one" or "we must divert to A or B and we will chose B because it's better for our passengers/company/crew even if it's 10min more flying to get there" are what keep the industry humming along mostly in the black rather than in the red. And it's these sorts of things that AI just tends to yolo and get mostly right when they're obvious but also get immensely wrong when any sort of gotcha materializes. | |
| ▲ | ButlerianJihad 3 days ago | parent | prev [-] | | I sincerely doubt that pilots decide "when to eject a passenger". Mostly it would be the cabin crew: the flight attendants are 100% in charge of flight safety, and they would be managing relationships with passengers, and they would be the ones to make the call. It would ultimately be them calling some kind of law enforcement. If an Air Marshal is onboard already, obviously they would be on the front line as well. Furthermore, the concept of "ejecting a passenger" from a flight would mostly not be something you do while in the air, unless you're nuts. Ejecting a passenger is either done before takeoff, or your crew decides to divert the flight, or continue to the destination and have law enforcement waiting on the tarmac. Naturally, pilots get involved when it's a question of where to fly the plane and when to divert, but ultimately the cabin crew is also involved in those decisions about problem passengers. | | |
| ▲ | rounce 3 days ago | parent | next [-] | | The Pilot in Command has ultimate legal responsibility over the operation of the flight, ICAO conventions explicitly state this. Whilst in practice the cabin crew will be the ones dealing with the passenger(s) and supplying information to the PIC , it won’t be them making the final decision. | |
| ▲ | sandworm101 3 days ago | parent | prev [-] | | No. Cabin crew recommend. Pilots actually decide. | | |
| ▲ | ButlerianJihad 3 days ago | parent [-] | | Do the pilots also decide whether to issue a parachute to the ejected passenger? | | |
| ▲ | stnikolauswagne 3 days ago | parent [-] | | Pretty sure ejection here is meant as shorthand for "Transfer the passenger to an entity on the ground to proceed from there" whether that entity is emergency medical services or law enforcement is secondary. |
|
|
|
|
|
|
| ▲ | SR2Z 3 days ago | parent | prev | next [-] |
| Yeah, I think it's been technically possible to automate jetliners for a while now, but when a metal tube with hundreds of people in it develops a technical fault while moving 500+ mph there's no substitute for a pilot. |
|
| ▲ | ekianjo 3 days ago | parent | prev | next [-] |
| > We just don't trust them enough to not have human pilots never mind that most crashes are caused by humans, very rarely by technical issues going amok |
| |
| ▲ | stnikolauswagne 3 days ago | parent | next [-] | | >never mind that most crashes are caused by humans, very rarely by technical issues going amok Because humans are the fallback for all the scenarios that the tech cannot reliably cover. And my intuition says that the tech around planes is so heavily audited that only things that work with 99.999...% accuracy work will be left to tech. | |
| ▲ | Mawr 3 days ago | parent | prev | next [-] | | That's so incredibly reductive, that I'd go ahead and call it plain wrong. "Caused by a human" is the lowest tier, first base human instinct analysis of any accident, and as such, unless proven otherwise, can be discarded out of hand. It comes down to: if a human mistake is capable of causing an accident, your system is badly designed because it assumes a part of the system known to be unreliable (a human) is always reliable. The whole trick is designing systems that are safe despite humans being in the loop. Then you get to benefit from the advantages humans bring over machines without suffering the downsides. | |
| ▲ | reeredfdfdf 3 days ago | parent | prev | next [-] | | Still those technological issues do happen, and in those situations it's good to have a human pilot in control. See for example Qantas Flight 72 - the autopilot thought aircraft was stalling, and sent the plane into a dive. It could have ended up very badly without human supervision. | | |
| ▲ | ekianjo 2 days ago | parent [-] | | and then you have Air France Rio-Paris where the Pitot sensors got something wrong, leading to a disconnection of the autopilot, and the pilots did everything they could to crash the plane by themselves, while it was fully operational. |
| |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | zenmac 3 days ago | parent | prev | next [-] |
| It would be interesting to see if Claude can land and take off. Don't think the autopilot can do that yet. |
| |
| ▲ | delta_p_delta_x 3 days ago | parent | next [-] | | > Don't think the autopilot can do that yet. It absolutely can; it's called autoland[1]. In really bad visibility, pilots simply can't see the runway until too late, and most aerodromes which expect these conditions have some sort of autoland system installed. The most advanced ones will control every aspect of the plane from top-of-descent (TOD), flaps and throttle configuration, long and short final, gear down, flare, reverse thrust, and roll-out, all the way to a full stop on the runway. Zero pilot input needed. And most of this was already available in the late 1970s. We have absolutely no need for LLM-based AI in aviation; traditional automation techniques have proven extremely powerful given how restricted the human domain of aviation already is. [1]: https://en.wikipedia.org/wiki/Autoland | |
| ▲ | LiamPowell 3 days ago | parent | prev [-] | | Autopilots can. Both on airliners and small planes, although only landing on the latter as far as I know and it's only meant for emergencies. Airbus ATTOL is probably the most interesting of these in that it's visual rather than ILS (note that no commercial airliners are using this). |
|
|
| ▲ | aaron695 3 days ago | parent | prev [-] |
| [dead] |