HomeDiagramsDatabaseMapsForum About
     

Go Back   SkyscraperPage Forum > Regional Sections > Canada > Alberta & British Columbia > Vancouver > Transportation & Infrastructure


Reply

 
Thread Tools Display Modes
     
     
  #661  
Old Posted Feb 19, 2024, 4:38 PM
WarrenC12 WarrenC12 is offline
Registered User
 
Join Date: May 2007
Location: East OV!
Posts: 21,693
Quote:
Originally Posted by aberdeen5698 View Post
Is that true? I thought the issue was that the manual door release isn't in an obvious position and in normal use people don't need to activate it and therefore are often not aware of its existence. That's what I've read about the Models 3 and Y, anyway.

See: https://lucidowners.com/attachments/...-fun-jpg.2279/

Edit: Or maybe you're talking about the rear doors?
Model Ys now have manual release rear doors too.
Reply With Quote
     
     
  #662  
Old Posted Feb 19, 2024, 5:11 PM
whatnext whatnext is offline
Registered User
 
Join Date: Feb 2009
Location: Vancouver
Posts: 22,284
Quote:
Originally Posted by Migrant_Coconut View Post
The whole reason why MCAS even exists in the first place? Because Boeing's engineers had accidentally redesigned the 737 Max to pitch up... so instead of fixing the weight balance like they should and stabilizing the plane... they had the software automatically pitch it down instead. No amount of training will 100% compensate for a design flaw.

Ditto AVs. More people drive than fly, and one lazy team of car engineers can (and have) hurt a lot more people at once than Boeing ever could. Did you know some Teslas don't have a manual door release?
At its heart that was all done to save money by not developing a new airframe. Simplified, take the ancient 737, slap some new engines in it and try to software their way out of the problems created.
Reply With Quote
     
     
  #663  
Old Posted Feb 19, 2024, 8:08 PM
Migrant_Coconut's Avatar
Migrant_Coconut Migrant_Coconut is offline
Registered User
 
Join Date: Oct 2015
Location: Kitsilano/Fairview
Posts: 8,396
Quote:
Originally Posted by WarrenC12 View Post
But when it makes a dumb mistake a "human would never make", people lose their ability to understand statistics.
This, but unironically. People aren't 100% rational.

Also see previous answers about liability and design flaws; a few idiot humans are a few idiot humans, but several AV collisions recalls the entire fleet (and may or may not trigger class-action lawsuits), so the stakes are much higher.
Reply With Quote
     
     
  #664  
Old Posted Feb 19, 2024, 9:48 PM
cganuelas1995 cganuelas1995 is offline
Registered User
 
Join Date: Jan 2017
Posts: 1,276
Felt like I'd add to this thread by saying that I saw an Uber pick up a passenger from the middle of a residential intersection.

That is all.
Reply With Quote
     
     
  #665  
Old Posted Feb 19, 2024, 10:48 PM
aberdeen5698's Avatar
aberdeen5698 aberdeen5698 is online now
Registered User
 
Join Date: May 2010
Posts: 4,435
Quote:
Originally Posted by Migrant_Coconut View Post
This, but unironically. People aren't 100% rational.
People are barely 10% rational. 😲

There are a lot of studies which suggest our decisions are made unconsciously and the "thinking" part of our brains simply rationalizes those decisions after the fact.
Reply With Quote
     
     
  #666  
Old Posted Feb 19, 2024, 11:30 PM
Spr0ckets Spr0ckets is offline
Registered User
 
Join Date: Oct 2015
Posts: 1,430
Quote:
Originally Posted by aberdeen5698 View Post
I take strong exception to this. The MCAS software that caused the complete loss of two 737 MAX aircraft and all aboard had IMHO two major flaws that were not the fault of the hapless pilots who fell victim to it.

The first is that the software inexplicably relied on the inputs of only one of two redundant angle of attack sensors. If the active sensor suffered a failure that caused erroneous readings, the software would wrest elevator control from the pilots even though the other sensor was still providing valid data. This resulted in a single point of failure that could doom the aircraft. It's a patently absurd design decision to have redundant sensors upon which the safety of the aircraft relies and then to completely fail to check to see if those sensors agree with each other or not.

The second is that in it's eagerness to certify a new version of the 737 without having to update the aircraft's type certification, which would have required costly retraining of pilots, the very existence of the newly-introduced MCAS software and its behaviour was hidden from pilots. They literally were unaware of its presence and what it could do. Pilots had no training for scenarios where faulty angle of attack indications could cause MCAS to pitch the aircraft's nose down using a force stronger than they were able to counteract by pulling back on the control stick. Post-accident investigations have concluded that pilots would likely not have had enough time to understand the what was happening to the aircraft and recover from it before it was too late.

There's a good reason that the 737 MAX was grounded for the better part of two years - it has nothing to do with pilot error and everything to do with the design and implementation of the aircraft's flight control system.
Sooo,...

STILL human error, right?

Whether it was the lack of proper training or instruction on the new software, the failure of informing the pilots of existence of said new system, or (poor) design and implemenation of said flight system (both of which were presumbaly done by humans).

You're just taking exception as to what point the human error occurred.

The point remains that the software doesn't act sentiently and doesn't do what it wasn't programmed to do, or fail to do what it was programmed not to fail to do.

Maybe the pilots in those particular cases weren't directly at fault, but their failure/inability to rectify what was going wrong at the moment still lies at the hands of their fellow Homo Sapiens and not non-sentient lines of code.

And at the end of the day, we STILL have the over-riding fact which even you didn't dispute, that ever since the advent of automated flight control systems, the overwhelming vast majority of flight crashes that have happened since have still been as a result of pilot (....or if you prefer, "human") error, rather than the systems themselves.
Reply With Quote
     
     
  #667  
Old Posted Feb 20, 2024, 12:09 AM
Migrant_Coconut's Avatar
Migrant_Coconut Migrant_Coconut is offline
Registered User
 
Join Date: Oct 2015
Location: Kitsilano/Fairview
Posts: 8,396
... Who do you think programs the systems?
Reply With Quote
     
     
  #668  
Old Posted Feb 20, 2024, 12:15 AM
VancouverOfTheFuture's Avatar
VancouverOfTheFuture VancouverOfTheFuture is online now
Registered User
 
Join Date: Apr 2014
Posts: 3,281
Quote:
Originally Posted by aberdeen5698 View Post
Edit: Or maybe you're talking about the rear doors?
well that seems dangerous... i wasnt aware of that. eek.

Quote:
Originally Posted by Migrant_Coconut View Post
And don't get me started on Ford's plan to make an AV which can repo itself...
thats also eek. it reminds me of the most infuriating parts of this new tech age. subscriptions. yes make me subscribe to use heated seats. that stuff should be illegal.

Quote:
Originally Posted by Spr0ckets View Post
Sooo,...

STILL human error, right?
isnt the software for self driving made by humans???
Reply With Quote
     
     
  #669  
Old Posted Feb 20, 2024, 12:37 AM
Spr0ckets Spr0ckets is offline
Registered User
 
Join Date: Oct 2015
Posts: 1,430
Quote:
Originally Posted by Migrant_Coconut View Post
... Who do you think programs the systems?
Well, that's sort of my point is that people act as if these automated systems - be they the autopilot of planes or autopilots of self-driving cars are less reliable than human drivers/pilots and more dangerous - despite statistics showing the opposite that human in control are more error-prone and more given to accidents or crashes - when the reality would likely be that they would be no better or worse because at the end of the day they are created and conceived by human beings with all their inherent flaws and imperfections.

Indeed, the reality is that the software is only as good as the person who programmed it and the system it was implemented in.

But when they are programmed correctly and implemented as they should with proper training and instruction for any human that should oversee them, they demonstrate better safety records than humans on their own.

Perhaps we're more comfortable in the event of crashes and errors if it's a human being behind the wheel or at the cockpit - maybe because we can then directly hold them more accountable than a room full of faceless programmers, with each only responsible for a part and none for the whole.

I liken it a bit to the fear that most people have over flying despite it statistically being the safest mode of travel.

It's more a psychological block or issue that we as human beings have with the perception than is the reality of what actually is - be it the fear of flying generally speaking or the use of automated systems in flying or now with self-driving cars.
Reply With Quote
     
     
End
 
 
Reply

Go Back   SkyscraperPage Forum > Regional Sections > Canada > Alberta & British Columbia > Vancouver > Transportation & Infrastructure
Forum Jump



Forum Jump


All times are GMT. The time now is 8:42 AM.

     
SkyscraperPage.com - Archive - Privacy Statement - Top

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.