Page 1 of 8

First Tesla Autopilot Death

Posted: Thu Jun 30, 2016 10:45 pm UTC
by KnightExemplar
http://money.cnn.com/2016/06/30/technol ... index.html

The crash occurred on May 7 in Williston, Florida, when a tractor-trailer made a left turn in front of the Tesla at an intersection, according to the National Highway Traffic Safety Administration.

Tesla (TSLA) said in a blog post that its autopilot system did not recognize the white side of the tractor trailer against a brightly lit sky, so the brake wasn't activated. It also noted that this is the first known fatality in over 130 million miles when autopilot was activated.


The "autopilot" system of Tesla literally did not see the white of a tractor trailer. The lower-cameras of the Tesla S did not see the tractor trailer because it was too high off the ground. The driver didn't notice the trailer either. But that's the problem with these semi-autonomous driving rigs: if an edge case comes up that it wasn't designed for, the driver is still bearing the responsibility.

Tesla's official response is here.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.


----

This isn't the first time that Tesla's sensor setup demonstrated a weakness to trailers with high clearance.

Re: First Tesla Autopilot Death

Posted: Thu Jun 30, 2016 11:06 pm UTC
by sardia
I guess I'll wait for the second generation autonomous car?

Re: First Tesla Autopilot Death

Posted: Thu Jun 30, 2016 11:39 pm UTC
by Mutex
Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?

Tesla seem to be bolting automation on as an afterthought, I thought it was meant to be basically cruise control on their cars anyway. And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.

Re: First Tesla Autopilot Death

Posted: Thu Jun 30, 2016 11:59 pm UTC
by ucim
Mutex wrote:And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.
No, not by itself it doesn't. You'd need to know that the driver was paying as much attention as xe would have if xe were driving unaided.

What it suggests to me is that engaging the autopilot is correlated with disengaging the meat processor.

If it turned out that a driver paying full attention to the road would have also missed the tractor trailer crossing in front of xim, then yes, I'd give the Tesla a pass. A re-enactment would be enlightening. But I suspect that a human paying attention would see it. And in the case where xe would not have, driving would be contraindicated in the first place.

Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?
Lasers are cool, but they aren't magic. Google cars use radar though, and that should be unaffected by visible light.

Jose

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 12:00 am UTC
by sardia
Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?

Tesla seem to be bolting automation on as an afterthought, I thought it was meant to be basically cruise control on their cars anyway. And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.

The driver in an autonomous car can't be expected to take control. It makes the situation more dangerous because you're forcing a unaware person to take emergency control, which just increases the chance of an accident. The only emergency control a human should have is an emergency stop button.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 12:28 am UTC
by Diadem
It's not clear from the news paper, but it sounds like this crash is actually the tractor trailer's fault right? If you make a left-turn other traffic has right of way. Although of course you want your autonomous car to still avoid a collision in such a scenario.

ucim wrote:
Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?
Lasers are cool, but they aren't magic. Google cars use radar though, and that should be unaffected by visible light.

Google cars use radar, lidar (radar via laser) and normal cameras. Both radar and lidar would most likely have spotted the trailer in this situation.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 1:11 am UTC
by morriswalters
sardia wrote:
Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?

Tesla seem to be bolting automation on as an afterthought, I thought it was meant to be basically cruise control on their cars anyway. And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.

The driver in an autonomous car can't be expected to take control. It makes the situation more dangerous because you're forcing a unaware person to take emergency control, which just increases the chance of an accident. The only emergency control a human should have is an emergency stop button.
Tesla isn't fully autonomous, is it? The driver is supposed to stay alert. And this is Googles reason for building cars with no steering wheels, their position seems to be that you either can be fully autonomous, or the driver should be in control.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 1:22 am UTC
by Soupspoon
Reminds me of the old tale (maybe I could check the likes of Snopes for it, but I'm gonna blunder on from memory1) of the new owner of a winnebago/similar with the relatively new feature of Cruise Control, the thing that keeps your speed constant so that you only need to hover over the brake in case of emergencies, with no strain of personally having to keep accelerator/gas-pedal pressure just right for hours at a time. On a long, boring stretch of road he turned it on - then went into the back to fry some bacon, or somesuch.

Of course there was no lane-keeping tech involved, just the rather introverted speed-governer mechanism, and after an indeterminate amount of time his driverless vehicle edged itself off the road to, IIRC, disasterous results.


Without knowing too much more about the accident than as quoted in the OP and mentioned in another news-source that I saw before, I don't know much about what the nominaldriver was doing, as ultimate master and commander of the vehicle, but obviously thete's been a similar disconnect between expectation and reality. Driverless cars (to whatever extent) always have had to deal with the problem that they're going to practically have to deal with roads with meatware piloting systems in them, or else transponder-based communications between vehicles (and/or static sensors) on a closed track could eliminate much of the complexity whilst raising safety considerably. But we're going for the complicated problem... And accidents between people happen all the time, it'd be nice to actuarilse miles driven per accident,for human and machine guidance, but there's just too little data for the latter. (A Google car crashex into the side of a bus, recently, I tbink; whilst some lane-holding car suffered a number of swerve-events recently, too, across a number of its fleet of vehicles, but still nothing much to go on to give statistical certainties, I think.)

1 Of a story that may be as apocryphal as the "there's this guy who strapped a JATO to his car..." one.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:35 am UTC
by KnightExemplar
sardia wrote:I guess I'll wait for the second generation autonomous car?


This isn't even a 1st generation autonomous car, despite Tesla's marketing and "autopilot" moniker. This is an collision avoidance system and cruise control with lane assist. The fact that Tesla let its marketing get ahead of the technology is part of the problem. When you read these articles, the article writer clearly believes that these cars are "autonomous". Ex quote:

Tesla's autopilot system has been the most aggressive deployment of an autonomous driving system by an automaker.


Erm... kinda not really? BMW and Subaru have Level 3 autonomous systems (IE: camera that follows a lane on a highway sort of deal). Its definitely not a "stop paying attention to the road" deal, and no other carmaker markets these capabilities as "autonomous" except Tesla. In fact, the other carmakers call it "adaptive cruise control" explicitly so that no one gets the wrong idea.

Diadem wrote:It's not clear from the news paper, but it sounds like this crash is actually the tractor trailer's fault right? If you make a left-turn other traffic has right of way. Although of course you want your autonomous car to still avoid a collision in such a scenario.


Its not like Tesla's adaptive cruise control sees red-lights and stops the car. For all we know, the car approached a stop-sign or red-light. More information about this intersection is needed before we reach any conclusion.

Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?

Tesla seem to be bolting automation on as an afterthought, I thought it was meant to be basically cruise control on their cars anyway. And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.


https://www.teslamotors.com/blog/dual-m ... -autopilot

The launch of Dual Motor Model S coincides with the introduction of a standard hardware package that will enable autopilot functionality. Every single Model S now rolling out of the factory includes a forward radar, 12 long range ultrasonic sensors positioned to sense 16 feet around the car in every direction at all speeds, a forward looking camera, and a high precision, digitally controlled electric assist braking system.


* The ultrasonic sensors are short-range. They won't see shit. They are more for parking-assist rather than actual driving senses.

* The radar was aimed too low. 18-wheelers are huge. The Radar didn't see the truck at all. This seems to be a major problem in the current Tesla design.

* The camera was blinded by the sun.

* The driver wasn't paying attention.

EDIT: To give people the idea of why the radar would miss a tractor trailer:

Spoiler:
Image


That's why. Basically what happened in this case is that the Tesla T-boned the trailer, so unlike this above picture, there was no "lower guard". The trailer will go straight at the windshield, where there are no real protections for the driver.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:37 am UTC
by commodorejohn
"...but the theory is sound!"

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:49 am UTC
by sardia
In the article, we need cars to be twice as safe as human drivers to reap safety benefits, though being as safe as human drivers still returns efficiency benefits.

Hopefully this will turn out like how they treat airline crashes where new patches and procedures are issued after each crash. I thought radar scans a large field of view. Why is the radar limited to low targets?

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:59 am UTC
by KnightExemplar
sardia wrote:In the article, we need cars to be twice as safe as human drivers to reap safety benefits, though being as safe as human drivers still returns efficiency benefits.


Its this awkward middle-ground technology has reached. Enough that these technologies provide great convenience, but their failure cases are acute. Until we reach true autonomous vehicles (type in location into the GPS), these issues will continue.

Mercedes Benz has "Traffic Jam mode", where cars will automatically stop-and-go for you and stay within the lane. A huge number of high-end luxury cars now implement adaptive cruise control. The difference is that the hype around Tesla is getting ahead of Tesla marketing, and people are beginning to believe that this car can do things that it can't.

From my understanding: when demoing the car, Tesla sales reps tell you to take your hands off the steering wheel to demonstrate the autopilot. On second thought, maybe its just the Youtube Dumbasses who do this. I know that Mercedes Benz / BMW have sensors that force you to touch the wheel every 3 seconds to prove you're still watching, but Tesla apparently doesn't have that sort of thing.

I think automatic braking is a good feature. This is a feature you can "forget about". If it fails, the driver SHOULD have been braking anyway. If it works, well that's a lot of money saved (and potentially a life). A lot of these "convenience" features however need to be clear: the driver is the one at fault. And sales reps should NOT be teaching drivers to take their hands off of the steering wheel.

Especially when this shit is still buggy as hell.

Or when the car doesn't see the curb and almost sends you right into it.

----------

Another note: the amount of spyware that Tesla installs on their systems allows for a very large scale amount of data collection. In fact, we know that the "autopilot" was on because Tesla computers say they were. Tesla knows the steering wheel, the position of the pedals, and everything going on in their cars.

This isn't just a car crash, this is a confirmed car crash with Tesla Autopilot on. The previous crashes that Tesla have reported are confirmed to be driver error.

How many car crashes have there been with Mercedes Benz "Traffic Jam Mode" ?? Well, no one really knows, because I doubt that Mercedes is actually tracking the configuration of all of their cars that they sell.

Hopefully this will turn out like how they treat airline crashes where new patches and procedures are issued after each crash. I thought radar scans a large field of view. Why is the radar limited to low targets?


Bad design, clearly.

EDIT: And also, airlines are far more expensive projects than personal driving vehicles. Even at $100,000 per Tesla Model S, forcing every owner to pay say, $10,000 every 2 or 3 years to upgrade the safety mechanisms of the car (ie: add a 2nd radar to scan for trucks and update all the software in this case) is just not going to work. I mean, if Tesla can manage a software-only solution here maybe it'd be interesting. But the fundamental problem of the current radar setup missing high-riding vehicles (like Tractor Trailers / 18-wheelers) does not give me much confidence.

Spending a $1 million upgrading an Airbus A320 (which costs $90 Million or so) to keep up with safety procedures makes sense. Especially as these airplanes get so much more usage in terms of passenger / miles traveled.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 6:42 am UTC
by BattleMoose
This just reminds me of the general nature of engineering new technologies. The sad reality is that with just about any engineering system that involve high speeds or large amounts of energy, that people die. Be it cars, trains, dams, aeroplanes, bridges, ships, submarines, boilers, nuclear reactors or elevators. It is from those deaths that we learn to make a technology safer. And considering how unsafe human driving is, I really hope we continue down the automation route.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:02 am UTC
by KnightExemplar
BattleMoose wrote:This just reminds me of the general nature of engineering new technologies. The sad reality is that with just about any engineering system that involve high speeds or large amounts of energy, that people die. Be it cars, trains, dams, aeroplanes, bridges, ships, submarines, boilers, nuclear reactors or elevators. It is from those deaths that we learn to make a technology safer. And considering how unsafe human driving is, I really hope we continue down the automation route.


With one death in 130 million miles of Autopilot... Tesla is actually on-track for matching up with "unsafe human driving" (whch is approximately 1 death every 100-million miles: 32% of which were drunk people)

Furthermore, the Tesla is one of the best equipped cars from a safety perspective. Its a 4500lb vehicle... it weighs as much as an F150 TRUCK for crying out loud and has spacious crumple zones. The Tesla is "winning" a lot of accidents from a momentum perspective, so we SHOULD expect fatalities to be lower.

And also "autopilot" is immune to drunk driving, fatigue, and many other factors. To see it line up with current human standards is... a bit disappointing, isn't it?

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:09 am UTC
by commodorejohn
"Sure, it failed on basically all counts, including ones that are pretty basic oversights on the part of the designers...but at least it isn't more fatal than drunk drivers! Yet!"

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:39 am UTC
by BattleMoose
It has the potential to get much much better. Its a future I look forward to. Well, unless you're opinion prevails and prevents the technology from being developed. Now that would be a real tragedy.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 8:46 am UTC
by Zamfir
It's not all or nothing. Many companies are introducing the same technologies, but more careful. They rely on more sensors, do more testing before roll-out, put in more restrictions, and perhaps most of all: they make less promises about the capabilities.

KE above mentions that only YouTubers take their hands of the wheel, not Tesla representatives. Anecdote ahead: when they introduced the Autopilot feature, some colleagues of mine were in the process of getting a Tesla. They all came back with the strong hint that legal nannying prevented Tesla from officially selling them a hands-off-the-wheel-read-your-newspaper feature, but wink-wink-nudge-nudge... Then the youtube videos with empty driver seats came out, and Tesla started backtracking.

We shouldn't just give Tesla the benefit of the doubt. They took a technology that other companies were very careful about, and deliberately dropped that care to make themselves look more cutting-edge. Now, perhaps they were right about that. This one fatal accident might have been an outlier. The next few years might show that Tesla was reasonable and everyone else overly conservative. But I like some conservatism in safety technology.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 9:10 am UTC
by HES
On this side of the pond, HGVs have side boards fitted to prevent vehicles (especially motorcyclists) from passing underneath. Sounds like that would have prevented this incident as the car would have detected the lower object.

Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky

This could just have easily occurred with a fully human-driven vehicle. Zero deaths is an unreasonable expectation, as long as it is significantly less. There is, of course, plenty of room for improvement for this relatively new technology.

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S.

Uncontrolled right turns across dual carriageways (GB) / left turns across divided highways (US) are statistically the most dangerous manoeuvre, so there's also a highway design element to consider.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 9:19 am UTC
by Nathan1852
In Teslas blog they stated that the car makes 'frequent checks to ensure that the driver's hands remain on the wheel'.
They also state that the autopilot is disabled by default and the driver has to acknowledge that the feature is still in a public beta.

To me it seems more like the drivers fault (since he still should be looking at the street) than the cars.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 10:28 am UTC
by Neil_Boekend
commodorejohn wrote:"Sure, it failed on basically all counts, including ones that are pretty basic oversights on the part of the designers...but at least it isn't more fatal than drunk drivers! Yet!"

That's not what I read from the data (source: NHTSA). Statistical relevance of a small sample size aside, it is better than human driving (average US). That was 1.08 deaths per 100 Mmiles (which works out to 1 per 93 Mmiles) in 2014, whle the Teslas have driven 130 Mmiles.
The number of miles with alcohol involved are unknown (it's not like people are going to report the amount of miles driven drunk) and thus there is no statistic available for deaths per 100 Mmiles of drunk driving. But I bet that mileage per death is lower.

Nathan1852 wrote:In Teslas blog they stated that the car makes 'frequent checks to ensure that the driver's hands remain on the wheel'.
They also state that the autopilot is disabled by default and the driver has to acknowledge that the feature is still in a public beta.

To me it seems more like the drivers fault (since he still should be looking at the street) than the cars.

Technically yes. But if the car drives itself it's difficult to keep focusing. Since this tech has already surpassed the human average (if we ignore the statistics problem of a small sample size) I do not see this as a problem.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 1:55 pm UTC
by Zamfir
Neil_Boekend wrote:Since this tech has already surpassed the human average (if we ignore the statistics problem of a small sample size) I do not see this as a problem.

That's a low standard. The current level of car safety is not so comfortable that it can act as a sufficient target. It's a risk level where reasonably achievable improvements should be made, not a level where you can rest your efforts.

And Teslas are young, heavy, expensive cars. It wouldn't be good if they had merely average safety. For example, the sensor suite can be used for purely safety-enhancing actions, instead of the autopilot. If Teslas on autopilot are merely comparable in safety to the fleet average, then the autopilot has effectively eaten up all the safety advantages.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 2:20 pm UTC
by Neil_Boekend
Zamfir wrote:
Neil_Boekend wrote:Since this tech has already surpassed the human average (if we ignore the statistics problem of a small sample size) I do not see this as a problem.

I think that standard is too low. The current level of car safety is not so comfortable that it can act as a suffient target. It's a risk level where reasonably achievable improvements should be made, not a level where you can rest your efforts.

Of course there is no reason to rest efforts in increasing the effectivity of the system. I'm saying that it's wise to use it if you're having trouble paying attention due to exhaustion.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 2:30 pm UTC
by Dauric
Neil_Boekend wrote:
Zamfir wrote:
Neil_Boekend wrote:Since this tech has already surpassed the human average (if we ignore the statistics problem of a small sample size) I do not see this as a problem.

I think that standard is too low. The current level of car safety is not so comfortable that it can act as a suffient target. It's a risk level where reasonably achievable improvements should be made, not a level where you can rest your efforts.

Of course there is no reason to rest efforts in increasing the effectivity of the system. I'm saying that it's wise to use it if you're having trouble paying attention due to exhaustion.


Ehhh... The problem with "safety measures" in general is that the presence of such measures makes people careless, after all there's a safety measure to protect them in case they fuck up. To wit: People lean over safety rails.

A measure that says "I improve safety if you're too tired to drive" will ironically encourage people to drive tired instead of pull over and catch a nap (which is always safer than continuing to drive, safety measures or not), and such features will prove most attractive to people who work and drive to the extent that they are frequently driving while tired, the safety equipment giving them the delusion that they are "safe" when doing so.

I think the failures in this incident are evidence enough that we are not -yet- at the point where we can turn over complete autonomous control to the vehicle, they're still a guard rail that requires the human beings in the situation to not lean over it, and we still need to tell people to not lean on the railings (and work out what we do when they invariably do lean on those rails).

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 2:34 pm UTC
by BattleMoose
Dauric wrote:Ehhh... The problem with "safety measures" in general is that the presence of such measures makes people careless, after all there's a safety measure to protect them in case they fuck up. To wit: People lean over safety rails.


While this is certainly a true aspect of human behavior, we have still continually managed to make air travel safer. As well as sea travel (There was a time that sea travel was pretty darn dangerous). Its a novel technology, there are going to be bumps but it will mature. Fewer bumps would be better, but there will be bumps.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 2:41 pm UTC
by Tyndmyr
Eh, if it's as safe as a person, fair enough.

If I hired a chauffer to drive for me, I wouldn't maintain alertness to grab the wheel if he made a mistake. Same, same. Sure, more improvements is great, but so far, it's not really all that scary. At least, not compared to routine traffic.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 2:53 pm UTC
by cphite
Mutex wrote:Don't the Google cars use lasers? So they shouldn't be affected by the collision object being as bright or the same colour as the background?

Tesla seem to be bolting automation on as an afterthought, I thought it was meant to be basically cruise control on their cars anyway. And the fact that the driver didn't see it as well supports the automobile advocates' argument that automated cars will at the very least outperform humans.


It doesn't support that argument at all. It shows that these systems aren't perfect, and that the assumption that they're perfect may in itself be dangerous. The driver didn't notice a tractor trailer most likely because he assumed he didn't need to pay attention.

My wife recently bought an SUV that, while it isn't autonomous, has a whole lot of driver assist features. For example, it has adaptive cruise control - which is great - you basically set your speed, and it'll stay at that speed but will adapt to traffic. If traffic slows down or stops, it slows down or stops. When traffic speeds up again, it speeds up again to your set speed. It has automatic braking in the case of an obstacle, lane departure, blind spot detection, etc, etc.

And these things are great... on a long trip, your job as driver is basically to steer. Set the speed to 65 or whatever and just go... usually.

However, heavy rain can mess with the detection... as can snow, or if dust builds up from driving on unpaved roads for example. Or bright sunlight. We've seen the system thrown off by all of these things. The Tesla was thrown off by a truck being too high off the ground, and bright white - two separate sensors failed to detect something as large as a truck.

And even when conditions are perfect, these things fail in ways that humans usually do not... lane change departure can be thrown off if the lines on the road aren't right, or if they're obscured, or if they're missing.

You can throw all of the sensors and computing power you want at the act of driving - the problem is that computers do not have judgement; the ability to make a reasoned decision. They depend on programmers trying to account for a multitude of specific inputs and there are simply too many of those in the real world to account for.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:10 pm UTC
by Zamfir
I'm saying that it's wise to use it if you're having trouble paying attention due to exhaustion.

I don't think there's enough evidence for that conclusion. Look at it this way: real life use of this system is a mixture. On the one hand, there are actively engaged drivers who whom the system works as a safety backstop. For this situation, the risks are likely to be very low.

On the other end, there's people who use the system to ignore traffic at all, and do something else. Or like Tyndmyr says, assume that the system knows best and don't interfere. It's quite possible that this part of the attention spectrum is very dangerous. Like drunk driving, perhaps . We hardly have enough data to judge the average safety of the mixture, let alone to tease out more subtle effects.

So, if you're tired, you might well drift off towards this side of the spectrum. Perhaps literally fall asleep. People are not good in staying focused under such circumstances. Average levels of safety won't tell you if this is safe, because those average vehicle miles are mostly made by people who were paying more attention.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:22 pm UTC
by HES
Average isn't good enough - but Neil never said it was average, he said it has surpassed the average.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:35 pm UTC
by Tyndmyr
Zamfir's right, we are dealing with small numbers here. In deaths, literally 1. That means significant possibility for error, particularly when considering varied driver behavior, which we don't really have detailed data on for this.

Granted, I would happily use it at present. But I'll also get in a car with someone I haven't established is an average or above average driver. It's an acknowledgement of potential risk, not a denial of it's existence.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:41 pm UTC
by KnightExemplar
HES wrote:
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky

This could just have easily occurred with a fully human-driven vehicle. Zero deaths is an unreasonable expectation, as long as it is significantly less. There is, of course, plenty of room for improvement for this relatively new technology.


In my decade+-long run of driving, I have NEVER failed to notice a fucking 18-wheeler because it was white.

Spoiler:
Image


Let alone a tractor trailer right in front of me. Have you seen how those things accelerate? Those things are SLOW.

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S.

Uncontrolled right turns across dual carriageways (GB) / left turns across divided highways (US) are statistically the most dangerous manoeuvre, so there's also a highway design element to consider.


Its a tractor trailer, and it moved far enough that the Tesla hit it on its side.

Its a fucking 18-wheeler that takes maybe 30+ seconds to make a left turn. This excuse is fucking bullshit. Have you driven around tractor trailers before? They don't exactly zip into the middle of a road.They don't make hairpin turns. Hell, they have issues making a left-turn when using two lanes and may hit the sidewalk due to their massive turning radius.

Yes, inattentive drivers have crashed into 18-wheelers before, and those collisions are extremely dangerous due to the weight of these things. (You might as well crash into a brick wall, really). But these things are literally the biggest vehicles that are legally allowed on a road: its utterly bullshit to claim "I didn't see it" if you hit one.

I mean, sure, maybe its a good excuse for a Tesla camera. But the driver? The proof is in the pudding: the driver didn't see the trailer because he wasn't driving.

Nathan1852 wrote:In Teslas blog they stated that the car makes 'frequent checks to ensure that the driver's hands remain on the wheel'.
They also state that the autopilot is disabled by default and the driver has to acknowledge that the feature is still in a public beta.

To me it seems more like the drivers fault (since he still should be looking at the street) than the cars.


It is surely the driver's fault. Don't get me wrong.

But Tesla is covering their ass. This isn't frequent. That's like... touching the wheel once every MINUTE, over 60 seconds.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 3:51 pm UTC
by Soupspoon
Dauric wrote:they're still a guard rail that requires the human beings in the situation to not lean over it, and we still need to tell people to not lean on the railings (and work out what we do when they invariably do lean on those rails).

Relevant? :P

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 4:23 pm UTC
by HES
KnightExemplar wrote:In my decade+-long run of driving, I have NEVER failed to notice a fucking 18-wheeler because it was white.

That you have never experienced a very specific set of circumstances is hardly surprising. Doesn't mean they don't happen. I guess you've never been snow- or rain-blinded either. Good for you.

Spoiler:
Image

Gee, I retract my entire statement because I'm a clueless idiot that doesn't know what a fucking truck looks like.

Its a fucking 18-wheeler that takes maybe 30+ seconds to make a left turn. This excuse is fucking bullshit.

Nobody is claiming the truck jumped out in front of the car. The fact that it is slow moving makes it even harder to spot the thing that is already, under the specific circumstances, hard to spot.

Sure, a more attentive driver would have spotted it before impact, but not necessarily soon enough. And maybe the story is bullshit, we weren't there, but it isn't nearly as far fetched as you insinuate.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 4:40 pm UTC
by KnightExemplar
HES wrote:
KnightExemplar wrote:In my decade+-long run of driving, I have NEVER failed to notice a fucking 18-wheeler because it was white.

That you have never experienced a very specific set of circumstances is hardly surprising. Doesn't mean they don't happen. I guess you've never been snow- or rain-blinded either. Good for you.


If I can't see, I generally slow down.

Look man: when the sun gets in your eyes, its from a particular direction. You can always look at different angles. The worst case scenario is probably going up-hill directly into the sun.

Nobody is claiming the truck jumped out in front of the car. The fact that it is slow moving makes it even harder to spot the thing that is already, under the specific circumstances, hard to spot.


You're gonna have to do a lot of convincing before you can convince me that an 18-wheeler tractor trailer is "hard to spot" with human eyes. The only situation is if the human were completely blinded by the sun, at which point you'd want to slow down a bit.

Bikes in blindspots, pedestrians on sidewalks, or a bicyclist are the "hard things to spot" on the road. Not the literally largest road-legal vehicle.

Sure, a more attentive driver would have spotted it before impact, but not necessarily soon enough. And maybe the story is bullshit, we weren't there, but it isn't nearly as far fetched as you insinuate.


For better or for worse, this driver posted regularly on youtube his adventures with the Tesla Model S autopilot. Considering other situations that he didn't see, I'm thinking he wasn't paying attention at all.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 4:46 pm UTC
by HES
KnightExemplar wrote:I mean, sure, maybe its a good excuse for a Tesla camera. But the driver?

I missed this in my first parsing, and I don't disagree.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 4:57 pm UTC
by commodorejohn
HES wrote:
KnightExemplar wrote:I mean, sure, maybe its a good excuse for a Tesla camera. But the driver?

I missed this in my first parsing, and I don't disagree.

It's almost like delegating responsibility to a computer means that the person nominally responsible isn't likely to be paying attention if the computer runs into problems, or something.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:26 pm UTC
by Diadem
KnightExemplar wrote:In my decade+-long run of driving, I have NEVER failed to notice a fucking 18-wheeler because it was white.

Where did you get that it was an 18-wheeler. Tractor-trailors come in all shapes and sizes. I paid some attention during my commute today and saw them with 10, 12, 16, 18, 22 and 34 wheels.

Let alone a tractor trailer right in front of me. Have you seen how those things accelerate? Those things are SLOW.

Funny. All the tractor-trailers I meet on the road seem to be driving 100 km/h. Not that slow. Sure they accelerate more slowly, but still won't take them more than a few seconds to cross a junction.

Its a fucking 18-wheeler that takes maybe 30+ seconds to make a left turn.

I have to ask. Have you driven a car before? You don't sound like someone with experience.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:44 pm UTC
by Zohar
Also I'd like to mention you wouldn't notice if you missed a white 18-wheeler. Like, it might happen every week for all you know!

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:49 pm UTC
by KnightExemplar
Zohar wrote:Also I'd like to mention you wouldn't notice if you missed a white 18-wheeler. Like, it might happen every week for all you know!


Point.

Diadem wrote:
Its a fucking 18-wheeler that takes maybe 30+ seconds to make a left turn.

I have to ask. Have you driven a car before? You don't sound like someone with experience.


https://youtu.be/Gl01w6vl9KI?t=57s

Okay, so it takes about 20 seconds to make a left turn while taking up two lanes. My point is that truck is going to be in the road for a long time. Its an event that you'll see coming.

Big Rigs have very large turning radius and take quite some time to turn.

Furthermore, we now have confirmation that the driver was watching Harry Potter during this event. So we know 100% sure that he wasn't looking at the road.

Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."

The movie "was still playing when he died and snapped a telephone pole a quarter mile down the road," Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he didn't see the movie, only heard it.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 7:55 pm UTC
by Zohar
I'm not really surprised, to be honest. I don't know how long the driver of the Tesla had as a test driver - I'm sure at the beginning they paid very close attention to the road, but after a while, it's completely reasonable to expect someone's attention to waver, or to be lost entirely.

Re: First Tesla Autopilot Death

Posted: Fri Jul 01, 2016 8:22 pm UTC
by Zamfir
Diadem wrote:Where did you get that it was an 18-wheeler. Tractor-trailors come in all shapes and sizes. I paid some attention during my commute today and saw them with 10, 12, 16, 18, 22 and 34 wheels.

An 18 wheeler is a specific layout (2 steering wheels and 2 load carrying axles on the truck, 2 axles on the trailer)that's so common the US that you can assume them by default, but exceedingly rare around here. Quite possibly, you haven't seen a single '18 wheeler’ in the American sense today.

I don't know why, but European semis put much more weight on the trailer’s wheels. By the time they need a second load carrying axle on the truck, there 's almost always 3 axles already on the trailer. It's somehow due to length limits.