First Tesla Autopilot Death

Seen something interesting in the news or on the intertubes? Discuss it here.

Moderators: Zamfir, Hawknc, Moderators General, Prelates

User avatar
sardia
Posts: 5726
Joined: Sat Apr 03, 2010 3:39 am UTC

Re: First Tesla Autopilot Death

Postby sardia » Wed Jul 06, 2016 3:43 pm UTC

The key point is that humans are super special snowflakes. Also this
[img]
http://www.smbc-comics.com/comic/doom
[/img]
But people don't like it when we call them out on it. It's better to keep quietly researching and reap the benefits.

User avatar
ucim
Posts: 5432
Joined: Fri Sep 28, 2012 3:23 pm UTC
Location: The One True Thread

Re: First Tesla Autopilot Death

Postby ucim » Wed Jul 06, 2016 3:50 pm UTC

BattleMoose wrote:Especially parents buying cars for their children, such an easy sell for an autonomous vehicle!
Teens want to drive, not be driven by a mechanical choffer. Why take drivers ed in the first place if the machines will do it for you?

Tyndmyr wrote:Sure you can. You patch the code.
No you don't. Somebody else does... if they are pressed to do so.

Tyndmyr wrote:Humans tend to not be hovering in the air, where the apparent sensory weakness is. There's no particular reason to believe the weakness at detecting tractor trailers would apply to pedestrians.
The flaw isn't that it couldn't detect "things hovering in the air". The flaw is that the designers didn't know that this was a problem. It was an unknown unknown, despite the fact that we know it now.

Neil_Boekend wrote:legally and on some roads overtaking is not allowed. On those roads tractors are usually exempt.
Bicycles would not be exempt. In fact, they are specifically included (at least in California). If you're stuck behind a bike on a two-lane undivided highway, you're stuck at 10mph for the duration. Passing is prohibited (unless you can legally give it three feet clearance, which in practice means waiting for a bona fide passing lane).

Neil_Boekend wrote:What's the effective difference between a human not paying attention and missing something and a sensor array missing something and not knowing it missed something?
Humans can learn from their mistakes. Machines cannot upgrade their sensors. Also humans have a stake in their own survival. Machines don't care whether or not they die.

Jose
Order of the Sillies, Honoris Causam - bestowed by charlie_grumbles on NP 859 * OTTscar winner: Wordsmith - bestowed by yappobiscuts and the OTT on NP 1832 * Ecclesiastical Calendar of the Order of the Holy Contradiction * Please help addams if you can. She needs all of us.

KnightExemplar
Posts: 5489
Joined: Sun Dec 26, 2010 1:58 pm UTC

Re: First Tesla Autopilot Death

Postby KnightExemplar » Wed Jul 06, 2016 3:53 pm UTC

Neil_Boekend wrote:
KnightExemplar wrote:And this "Tesla Autopilot" is an optional, beta system that is only being used for highway driving, and is supposed to be used with an attentive driver ready to take over "at any time" (because its still buggy as shit)
Spoiler:
No. This system is not even as good as the crappy humans we are. Maybe one day in the future we will have autonomous drivers that are better than us. But lets not drink the kool-aid and pretend that the day has come yet.

Yes. The failure mode of the Tesla sensor array is different from the failure mode of a human driver. That is not relevant. Human drivers often try to overtake a car while they can't see shit or miss estimate speeds and distances. What is relevant is what your chances are of dying in a car in case of a human driving versus yourself driving.


Dude. This is a fact. If you don't want to believe me about this, then at very least believe Elon Musk himself on this issue. Tesla's "autopilot" is NOT an autonomous system, and requires 100% attentive drivers.

https://www.teslamotors.com/blog/tragic-loss

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.


The failure cases of Tesla are well documented. This is NOT the first Tesla autopilot crash, this is simply the first fatal one.

I talked to Tesla support and they acknowledged that the current sensors/software does not see non moving cars very well, or sometimes too late to allow full collision avoidance.


Autopilot doesn't see stopped cars very well? They better fix that problem! I'm pretty sure that stopped cars are on a lot of roads.

--------------

We aren't even at the point where we can be talking about autonomous drivers yet. Tesla's system is solidly a "Driver Assist" technology. Nothing more. In all cases, the driver should have their hands on the wheel, attentive, and still looking out. In fact, based on the failure-cases of Tesla's sensor system, it looks like you need to be on the lookout for Tesla to mistake a shadow for a lane or something, and then turn over the double-yellow line.

This technology is actually pretty shit. Not "the shit", but a load of crap.

Maybe when we have a REAL autonomous car, we can start talking about theoretically using those to drive us around. But when we only have "driver assist" technologies in beta-release full of bugs? Yeah, don't use them. That's too dangerous.
First Strike +1/+1 and Indestructible.

User avatar
HES
Posts: 4746
Joined: Fri May 10, 2013 7:13 pm UTC
Location: England

Re: First Tesla Autopilot Death

Postby HES » Wed Jul 06, 2016 4:02 pm UTC

ucim wrote:
Neil_Boekend wrote:legally and on some roads overtaking is not allowed. On those roads tractors are usually exempt.
Bicycles would not be exempt. In fact, they are specifically included (at least in California). If you're stuck behind a bike on a two-lane undivided highway, you're stuck at 10mph for the duration. Passing is prohibited (unless you can legally give it three feet clearance, which in practice means waiting for a bona fide passing lane).

That's a poorly thought out law and nothing else. The UK equivalent of the double yellow line (ours is white) specifically exempts things moving less than 10mph, and is used where specifically needed rather than along entire stretches of road.

How wide is a typical lane in the US anyway? 3 feet clearance is hardly generous.
Last edited by HES on Wed Jul 06, 2016 4:06 pm UTC, edited 1 time in total.
He/Him/His Image

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Wed Jul 06, 2016 4:06 pm UTC

ucim wrote:Humans can learn from their mistakes. Machines cannot upgrade their sensors. Also humans have a stake in their own survival. Machines don't care whether or not they die.
Dauric wrote:Computers don't learn from "OH SHIT!" moments as quickly as people do.
I doubt that people learn from them either. Missing having an accident doesn't make you better, it simply says that you came up on the right side of the coin toss. Practice doing right things gives you skills, and most drivers don't. Most people never learned how to pump brake in a panic stop, what made things better was automating the required behavior using anti-lock brakes. Cars became safer not people. Tires are better, the shell protects you better, lighting of all types was improved including that nifty extra brake light, manufacturers got rid of things that killed people like steel dashes. They added seat belts and airbags. Now backup cameras to help not hit people while backing up. You should be noticing a trend.

@ucim
Sensors have been improved every year. I'm going to add a backup camera to may car.

commodorejohn
Posts: 935
Joined: Thu Dec 10, 2009 6:21 pm UTC
Location: Placerville, CA
Contact:

Re: First Tesla Autopilot Death

Postby commodorejohn » Wed Jul 06, 2016 4:22 pm UTC

HES wrote:
ucim wrote:
Neil_Boekend wrote:legally and on some roads overtaking is not allowed. On those roads tractors are usually exempt.
Bicycles would not be exempt. In fact, they are specifically included (at least in California). If you're stuck behind a bike on a two-lane undivided highway, you're stuck at 10mph for the duration. Passing is prohibited (unless you can legally give it three feet clearance, which in practice means waiting for a bona fide passing lane).

That's a poorly thought out law and nothing else. The UK equivalent of the double yellow line (ours is white) specifically exempts things moving less than 10mph, and is used where specifically needed rather than along entire stretches of road.

How wide is a typical lane in the US anyway? 3 feet clearance is hardly generous.

It's absolutely terribly thought out. Interstate lanes are twelve feet wide, which would be room enough for most cars to give bicyclists the required berth, but you never run into bicyclists on the interstate - and most highways and streets have a lot less wiggle room, particularly as you get out into more rural areas (where it's not even a given that the road will have a "shoulder" space outside the lanes.) So you all-too-frequently do find yourself stuck behind a bicyclist who has no safe space to move into so that you can pass him with the required berth, or even with less than that but still a safe distance. It's absolutely maddening, and I can't imagine it's much comfort to bicyclists either, having a car (or, not uncommonly, a line of cars) riding their ass waiting for them to just get the hell out of the way already.
"'Legacy code' often differs from its suggested alternative by actually working and scaling."
- Bjarne Stroustrup
www.commodorejohn.com - in case you were wondering, which you probably weren't.

User avatar
Neil_Boekend
Posts: 3215
Joined: Fri Mar 01, 2013 6:35 am UTC
Location: Yes.

Re: First Tesla Autopilot Death

Postby Neil_Boekend » Wed Jul 06, 2016 4:27 pm UTC

Dauric wrote:
Spoiler:
Neil_Boekend wrote:
Dauric wrote:
Tyndmyr wrote:
Tyndmyr wrote:People hit obvious obstacles all the time.
But that's because they aren't looking, not because they can't see. In the case of the Tesla, it's because it couldn't see, and didn't know it couldn't see.


....what's the difference?



When a device detects a fault it throws an error, the error can then instigate a warning to the driver, reduce speed, record the error, and/or make some other response to the error itself that changes the way the device is operating.

When an error isn't thrown by the system, then the device continues operating normally without responding to the fact that it isn't operating normally.

What's the effective difference between a human not paying attention and missing something and a sensor array missing something and not knowing it missed something?


First off: as noted elsewhere if an individual human being misses something it's that individual that misses it, others may catch the circumstance and alert the person who missed the detail. When a machine misses something -all- of the same machines are going to miss the same circumstance.
Humans are unlikely to pay much attention to the road when someone else is driving. I certainly don't. Much better for my blood pressure and friendships because some people have vastly different driving styles from mine and until now I have always survived those trips.
Secondly: If a human being misses some detail, they miss it in that one instance and may learn from the experience to pay attention to that detail. If a machine misses something it will continue to miss that circumstance until the error is addressed (by upgrading software or hardware).
Humans don't learn from the "OH SHIT!" moments from others. Once the programmers have upgraded the software all cars will be able to download the upgrade. I'd be surprised if this incident didn't cause an improvement project within Tesla Motors. Perhaps image recognition of the wheels of a truck so it doesn't attempt to drive between them.
If the machine doesn't know it missed a detail then it didn't throw an error code for the developers to track down the fault, which makes those upgrades to stop missing a critical detail that much more difficult to narrow down and address with a functional fix.
Humans who don't pay attention miss a lot while not knowing they missed it and software can (and should) be written with such stuff in mind too. If a car is not visible to the software from one angle it is still likely to be visible from an angle that occurs later (assuming most angles are covered).
Computers don't learn from "OH SHIT!" moments as quickly as people do.
True, but humans grow lax and lose the benefit of learning from an "OH SHIT!" moment. And they only learn from their own.




ucim wrote:
BattleMoose wrote:Especially parents buying cars for their children, such an easy sell for an autonomous vehicle!
Teens want to drive, not be driven by a mechanical choffer. Why take drivers ed in the first place if the machines will do it for you?
I didn't. I wanted the freedom provided by driving. To be able to go whenever I wanted and not having to ask for someone to drive me. Presumably many teens also want to have a place where they can get intimate with their partner without their parent in the front seat. This is not damaged by a self driving car.
Tyndmyr wrote:Sure you can. You patch the code.
No you don't. Somebody else does... if they are pressed to do so.
A true self-driving car would place the legal liability on the manufacturer. There: pressed to do so.
Neil_Boekend wrote:legally and on some roads overtaking is not allowed. On those roads tractors are usually exempt.
Bicycles would not be exempt. In fact, they are specifically included (at least in California). If you're stuck behind a bike on a two-lane undivided highway, you're stuck at 10mph for the duration. Passing is prohibited (unless you can legally give it three feet clearance, which in practice means waiting for a bona fide passing lane).
I meant that as an example how it could have been solved. I didn't mean to imply it was legal.
Neil_Boekend wrote:What's the effective difference between a human not paying attention and missing something and a sensor array missing something and not knowing it missed something?
Humans can learn from their mistakes. Machines cannot upgrade their sensors. Also humans have a stake in their own survival. Machines don't care whether or not they die.
(see my replies to Dauric) And I do not care if they care whether I live or die. My safety belt doesn't care, still I wear it. Autonomous driving doesn't add that same amount of safety yet, but they need not care.




KnightExemplar wrote:
Neil_Boekend wrote:
KnightExemplar wrote:And this "Tesla Autopilot" is an optional, beta system that is only being used for highway driving, and is supposed to be used with an attentive driver ready to take over "at any time" (because its still buggy as shit)
Spoiler:
No. This system is not even as good as the crappy humans we are. Maybe one day in the future we will have autonomous drivers that are better than us. But lets not drink the kool-aid and pretend that the day has come yet.

Yes. The failure mode of the Tesla sensor array is different from the failure mode of a human driver. That is not relevant. Human drivers often try to overtake a car while they can't see shit or miss estimate speeds and distances. What is relevant is what your chances are of dying in a car in case of a human driving versus yourself driving.


Dude. This is a fact. If you don't want to believe me about this, then at very least believe Elon Musk himself on this issue. Tesla's "autopilot" is NOT an autonomous system, and requires 100% attentive drivers.

I didn't dispute the fact that the Tesla autopilot system failed in that clip.
Neither did I say it was an autonomous system.
I'm saying it is an anecdote. I'm saying that it does not tell us much except that the autopilot system is not perfect yet. Which is hardly surprising, since Tesla says the same. It doesn't have a lick of statistical relevance because we do not know how many near misses humans would have had due to stupidity on the human's part.
Mikeski wrote:A "What If" update is never late. Nor is it early. It is posted precisely when it should be.

patzer's signature wrote:
flicky1991 wrote:I'm being quoted too much!

he/him/his

User avatar
LaserGuy
Posts: 4370
Joined: Thu Jan 15, 2009 5:33 pm UTC

Re: First Tesla Autopilot Death

Postby LaserGuy » Wed Jul 06, 2016 4:33 pm UTC

Zamfir wrote:
When it is established that autonomous automobiles are somewhat safer, insurance companies will start offering lower premiums for people who use autonomous automobiles.


Most (if not all) of the promised gains in safety are due to systems that do not require autonomy. Collision avoidance for example will be widespread, probabably before practical autonomous cars even exist.


Sure, but then humans will pay less attention to the road and drive more recklessly. See, for example, the guy who was watching Harry Potter instead of driving his car, assuming his Telsa's collision avoidance system would keep him safe.

ucim wrote:Teens want to drive, not be driven by a mechanical choffer. Why take drivers ed in the first place if the machines will do it for you?


Then you don't take driver's ed anymore. Same reason we don't teach teens how to ride or horse or drive a wagon. Most people, teens or otherwise, don't drive because they enjoy the act of driving; most do it because they want to get from point A to point B. You honestly think if they had the choice between sitting in the car playing Angry Birds on their cell phone while the car drove itself or actually driving the car, they're going to choose the latter?

User avatar
ucim
Posts: 5432
Joined: Fri Sep 28, 2012 3:23 pm UTC
Location: The One True Thread

Re: First Tesla Autopilot Death

Postby ucim » Wed Jul 06, 2016 4:33 pm UTC

NeilBoekend wrote:Presumably many teens also want to have a place where they can get intimate with their partner without their parent in the front seat. This is not damaged by a self driving car.
It is if the self-driving car has too many "sensors". But that's a different rant. Or maybe youtube will have a new channel: makeout.youtube.com

Jose
Order of the Sillies, Honoris Causam - bestowed by charlie_grumbles on NP 859 * OTTscar winner: Wordsmith - bestowed by yappobiscuts and the OTT on NP 1832 * Ecclesiastical Calendar of the Order of the Holy Contradiction * Please help addams if you can. She needs all of us.

User avatar
HES
Posts: 4746
Joined: Fri May 10, 2013 7:13 pm UTC
Location: England

Re: First Tesla Autopilot Death

Postby HES » Wed Jul 06, 2016 4:52 pm UTC

commodorejohn wrote:Interstate lanes are twelve feet wide

Excuse my stereotyping misconceptions, then. I assumed they would be larger, but they are in fact identical (except, of course, that ours are technically 3.65 metres). I'll spare the good-and-bad-lane-widths tangent as I'm already well off topic.
He/Him/His Image

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Wed Jul 13, 2016 10:12 pm UTC

I just ran across this and I link without comment.

User avatar
LaserGuy
Posts: 4370
Joined: Thu Jan 15, 2009 5:33 pm UTC

Re: First Tesla Autopilot Death

Postby LaserGuy » Wed Jul 13, 2016 10:39 pm UTC

I'm not sure that there's really much of a technological solution that could have helped that much in such a scenario, beyond, I suppose, forcibly preventing the car from reaching 115 mph in the first place. In principle, collision avoidance/assisted braking could work on targets 1000+ feet away, but I think the system would probably get a lot of false positives and be pretty unreliable. A cursory search suggests most of them aren't rated above 60 mph at the very fastest.

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2307
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: First Tesla Autopilot Death

Postby Soupspoon » Thu Jul 14, 2016 12:41 am UTC

The collision avoidance would surely disallow any speed above which it did not have (sensory) confidence in its ability to spot and respond to dangers in time. Theoretically, long grass by the side of a road could conceal kamikazee wildlife of sufficient mass to cause problems as they bolted straight into the path of the vehicle, such that the vehicle would be travelling at a much reduced speed if it 'realised' this. Long, straight desert highways would (barring the artificial limits to speed as prescribed/proscribed by law) let the vehicle go full tilt, without too much in the way of small scrub concealing potential dangers.

The trouble with this thread's headline case is that it didn't know it couldn't see. Blindfold a human, and he knows not even to attempt to drive, probably, Similarly, set him on an unlit road at night with no headlights of his own and he shouldn't drive off into the dark without a great amount of trepidation about neither seeing nor being seen.

But if the sensors say "there is something ahead" or "I cannot see anything ahead", then there's clearly room for misinterpetation of a blind-spot as a false-negative All Clear, rather than if they convey "I can confirm clear road for X yards, a central barrier with no breaks, a feeder road with enough view along it to rule out vehicles at 10% above legal speeds being an issue by the time we pass it", and probably far more information than that, in which a positive envelope is created (perhaps not without error, but then that's where degrees of confidence would come in, to add a non-negotiable buffer below the perceived safety envelope for this speed and heading) rather than negative envelopes sliced off when problematic signals return.

For all I know, the system does it more like that already, but then it obviously wasn't programmed with the same trepidation as a knight in armour knowing that his view of the battle was necessarily restricted to the thin slit in his protective helm and thus acting somewhat defensively if he were to avoid being struck down by an unseen enemy.

Even with the white-sided artic trailer being invisible against the night sky and beyond the vertical sweep of the radar, or whatever analogue to that it was again, the car-level presence of the wheels (tractor + trailor sets) and related chassis components ought to have appeared on the radar, and a camera that identified them as merely similar to clear road is alike to a person driving with steamed-up windscreen/shield or glasses or even lack of necessary glasses or with cataracts/etc. Tests are doubtless being done to ensure something is done to ensure the artificial vision isn't being Mr Magoo, at least not without realising it.

(Two sensor sets also sounds problematic. Three methods wouod perhaps be best, an inkling of danger from one should prompt the aggressive reanalysis of the other two seeking to rule out possibilities , and two sets identifying a danger would trigger the required evasive/avoidance response. With two, there's "do on one" which is ripe for false positives, or "think on one, do on two" which is ripe for false negatives lasting far too long.)

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Thu Jul 14, 2016 2:10 am UTC

His collision avoidance didn't work. Physics ruled here, anything coming over that hill at that speed was limited in it's ability to avoid the obstruction, given the limits of the current traction technology. Extreme control inputs would have rolled the car. There was going to be an accident. An intelligently designed system would be pre-braking coming over a blind hill, and it wouldn't have been speeding. Pre-braking involves nothing more than releasing acceleration. Most people don't. You should do it at intersections. A cheap infrared reflector on top of the cars oriented at 90 degrees to the direction of travel could warn of obstructing cars. A look here stupid for vehicles crossing a traffic lane. Or something. Each new event will inform engineers about the things they haven't yet considered.

User avatar
sardia
Posts: 5726
Joined: Sat Apr 03, 2010 3:39 am UTC

Re: First Tesla Autopilot Death

Postby sardia » Thu Jul 14, 2016 3:29 am UTC

LaserGuy wrote:I'm not sure that there's really much of a technological solution that could have helped that much in such a scenario, beyond, I suppose, forcibly preventing the car from reaching 115 mph in the first place. In principle, collision avoidance/assisted braking could work on targets 1000+ feet away, but I think the system would probably get a lot of false positives and be pretty unreliable. A cursory search suggests most of them aren't rated above 60 mph at the very fastest.

Solutions at higher speeds are more engineering problems than anything else. Look at how the interstate highways are designed, they aren't anything like the local roads. No sidewalks, no stops, one way roads with dividers. The road markers are super elongated, and everything is stretched or enlarged to facilitate driving at higher speeds. You'd have to do that x2 for higher speeds, like say the autobahn.

User avatar
HES
Posts: 4746
Joined: Fri May 10, 2013 7:13 pm UTC
Location: England

Re: First Tesla Autopilot Death

Postby HES » Thu Jul 14, 2016 8:06 am UTC

morriswalters wrote:There was going to be a collision.

FTFY. No such thing as an accident in this context.
He/Him/His Image

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2307
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: First Tesla Autopilot Death

Postby Soupspoon » Thu Jul 14, 2016 9:48 am UTC

morriswalters wrote:His collision avoidance didn't work. Physics ruled here, anything coming over that hill at that speed was limited in it's ability to avoid the obstruction, given the limits of the current traction technology. ...
I didn't think the offending car was the one that might have had collision avoidance, I thought (in the deliberate and stated absence of comment) that it wasintended as a general example on how the arrogant stupidity of human drivers easily outweighs the simple stupidity of a driving-aid-cum-full-automation package. (I also didn't assume the electric car 'victim' was at fault, for not detecting trouble given the driver had not... Not even sure that car has suitable assistive technology.)

I don't think you were posting in response to my message, but you pretty much say as I did, but briefer, in that if there was a collision avoidance system in use that actually knew what it was doing (yet did not also obey actual/assumed speed limits as well) then it would never have allowed its driver to have been going so fast as to have been caught out by (especially!) legally moving manually-driven vehicles not actually being driven suicidally1, by any reasonable expectation.

But now someone have to define 'reasonable'...


1 In steadily driven bunched multi-lane traffic, it is feasible for a neighbouring driver to get it into his mind to yank the steering over and sideswipe an automated car (or 'encourage it' to itself swerve onto the shoulder - and I predict cases of that will happen in the future, from militant self-drivers who flip when they see a stream of auto-automobiles doing slightly better in their own lane than him in his, due to more responsive position maintenance, perhaps with obviously distracted drivers, perhaps even whilst empty as they head towards their 'driver' picking-up spot from the out-of-town free parking spots, or sent to pick up the worker of the family after doing the school run for the house-parent. Even on undivided highways2, poking out into the opposing stream to disrupt an auto-convoy. But probably humans 'lane drifting' will be the basic standard to guard against.

2 I also await the use of 'kits' to lay down false road markings and cover up legitimate ones, sufficient to fool the 'lane camera systems' of the next automated vehicle (like the classic 'diversion - bridge out' arrow switcheroo, beloved/befallen of the Dick Dasterdleys and Terry Thomases of the fictional car-race world). Perhaps strips of line-bright/road-dark material laid out artfully to disguise the true route and depict an alternative, to be quickly swiped away by fishing line after the target vehicle has enditched itself.

User avatar
Neil_Boekend
Posts: 3215
Joined: Fri Mar 01, 2013 6:35 am UTC
Location: Yes.

Re: First Tesla Autopilot Death

Postby Neil_Boekend » Thu Jul 14, 2016 10:15 am UTC

Soupspoon wrote:1 In steadily driven bunched multi-lane traffic, it is feasible for a neighbouring driver to get it into his mind to yank the steering over and sideswipe an automated car (or 'encourage it' to itself swerve onto the shoulder - and I predict cases of that will happen in the future, from militant self-drivers who flip when they see a stream of auto-automobiles doing slightly better in their own lane than him in his, due to more responsive position maintenance, perhaps with obviously distracted drivers, perhaps even whilst empty as they head towards their 'driver' picking-up spot from the out-of-town free parking spots, or sent to pick up the worker of the family after doing the school run for the house-parent. Even on undivided highways2, poking out into the opposing stream to disrupt an auto-convoy. But probably humans 'lane drifting' will be the basic standard to guard against.

2 I also await the use of 'kits' to lay down false road markings and cover up legitimate ones, sufficient to fool the 'lane camera systems' of the next automated vehicle (like the classic 'diversion - bridge out' arrow switcheroo, beloved/befallen of the Dick Dasterdleys and Terry Thomases of the fictional car-race world). Perhaps strips of line-bright/road-dark material laid out artfully to disguise the true route and depict an alternative, to be quickly swiped away by fishing line after the target vehicle has enditched itself.

1.
a/If a manual driven car makes an illegal move against an automatically driven car I would be very surprised not to see a 3d animation based on the datadump of all the auto auto recorded before during and after the crash in a court case.
b/If the manual car didn't make an illegal move then the automatically driven car made an illegal move and there should be repercussions.
c/If nobody made an illegal move and all vehicles worked without mechanical problems somebody should take a damn hard look at the specific gap in the law that allowed such a situation.

2.
There will be a lot of data gathering. If the lane markings do not fit the predicted lane markings (based on locations from previous auto cars that have been driving there) there likely are road works. Which the car must be able to detect in the forseeable future anyway. In that case the car should slow down anyway. If the car detected no road signs indicating road works there should be large alarm bells and a controlled safe stop (if possible). Including (probably) an automated 911 call to indicate something is seriously wrong there. Perhaps the car could slowly progress along the known route while extensively checking for obstacles and road damage etc.
Mikeski wrote:A "What If" update is never late. Nor is it early. It is posted precisely when it should be.

patzer's signature wrote:
flicky1991 wrote:I'm being quoted too much!

he/him/his

User avatar
HES
Posts: 4746
Joined: Fri May 10, 2013 7:13 pm UTC
Location: England

Re: First Tesla Autopilot Death

Postby HES » Thu Jul 14, 2016 10:28 am UTC

A fully autonomous car will not be solely reliant on road markings. A semi-autonomous car will be able to return control to a human driver if something is amiss.

Markings wear off, are removed for roadworks, can be obscured by spillages and bad weather, and in some cases they don't exist in the first place. There are other methods of navigation.
He/Him/His Image

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Thu Jul 14, 2016 11:16 am UTC

Soupspoon wrote:I thought (in the deliberate and stated absence of comment) that it wasintended as a general example on how the arrogant stupidity of human drivers easily outweighs the simple stupidity of a driving-aid-cum-full-automation package. (I also didn't assume the electric car 'victim' was at fault, for not detecting trouble given the driver had not
You are correct. These accidents are common, sometimes they are the crossing cars fault and sometimes not. My wife was T boned in the kind of accident. The car saved her. A close friend was almost killed in another. But the fact of the matter is that humans are not perfect drivers. In the original event Tesla's autopilot didn't see the truck, this is a fixable engineering failure, but it is not peculiar to Tesla's car.

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2307
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: First Tesla Autopilot Death

Postby Soupspoon » Thu Jul 14, 2016 11:19 am UTC

Indeed (@Neil, two messages up now). Humans aren't logical, but they are inventive enough to come up with spontaneous and/or premeditated abuses of a system, whilst not caring for (or revelling in) the consequences. (For above-post's footnote 2, I'm talking of a potential one-lane shift, to start with, possibly on a bend, where no GPS/Inertial Guidance location information is yet quite reliable enough to detect a small deviation if you can temporarily fool the cameras into a different calibration. Especially in a naive vehicle that hasn't a detailed memory of the route to recall against to ensure it isn't being tricked into verging itself or worse.)

Ninja:
HES wrote:A fully autonomous car will not be solely reliant on road markings. A semi-autonomous car will be able to return control to a human driver if something is amiss.

Markings wear off, are removed for roadworks, can be obscured by spillages and bad weather, and in some cases they don't exist in the first place. There are other methods of navigation.

Of course not. But they (and or electronic/optical 'beacons', set up along/underneath roads by the Autoautomotive Transport Authority/whoever, at great expense) are going to be an obvious first recourse. I was tracking my GPS path, earlier this week, as a passenger in a car, and my track was passing through off-road terrain (sometimes a number of parallel terraced streets away, sometimes in fields or parkland3), jinking all over the road4 or even, whilst stationary, moving around even without vehicle or device moving, and nobody moving about outside to mess up the signals5. Some of this is because it wasn't a vehicular GPS with large aerials installed (for best views, even under trees, of all satellites) and some of this would be solved by Differential GPS beacons (assuming they were also proof from spoofing or jamming, like also the 'white line' or 'roadside' beacons, proposed above), but ultimately if it looks like a white line and acts like a white line, it's going to be a difficult decision to make as to whether to ignore it in favour of more remote and variable positioning beacons, when obvious lines can be seen (as opposed to be being seen to be absent or unreliable) and 99.9% of the time they're your best fallback option.

There'll be protections against spoofing, but it it'll take a while to perfect, new spoofings will emerge from the modern-day/near-future Luddite brigades, and if their tampering reduces self-driving vehicles to a snail's pace and/or force the driver to Take Back Control then that is likely as much a victory for the spiritual descendents of the frame-breakers (perhaps a militant group of chaufeurs, looking to repeat the attempt at better market positioning and recognition in the light of the rise of automation in their chosen field) who may not care that they aren't fully disrupting their targets, or even wish to.

(A spoofed/stolen/retasked "roadworks ahead, slow down" warning beacon could be an easy ask, and at no obvious risk to human life or automotive property, whilst making a point and/or making themselves awkward.)

3
Spoiler:
Screenshot_2016-07-14-11-43-16.png

4
Spoiler:
Screenshot_2016-07-14-11-44-29.png

5
Spoiler:
Screenshot_2016-07-14-11-45-29.png

(All above tracks were at 30-seconds recording refresh rate with no obvious signal dropouts during the duration of their map segments (assumed accurate enough) being transited, middle was whilst in slow traffic, bottom was given a parked-car symbol at the point of stopping, around which half an hour or so of 'stationary squiggle' can be seen to be tracked. Those examples were picked up at semi-random in reverse order on a general SW to NE trip, overall, that was recorded for future use.)

KnightExemplar
Posts: 5489
Joined: Sun Dec 26, 2010 1:58 pm UTC

Re: First Tesla Autopilot Death

Postby KnightExemplar » Thu Jul 14, 2016 2:08 pm UTC

Neil_Boekend wrote:1.
a/If a manual driven car makes an illegal move against an automatically driven car I would be very surprised not to see a 3d animation based on the datadump of all the auto auto recorded before during and after the crash in a court case.
b/If the manual car didn't make an illegal move then the automatically driven car made an illegal move and there should be repercussions.
c/If nobody made an illegal move and all vehicles worked without mechanical problems somebody should take a damn hard look at the specific gap in the law that allowed such a situation.


If the sensor-data of the Tesla "automatic car" were reliable, then it wouldn't have crashed into the side of a tractor trailer. There's the key assumption that these sensors are actually perfect. This case proves they are not.

In this case, the Tesla car was blind to the Tractor Trailer, with deadly results. We are very far away from using the car's sensor data as a reliable source in court.
First Strike +1/+1 and Indestructible.

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Thu Jul 14, 2016 2:32 pm UTC

The sensor data is being used now by NHTSA and your insurance company, and even a failure such as the Tesla's blindness to white objects is useful in court. It contributes to the division of liability. And it serves as a nudge to improve the engineering. On my part I try to remember that everything we use to aid drivers today come from unexpected hazard modes introduced by new technologies. Four way stops were invented after the car.

User avatar
Neil_Boekend
Posts: 3215
Joined: Fri Mar 01, 2013 6:35 am UTC
Location: Yes.

Re: First Tesla Autopilot Death

Postby Neil_Boekend » Thu Jul 14, 2016 2:47 pm UTC

KnightExemplar wrote:
Neil_Boekend wrote:1.
a/If a manual driven car makes an illegal move against an automatically driven car I would be very surprised not to see a 3d animation based on the datadump of all the auto auto recorded before during and after the crash in a court case.
b/If the manual car didn't make an illegal move then the automatically driven car made an illegal move and there should be repercussions.
c/If nobody made an illegal move and all vehicles worked without mechanical problems somebody should take a damn hard look at the specific gap in the law that allowed such a situation.


If the sensor-data of the Tesla "automatic car" were reliable, then it wouldn't have crashed into the side of a tractor trailer. There's the key assumption that these sensors are actually perfect. This case proves they are not.

In this case, the Tesla car was blind to the Tractor Trailer, with deadly results. We are very far away from using the car's sensor data as a reliable source in court.

In that case there is an obvious inconsistency in the sensor data when combined with the situation. We use people's memory as a reliable source in court and not only do many humans have a specific goal their memory omits details and twists things. We use them because we have nothing better. A court appointed data interpreter would be required to properly interpret the data, of course they would, but that doesn't make the data invalid. Merely incomplete in the case of the Tesla, but even there if all the data were properly gathered the cause of the accident would be obvious: If the Tesla was a fully autonomous car the data would indicate that the car was not equipped with a sufficient sensor array to drive safely. Just like someone with severe cataracts.
When you have all the data you can see what data is missing. When you have no data but only what people think happened you're guessing.
In this specific case the Tesla data will probably indicate
1. Nothing on the road, free to drive
2. A truck pulls up, no trailer. Tesla can drive on behind it safely
3. Deceleration indicating a crash measured. Airbags deployed.
Anyone would be able to use that to prove the Tesla didn't detect something it crashed into. Together with the damage to car and truck it wouldn't have taken a genius to solve the riddle.

If a human were to have been driving that car there would be no data from that human, because the data recording system is dead. Not that in this case the data is really required to figure out what happened, but in some cases human testifying is key. Fallible humans with an agenda of their own.
And if it had been a crash that was survivable the data would be more like:
"It all happened so fast! I was driving. A truck pulled up but it was clear behind it. Suddenly there was a trailer and I crashed into it." Of course this is not a specific mistakes humans are likely to make, but that is besides this specific point.
Mikeski wrote:A "What If" update is never late. Nor is it early. It is posted precisely when it should be.

patzer's signature wrote:
flicky1991 wrote:I'm being quoted too much!

he/him/his

User avatar
ucim
Posts: 5432
Joined: Fri Sep 28, 2012 3:23 pm UTC
Location: The One True Thread

Re: First Tesla Autopilot Death

Postby ucim » Thu Jul 14, 2016 3:28 pm UTC

A: Too much bringing up of GPS and road beacons - neither humans nor autodrive cars will (or should) be following these signals. They indicate where the clear path should be. Rather, they follow direct observations of the immediate surroundings. That indicates where the clear path actually is. There's nothing besides direct observations that will detect and identify a deer, a paper bag blowing across the road, a child, or a stray hubcap.

B: Humans take a driver test which is totally inadequate for an autodrive car. I'm pretty sure autodrive cars would pass the test as is, but it's not good enough, because the driver test does not test things we take for granted in people. We know much better what people can and can't do, and why they do it. Sure, we know "more" about computer algorithms, but that is on the level of hormones and blood flow; not very useful for predicting behavior in unusual circumstances. We have a vision test, for example, but that is not to tell whether or not a person can see a truck against the sky. We don't test for that particular vision defect because based on what we know about people, it would have already shown up. But the Tesla, while it might pass the human vision test, would fail the "do you see a truck" test that is not administered. We don't know enough about computer vision (and where it is necessary) to know that we don't need such a test for autodrive cars, which is why we do.

C: Human reactions are on a higher level than (present) computer reactions. The human heuristics are vaguer, perhaps, but also cover a wider breadth of experience, and therefore are better able to deal with unusual circumstances (even if they are perhaps less quick about it). Computers are much quicker, but their algorithms are more direct, which means that more has to be programmed into it by hand, rather than generated by the computer's experience and evolution.

Not to say one is better than the other for driving; just that they need to be seen and treated differently for those reasons.

Jose
Order of the Sillies, Honoris Causam - bestowed by charlie_grumbles on NP 859 * OTTscar winner: Wordsmith - bestowed by yappobiscuts and the OTT on NP 1832 * Ecclesiastical Calendar of the Order of the Holy Contradiction * Please help addams if you can. She needs all of us.

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Thu Jul 14, 2016 4:01 pm UTC

ucim wrote:Too much bringing up of GPS and road beacons - neither humans nor autodrive cars will (or should) be following these signals. They indicate where the clear path should be. Rather, they follow direct observations of the immediate surroundings. That indicates where the clear path actually is. There's nothing besides direct observations that will detect and identify a deer, a paper bag blowing across the road, a child, or a stray hubcap.
We use things like beacons every day. From the reflectors glued to the road to the other reflectors used on rural(an urban) road posts to reflectors on trucks. Lights themselves are beacons, both the lights on cars and street lights as well as flashing caution signals. A car that can read signage can be as aware of trouble points(children playing and deer crossing signs come to mind) as a human. And as the sensors improve I can envision sensors that can see through optical concealment(brush and other things) that humans can't see through, and won't ever be able to. And GPS can be useful, good maps used in concert with GPS can be a check on position as well as establishing the error level of the GPS position. But people will still be killed by cars, no matter who is driving.

Tyndmyr
Posts: 10119
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: First Tesla Autopilot Death

Postby Tyndmyr » Thu Jul 14, 2016 4:17 pm UTC

More people have probably died trying to catch pokemons while driving than were killed by autopilot.

And the former's been around a lot less.

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2307
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: First Tesla Autopilot Death

Postby Soupspoon » Thu Jul 14, 2016 4:26 pm UTC

For a threat to humanity with more history in car accidents than GO has yet accumulated... "Tamagotchi: The Hidden Killer!"

Chen
Posts: 5193
Joined: Fri Jul 25, 2008 6:53 pm UTC
Location: Montreal

Re: First Tesla Autopilot Death

Postby Chen » Thu Jul 14, 2016 4:54 pm UTC

ucim wrote:B: Humans take a driver test which is totally inadequate for an autodrive car. I'm pretty sure autodrive cars would pass the test as is, but it's not good enough, because the driver test does not test things we take for granted in people. We know much better what people can and can't do, and why they do it. Sure, we know "more" about computer algorithms, but that is on the level of hormones and blood flow; not very useful for predicting behavior in unusual circumstances. We have a vision test, for example, but that is not to tell whether or not a person can see a truck against the sky. We don't test for that particular vision defect because based on what we know about people, it would have already shown up. But the Tesla, while it might pass the human vision test, would fail the "do you see a truck" test that is not administered. We don't know enough about computer vision (and where it is necessary) to know that we don't need such a test for autodrive cars, which is why we do.


Let's be fair here. The human driving test is inadequate for determining whether or not humans are safe drivers too. Now that said, I don't think anyone is suggesting an automated car need only pass a human driving test. I imagine there will be all sorts of certification testing that would need to be done before an automated car would be deemed acceptable. Even then I imagine there will be accidents caused by unknowns or by things that people didn't think to test for. The regulations will then be modified and cars will need to be retrofitted or redesigned appropriately. Really I imagine automated cars regulations are going to look a lot like aircraft, train or large passenger ship regulations.

elasto
Posts: 3065
Joined: Mon May 10, 2010 1:53 am UTC

Re: First Tesla Autopilot Death

Postby elasto » Thu Jul 14, 2016 5:10 pm UTC

Chen wrote:Let's be fair here. The human driving test is inadequate for determining whether or not humans are safe drivers too. Now that said, I don't think anyone is suggesting an automated car need only pass a human driving test. I imagine there will be all sorts of certification testing that would need to be done before an automated car would be deemed acceptable. Even then I imagine there will be accidents caused by unknowns or by things that people didn't think to test for. The regulations will then be modified and cars will need to be retrofitted or redesigned appropriately. Really I imagine automated cars regulations are going to look a lot like aircraft, train or large passenger ship regulations.

Yup.

Automated vehicles have a couple of big plusses when it comes to testing, also.

Firstly, you only really need to run the full battery of tests on a single sample vehicle. If it passes, you know that all vehicles running the same combination of software and hardware would also pass. You only then need further testing to affirm sufficient QC during manufacturing, to ensure a uniform quality of sensors etc.

Secondly, unlike with the emissions scandal, say, it won't be possible for a manufacturer to 'fake' a good result. Or, rather, if such testing were faked, it would be immediately obvious due to all the crashes and deaths - which would result in (a) massive compensation payouts and (b) a massive loss of market share. So, unlike with emissions, every manufacturer is going to ensure they don't simply meet the testing standards, but exceed them by as much as is humanly possible.

(Of course, some manufacturers will learn this lesson slower than others *cough*Tesla*cough*)

User avatar
ucim
Posts: 5432
Joined: Fri Sep 28, 2012 3:23 pm UTC
Location: The One True Thread

Re: First Tesla Autopilot Death

Postby ucim » Thu Jul 14, 2016 5:39 pm UTC

morriswalters wrote:We use things like beacons every day. From the reflectors glued to the road...
... yes, but we use them as parts of the landscape, not as proxies for the landscape. That's the thrust of my point. And sure, GPS is useful. But if an accident is caused by bad GPS or lack of GPS, then something is Very Wrong.

elasto wrote:Secondly, unlike with the emissions scandal, say, it won't be possible for a manufacturer to 'fake' a good result. Or, rather, if such testing were faked, it would be immediately obvious due to all the crashes and deaths
But you won't see "all the" crashes and deaths. You'll see some number of crashes and deaths, which might be a bit larger than it would have been but you don't know what it would have been, and it's only a little bit. You'll never be able to pin it on any decisions made in the boardroom (which will include all sorts of cost/benefit tradeoffs). Teaching to the test is an old trick.

And when so many cars are self-driving and an issue like this is found, it will play out like Takita.

Jose
Order of the Sillies, Honoris Causam - bestowed by charlie_grumbles on NP 859 * OTTscar winner: Wordsmith - bestowed by yappobiscuts and the OTT on NP 1832 * Ecclesiastical Calendar of the Order of the Holy Contradiction * Please help addams if you can. She needs all of us.

Tyndmyr
Posts: 10119
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: First Tesla Autopilot Death

Postby Tyndmyr » Thu Jul 14, 2016 5:47 pm UTC

ucim wrote:
morriswalters wrote:We use things like beacons every day. From the reflectors glued to the road...
... yes, but we use them as parts of the landscape, not as proxies for the landscape. That's the thrust of my point. And sure, GPS is useful. But if an accident is caused by bad GPS or lack of GPS, then something is Very Wrong.


Literally none of these systems are GPS only.

And yes, a sign or reflector tape is indeed merely a symbol to convey information. It's not inherently different from a similar system optimized to digital needs.

Chen
Posts: 5193
Joined: Fri Jul 25, 2008 6:53 pm UTC
Location: Montreal

Re: First Tesla Autopilot Death

Postby Chen » Thu Jul 14, 2016 5:54 pm UTC

ucim wrote:But you won't see "all the" crashes and deaths. You'll see some number of crashes and deaths, which might be a bit larger than it would have been but you don't know what it would have been, and it's only a little bit. You'll never be able to pin it on any decisions made in the boardroom (which will include all sorts of cost/benefit tradeoffs). Teaching to the test is an old trick.


While there will be some aspect to that you can bet that any automated car crashes, especially if there are deaths, are going to scrutinized FAR more than a regular car crash. Again think of them akin to aircraft or train crashes in regards to the type of investigation that will go through. It would make sense that the NTSB would be what investigates these since really if automated cars are all around, the risk to the public is FAR higher than any plane/train crashes.

elasto
Posts: 3065
Joined: Mon May 10, 2010 1:53 am UTC

Re: First Tesla Autopilot Death

Postby elasto » Thu Jul 14, 2016 6:07 pm UTC

Chen wrote:While there will be some aspect to that you can bet that any automated car crashes, especially if there are deaths, are going to scrutinized FAR more than a regular car crash. Again think of them akin to aircraft or train crashes in regards to the type of investigation that will go through. It would make sense that the NTSB would be what investigates these since really if automated cars are all around, the risk to the public is FAR higher than any plane/train crashes.

Yup. And, with a little thought, it's obvious why every automated crash will accrue far more investigation than any human one:

Crashes caused by human error are somewhat non-reproducible: You discover the particular way that a particular human failed. The outcome is limited in scope: An increase in insurance premiums and/or criminal sanctions just for that one human.

Crashes caused by automation errors are highly reproducible: If it failed under a particular set of circumstances, it's highly likely it'd fail every time that set of circumstances were to arise in the future. So it's worth many more resources being devoted to find the cause of the failure, since it will pay for itself dozens to millions of times over.

Efforts to improve human driving have to be done on a continual basis: Every new generation of drivers has to be trained, told not to drink and drive, told not to speed and so on. And every new generation will have drivers who think the rules don't apply to them. Hence the million fatalities a year.

In a sense, efforts to improve automated driving only have to be done once. Sure, sometimes bugs can creep back in, but, assuming a comprehensive battery of tests, once handling some situation has been solved once, it's solved for all time. Hence why self-driving car crashes will be investigated more or less like plane crashes are now.

DanD
Posts: 257
Joined: Tue Oct 05, 2010 12:42 am UTC

Re: First Tesla Autopilot Death

Postby DanD » Thu Jul 14, 2016 8:02 pm UTC

Soupspoon wrote:
The trouble with this thread's headline case is that it didn't know it couldn't see. Blindfold a human, and he knows not even to attempt to drive, probably, Similarly, set him on an unlit road at night with no headlights of his own and he shouldn't drive off into the dark without a great amount of trepidation about neither seeing nor being seen.


However, humans do routinely outdrive their headlights at night. In fact, low beam headlights only provide clear illumination out to about 160 feet. That's shorter than the stopping distance from 40mph (assuming average reflexes and brakes).

So yes, humans make some judgements well, but they're really stupid about others.

User avatar
sardia
Posts: 5726
Joined: Sat Apr 03, 2010 3:39 am UTC

Re: First Tesla Autopilot Death

Postby sardia » Thu Jul 14, 2016 8:16 pm UTC

The thing about sensor deficiencies is that there are teams of engineers spending millions to improve them. The average human solution is to pump out more humans till the inconvenience of death is overcome.

Tyndmyr
Posts: 10119
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: First Tesla Autopilot Death

Postby Tyndmyr » Thu Jul 14, 2016 8:23 pm UTC

sardia wrote:The thing about sensor deficiencies is that there are teams of engineers spending millions to improve them. The average human solution is to pump out more humans till the inconvenience of death is overcome.


Been workin' pretty well so far.

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Tue Jul 26, 2016 10:33 pm UTC

For those interested a link to an article and some pictures. The truck.Image
The car.Image
And the intersection.Image

User avatar
LaserGuy
Posts: 4370
Joined: Thu Jan 15, 2009 5:33 pm UTC

Re: First Tesla Autopilot Death

Postby LaserGuy » Tue Jul 26, 2016 11:38 pm UTC

I think the general rule in such situations is that "the truck always wins".

morriswalters
Posts: 6869
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: First Tesla Autopilot Death

Postby morriswalters » Wed Jul 27, 2016 12:19 am UTC

Well, yeah.

Soupspoon wrote:The trouble with this thread's headline case is that it didn't know it couldn't see.
I was rereading this and realized just how odd this quote is.


Return to “News & Articles”

Who is online

Users browsing this forum: No registered users and 16 guests