Hugo Hacker News

U.S. opens probe into Tesla’s Autopilot over emergency vehicle crashes

Animats 2021-08-16 20:07:48 +0000 UTC [ - ]

Right. As I've pointed out previously, Tesla seems to be unable to detect sizable stationary obstacles that are partly blocking a lane, especially if they don't look like the rear end of a car. In addition to emergency vehicles, Teslas on autopilot have plowed into freeway barriers and a street sweeper. That's the usual situation for first responders, who usually try to block as little of the road as possible but often don't have enough shoulder space.

It's clear what Tesla really has - a good lane follower and cruise control that slows down for cars ahead. That's a level 2 system. That's useful, but, despite all the hype about "full self driving", it seems that's all they've got.

"Full self driving" just adds some lane-changing assistance and hints from the nav system.

icelandicmoss 2021-08-16 21:51:20 +0000 UTC [ - ]

I feel like part of the problem with the kind of autopilot crashes you describe here is how inexplicable they are to humans. Whilst humans can be dangerous drivers, the incidents they cause generally have a narrative sequence of events that are comprehensible to us -- for instance, driver was distracted, or visibility was poor.

But when a supposedly 'all-seeing always watching' autopilot drives straight into a large stationary object in clear daylight, we have no understanding of how the situation occurred.

This I think has a couple of effects:

1) The apparent randomness makes the idea of these crashes a lot more scary -- psychologically we seem to have a greater aversion to danger we can't predict, and we can't tell ourselves the 'ah but that wouldn't happen to me' story.

2) Predictability of road incidents actually is a relevant piece of information. As a road user (including pedestrian), most of my actions are taken on the basis of what I am expecting to happen next, and my model for this is how humans drive (and walk). Automated drivers have different characteristics and failure modes, and that makes them an interaction problem for me.

oaw-bct-ar-bamf 2021-08-16 22:13:00 +0000 UTC [ - ]

In my opinion the underlying assumption autopilots are built with are wrong. It is assumed that the road is free to drive on.

Only when the vehicle computer detects a known object on the road that it knows should not be there it is applying brakes or trying to steer around.

I would feel safer if the algorithm would assume the negative case as default and only give the „green light“ once it determined that the road is free to drive on. In case of unknown (not yet supervised) road obstructions the worst needs to be assumed.

That’s where the ‚unexplainable‘ crashes are coming from. Something the size of an actual truck is obstructing the road. But couldn’t quite classify it because the truck has tipped over and is lying on the road sideways. Not yet learned by the algorithm. Can't be that bad, green light, no need to avoid or brake.

sangnoir 2021-08-17 01:27:22 +0000 UTC [ - ]

> Only when the vehicle computer detects a known object on the road that it knows should not be there it is applying brakes or trying to steer around.

The problem with Tesla's "No LIDAR ever, cameras are good enough" approach is that it fails to detect emergency vehicles: they filter out stationary items out of radar signal as noise[1],and Tesla's ML models probably can't reliably identify oblique vehicles and semi trailers as obstacles.

1. Makes sense in isolation: frequent radar returns from roadside and overhead signs would be a pain to deal with

Varriount 2021-08-17 07:46:06 +0000 UTC [ - ]

What is the reasoning behind "no lidar"? Cost?

MaxikCZ 2021-08-17 07:55:01 +0000 UTC [ - ]

The stated reason is "your eyes dont shoot lasers, so a camera is good enough". But the implied reason is cost for sure. With how fast the price of lidar drops, and its abilities increase (think solid state lidar), I wonder how long until first tesla with lidar rolls down the production line, or if Elon is too proud to ever allow that

quartesixte 2021-08-16 22:32:28 +0000 UTC [ - ]

> It is assumed that the road is free to drive on.

Trying to remember if the opposite of this is how human drivers are taught, or if this is implicit in how we move about the world. My initial gut reaction says yes and this is a great phrasing of something that was always bothering me about automated driving.

Perhaps we should model our autopilots after horses: refusal to move against anything unfamiliar, and biased towards going back home on familiar routes.

cameron_b 2021-08-17 17:49:30 +0000 UTC [ - ]

In my high school’s Drivers Ed class I distinctly remember the one-question pop quiz: “What is the most dangerous mile of road?”

The answer was “the mile in front of you”

Additionally there was some statistic about the frequency of accidents within a very short distance of the drivers residence, which seemed to underscore the importance of being aware of just how much your brain filters out the “familiar” in contrast to a newly stimulating environment.

jiscariot 2021-08-17 20:54:04 +0000 UTC [ - ]

I had always assumed the "close to home" numbers were just bad statistics, because I never saw them control for % of driving that was done "close to home".

If I google it, I get like three pages of law firms.

Animats 2021-08-17 06:29:36 +0000 UTC [ - ]

In my opinion the underlying assumption autopilots are built with are wrong. It is assumed that the road is free to drive on. Only when the vehicle computer detects a known object on the road that it knows should not be there it is applying brakes or trying to steer around. I would feel safer if the algorithm would assume the negative case as default and only give the „green light“ once it determined that the road is free to drive on.

I agree, but it will up the false alarm rate in a system without good depth perception for all objects. This is tough with cameras only. Reflective puddles are a problem; they're hard to range with vision only. Anything that doesn't range well, which is most very uniform surfaces, becomes a reason to slow down. As you get closer, the sensor data gets better and you can usually decide it's safe to proceed.

Off-road autonomous vehicles have to work that way, but on-road ones can be more optimistic.

Waymo takes a hard line on this, and their vehicles drive rather conservatively as a result. They do have false-alarm problems and slowdowns around trouble spots.

oaw-bct-ar-bamf 2021-08-19 10:51:09 +0000 UTC [ - ]

Would you rather optimize for a faster overall fleet, or a fleet with stress free driving, no incidents, no need to intervene or be to be worried.

If the system gets faster over time, even better. But I cannot imagine huge adoption unless the system gets actually reliable. I am pretty much in favor of the Waymo approach.

oaw-bct-ar-bamf 2021-08-19 10:53:45 +0000 UTC [ - ]

Having high false positive results with only single or dual sensors only shows how ‚bad‘ we still are with controlled secure automated driving.

willcipriano 2021-08-16 22:29:06 +0000 UTC [ - ]

I agree. In the north east at least pothole avoidance is a critically important skill. Any "autopilot" without it would be fairly useless around me as I'd have to take over every 30 seconds to not end up with a flat tire. I have adaptive cruse control and that's about as far as I'll trust a computer to drive given the current tech.

2021-08-16 22:30:51 +0000 UTC [ - ]

2021-08-16 23:16:35 +0000 UTC [ - ]

diggernet 2021-08-17 00:21:42 +0000 UTC [ - ]

My problem with those crashes is that they are entirely explicable: The car is blind to stationary objects in the road. (My best guess at the logic is they assume that "anything stationary cannot possibly be in the road, right?")

To me, that blindness is simply unacceptable. If there is anything in the road, whether identified or not, it should automatically be flagged as a hazard. That flag should only be removed if it is detected to be moving in a way such that it will be somewhere else when you get there.

I have Subaru EyeSight. It has no problem seeing stationary objects. What's Tesla's problem?

harles 2021-08-17 00:25:42 +0000 UTC [ - ]

I’m not sure about newer models without radar, but the older ones explicitly discard stationary returns on their radar. As I understand it, without elevation data it can’t know if it’s a bridge you’ll pass under, a soda can in the road, or a stopped car - so just ignore it all.

Of course the vision system is supposed to compensate for this, and it performs poorly on objects it doesn’t see often, like emergency vehicles.

jiggawatts 2021-08-17 01:41:57 +0000 UTC [ - ]

The vision system is supposed to be able to determine an accurate depth map based on a combination of stereo vision and depth-from-defocus. I've seen demos of the real-time depth map, and it looks high-resolution and accurate to about 5-10cm.

So, if they have the input data, why is it being ignored by autopilot?

harles 2021-08-17 03:53:38 +0000 UTC [ - ]

Tesla’s website[0] states it’s monocular depth estimation. I haven’t heard of them doing any form of stereo.

[0] https://www.tesla.com/autopilotAI

diggernet 2021-08-17 14:32:10 +0000 UTC [ - ]

why should it matter how often it sees something? Or even if it's something the car has never seen before? All it should care about is whether there is an obstacle, not what the obstacle is. Whether it's an emergency vehicle, a sofa, a boulder, a canoe, a table saw, or a dolphin, you don't want to hit it!

harles 2021-08-17 14:39:06 +0000 UTC [ - ]

How often it’s seen in training data that is, which is pulled from data in the wild.

It’s simply not possible to do depth estimation like this without priors. That’s one of the serious limitations of such systems - you have to train on every class of object you don’t want to hit.

diggernet 2021-08-17 18:15:47 +0000 UTC [ - ]

Then they are doing it wrong. There are all manner of things that can end up in the road that have never been (and will never be) classified. If their system must classify a thing to not hit the thing, then they will kill people. It's gross negligence to work so hard to not, at the very minimum, install two cameras for stereo vision.

harles 2021-08-17 22:22:57 +0000 UTC [ - ]

100% agree. I think depth is critical and monocular estimation doesn’t cut it.

elihu 2021-08-16 23:25:51 +0000 UTC [ - ]

Another aspect of unpredictability is that drivers are expected to be alert and vigilant while using ADAS features, but I get the impression that Tesla's implementation sometimes does things that are completely unexpected. Sometimes you might have to react immediately to something you didn't see coming, because you didn't expect the car to suddenly try to steer into a concrete pillar or something.

It's one thing to have to deal with inexplicable behavior from other cars, but to have to deal with inexplicable behavior from your own car seems quite a bit more unnerving.

riskable 2021-08-17 13:30:27 +0000 UTC [ - ]

I think the problem we're seeing here is that Tesla's autopilot system is on the cusp of a fully automated driving experience and that feels good enough to the driver. Yet it's not quite good enough, as we can see from the mistakes it has made.

Honestly, I see this as a necessary transition pain towards fully automated vehicles. No matter how you slice it there's going to be periods where fully automated driving systems aren't quite there yet but are good enough 97% of the time that human drivers let their guard down. It's going to take some sacrifices to get to fully autonomous driving.

The good news is that even with these accidents self-driving features are a bazillion times safer than human drivers. It sure seems like the occasional vehicle collision into stationary objects is going to throw a great big wrench into self-driving safety statistics but it isn't even a rounding error compared to the sheer number of accidents caused by human drivers.

pauljurczak 2021-08-17 03:45:10 +0000 UTC [ - ]

> to have to deal with inexplicable behavior from your own car seems quite a bit more unnerving.

And yet, tens of thousands of drivers are working as unpaid beta testers for Tesla. Mind-boggling.

tshaddox 2021-08-17 01:01:57 +0000 UTC [ - ]

> I feel like part of the problem with the kind of autopilot crashes you describe here is how inexplicable they are to humans.

I don't see why these are inexplicable to humans. It's certainly no more difficult to explain than, say, a (non-adaptive) cruise control in a car from 2000 doing the same thing.

> Whilst humans can be dangerous drivers, the incidents they cause generally have a narrative sequence of events that are comprehensible to us -- for instance, driver was distracted, or visibility was poor.

But that is arguably a sufficient explanation for these Tesla crashes as well. The driver being distracted or inattentive or unable to see clearly is a requirement for all of these Tesla crashes, as far as I know.

icelandicmoss 2021-08-17 01:21:20 +0000 UTC [ - ]

Perhaps 'unintuitive' is a better word to convey what I mean -- as in, there isn't an easily understandable (non-technical) narrative chain of events, there's just 'opaque box malfunctioned'. The cruise-control example you give feels a bit different, as CC doesn't claim to include automated collision avoidance, whereas something labelled 'autopilot' does.

kevin_thibedeau 2021-08-17 01:37:50 +0000 UTC [ - ]

It's perfectly explainable. You have a blind machine with an imperfect sensorium trying to describe an elephant. Correct identification is just getting lucky. The layers of ML improve the odds but can never achieve 100%. The whole scheme is playing dice with other people's safety.

postmeta 2021-08-16 20:58:12 +0000 UTC [ - ]

As this reddit post pointed out, this appears to be a common problem with radar TACC. https://www.reddit.com/r/teslamotors/comments/p5ekci/us_agen...

""" These events occur typically when a vehicle is partially in a lane and radar has to ignore a stationary object. This is pretty standard and inherent with TACC + radar.

The faster Tesla pushes the vision only stack to all cars after they’ve validated the data, the faster this topic becomes moot. Andrej Karpathy talks and shows examples of what that would do here. Minutes 23:00-28:00 https://youtu.be/a510m7s_SVI

Older examples from manuals of other TACC systems which use radar:

Volvo’s Pilot Assist regarding AEB/TACC.

According to Wired, Volvo’s Pilot Assist system is much the same. The vehicles’ manual explains that not only will the car fail to brake for a sudden stationary object, it may actually race toward it to regain its set speed:

“Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.”

Cadillac Super Cruise - Page 252

Stationary or Very Slow-Moving Objects

ACC may not detect and react to stopped or slow-moving vehicles ahead of you. For example, the system may not brake for a vehicle it has never detected moving. This can occur in stop-and-go traffic or when a vehicle suddenly appears due to a vehicle ahead changing lanes. Your vehicle may not stop and could cause a crash. Use caution when using ACC. Your complete attention is always required while driving and you should be ready to take action and apply the brakes.

BMW Driving Assistant Plus - Page 124

A warning may not be issued when approaching a stationary or very slow-moving obstacle. You must react yourself; otherwise, there is the danger of an accident occurring.

If a vehicle ahead of you unexpectedly moves into another lane from behind a stopped vehicle, you yourself must react, as the system does not react to stopped vehicles. """

pkulak 2021-08-16 22:57:00 +0000 UTC [ - ]

I’ll believe it when I see it. From what I can tell, Tesla has made no progress at all in three years. I just drove my buddy’s 3, and it was still diving to the right when a lane merges and the line disappears. This drove me nuts when I test drove years ago. Other cars do lane keeping so much better than Tesla at this point.

btilly 2021-08-16 23:16:29 +0000 UTC [ - ]

These have different causes.

The problem with radar on the ground is that most of what comes to a radar detector is reflections from a stationary world, with relative delays so small as to be undetectable. So the first step in processing is to filter out everything at the speed of that motionary world. All fixed objects therefore disappear, and you are left sorting out moving objects. Which means you now can't detect stationary objects at all.

Tesla has a different problem. They probably don't have depth perception. They therefore have to classify objects, and make educated guesses about where they are relative to the car. Unexpected kinds of objects, or objects in unexpected configurations, fail to be classified and therefore fail to be analyzed.

In principle, Tesla can succeed. After all we don't have binocular vision past 6 meters either. Tesla is improving.

But they haven't yet.

ajross 2021-08-16 22:17:37 +0000 UTC [ - ]

FWIW: Tesla AP is primarily vision based now. Newer cars in the US aren't even being fitted out with the radar units anymore (mine doesn't have it, for instance). So while this may in some sense be an unavoidable edge case for radar, it really shouldn't be for Tesla Autopilot.

It's worth checking out for sure. Not worth the headline bandwidth and flamage budget being spent on it.

sjg007 2021-08-16 21:05:11 +0000 UTC [ - ]

What about Subaru eyesight? I thought it did..

Loughla 2021-08-16 21:19:15 +0000 UTC [ - ]

Subaru's eyesight absolutely will stop you when you think you're about to hit something, regardless of whether or not that something was moving previously.

It's actually really annoying if you live in a rural area without clearly defined lanes, and large, stationary objects (tractors and whatnot) close to the road.

nzrf 2021-08-16 21:55:37 +0000 UTC [ - ]

As I think I previously posted about this. It will also see exhaust coming up on cold winter day as an obstacle and brake unexpectedly at light. It literally is worst and wish it could be disabled by default.

Additionally, the back up sensor is a tad over zealous also.

xattt 2021-08-16 23:36:26 +0000 UTC [ - ]

You list pages but fail to mention which models these are for. Do automakers issue manuals for advanced driving assistance systems?

bit_logic 2021-08-16 22:07:37 +0000 UTC [ - ]

I think we need to add a new level 2 assisted driving skills section to driving tests. Level 2 can be safer but it really requires understanding how level 2 works and limitations. For example, when I use level 2 (mine is Honda but applies to other level 2 as well since they mostly share the same vendor), these are the rules I follow:

- Car switching in/out of my lane, I manually take over

- Tight curve in the freeway, manually take over

- Very frequently check the dashboard indicator that shows if the sensors "sees" the car front or not

- Anything unusual like construction, cones, car on shoulder, manually take over

- Anything that looks difficult like weird merging lanes, manually take over

- Any bad weather or condition like sun directly in front, manual drive

- Frequently adjusting max speed setting on ACC. It's safer to not be too much above the prevailing speeds. Otherwise, if ACC suddenly becomes blind, it can accelerate dangerously as it tries to reach max set speed.

- I don't trust lane keep much, it's mostly a backup for my own steering and making my arms less tired turning the wheel

The key thing is to recognize just how dumb this technology is. It's not smart, it's not AI. It's just a bit above the old cruise control. With that mindset it can be used safely.

jumpkick 2021-08-16 22:34:27 +0000 UTC [ - ]

If level 2 requires such handholding, what’s the point? Seems to me like it just leads to a false sense of security, giving drivers the feeling that they can trust the self-driving system a lot more than they safely can.

throwaway0a5e 2021-08-16 23:12:50 +0000 UTC [ - ]

It's basically ultra advanced cruise control that can also handle stop and go traffic on a well marked road. That frees the driver up to dedicate more of the situational awareness budget to other things.

If you use it that way it's fine. If you expect it to be as smart as a student driver it's not fine.

tshaddox 2021-08-17 01:04:22 +0000 UTC [ - ]

The point is that it makes driving on the highway easier, just like automatic transmissions, power steering, etc. I'm sure early critics of all of these technologies have said things like "but if the driver doesn't need to spend mental and physical energy to that this they will stop paying attention to the road and will be less safe."

amanaplanacanal 2021-08-17 01:36:08 +0000 UTC [ - ]

You’re sure? Perhaps you could find where somebody actually said that.

tshaddox 2021-08-17 16:11:11 +0000 UTC [ - ]

I meant it sarcastically. Perhaps no one did say that at the time. My point was that it would obviously be ridiculous if they did.

LightG 2021-08-19 08:41:22 +0000 UTC [ - ]

100% ...

I'd rather at least get the pleasure of driving, rather than basically becoming the supervisor for my car. Quartlery performance reviews, checking on KPI's.

Autopilot and the like are absolutely not on my list of features I'm looking for when buying a new car. Crusie-control? Handy. AP, waste of (my) time.

tayo42 2021-08-16 22:14:23 +0000 UTC [ - ]

Seems like it's easier just to drive regularly. This sounds very distracted

ajross 2021-08-16 22:24:47 +0000 UTC [ - ]

> Level 2 can be safer but [...]

I think that requires more numerate analysis than you're giving though. The data from the story is a sample size of 11 crashes over three years (I think). If that's really the size of the effect, then your "but [...]" clause seems very suspect.

There are almost two million of these cars on the roads now. It seems extremely likely that the number of accidents prevented by AP dwarfs this effect, so arguing against it even by implication as you do here seems likely to be doing more harm than good.

That doesn't mean it's not worth investigating what seems like an identifiable edge case in the AP obstacle detection. But that's a bug fix, not an argument about "Level 2 Autonomy" in general.

sunshineforever 2021-08-16 22:32:29 +0000 UTC [ - ]

I think it's early to be adding stuff like that to government mandated driving tests when these cars are only theoretically available to the ever dwindling middle class and above. Unless my circumstances change there's no chance I'll be in one for at least 10-15 years.

mmcconnell1618 2021-08-16 22:18:41 +0000 UTC [ - ]

I wonder if some sort of standard display showing what the vehicle "sees" and is predicted to do will be regulated. For example, if the display shows the vehicle doesn't understand a firetruck parked half way in the lane or the tight curve on the freeway, at least the driver can validate on the display and have some time to react.

sunshineforever 2021-08-16 22:33:28 +0000 UTC [ - ]

I would strongly prefer a car with such features.

shrimpx 2021-08-16 22:42:21 +0000 UTC [ - ]

Elon keeps warning the world about the impending hyperintelligent AI that will retool human economies, politics, and religions, yet year after year his cobbled-together AI fails at basic object detection.

geekraver 2021-08-17 14:06:02 +0000 UTC [ - ]

AI is a bogus term. It was well-discredited in the 1980s and has gone through an ill-deserved rehabilitation. Deep neural nets are a step forward for some tasks like vision, but are still just scratching the surface of what really goes on in our brains when sense-making from visual data.

MisterTea 2021-08-17 14:20:03 +0000 UTC [ - ]

I'm split over this. On one hand The AI is so incompetent that it could never take over the world and enslave humanity. On the other, its so incompetent that it winds up killing humans anyway. lose-lose.

president 2021-08-16 23:00:58 +0000 UTC [ - ]

It's all marketing and image projection.

RcouF1uZ4gsC 2021-08-16 23:42:34 +0000 UTC [ - ]

> That's the usual situation for first responders, who usually try to block as little of the road as possible but often don't have enough shoulder space.

Actually, from talking with friends who are first responders, many times they will park fire trucks, etc so that they are blocking enough of the road to protect the first responders and the victims. The last thing you want is to have another car come and crash into first responders or victims of the initial accident. That is why they will deliberately park the truck at an angle to protect the people.

FireBeyond 2021-08-17 03:51:33 +0000 UTC [ - ]

Exactly this. We'll place the heaviest apparatus 'upstream' of the event and in a way that 'encourages' cars away from the incident.

Larger departments or those dealing with busier freeways have even started re-purposing older engines with water ballasts and attenuators as 'blocker' engines.

asdff 2021-08-16 22:31:21 +0000 UTC [ - ]

I don't understand how its even possible for these cars to be crashing. My car beeps like a missile is locked on when I am coming too close to an object I might hit. Just a simple sensor in the front. If my car can beep and give me enough time to slam on the brakes, why can't Tesla's do the same?

HALtheWise 2021-08-16 22:51:56 +0000 UTC [ - ]

You're probably referring to parking sensors, which are typically ultrasonic sensors mounted to the bumpers. Unfortunately, ultrasonics have both a limited range of ~20ft for practical uses, and more damningly, a relatively wide field of view with no ability to distinguish where in the field of view an object is. While 20ft range is more than enough to give you time to slam on the brakes in your garage, it's basically useless for high speed autonomy, except for some very limited blindspot-awareness type tasks.

asdff 2021-08-17 05:40:32 +0000 UTC [ - ]

It's not a parking sensor and there isn't anything on the bumper as far as i can tell. It's toyotas precollision system that uses a simple radar and a cheap camera. It even brakes for you if you don't.

quartesixte 2021-08-16 22:33:20 +0000 UTC [ - ]

Well for starters, they’re taking out the radars that other cars rely on to accomplish this.

shrimpx 2021-08-16 22:45:23 +0000 UTC [ - ]

In other parts of this thread, people are suggesting that these crashes are the radar's fault and deploying their vision-only system will fix the problem.

jazzyjackson 2021-08-17 05:08:00 +0000 UTC [ - ]

I don’t see how you can blame a radar for your vision system not being able to classify a fire truck (not to mention the unforgivable act of ignoring objects it can’t classify)

quartesixte 2021-08-17 07:58:10 +0000 UTC [ - ]

As the sibling post says, that’s ridiculous at face value. All the radar is check for collisions because that’s how a radar works...is there a non-intuitive answer here???

shrimpx 2021-08-18 18:59:50 +0000 UTC [ - ]

didntknowya 2021-08-17 04:21:58 +0000 UTC [ - ]

it'd say it's more just a fancy cruise control with obstacle detection than a true "self driving" system

qweqwweqwe-90i 2021-08-16 20:54:33 +0000 UTC [ - ]

Yeah, let's ignore all the good and force everyone to go back to human drivers that are 10x worse.

kube-system 2021-08-16 21:08:11 +0000 UTC [ - ]

Tesla does not offer a vehicle that does not require human drivers (not even momentarily). Tesla's autopilot and FSD systems are both SAE Level 2, which means the human is still in operation of the vehicle at all times. All of Tesla's driving assistance technologies require a human to monitor operation of the vehicle and intervene if necessary. The fact that they have given anyone an impression otherwise is problematic.

https://www.sae.org/binaries/content/gallery/cm/articles/pre...

ajross 2021-08-16 22:20:10 +0000 UTC [ - ]

> The fact that they have given anyone an impression otherwise is problematic.

Good grief. This meme will not die. The car literally tells you to keep your hands on the wheel every time you engage autopilot, yells at you if you don't, will lock you out of the system as punishment if you don't comply, and if you really seem disabled will bring the car to a stop and turn the hazards on. It simply will not operate without an attentive driver, or at the very least one spending considerable energy at defeating the attention nags.

There are exactly zero Tesla drivers in the world who don't know these rules. Just stop with the nonsense. Please.

FireBeyond 2021-08-17 03:52:40 +0000 UTC [ - ]

And there are Tesla materials that say, word for word, "The driver is only there for legal purposes. The car is driving itself."

It only yells at you now because Tesla had to be forced to make it do so. Previously it'd let you go for a quarter of an hour before checking in on you.

Good grief yourself.

vkou 2021-08-16 22:23:15 +0000 UTC [ - ]

> There are exactly zero Tesla drivers in the world who don't know these rules. Just stop with the nonsense. Please.

Tesla's marketing also knows that there are exactly zero drivers in the world who follow those rules, but that doesn't stop them from overselling the capabilities of what they ship.

ajross 2021-08-16 22:26:46 +0000 UTC [ - ]

Stop it. Please. Again, there are no Tesla drivers who have been misled about the capabilities of the system. The people who have been mislead are folks like you who read arguments on the internet and don't drive these cars. Go try one and see how the system works. It doesn't permit the kind of confusion that everyone constantly assumes. It just doesn't.

kube-system 2021-08-17 05:03:24 +0000 UTC [ - ]

There are many instances of people defeating the lockout system. Social media is full of these types of demonstrations. Plenty have the attitude that it is okay to do this. Some have died while showing it off.

ajross 2021-08-17 05:06:32 +0000 UTC [ - ]

"Defeating the lockout system" wasn't the discussion at hand. The contention upthread is that Tesla drivers did not know that they needed drive the car.

kube-system 2021-08-17 05:43:32 +0000 UTC [ - ]

The widespread defeat demonstrations are evidence of the pop-culture misunderstanding of the situation and owners’ willingness to:

1. concede to peer pressure and/or

2. doubt of the validity or seriousness of those warnings/lockouts

kube-system 2021-08-17 04:50:40 +0000 UTC [ - ]

Because the effectiveness of those warnings are diminished by mixed messaging and peer pressure from non-owner passengers.

There are plenty of people who have been convinced that those safety features/warnings are “just there for lawyers” and have attached items to the wheel to defeat the safety lockouts in order to show off their “self driving car” to their friends.

fxtentacle 2021-08-17 09:59:49 +0000 UTC [ - ]

Here's #1 out of your zero drivers ;)

"Tesla driver slept as car was going over 80 mph on Autopilot, Wisconsin officials say"

0x000000001 2021-08-17 20:11:07 +0000 UTC [ - ]

And that driver was cheating the attention nags

evanextreme 2021-08-16 20:58:05 +0000 UTC [ - ]

No one is saying this should be the case, just that the feature is not what the company advertises (the ability for the car to fully drive itself) and that said feature is further away from completion than many might lead you to believe. as someone who drives a tesla with autopilot, I agree with this. Autopilot is the best lane assistance system ive used, but thats all it is.

paxys 2021-08-16 21:51:26 +0000 UTC [ - ]

Less than 100 first responders are hit annually in the USA. The fact that just Tesla has managed to hit 11 since 2018 makes it pretty clear that human drivers are not "10x worse" than Tesla's tech, but quite the opposite.

qweqwweqwe-90i 2021-08-16 22:22:20 +0000 UTC [ - ]

You are comparing two different things. 11 responder vehicles crashes is not comparable to how many first responders (people) are hit.

breakfastduck 2021-08-16 20:59:31 +0000 UTC [ - ]

Go back? Human drivers are the norm, not the exception. This argument is tiresome beyond belief.

But no, just keep disregarding the clearly significant issues and mis-marketing, because progress, right?

thebruce87m 2021-08-16 22:27:28 +0000 UTC [ - ]

There is no independent data to support your 10X claim. Tesla marketing or an Elon tweet doesn’t count.

sillystuff 2021-08-17 00:48:20 +0000 UTC [ - ]

I was curious and went looking for corroboration / refutation of the 10x you stated. Tesla and Musk both have cited 10x safer (but I didn't see any independent confirmation). But, there was a critical analysis by Forbes that came to a different conclusion. Per Forbes, "[with autopilot] it looks like a Tesla is slightly less safe. But not a lot less safe."

https://www.forbes.com/sites/bradtempleton/2020/07/28/teslas...

clifdweller 2021-08-16 21:07:03 +0000 UTC [ - ]

the point isn't to ignore it but to make sure everyone using it is 100% aware they need to be paying attention as this is a edge case that is common that the system cant handle so drivers are still responsible to take over

jacquesm 2021-08-16 20:56:27 +0000 UTC [ - ]

Can you please stop repeating this tripe, it's been debunked over-and-over again, and it is really getting tiring.

gundmc 2021-08-16 12:57:50 +0000 UTC [ - ]

This is why I believe the approach of incremental improvement towards full self driving is fundamentally flawed. These advanced driver assist tools are good enough to lull users into a false sense of security. No amount of "but our terms and conditions say you need to always pay attention!" will overcome human nature building that trust and dependence.

Robotbeat 2021-08-16 13:21:18 +0000 UTC [ - ]

I actually disagree. (And before you respond, please read my post because it’s not a trivial point.)

The fact that an huge formal investigation happened with just a single casualty is proof that it may actually be superior for safety in the long-term (when combined with feedback from regulators and government investigators). One death in conventional vehicles is irrelevant. But because of the high profile of Tesla’s technology, it garners a bunch of attention from the public and therefore regulators. This is PRECISELY the dynamic that led to the ridiculously safe airline record. The safer it is, the more that rare deaths will be investigated and the causes sussed out and fixed by industry and regulators together.

Perhaps industry/Tesla/whoever hates the regulators and investigations. But I think they are precisely what will cause self driving to become ever safer, and eventually become as safe as industry/Tesla claims, safer than human drivers while also being cheap and ubiquitous. Just like airline travel today. A remarkable combination of safety and affordability.

This might be the only way to ever do it. I don’t think the airline industry could’ve ever gotten to current levels of safety by testing everything on closed airfields and over empty land for hundreds of millions of flight hours before they had sufficient statistics to be equal to today.

It can’t happen without regulators and enforcement, either.

sandworm101 2021-08-16 13:37:11 +0000 UTC [ - ]

Then why not flip the scheme. Instead of have the human as backup to the machine, make the machine backup the human. Let the human do all the driving and have the robot jump in whenever the human makes a mistake. Telemetry can then record all the situations where the human and the machine disagreed. That should provide all the necessary data, with the benefit of the robot perhaps preventing many accidents.

Of course this is impossible in the real world. Nobody is going to buy a car that will randomly make its own decisions, that will pull the wheel from your hands ever time it thinks you are making an illegal lane change. Want safety? How about a Tesla that is electronically incapable of speeding. Good luck selling that one.

alistairSH 2021-08-16 13:54:52 +0000 UTC [ - ]

Nobody is going to buy a car that will randomly make its own decisions, that will pull the wheel from your hands ever time it thinks you are making an illegal lane change.

That's almost exactly what my Honda does. Illegal (no signal) lane change results in a steering wheel shaker (and optional audio alert). And the car, when sensing an abrupt swerve which is interpreted as the vehicle leaving the roadway, attempts to correct that via steering and brake inputs.

But, I agree with your more general point - the human still needs to be primary. My Honda doesn't allow me to remove my hands from the steering wheel for more than a second or two. Tesla should be doing the same, as no current "autopilot" system is truly automatic.

ohazi 2021-08-16 19:39:00 +0000 UTC [ - ]

> That's almost exactly what my Honda does. Illegal (no signal) lane change results in a steering wheel shaker (and optional audio alert). And the car, when sensing an abrupt swerve which is interpreted as the vehicle leaving the roadway, attempts to correct that via steering and brake inputs.

By the way, this is fucking terrifying when you first encounter it in a rental car on a dark road with poor lane markings while just trying to get to your hotel after a five hour flight.

I didn't encounter an obvious wheel shaker, but this psychotic car was just yanking the wheel in different directions as I was trying to merge onto a highway.

Must be what a malfunctioning MCAS felt like in a 737 MAX, but thankfully without the hundreds of pounds of hydraulic force.

peeters 2021-08-16 14:11:08 +0000 UTC [ - ]

> Illegal (no signal) lane change results in a steering wheel shaker (and optional audio alert).

To be clear, tying the warning to the signal isn't about preventing unsignaled lane changes, it's gauging driver intent (i.e. is he asleep and drifting or just trying to change lanes). It's just gravy that it will train bad drivers to use their signals properly.

sandworm101 2021-08-16 14:46:09 +0000 UTC [ - ]

Is a lane change without signal always illegal? I know that it almost certainly make you liable for any resulting accident, but I'm not sure that it is universally illegal.

peeters 2021-08-16 15:28:37 +0000 UTC [ - ]

This is technically true in Ontario (TIL).

> 142 (1) The driver or operator of a vehicle upon a highway before turning (...) from one lane for traffic to another lane for traffic (...) shall first see that the movement can be made in safety, and if the operation of any other vehicle may be affected by the movement shall give a signal plainly visible to the driver or operator of the other vehicle of the intention to make the movement. R.S.O. 1990, c. H.8, s. 142 (1).

That said there's zero cost to doing so regardless of whether other drivers are affected.

https://www.ontario.ca/laws/statute/90h08#BK243

sandworm101 2021-08-16 20:51:53 +0000 UTC [ - ]

That's the sort of law I remember. It is considered a failure to communicate your intention rather than a violation per se in every circumstance.

peeters 2021-08-17 02:13:00 +0000 UTC [ - ]

It could be one of those "if a tree falls in the forest" scenarios. If a cop is near enough to see you not signal, he could easily argue that he himself might have been affected by the turn or lane change.

dragonwriter 2021-08-16 15:29:28 +0000 UTC [ - ]

> Is a lane change without signal always illegal? I know that it almost certainly make you liable for any resulting accident

Usually, it makes you liable because it is illegal. CA law for instance reqires signalling 100ft before a lane change or turn.

hermitdev 2021-08-16 16:35:41 +0000 UTC [ - ]

Yes, failure to signal is a traffic violation. At least everywhere I've lived/traveled in the US. It's also a rather convenient excuse for police to "randomly" pull you over (I've been pulled over by Chicago PD for not signaling for a lane change, despite actually having done so).

alistairSH 2021-08-16 14:55:39 +0000 UTC [ - ]

I have no idea, but the point wasn't so much that the lane change is illegal, but that lack of signal is used to indicate lack of driver attention. I shouldn't have used "illegal" in my original post.

alistairSH 2021-08-16 14:24:42 +0000 UTC [ - ]

Correct. It's not (primarily) a training thing, but used to ensure the driver is driving and not sleeping/watching movies/whatever.

nthj 2021-08-16 21:36:23 +0000 UTC [ - ]

Just to add, I have a 2021 Honda, and disabling this functionality is a 1-button-press toggle on the dash to the left of the steering wheel. Not mandatory.

Robotbeat 2021-08-16 14:09:51 +0000 UTC [ - ]

Tesla’s system also requires driving’s to have their hands on the steering wheel and occasionally provide torque input.

alistairSH 2021-08-16 14:22:47 +0000 UTC [ - ]

Interesting, I assumed it didn't, given the prevalence of stories about driver watching movies on their phones. I guess they just leave one hand lightly on the wheel, but are still able to be ~100% disengaged from driving the car.

jeofken 2021-08-16 19:07:18 +0000 UTC [ - ]

On most or all roads below 100km/h autopilot won’t allow speeding, and therefore I drive at the limit, which I know I would not have done if I controlled it. It also stays in the lane better than I do, keeps distance better, and more. Sometimes it’s wonky when the street lines are unclear. It’s not perfect but a better driver than I am in 80% of cases.

My insurance company gives a lower rate if you buy the full autopilot option, and that to me indicates they agree it drives better than I, or other humans, do.

Johnny555 2021-08-16 19:18:55 +0000 UTC [ - ]

On most or all roads below 100km/h autopilot won’t allow speeding, and therefore I drive at the limit, which I know I would not have done if I controlled it

If following the speed limit makes cars safer, another way to achieve that without autopilot is to just have all cars limit their speed to the speed limit.

Sometimes it’s wonky when the street lines are unclear. It’s not perfect but a better driver than I am in 80% of cases

The problem is in those 20% of cases where you'd lulled into boredom by autopilot as you concentrate on designing your next project in your head, then suddenly autopilot says "I lost track of where the road is, here you do it!" and you have to quickly gain context and figure out what the right thing to do is.

Some autopilot systems use eye tracking to make sure that the driver is at least looking at the road, but that doesn't guarantee that he's paying attention. But at least that's harder to defeat than Tesla's "nudge the steering wheel once in a while" method.

beambot 2021-08-16 19:46:48 +0000 UTC [ - ]

> just have all cars limit their speed to the speed limit.

The devil is in the details... GPS may not provide sufficient resolution. Construction zones. School zones with variable hours. Tunnels. Adverse road conditions. Changes to the underlying roads. Different classes of vehicles. Etc.

By the time you account for all the mapping and/or perception, you could've just improved the autonomous driving and eliminated the biggest source of humans driving: The human.

freeone3000 2021-08-16 20:42:15 +0000 UTC [ - ]

The single system you're describing, with all of its complexity, is a subset of what is required for autonomous vehicles. We will continue to have road construction, tunnels, and weather long past the last human driver. Improving the system here simply improves the system here -- you cannot forsake this work by saying "oh the autonomous system will solve it" -- this is part of the autonomous system.

Johnny555 2021-08-16 19:57:53 +0000 UTC [ - ]

But you can still impose a max speed limit based on available data to cover most normal driving conditions but it's still on the driver to drive slower if appropriate. And that could be implemented today, not a decade from now when autonomous driving is trustable.

The parent post said that autopilot won't let him go over the speed limit and implies that makes him safer. My point is that you don't need full autopilot for that.

So this is not a technical problem at all, but a political one. As the past year has shown, people won't put up with any convenience or restriction, even if it could save lives (not even if it could save thousands of lives)

asdff 2021-08-16 22:42:23 +0000 UTC [ - ]

GPS is extremely accurate honestly. My garmen adjusts itself the very instant I cross over a speed limit sign to a new speed, somehow. Maybe they have good metadata, but its all public anyway under some department of transportation domain and probably not hard to mine with the price of compute these days. Even just setting a top speed in residential areas of like 35mph would be good and save a lot of lives that are lost when pedestrians meet cars traveling at 50mph. A freeway presents a good opportunity to add sensors to the limited on and off ramps for the car to detect that its on a freeway. Many freeways already have some sort of sensor based system for charging fees.

What would be even easier than all of that, though, is just installing speeding cameras and mailing tickets.

watt 2021-08-16 20:56:28 +0000 UTC [ - ]

Just add all those to the map system. It could be made incredibly accurate, if construction companies are able to actually submit their work zones and "geofence" them off on the map.

jeofken 2021-08-16 22:37:21 +0000 UTC [ - ]

During the 3 years I’ve owned it there are 2 places where lines are wonky and I know to take over.

I have not yet struggled to stay alert when it drives me, and it has driven better than I would have - so it certainly is an improvement over me driving 100% of the time. It does not have road rage and it does not enjoy the feeling of speeding, like I do when I drive, nor does it feel like driving is a competition, like I must admit I do when I am hungry, stressed, or tired.

> just have all cars limit their speed to the speed limit

No way I’d buy a car that does not accelerate when I hit the pedal. Would you buy a machine that is not your servant?

phyzome 2021-08-17 00:19:19 +0000 UTC [ - ]

> Would you buy a machine that is not your servant?

I mean... that's an odd thing for someone to say who has bought a vehicle with over-the-air firmware updates.

jeofken 2021-08-17 11:01:53 +0000 UTC [ - ]

Very true! And I’m typing this on an iPhone…

I wish I could hack but car, but also wouldn’t trust others if they did

tablespoon 2021-08-16 19:43:19 +0000 UTC [ - ]

> Of course this is impossible in the real world. Nobody is going to buy a car that will randomly make its own decisions, that will pull the wheel from your hands ever time it thinks you are making an illegal lane change.

Yeah, add to that the unreliability of Tesla's system means that it cannot pull the wheel from the driver, because it's not unusual for it to want to do something dangerous and need to be stopped. You don't want it to "fix" a mistake by driving someone into the median divider.

wizzwizz4 2021-08-16 13:47:43 +0000 UTC [ - ]

> * Let the human do all the driving and have the robot jump in whenever the human makes a mistake.*

Because when the human disagrees with the machine, the machine is usually the one making a mistake. It might prevent accidents, but it would also cause them, and you lose predictability in the process (you have to model the human and the machine).

foobiekr 2021-08-16 20:54:52 +0000 UTC [ - ]

I don't know how anyone can look at the types of accidents Tesla is having and conclude that it should override the human driver.

CrazyStat 2021-08-16 14:01:49 +0000 UTC [ - ]

> Want safety? How about a Tesla that is electronically incapable of speeding.

That would be unsafe in many situations. If the flow of traffic is substantially above the speed limit--which it often is--being unable to match it increases the risk of accident. This is known as the Solomon curve [1].

[1] https://en.wikipedia.org/wiki/Solomon_curve

treesknees 2021-08-16 14:35:44 +0000 UTC [ - ]

> Subsequent research suggests significant biases in the Solomon study, which may cast doubt on its findings

With the logic presented in the theoretical foundation section, it seems that the safer move would actually be slow down and match the speed of all the trucks and other large vehicles... which won't happen.

Matching speed sounds great, except there are always people willing to go faster and faster. In my state they raised the speed limit from 70 to 75, it just means more people are going 85-90. How is that safer?

filoleg 2021-08-16 15:26:57 +0000 UTC [ - ]

To address your last paragraph, everyone going 85-90 is less safe than everyone going 70-75, you are correct.

However, you individually going 70-75 when everyone else is going 85-90 is less safe than you going 85-90 like everyone else in the exact same situation.

>there are always people willing to go faster and faster

That’s why no one says “go as fast as the fastest vehicle you see”, it is “go with the general speed of traffic”. That’s an exercise for human judgement to figure that one out, which is why imo it isn’t a smart idea to have the car automatically lock you out of overriding the speed limit.

FireBeyond 2021-08-17 03:57:15 +0000 UTC [ - ]

> However, you individually going 70-75 when everyone else is going 85-90 is less safe than you going 85-90 like everyone else in the exact same situation.

And yet the roads are full of vehicles literally incapable of going 85. Many trucks cannot do more than 69mph.

kiba 2021-08-16 18:54:10 +0000 UTC [ - ]

People are going faster because they felt it's safer, not because of the speed limit. You can design roads that cause humans to slow down and be more careful.

CamperBob2 2021-08-16 19:31:28 +0000 UTC [ - ]

Or you can just set the speed limit appropriately, in accordance with sound engineering principles. A radical notion, I guess.

2021-08-16 14:48:01 +0000 UTC [ - ]

MichaelGroves 2021-08-16 14:50:49 +0000 UTC [ - ]

A self driving car obviously needs to be aware of other cars on the road. I don't see any reason why the car couldn't observe other cars, see what speed they are going at, and refuse to go faster than the rest. A car that refuses to do 120mph when all the other cars are doing 60mph in a 50mph zone should be trivial.

(Trivial if the self driving tech works at all....)

TheCapn 2021-08-16 20:59:19 +0000 UTC [ - ]

You're getting downvoted for this comment apparently, but I'm still of the firm belief that we will never see full autonomous driving without some sort of P2P network among cars/infrastructure.

There's just too much shit that can't be "seen" with a camera/sensor in conjested traffic. Having a swarm of vehicles all gathering/sharing data is one of the only true ways forward IMO.

sandworm101 2021-08-16 14:32:44 +0000 UTC [ - ]

Ok. Electronically incapable of driving more than 10% faster than other traffic.

rad_gruchalski 2021-08-16 16:29:45 +0000 UTC [ - ]

How do you know which traffic is "the other traffic"? So basically - there's no top limit.

cactus2093 2021-08-16 19:54:26 +0000 UTC [ - ]

> How about a Tesla that is electronically incapable of speeding. Good luck selling that one.

Instead they did the exact opposite with the plaid mode model S, lol. It kind of works against their claims that they prioritize safety when their hottest new car - fully intended for public roads - has as its main selling point the ability to accelerate from 60-120 mph faster than any other car.

Someone 2021-08-16 13:57:10 +0000 UTC [ - ]

I think that’s the approach many car manufacturers have been on for decades.

As a simple example, ABS (https://en.wikipedia.org/wiki/Anti-lock_braking_system) only interferes with what the driver does when an error occurs.

More related to self-driving, there’s various variants of https://en.wikipedia.org/wiki/Lane_departure_warning_system that do take control of the car.

And it is far from “incapable of speeding”, but BMW, Audi and Mercedes-Benz “voluntarily” and sort-of limit the speed of their cars to 250km/hour (https://www.autoevolution.com/news/gentlemens-agreement-not-...)

FridayoLeary 2021-08-16 13:53:51 +0000 UTC [ - ]

>Let the human do all the driving and have the robot jump in whenever the human makes a mistake.

Nothing more annoying then a car that thinks i don't know how to drive (warning beeps etc.).

gambiting 2021-08-16 19:12:50 +0000 UTC [ - ]

I keep saying the same thing actually whenever people say that manual driving will be outlawed. Like, no, it won't be - because the computers will still save you in most cases either way, autopilot enabled or not.

>>How about a Tesla that is electronically incapable of speeding. Good luck selling that one.

From 2022 all cars sold in the EU have to have an electronic limiter that keeps you to the posted speed limit(by cutting power if you are already going faster) - the regulation does allow the system to be temporarily disabled however.

ggreer 2021-08-16 21:28:13 +0000 UTC [ - ]

Your summary is incorrect. The ETSC recommends that Intelligent Speed Assistance should be able to be overridden.[1] It's supposed to not accelerate as much if you're exceeding the speed limit, and if you override by pressing the accelerator harder, it should show some warning messages and make an annoying sound. It's stupid, but it doesn't actually limit the speed of your car.

I think it's a silly law and I'm very glad I don't live in a place that requires such annoyances, but it's not as bad as you're claiming.

1. https://etsc.eu/briefing-intelligent-speed-assistance-isa/

Symbiote 2021-08-16 21:57:51 +0000 UTC [ - ]

I hired a new car with Intelligent Speed Assistance this summer, though it was set (and I left it) just to "ping" rather than do any limiting. I drove it to a fairly unusual place, though still in Europe and with standard European signs. It did not have a GPS map of the area.

It could reliably recognize the speed limit signs (red circle), but it never recognized the similar grey-slash end-of-limit signs. It also didn't recognize the start-of-town or end-of-town signs, so it didn't do anything about the limits they implied.

I would certainly have had to disable it, had it been reducing the acceleration in the way that document describes.

9935c101ab17a66 2021-08-18 04:49:09 +0000 UTC [ - ]

What? Cars with collision detection systems already exist, and they can handle both side on and head on collision avoidance when a human is driving.

People literally are buying cars that “make their own decisions”. Importantly though, these systems only activate in the case of an imminent collision IF the corrective measure won’t cause another collision.

> that will pull the wheel …

Yah, of course no one is going to buy a care that does what you describe because what you describe is insane and inherently unsafe. Unless a collision is imminent, nothing happens.

velcii 2021-08-16 13:45:47 +0000 UTC [ - ]

>Let the human do all the driving and have the robot jump in whenever the human makes a mistake.

I really don't think that would give many data points, because all of the instances would be when a human fell asleep or wasn't paying attention.

2021-08-16 13:50:12 +0000 UTC [ - ]

dboreham 2021-08-16 13:44:51 +0000 UTC [ - ]

This line of thinking is flawed because it assumes a smooth surface over the safety space, where if you make incremental improvements you will head towards some maxima of safety. e.g. : the wing fell off; investigate; find that you can't use brittle aluminum; tell aircraft manf. to use a more ductile alloy. Self driving technology isn't like that -- you can't just file a bug "don't mistake a human for a plastic bag", fix that bug and move on to the next one. No number of incremental fixes will make self driving that works as any reasonable human would expect it to work.

vntok 2021-08-16 13:30:26 +0000 UTC [ - ]

This argument is flawed, because when regulators investigate a Tesla crash, Waymo doesn't care the slightest. The technologies (emphasis on having skeuomorphic cameras vs a lidar), approaches (emphasis on generating as many situations as possible in simulated worlds and carefully transitioning to the business case vs testing as early in the real world with background data captation) and results are so different between the actors in this specific industry that one's flaws being fixed or improved won't necessarily translate into others benefitting from it.

Conversely, when Waymo iterates and improves their own safety ratios by a significant amount, that evidently does not result in Tesla's improving in return.

unionpivo 2021-08-16 13:41:49 +0000 UTC [ - ]

well when regulators investigate Boing, Airbus probably doesn't care either.

Until it leads to something systemic, that then regulator mandates for all vehicles

vntok 2021-08-16 19:18:09 +0000 UTC [ - ]

Boeing and Airbus operate largely in the same direction with similar technical solutions to somilar problems.

Not the case at all between lidars and cameras.

treeman79 2021-08-16 13:28:01 +0000 UTC [ - ]

Ignoring other issues.

Asking someone to pay attention when they are not doing anything is unrealistic. I would be constantly bored / distracted. My wife would instantly fall asleep. Etc etc.

3pt14159 2021-08-16 13:25:20 +0000 UTC [ - ]

I largely agree with you, but I just wish regulators would start by only allowing these assist programs for people that are already known to be poor drivers. The elderly and convicted drunk drivers, for example. That way we could have the best of both worlds.

lastofthemojito 2021-08-16 13:31:40 +0000 UTC [ - ]

Incentivizing drunk driving seems dangerous.

joshgrib 2021-08-16 13:38:44 +0000 UTC [ - ]

Requiring people to buy a special car/system to be able to drive doesn't seem like an incentive - it seems similar to the interlock system we currently require drunk drivers to purchase to be able to drive.

If anything a driver monitoring system seems even better than the interlock system, for example you couldn't have your kids/friends blow for you to bypass it.

unionpivo 2021-08-16 13:40:07 +0000 UTC [ - ]

I disagree. I would not put people who shoved poor judgment in situation, where they can further hurt other or themselves it. People like that are more likely not to pay attention and do other irresponsible things.

Go with safest drivers first.

Seanambers 2021-08-16 13:40:08 +0000 UTC [ - ]

Teslas Autopilot system is almost 10X safer than the average human driver already based on the latest 2021 Q1 numbers.

https://www.tesla.com/en_CA/VehicleSafetyReport

ra7 2021-08-16 13:53:34 +0000 UTC [ - ]

Tesla's safety report lacks data and is extremely misleading.

1. Autopilot only works on (or intended to work on) highways. But they are comparing their highway record to all accident records including city driving, where accident rate is far higher than highway driving.

2. They're also comparing with every vehicle in the United States including millions of older vehicles. Modern vehicles are built for higher safety and have a ton of active safety features (emergency braking, collision prevention etc). Older vehicles are much more prone to accidents and that skews the numbers.

The reality is Teslas are no safer than any other vehicles in its class ($40k+). Their safety report is purely marketing spin.

deegles 2021-08-16 14:49:52 +0000 UTC [ - ]

They also include miles driven by previous versions of their software in the “safe miles driven” tally. There’s no guarantee any improvement would not have resulted in more accidents. They should reset the counter on every release.

wilg 2021-08-16 20:03:56 +0000 UTC [ - ]

> The reality is Teslas are no safer than any other vehicles in its class ($40k+).

Would another way of saying this be that they are as safe as other vehicles in that class? And that therefore Autopilot is not more unsafe than driving those other cars?

pandaman 2021-08-16 23:49:58 +0000 UTC [ - ]

Do you know many vehicles $40K+ that don't have BLIS and rear/front cross traffic alerts? While a radar-based blind sport alert (one that warns if a car behind is moving too fast to safely merge) is probably irrelevant for the city driving, the cross traffic is extremely useful when pulling out of driveway obstructed by parked cars, I personally have seen several accidents just on my street that could have been prevented with cross traffic detection. I think the expensive models (S/X) still have the front radar so they may have the front alert but I don't think any model ever had the rear radar for the rear cross traffic alert.

ra7 2021-08-16 20:23:37 +0000 UTC [ - ]

I would probably agree, but I also think it’s a case of “need more data”.

We should really compare Autopilot with its competitors like GM’s Super Cruise or Ford’s Blue Cruise, both of which offer more capabilities than Autopilot. That will show if Tesla’s driver assist system is more or less safe than their competitors product.

ggreer 2021-08-16 22:07:48 +0000 UTC [ - ]

What capabilities does GM or Ford have that Tesla doesn't? Neither GM nor Ford have rolled out automatic lane changing. Teslas have been doing that since 2019.

The reason GM's Super Cruise got a higher rating by Consumer Reports was because CR didn't even test the capabilities that only Tesla had (such as automatic lane change and taking offramps/onramps). Also, the majority of the evaluation criteria weren't about capabilities. eg: "unresponsive driver", "clear when safe to use", and "keeping the driver engaged".[1]

1. https://www.consumerreports.org/car-safety/cadillac-super-cr...

ben_w 2021-08-16 13:47:15 +0000 UTC [ - ]

> In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

I think the comparison should be Tesla with/without AI, not Tesla/not-Tesla; so roughly either x2 or x4 depending on what the other active safety features do.

It’s not nothing, but it’s much less than the current sales pitch — and the current sales pitch is itself the problem here, for many legislators.

Ajedi32 2021-08-16 14:03:22 +0000 UTC [ - ]

> we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged [...] for those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven

This still isn't the correct comparison. Major selection bias with comparing miles with autopilot engaged to miles without it engaged, since autopilot cannot be engaged in all situations.

A better test would be to compare accidents in Tesla vehicles with the autopilot feature enabled (engaged or not) to accidents in Tesla vehicles with the autopilot feature disabled.

HPsquared 2021-08-16 20:05:00 +0000 UTC [ - ]

Even then, there's selection: people who do a lot of highway driving are more likely to opt for Autopilot than those who mostly drive in the city.

bluGill 2021-08-16 14:16:14 +0000 UTC [ - ]

As was stated elsewhere, most accidents happen in city driving where autopilot cannot be activated so the with/without AI is meaningless. We need to figure out when the AI could have been activated but wasn't, if you do that then you are correct.

Robotbeat 2021-08-16 14:24:40 +0000 UTC [ - ]

On the contrary to your overall point: The fatal crash rate per miles driven is almost 2 times higher in rural areas than urban areas. Urban areas may have more accidents, but the speeds are likely lower (fender benders).

https://www.iihs.org/topics/fatality-statistics/detail/urban...

freshpots 2021-08-16 21:08:46 +0000 UTC [ - ]

"..most accidents happen in city driving where autopilot cannot be activated so the with."

Yes it can. The only time it can't be activated is if there is no clearly marked center line.

throwaway0a5e 2021-08-16 13:45:02 +0000 UTC [ - ]

That impressive claim narrows to approximately the noise floor if you compare to comparable drivers in comparable cars.

2021-08-16 13:48:12 +0000 UTC [ - ]

Ajedi32 2021-08-16 13:47:59 +0000 UTC [ - ]

Tesla's vehicles are almost 10X safer than the average vehicle. Whether their autopilot system is contributing positively or negatively to that safety record is unclear.

The real test of this would be: of all Tesla vehicles, are the ones with autopilot enabled statistically safer or less safe than the ones without autopilot enabled?

FireBeyond 2021-08-17 03:59:44 +0000 UTC [ - ]

Tesla's driving is skewed, hugely. There are plenty of situations where AP is turned off that are less safe in general.

AP has the luxury of being able to be turned off in less than ideal conditions. Human drivers can't do that.

It's the reason why only Tesla touts these numbers. They're inaccurate and misleading.

input_sh 2021-08-16 13:48:00 +0000 UTC [ - ]

...according to Tesla, based on the data nobody else can see?

czzr 2021-08-16 13:49:51 +0000 UTC [ - ]

No, it’s not. I guess we’re doomed to see this at-best-misleading-but-really-just-straight-up-lying analysis every time there’s an article about this.

Makes me laugh, especially with the “geeks are immune to marketing” trope that floats around here equally as regularly.

2021-08-16 21:37:53 +0000 UTC [ - ]

Waterluvian 2021-08-16 13:06:24 +0000 UTC [ - ]

I have a Subaru Forester base model with lane keeping and adaptive cruise control.

I need to be touching the wheel and applying some force to it or it begins yelling at me and eventually brings me slowly to a stop.

I’ve had it for a year now and I cannot perceive of a way, without physically altering the system (like hanging a weight from the wheel maybe?) that would allow me to stop being an active participant.

I think the opposite is true: Tesla’s move fast and kill people approach is the mistake. Incremental mastering of autonomous capabilities is the way to go.

jeffnappi 2021-08-16 13:16:34 +0000 UTC [ - ]

I own a Model Y and am a pretty heavy Autopilot user. You have to regularly give input on the steering wheel and if you fail a few times it won't let you re-engage until you park and start again.

Personally Autopilot has actually made driving safer for me... I think there's likely abuse of the system though that Tesla could work harder to prevent.

DrBenCarson 2021-08-16 15:57:50 +0000 UTC [ - ]

I personally think the issue boils down to their use of the term "Autopilot" for a product that is not Autpilot (and never will be with the sensor array they're using IMO.)

They are sending multiple signals that this car can drive itself (going so far as charging people money explicitly for the "self-driving" feature) when it cannot in the slightest do much more than stay straight on an empty highway.

They should be forced to change the name of the self-driving features, I personally think "Backseat Driver" would be more appropriate.

jhgb 2021-08-16 19:52:15 +0000 UTC [ - ]

> the issue boils down to their use of the term "Autopilot" for a product that is not Autpilot

It is literally an autopilot. Just like an autopilot on an airplane, it keeps you stable and in a certain flight corridor. There's virtually no difference except for Tesla's Autopilot's need to deal with curved trajectories.

labcomputer 2021-08-16 20:37:56 +0000 UTC [ - ]

> There's virtually no difference except for Tesla's Autopilot's need to deal with curved trajectories.

Well, and it actively avoids collisions with other vehicles (most of the time). Airplane (and boat) autopilots don't do that.

"But you're using the word autopilot wrong!"

jhgb 2021-08-17 13:13:10 +0000 UTC [ - ]

Well, they still need to avoid the collisions more reliably, apparently. Once they do it perfectly reliably I will add it into the list of the things it does differently from an airplane autopilot. ;)

Kaytaro 2021-08-16 19:05:36 +0000 UTC [ - ]

Autopilot is precisely the correct term - An autopilot is a system used to control the path of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators.

tgsovlerkhgsel 2021-08-16 13:08:26 +0000 UTC [ - ]

Tesla had a similar system, and

> physically altering the system (like hanging a weight from the wheel maybe?)

was exactly what people were doing. But it's also possible to be physically present, applying force, but being "zoned out", even without malicious intent.

johnnyApplePRNG 2021-08-16 13:19:14 +0000 UTC [ - ]

>But it's also possible to be physically present, applying force, but being "zoned out", even without malicious intent.

I've occasionally noticed myself zoning out behind the wheel of my non-self-driving car as well.

It's actually very common. [0]

[0] https://www.actuarialpost.co.uk/article/quarter-of-fatal-cra...

FireBeyond 2021-08-17 04:01:57 +0000 UTC [ - ]

It also previously required only 15 minutes between steering wheel contacts.

shakna 2021-08-16 13:12:26 +0000 UTC [ - ]

> I need to be touching the wheel and applying some force to it or it begins yelling at me and eventually brings me slowly to a stop.

> I’ve had it for a year now and I cannot perceive of a way, without physically altering the system (like hanging a weight from the wheel maybe?) that would allow me to stop being an active participant.

That's exactly what people were doing with the Tesla. Hanging a weight to ensure the safety system doesn't kick in. [0][1]

[0] https://edition.cnn.com/2021/04/28/cars/tesla-texas-crash-au...

[1] https://twitter.com/ItsKimJava/status/1388240600491859968/ph...

Waterluvian 2021-08-16 13:18:36 +0000 UTC [ - ]

If people are consciously modifying their car to defeat obvious safety systems, I have a really hard time seeing how the auto manufacturer should be responsible.

I guess the probe will reveal what share of fatal accidents are caused by this.

rcxdude 2021-08-16 13:48:33 +0000 UTC [ - ]

Well, it doesn't help when the CEO of the company publically states that the system is good enough to drive on its own and those safety systems are only there because of regulatory requirements.

theluketaylor 2021-08-17 00:37:59 +0000 UTC [ - ]

GM's Supercruise (which is the actual king of the hill for L2 systems) uses cameras to track the driver's eye position to ensure they are paying attention. It's significantly harder to defeat, is geofenced to prevent use in incompatible situations like surface streets, and has a much more graceful disengagement process. Most of the time autopilot is smooth, but sometimes it just hands control back to the driver without warning.

Teslas can famously be tricked by wedging an orange between the rim and spoke of the steering wheel to produce enough torque on the wheel to satisfy the detection. There are enough videos of it on youtibe that tesla could easily be found negligent for not doing enough to prevent drivers from defeating a safety system given that alternate technology that more directly tracks attention is available and tricking tesla's detection method became common knowledge.

SEJeff 2021-08-16 13:40:55 +0000 UTC [ - ]

You're literally describing how the Tesla system works. It requires you to keep your hand on the wheel and apply a slight pressure every so often. The cabin camera watches the driver and if they're looking down or at their phone, it does that much more often.

People causing these problems almost certainly are putting something over the cabin camera and a defeat device on the steering wheel.

ChrisClark 2021-08-16 13:15:58 +0000 UTC [ - ]

That's exactly what my Tesla does. I need a constant torque on the steering wheel or it yells at me and slowly comes to a stop.

Waterluvian 2021-08-16 13:17:29 +0000 UTC [ - ]

Personally I’ve found this to be sufficient in my Forester. Even holding the wheel but not being “there” isn’t enough. The car is really picky about it.

gccs 2021-08-16 13:25:21 +0000 UTC [ - ]

Shove a can of soda in the wheel and it will stop beeping.

mellavora 2021-08-16 13:36:31 +0000 UTC [ - ]

yes, but what if you also have to sing the jingle?

Damn, this 'drink verification can' is going to get us all killed.

slg 2021-08-16 19:23:00 +0000 UTC [ - ]

One problem that is often ignored in these debates is that people already don't always pay attention while driving. Spend some time looking at other drivers next time you are a passenger in slow traffic. The number of drivers on their phones, eating, doing makeup, shaving, or even reading a book is scary.

It therefore isn't a clean swap of a human paying attention to a human who isn't. It becomes a complicated equation that we can't just dismiss with "people won't pay attention". It is possible that a 90%/10% split of drivers paying attention to not paying attention is more dangerous when they are all driving manually than a 70%/30% split if those drivers are all using self-driving tech to cover for them. Wouldn't you feel safer if the driver behind you who is answering texts was using this incremental self-driving tech rather than driving manually?

No one has enough data on the performance of these systems or how the population of drivers use them to say definitively that they are either safer or more dangerous on the whole. But it is definitely something that needs to be investigated and researched.

helsinkiandrew 2021-08-16 13:28:19 +0000 UTC [ - ]

> you need to always pay attention

That is the fatal flaw in anything but a perfect system - any kind of system that is taking the decisions about steering from the driver is going to result in the driver at best thinking about other things and worse getting into the back seat to change. If you had to develop a system to make sure someone was paying attention, you wouldn't make them sit in a warm comfy seat looking at a screen - you would make them actively engage with what they were looking at - like steering.

And ultimately it doesn't matter how many hundreds of thousands of hours of driving you teach your system with, it may eventually be able to learn about parked cars, kerbs and road signs, but there won't be enough examples of different accidents and how emergency vehicles behave to ever make it behave safely. Humans can cope with driving emergencies fairly well (not perfectly admittedly) no matter how many they've been involved in using logic and higher level reasoning.

Baeocystin 2021-08-16 19:35:52 +0000 UTC [ - ]

I remember reading Donald Norman's books decades ago, and one of the prime examples of the dangers of automation in cars was adaptive cruise control- which would then suddenly accelerate forward in a now-clear off-ramp, surprising the heck out of the previously-complacent driver, and leading to accidents.

We've known for a very long time that this sort of automation/manual control handoff failure is a very big deal, and yet there seems to be an almost willful blindness from the manufacturers to address it in a meaningful way.

fzzzy 2021-08-16 13:27:24 +0000 UTC [ - ]

Do you hate regular cruise control? How is that not partial self driving?

ghaff 2021-08-16 13:48:28 +0000 UTC [ - ]

To tell you the truth, I generally do and haven't used it for ages. Where I drive, the roads have some amount of traffic. I find (traditional) cruise control encourages driving at a constant speed to a degree that I wouldn't as a driver with a foot on the gas. So I don't "hate" regular cruise control but I basically never use it.

hcurtiss 2021-08-16 14:35:42 +0000 UTC [ - ]

I think you are in a distinct minority.

ghaff 2021-08-16 14:44:52 +0000 UTC [ - ]

Maybe a fairly small sample size but I don't know the last time I've been in a car where the driver has turned on cruise control. But it probably varies by area of the country. In the Northeast, there's just enough traffic in general that it's not worth it for me.

minhazm 2021-08-16 19:32:27 +0000 UTC [ - ]

In traffic is where traffic aware cruise control is most useful. A lot of people I knew who bought Tesla's in the bay area specifically bought it so their commutes would be less stressful with the bumper to bumper traffic. I drove 3000+ miles across the country last year with > 90% of it on AP and I was way less tired with AP on vs off and it allowed me to just stay focused on the road and look out for any issues.

int_19h 2021-08-16 21:10:30 +0000 UTC [ - ]

One thing worth noting about Subaru's approach to this that is specifically relevant to bumper-to-bumper traffic, is that it will stop by itself, but it won't start moving by itself - the driver needs to tap the accelerator for that. It will warn you when the car in front starts moving, though.

ghaff 2021-08-16 20:47:45 +0000 UTC [ - ]

Yes. I was (explicitly) talking about traditional "dumb" cruise control. I haven't used adaptive cruise control but I agree it sounds more useful than traditional cruise control once you get above minimal traffic.

cmpb 2021-08-16 13:04:39 +0000 UTC [ - ]

I disagree. One feature my car has is to pull me back into the lane when I veer out of it (Subaru's lane keep assist). That is still incremental improvement towards "full self driving". I agree, however, that Tesla's Autopilot is not functional enough, and any tool designed to allow humans to remove their hands from the wheel should not require their immediate attention in any way.

tapoxi 2021-08-16 13:09:02 +0000 UTC [ - ]

I think people just assume Tesla's Autopilot is more capable than it really is.

My car has adaptive cruise control and lane keep assist, but I'm not relying on either for anything more complex than sipping a drink while on the highway.

rmckayfleming 2021-08-16 13:22:13 +0000 UTC [ - ]

Yep, if anything they’re just a way to make long drives or stop and go highway traffic more tolerable. When I got my first car with those features it seemed like a gimmick, but they really help to reduce fatigue.

Robotbeat 2021-08-16 13:25:04 +0000 UTC [ - ]

Tesla’s autopilot does not allow you to remove your hands from the wheel. You must keep them on, and apply torque occasionally, to keep it engaged.

Karunamon 2021-08-16 18:59:10 +0000 UTC [ - ]

In reality, it'll let you get away with going handsfree for upwards of 30 seconds. That's more than long enough to lose your attention.

darkerside 2021-08-17 12:03:38 +0000 UTC [ - ]

I think that's actually a step towards a local maximum that makes us less likely to achieve actual FSD. The safer we can make AI-guided driving where a person is still in control, the higher the bar becomes for a solo AI to be significantly safer than the alternatives.

MR4D 2021-08-16 19:48:14 +0000 UTC [ - ]

A car that beeps when I drift out of lane, or beeps when I go too fast before a curve, or beeps like hell if I cross over the center median would be hugely useful, because a record of every warning would be there, whether correct or not.

Conversely, if it didn't warn me right before an accident, then the absence of that warning would be useful too.

All of that information should be put back into the model based on crash reporting. Everything else can be ignored.

I would argue that the information should be available to all automakers (perhaps using the NHTSA as a conduit), so that each of them have the same safety information, but can still develop their own models. The FAA actually does this already with the FAA Accident and Incident Data Systems [0] and it has worked pretty darn well.

[0] - https://www.asias.faa.gov/apex/f?p=100:2:::NO:::

oceanghost 2021-08-16 20:29:35 +0000 UTC [ - ]

The new Toyota RAV4's have this feature-- if you go out of bounds in your lane they beep and the steer5ing wheel gives a bit of resistance.

It also reads the speed limit signs and places a reminder in the display. I think it can brake if it detects something in front of it, but I'm not certain.

MR4D 2021-08-16 21:59:20 +0000 UTC [ - ]

Many other cars do as well.

My main point (perhaps buried more than it should have been) is that by centralizing accident data along with whether an alert went off (or not), and sharing that with all automobile manufacturers can help this process proceed better.

Right now the data is highly fragmented and there is not really a common objective metric by which to make decisions to improve models.

oceanghost 2021-08-17 02:26:23 +0000 UTC [ - ]

I'm sorry, I entirely missed your point.

I agree that would be reasonable and desirable :-)

dalbasal 2021-08-16 14:02:32 +0000 UTC [ - ]

I think the fundamental flaw is indisputable. Everyone is aware of in-between stage problems. I don't think it's an insurmountable flaw.

These things are on the road already. They have issues, but so do human only cars. Tweaks probably get made, like some special handling of emergency vehicle scenarios. But, it's not enough to stop it.

Meanwhile, it's not a permanent state. Self driving technology is advancing, becoming more common on roads. Procedures, as well as infrastructure, is growing around the existence of self driven cars. Human supervisor or not, the way these things use the road affects the design of roads. If your emergency speed sign isn't being headed by self driven cars, your emergency speed sign has a bug.

tgsovlerkhgsel 2021-08-16 13:07:15 +0000 UTC [ - ]

It depends. As long as the resulting package (flawed self driving system + the average driver) isn't significantly more dangerous than the average unassisted human driver, I don't consider it irresponsible to deploy it.

"The average driver" includes everyone, ranging from drivers using it as intended with close supervision, drivers who become inattentive because nothing is happening, and drivers who think it's a reasonable idea to climb into the back seat with a water bottle duct taped to the steering wheel to bypass the sensor.

OTOH, the average driver for the unassisted scenario also includes the driver who thinks they're able to drive a car while texting.

TacticalCoder 2021-08-16 13:15:00 +0000 UTC [ - ]

> As long as the resulting package (flawed self driving system + the average driver) isn't significantly more dangerous than the average unassisted human driver...

Shouldn't that compared to "average driver + myriad of modern little safety features" instead of "average unassisted driver"? The one who has the means to drive a Tesla with the "full driving" mode certain has the means to buy, say, a Toyota full of assistance/safety features (lane change assist, unwanted lane change warning and whatnots).

politician 2021-08-16 13:37:49 +0000 UTC [ - ]

Why isn’t defeating the self-driving attention controls a crime like reckless driving? Isn’t that the obvious solution?

tgsovlerkhgsel 2021-08-16 14:42:58 +0000 UTC [ - ]

It almost certainly is, at least when combined with the intentional inattention that follows.

Making it a crime isn't an "obvious solution" to actually make it not happen. Drunk driving is a crime and yet people keep doing it. Same with texting and driving.

politician 2021-08-16 15:31:21 +0000 UTC [ - ]

The problem is determining who is liable for damages, not prevention. Shifting the liability for willfully disabling a safety control puts them on notice.

Prevention as a goal is how we end up with dystopia.

rcxdude 2021-08-16 13:50:11 +0000 UTC [ - ]

Gonna be pretty difficult to enforce. Many US states don't even enforce a minimum roadworthiness of cars on the roads.

politician 2021-08-16 14:04:45 +0000 UTC [ - ]

Does that even matter? If the state doesn’t care to enforce its laws against reckless driving, why should the manufacturer be encumbered with that responsibility?

CaptArmchair 2021-08-16 13:28:53 +0000 UTC [ - ]

> drivers using it as intended with close supervision

Doesn't this hide a paradox? Using a self-driving car as intended implies that the driver relinquishes a part of the human decision making process to the car. While close supervision implies that the driver can always take control back from the car, and therefore carries full personal responsibility of what happens.

The caveat here is that the car might make decisions in a rapidly changing, complex context which the driver might disagree with, but has no time to correct for through manual intervention. e.g. hitting a cyclist because the autonomous system made an erroneous assertion.

Here's another way of looking at this: if you're in a self-driving car, are you a passenger or a driver? Do you intend to drive the car yourself or let the car transport you to your destination?

In the unassisted scenario, it's clear that both intentions are one and the same. If you want to get to your location, you can't but drive the car yourself. Therefore you can't but assume full personal responsibility for your driving. Can the same be said about a vehicle that's specifically designed and marketed as "self-driving" and "autonomous"?

As a driver, you don't just relinquish part of the decision making process to the car, what essentially happens is that you put your trust in how the machine learning processes that steer the car were taught to perceive the world by their manufacturer. So, if both car and occupant disagree and the ensuing result is an accident, who's at fault? The car? The occupant? The manufacturer? Or the person seeking damages because their dog ended up wounded?

The issue here isn't that self-driving cars are inherently more dangerous then their "dumb" counter parts. It's that driving a self-driving car creates it's own separate class of liabilities and questions regarding responsible driving when accidents do happen.

jdavis703 2021-08-16 13:20:31 +0000 UTC [ - ]

The average driver breaks multiple laws on every trip. Most of the time no one gets hurt. But calibrating performance against folks violating traffic and criminal laws sets the bar too low for an automated system. We should be aiming for standards that either match European safety levels or the safety of modes of air travel or rail travel.

tgsovlerkhgsel 2021-08-16 13:25:24 +0000 UTC [ - ]

I disagree. Perfect is the enemy of good, and rejecting a better system because it isn't perfect seems like an absurd choice.

I'm not saying improvements should stop there, but once the system has reached parity, it's OK to deploy it and let it improve from there.

bcrl 2021-08-16 14:14:38 +0000 UTC [ - ]

Except that doesn't work if you're trying to produce a safe product. Investigations into crashes in the airline industry have proven that removing pilots from active participation in the control loop of the airplane results in distraction and an increased response time when an abnormal situation occurs. Learning how to deal with this is part of pilots' training, plus they have a co-pilot to keep an eye on things and back them up.

An imperfect self driving vehicle is the worst of all worlds: they lull the driver into the perception that the vehicle is safe while not being able to handle abnormal situations. The fact that there are multiple crashes on the record where Telsas have driven into stationary trucks and obstacles on roads is pretty damning proof that drivers can't always react in the time required when an imperfect self driving system is in use. They're not intrinsically safe.

At the very least drivers should be required additional training to operate these systems. Like pilots, drivers need to be taught how to recognize when things go awry and react to possible failures. Anything less is not rooted in safety culture, and it's good to see there are at least a few people starting to shine the light on how these systems are being implemented from a safety perspective.

notahacker 2021-08-16 19:15:11 +0000 UTC [ - ]

> Perfect is the enemy of good, and rejecting a better system because it isn't perfect seems like an absurd choice.

Nothing absurd about thinking a system which has parity with the average human driver is too risky to buy unless you consider yourself to be below average at driving. (As it is, most people consider themselves to be better than average drivers, and some of them are even right!) The accident statistics that comprise the "average human accident rate" are also disproportionately caused by humans you'd try to discourage from driving in those circumstances...

Another very obvious problem is that an automated system which kills at the same rate per mile as an average human drivers will tend to be driven a lot more because no effort (and probably replace better-than-average commercial drivers long before teenagers and occasional-but-disproportionately-deadly drivers can afford it).

Robotbeat 2021-08-16 13:23:54 +0000 UTC [ - ]

Yes, I agree. We should hold automated systems to a higher standard. Unless you’re proposing we ban automated systems until they’re effectively perfect because that would perversely result in a worse outcome: being stuck with unassisted driving forever.

dheera 2021-08-16 19:08:33 +0000 UTC [ - ]

Is it? Tesla is still alive because they're selling cars.

It's just that the companies that are NOT doing incremental approaches are largely at the mercy of some investors who don't know a thing about self-driving, and they may die at any time.

I agree with you that it is technically flawed, but it may still be viable in the end. At least their existence is not dependent on the mercy of some fools who don't get it, they just sell cars to stay alive.

That's one of the major problems of today's version of capitalism -- it encourages technically flawed ways to achieve scientific advancement.

yawaworht1978 2021-08-16 22:14:22 +0000 UTC [ - ]

Would be interesting to know how many buy based on the fsd hype(including the ones who don't pay for the package) and how many buy because of the "green" factor. However many there are who buy because of the fsd promise, all that revenue is coming from vaporware (beta ware at best) and is possible due to lack of regulatory enforcement. History shows that the longer the self regulatory entities take the p, the harder the regulatory hammer comes down eventually.

api 2021-08-16 14:24:56 +0000 UTC [ - ]

Full self driving is one of those things where getting 80% of the way there will take 20% of the effort and getting the remaining 20% of the way there will take 80% of the effort.

Tesla auto-drive seems like it's about 80% of the way there.

2021-08-16 19:10:31 +0000 UTC [ - ]

backtoyoujim 2021-08-16 13:36:36 +0000 UTC [ - ]

I mean there are videos of a vehicle's occupant sitting in the rear seats making food and drinks while the vehicles are tricked into operating off of the vehicles sensors.

It is not solely the trust and dependence but inclusive is the group of idiots with access to wealth without regard to human life.

bishoprook2 2021-08-16 13:36:46 +0000 UTC [ - ]

I expect that to design self-driving you need to push the limits (with some accidents) a bit with a bunch of telemetry. Going from not-much to full-self-driving requires a lot of design increments.

mnmmn123456 2021-08-16 13:03:07 +0000 UTC [ - ]

Today, there is at least one of the most advanced Neural Networks entering each car: A human being. If we could just implement the AI to add to this person and not replace it...

ben_w 2021-08-16 13:42:57 +0000 UTC [ - ]

What would such an AI even look like? If it spots every real danger but also hallucinates even a few dangers that aren’t really there, it gets ignored or switched off for needlessly slowing the traveler down (false positives, apparently an issue with early Google examples [0]); if it only spots real dangers but misses most of them, it is not helping (false negatives, even worse if a human is blindly assuming the machine knows best and what happened with e.g. Uber [1]); if it’s about the same as humans overall but makes different types of mistake, people rely on it right up until it crashes then go apoplectic because it didn’t see something any human would consider obvious (e.g. Tesla, which gets slightly safer when the AI is active, but people keep showing the AI getting confused about things that they consider obvious [2]).

[0] https://theoatmeal.com/blog/google_self_driving_car

[1] https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg

[2] https://youtube.com/watch?v=Wz6Ins1D9ak

salawat 2021-08-16 13:26:55 +0000 UTC [ - ]

This is the bit nobody likes to realize. FSD at it's best...is still about as fallible as a human driver. Minus free will (it is hoped).

I will be amused, intrigued, and possibly a bit horrified if by the time FSD hits level 5, and they stick with the Neural Net of Neural Nets architecture if there isn't a rash of system induced variance in behavior as emergent phenomena take shape.

Imagined news: All Tesla's on I-95 engaged in creating patterns whereby all non-Teala traffic was bordered by a Tesla on each side. Almost like a game of Go, says expert. Researchers stumped.

Then again, that's imply you had an NN capable of retraining itself on the fly to some limited degree, which I assume no one sane would put into service... Hopefully this comment doesn't suffer a date of not aging well.

antattack 2021-08-16 14:00:12 +0000 UTC [ - ]

All Level 2 systems need to better integrate with the driver. Upon engagement driver and driver assist are team where communication and predictability is crucial.

mrfusion 2021-08-16 13:05:18 +0000 UTC [ - ]

I’d be curious if there are studies out there on how to do automated assists in machines that require vigilance that don’t have this problem.

jdavis703 2021-08-16 13:23:25 +0000 UTC [ - ]

Both airplanes and trains have automated “assist.” At least in the case of WMATA they give up on automatic train control after a fatal crash.

merrywhether 2021-08-16 15:30:01 +0000 UTC [ - ]

This is a misrepresentation of the dumpster fire that was the WMATA train situation. Yes, the fatal crash was the last straw, but the root problem was not the automation system but rather the complete lack of maintenance that led to its inability to work properly. Congress refusing to fund maintenance and then falling behind 10-15 years on it lead to all kinds of systems failing. The fatal fire in the blue line tunnel under the river occurred with a human at the controls, but we’re similarly not blaming that incident on the perils of human operation.

jdavis703 2021-08-16 16:10:54 +0000 UTC [ - ]

I don’t blame the operator for the crash. The other train was behind a blind curve and she hit the emergency brake within a reasonable amount of time given what she could see. However the speeds of the system were set too high for the operator to safely stop because they assumed the ATC would work perfectly.

userbinator 2021-08-16 13:42:56 +0000 UTC [ - ]

In a plane you also have far more time to react once the autopilot disconnects for whatever reason, than the fraction of a second that a car gives you.

jdavis703 2021-08-16 17:00:11 +0000 UTC [ - ]

Then the automation needs to be more conservative in its ability and request intervention sooner.

yawaworht1978 2021-08-16 22:10:00 +0000 UTC [ - ]

The difference is, they have traffic controllers and the train have their own dedicated rails, almost no obstructions and a train into train crash danger situation rarely arises. The planes have a lot of maneuvering space to all sides.

Car traffic and streets are more dense and often have humans crossing them without regards to laws, bicycles, motorbikes, road construction and bad weather.

Not saying one auto pilot system is better than the other, however, they operate in different environments.

swiley 2021-08-16 13:34:02 +0000 UTC [ - ]

We have an education problem. People have no idea what computers do because they're illiterate (literacy would mean knowing at least one language well enough to read and write in it) so they just take other people's word that they can do some magical thing with software updates. The most extreme examples of this were the iPhone hoaxes telling people that software updates provided waterproofing or microwave charging.

supperburg 2021-08-16 19:15:06 +0000 UTC [ - ]

Lex Fridman said they studied this and found that people don’t become “lulled” even after using the system for a long period of time.

nathias 2021-08-16 13:01:05 +0000 UTC [ - ]

that's not human nature, that's user stupidity

weird-eye-issue 2021-08-16 13:02:22 +0000 UTC [ - ]

What's the... Difference?

nathias 2021-08-16 14:03:22 +0000 UTC [ - ]

One is a permanent property of our nature the other a choice.

weird-eye-issue 2021-08-17 12:22:23 +0000 UTC [ - ]

No, one is a fact and the other is a consequence

2021-08-16 13:28:58 +0000 UTC [ - ]

formerly_proven 2021-08-16 13:10:10 +0000 UTC [ - ]

The design is fine, it's all the users who are idiots.

P.S. /s. Obviously, Mr. Poe.

rvz 2021-08-16 13:36:35 +0000 UTC [ - ]

Well they did not 'pay' attention. They 'paid' for the "Fools Self Driving" package.

This is why 'attention' and 'driver monitoring' was not included.

salawat 2021-08-16 13:41:29 +0000 UTC [ - ]

T. Every maligned designer when someone points out a flaw

It's okay. I do it too. Really need to work on seeing yourself making that argument as a starting point and not an endpoint.

judge2020 2021-08-16 13:00:51 +0000 UTC [ - ]

Your reasoning doesn’t apply to the incremental improvements to self-driving, rather Tesla’s decision to allow all cars to use TACC/auto-steer. They haven’t even given people “the button” to enroll in FSD beta, likely because they know it would be extremely bad PR when a bunch of people use it without paying attention.

jedberg 2021-08-16 19:47:45 +0000 UTC [ - ]

A lot of people in here saying it is not possible to drive safely with partial self driving. I wonder, how many of those people have actually driven a car with autopilot?

I have autopilot on my car, and it definitely makes me a better and safer driver. It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things. Instead I can spend all my brainpower focused on looking for potential emergencies, instead of splitting time between lane keeping/following and looking for emergencies.

I no longer have to look at my speedometer or the lane markers, I can take a much broader view of the traffic and conditions around me.

Before you say it's impossible to be safe driving with an assistive product, I suggest trying one out.

tomdell 2021-08-16 19:52:28 +0000 UTC [ - ]

I would argue that partial self driving is an irresponsible product not because it's impossible to drive safely with it, but because so many people will use it as an excuse to pay little to no attention to the road. If you personally are a responsible driver and even a better driver with it, that's great - but most people probably aren't going to use it the same way, especially those without much of an understanding of the technology - and especially given the way that Tesla markets it.

Drunk_Engineer 2021-08-16 19:58:59 +0000 UTC [ - ]

The technical term for this is Risk Compensation:

"Risk compensation is a theory which suggests that people typically adjust their behavior in response to perceived levels of risk, becoming more careful where they sense greater risk and less careful if they feel more protected."

xahrepap 2021-08-16 20:34:12 +0000 UTC [ - ]

Reminds me of this kind of thing:

https://usa.streetsblog.org/2017/09/13/wide-residential-stre...

I was first introduced to "wide streets in neighborhoods are more dangerous than narrow" on HN years ago. (I don't think it was the linked article, but that was the first one that came up just now after a search :P )

Since having read that, I've actually noticed how true this is, at least to me anecdotally. When I'm driving in a neighborhood with crowded streets, I can't bring myself to go over 15MPH, much less over the speed limit (typically 25 in neighborhoods in the US).

Wide streets give a sense of security. So I feel like people are less likely to pay attention going around bends, parked cars, etc, than if they didn't have that sense of security.

MichaelZuo 2021-08-16 22:59:02 +0000 UTC [ - ]

Also moral hazard, kinda.

telside 2021-08-16 23:44:26 +0000 UTC [ - ]

I trust someone posting here to driver safely and safer with it.

Forget the average, how about the bottom 10-20% of all drivers? I don't trust the bottom 10% driving with "Autopilot" at all, zero. They are going to use it to go on autopilot while driving, exactly as the marketing implies. I mean there has to even be people who think the car itself is conscious. Car has advanced AI, must be conscious.

To think otherwise is just highly underestimating how clueless some people are.

vnchr 2021-08-17 00:52:33 +0000 UTC [ - ]

I don’t trust those people without Autopilot either. Is that the point?

kemiller 2021-08-16 20:00:25 +0000 UTC [ - ]

Yes there have been stories about irresponsible people. Do you have any evidence that this is the common case? The aggregate evidence seems to suggest reduced accidents and reduced fatalities.

gusgus01 2021-08-16 21:12:51 +0000 UTC [ - ]

There was a study that adaptive cruise control and lane assist leads to more people speeding: https://www.iihs.org/news/detail/adaptive-cruise-control-spu...

They then use a "common formula" to show that that leads to more fatal accidents, but didn't actually study on actual crash data.

birken 2021-08-16 20:12:06 +0000 UTC [ - ]

Absolutely not true as a blanket statement. Maybe if the driver monitoring is so lax that you could conceivably trick the car into poorly driving itself, but the system I use, Comma [1], has incredibly strict driver monitoring.

There is absolutely no doubt I'm a safer driver with Comma than without it. I'm still in control, but Comma not only allows me to expend less effort driving (which allows me to stay alert over longer periods of time), but also be much less emotional when driving. I'm pretty convinced that a large percentage of accidents are caused by frustrated or bored drivers doing crazy things that you just don't feel the urge to do with the assistance of self-driving.

1: https://comma.ai/

jedberg 2021-08-16 20:49:18 +0000 UTC [ - ]

I use the same system as you do, and I've noticed that if you mention that system's name, you tend to get downvotes. I haven't yet figured out why, not sure if there is a bot or just a lot of Tesla fans who downvote the mention of our system.

Edit: After one minute I got a downvote.

SECProto 2021-08-16 21:52:37 +0000 UTC [ - ]

It sounds like you're advertising it. "The future can be yours, today. For the introductory monthly price of 79.99. Sign up here[1]"

jedberg 2021-08-16 21:56:16 +0000 UTC [ - ]

This doesn't even make sense. Simply mentioning the name of a product I use is not advertising. Otherwise, is every person here who mentions Tesla advertising too?

SECProto 2021-08-16 23:33:43 +0000 UTC [ - ]

> Simply mentioning the name of a product I use is not advertising.

Sorry, I actually meant to refer to Birken's comment above. Advertising might not have been the best word - astroturfing? If you read their comment but replace "comma" with "tesla" it still reads as spammy.

Yours was fine (though discussing downvotes will always get you downvotes, my comment included).

> Otherwise, is every person here who mentions Tesla advertising too?

Only the ones who needlessly sing the praises of the Tesla autopilot in barely-related threads.

jskrn 2021-08-16 19:57:36 +0000 UTC [ - ]

Well said, that last bit especially. The regulations on medical devices are on how the manufacturer markets it. Should be the same for driving technology.

joshuanapoli 2021-08-16 21:44:03 +0000 UTC [ - ]

> so many people will use it as an excuse to pay little to no attention to the road

I guess that we have to look at the results here to judge whether too many people are not paying attention. Hopefully the investigation will reveal whether the autopilot incidents of collision with emergency vehicles is significantly more frequent or less frequent than from vehicles being driven in the traditional way.

wilg 2021-08-16 19:59:00 +0000 UTC [ - ]

This is a question that is answerable with the right data – we can just see if it's safer or not.

malwarebytess 2021-08-16 20:06:37 +0000 UTC [ - ]

Doesn't the data show that cars with assistive technologies are in fewer non-fatal and fatal accidents?

new_realist 2021-08-16 20:29:33 +0000 UTC [ - ]

Tesla marketed Autopilot != responsibly implemented assistive safety systems.

tomdell 2021-08-16 20:40:38 +0000 UTC [ - ]

It looks like the federal government is beginning to collect and analyze relevant data, which will be interesting.

https://www.latimes.com/business/story/2021-06-29/nhtsa-adas...

Tesla released data in the past, but that’s quite suspect as they have an obvious agenda and aren’t known for open and honest communication.

https://www.latimes.com/business/autos/la-fi-hy-tesla-safety...

woah 2021-08-16 21:01:53 +0000 UTC [ - ]

I once talked to a guy who bragged about having Autopilot drive him home when he's drunk

samstave 2021-08-16 20:31:48 +0000 UTC [ - ]

One would think that with "autopilot" there would be a limit to speed and an increased "caution distance" the vehicle maintains with everything.

I also think there should be dedicated lanes for self driving cars..

A very good friend of mine was a sensor engineer at google working on virtual sensors that interacted with hand gestures in the air... and is now a pre-eminent sensor engineer for a large japanese company everyone has heard of...

We drove from the bay to nprthern california in his tesla and it was terrifying how much trust he put into that car. I got car sick and ended up throwing up out the window...

Knowing what I know of machines working in tech since 1995 -- I would trust SHIT for self-driving just yet.

rubicon33 2021-08-16 19:52:45 +0000 UTC [ - ]

Something tells me the majority of people with partial self driving aren't using it as a means of staying more focussed on the road. There's a pesky little device buzzing around in everyone's pocket that is more likely the recipient of this newfound attention.

theluketaylor 2021-08-17 00:20:32 +0000 UTC [ - ]

I do like using Tesla's Traffic Aware Cruise Control, but autopilot takes over enough of the driving I know I can't pay meaningful attention for more than 10 or 15 minutes. I just did a 1600 km round trip with traffic aware cruise control on the vast majority of the time, but I didn't turn on autopilot once.

Autopilot bothers me for a number of reasons. Fundamentally it's a poor driver, spending time in people's blind spots unnecessarily, braking and accelerating in rather abrupt ways, and just generally acting like a teenage driver who just got their license. It simply doesn't practice defensive driving.

I also spend a lot of time driving on undivided rural highways. These are highly dangerous roads, with closing speeds in excess of 200 km/hr at times. In those situations autopilot drives far too much according to the strict rules of the road and can't adjust to the situation. It doesn't use the lane space to leave additional room and it doesn't give a wide berth to cyclists.

It also bothers me deeply that one of the ways to override autopilot is to make steering input and that's also the indicator Tesla uses to determine the user participating. I haven't used autopilot much for the reasons above, but the few times I did active it the steering input required to tell the car I'm there is also enough steering input to move the car several feet in the lane due to very direct steering. It feels like I'm just fighting the car. That is deeply unnerving since the force required to override autopilot feels like enough to jump nearly half a lane and cause a collision.

jedberg 2021-08-17 00:28:36 +0000 UTC [ - ]

I don't drive a Tesla, I use a different autopilot system. The one I use isn't as aggressive as Tesla, so it doesn't exhibit these behaviors but also requires more manual takeovers. Also it uses facial recognition so you don't have to touch the wheel (but you can touch the wheel and steering manually doesn't disengage it, only the brake and gas do).

theluketaylor 2021-08-17 00:48:38 +0000 UTC [ - ]

Since autopilot is a specifically Tesla term for their L2 driver assistance features, you may be better served in this thread referring specifically to the system you have. Most other L2 systems require so much more driver input and attention that many of the incredulous replies are likely assuming you're talking about Tesla specifically.

I'm guessing GM Supercruise since it's the only one I'm aware of that uses eye tracking in production (though Tesla claims to have enabled that in the US just recently. Personally I'm not sure their camera placement can really do proper eye tracking). Supercruise's disengagement rate is low though, generally much lower than Tesla's.

I do like supercruise from what I've seen of it (haven't had a chance to actually use it since GM seems determined to waste their advantage by not rushing it into every car they make).

jedberg 2021-08-17 01:16:17 +0000 UTC [ - ]

I don't mention the one I use because every time I do I get immediate downvotes. But I use the system from Comma.ai which is based on Openpilot, which refers to itself as open source autopilot.

bob1029 2021-08-16 21:47:34 +0000 UTC [ - ]

> my brain no longer has to worry about those mundane things

I would be terrified to share the road with someone of this mindset. Your vehicle is a lethal weapon when you are driving it around (assisted or otherwise). At no point can someone claim that a tesla vehicle circa today is able to completely assume the duty of driving. You are still 100% responsible for everything that happens in and around that car. You had better have a plan for what happens if autopilot decides to shit the bed while a semi jackknifes in front of you.

The exceptions are what will kill you - and others - every single time. It's not the boring daily drives where 50 car pileups and random battery explosions occur. Maybe your risk tolerance is higher. Mine is not. Please consider this next time you are on the road.

nexuist 2021-08-16 22:03:44 +0000 UTC [ - ]

This...is entirely the point OP is making. You get more brain power to watch out for the semi jackknifing into you, the car switching lanes without signaling, the truck about to lose a bucket or chair from its bed. This is stuff you may not catch when you're spending your brain power focusing on staying between the lanes and keeping your distance between the car in front.

When you automate away the mundane, exceptions are much easier to catch.

crote 2021-08-16 23:30:15 +0000 UTC [ - ]

The problem is that it isn't automated away.

Self-driving cars will take that stuff over 99% of the time, but the 1% where it screws up is the dangerous part. There are plenty of examples where a self-driving car seemingly randomly seems to go completely haywire, without any obvious reason.

Instead of spending brain power on driving properly, you now have to spend brain power on looking at what you should be doing _and_ checking if the car actually agrees and is doing it.

Staying 100% focused without actually _doing_ anything is incredibly difficult. Many countries intentionally add curves to their highways to keep drivers alert: having a pencil-straight road for hundreds of miles really messes with the human brain.

stephencanon 2021-08-17 01:17:15 +0000 UTC [ - ]

> You get more brain power to watch out for the semi

That doesn’t help at all if you’re reading a book or playing on your phone, both of which are things I observe Tesla drivers doing pretty often when I’m in the Bay Area.

jedberg 2021-08-16 21:54:11 +0000 UTC [ - ]

Do you get concerned about mathematical errors because the computer is doing the calculation instead of someone doing it by hand?

It's the same thing here. The computer is assisting me so that I can take care of the novel situations, the exceptions if you will. I can pay closer attention to the road and see that jackknifed trailer sooner because I wasn't looking at my speedometer to check my speed.

And I don't have a Tesla, I use a different autopilot system.

throwaway0a5e 2021-08-16 23:19:22 +0000 UTC [ - ]

You're misunderstanding him at best and projecting at worst.

He's saying that he no longer has to worry about those things the same way cruise control lets you not worry about the speedometer needle and dedicate more of your attention budget outside the car.

Of course you can be an idiot and spend it on your cell phone but that's not really a failure mode specific to any given vehicle technology.

CommieBobDole 2021-08-16 20:25:21 +0000 UTC [ - ]

I drove a friend's Model 3, and within five minutes of driving on autopilot it got confused at an intersection and tried to make a pretty sudden 'lane change' to the wrong side of a divided road.

Obviously that's a single anecdote, and I don't know if it would have gone through with it because I immediately corrected, but that was my experience.

ec109685 2021-08-16 21:38:02 +0000 UTC [ - ]

I bet it would have made that mistake.

The question is whether a system that absolutely requires that you pay attention going through intersections (which you should obviously do) is safer in aggregate than not having those features enabled at all in those situations.

E.g. are weird lane changes that people don't catch happening more frequently than people zooming through red lights because they weren't paying attention. Only the data can show that, and Tesla should share it.

MisterTea 2021-08-16 20:30:18 +0000 UTC [ - ]

> "* It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things.*"

Ive been driving for 25 years, cars, trucks, trailers, standard and auto transmissions, and I have never once thought to myself "I'd be such a better diver if I didn't have to pay attention to my speed, lane keeping or following distance" Why? Because those mundane things are already on autopilot in my brain.

Posts like yours are so absurd to met that I cant help but think shill.

jedberg 2021-08-16 20:52:58 +0000 UTC [ - ]

I've been driving for 29 years, and I never thought those things either until I got autopilot (and I don't have a Tesla BTW, I have a different autopilot system). While those things were autopilot in my brain, they still took brain power. It's so much more relaxing not worrying about those things.

It's like people who do math by hand and then get a calculator.

spywaregorilla 2021-08-16 20:54:17 +0000 UTC [ - ]

Maybe you're a great driver then. Have you ever shared the road with someone who was a terrible driver?

studentrob 2021-08-16 21:53:42 +0000 UTC [ - ]

Yes. Those are people who think they can go hands free, use their phone or watch a movie while on autopilot.

rrix2 2021-08-17 00:36:13 +0000 UTC [ - ]

theyre also likely the same type of folks who fall for marketing like "full self driving" without investigating critically or even reading through the analog and digital shrinkwrap they have to tear through to get to their date or appointment or whatever on time.

int_19h 2021-08-16 21:05:02 +0000 UTC [ - ]

I have a car that does those things as well, and I use it a lot... but it's not Tesla, and its manufacturer doesn't refer to it "autopilot" or "self driving", but rather "advanced cruise control".

throwaway09223 2021-08-16 19:52:17 +0000 UTC [ - ]

Agreed. I have a rudimentary radar enhanced cruise control in my minivan and I've found it's really helpful for maintaining a safe stopping distance while driving.

breakfastduck 2021-08-16 21:00:25 +0000 UTC [ - ]

Thats not what they bloody sell it as, though. Thats the key.

aguasfrias 2021-08-16 20:01:48 +0000 UTC [ - ]

You might be giving too much credit to your ability to pay attention to your surroundings. It is possible that looking around as a passenger might actually increases risk. There's no way to tell other than looking at the data.

Personally, I tend to turn off things like lane keeping because I end up having to babysit them more than I would like. It doesn't always read the lanes correctly, though I have not tried Tesla's technology yet.

new_realist 2021-08-16 20:27:42 +0000 UTC [ - ]

As someone who has used Autopilot extensively, I can tell you: you only have the illusion of enhanced safety. In reality, parts of your brain have shut down to save energy, and you've lost some situational awareness, but you can't tell that's happened.

theopsguy 2021-08-16 23:13:07 +0000 UTC [ - ]

If you can’t tell, how do you know this is the case?

yumraj 2021-08-16 22:03:57 +0000 UTC [ - ]

It is possible that you've learnt to drive like that and that it works for you.

But, I feel that this depends on the type of driver and their personality. I, for on, have never felt comfortable with cruise controls, even adaptive ones, let alone partial self-driving. I had always found that I am more comfortable when I was driving rather than trying to make sure that the adaptive cruise control is able to make a complete stop in case of emergencies. Perhaps I'm just a little untrusting and paranoid :).

andyxor 2021-08-16 20:20:47 +0000 UTC [ - ]

i've rented model X with the latest FSD a few weeks ago and even simple things like lane detection are very inconsistent and unpredictable.

I don't know if this "AI" has any sort of quality control, but how difficult is it to test if it detects a solid white line on the side of the road in at least 6 out of 10 tries

it also tends to suddenly disengage and pass control to the driver at most dangerous parts of the trip e.g. when passing other car in narrow lane, etc.

This "driver assistant" is a series of disasters in the making.

hellbannedguy 2021-08-16 20:16:42 +0000 UTC [ - ]

There's not a small part of your psyche that tells you it's ok to drive while tired, or all the way home from that Las Vegas trip because the technology is so good?

jedberg 2021-08-16 20:51:00 +0000 UTC [ - ]

The thing is, I get a lot less tired when I'm driving now, because I get to focus only on the novel stimulus (possible emergencies) and not the mundane.

But no, I don't trust it to drive itself. If I'm tired I won't drive, regardless of autopilot.

phyzome 2021-08-17 00:17:42 +0000 UTC [ - ]

> I no longer have to look at my speedometer

I'm curious about this part. Do you manually input a limit, or trust it to read street signs?

And how often do you look at your speedometer anyhow? I think on the highway I glance at it maybe once every few minutes and otherwise match speed with the other vehicles, and in the city I look more often but mostly just drive at what feels a safe pace (which seems to match the limits, more or less.)

jedberg 2021-08-17 00:25:36 +0000 UTC [ - ]

I don't drive a Tesla, but the one time I did drive one it read the street signs. In my car I just set the speed manually.

It's true, I don't look at the speedometer all that often when driving manually, but it's just one less thing to worry about.

asdff 2021-08-16 22:36:23 +0000 UTC [ - ]

I think that's pretty reckless honestly, you put a lot of faith in the system being able to detect lane markers. Other than that I could see how adaptive cruise control can be nice, but it's also not hard to engage cruise control and fine tune your speed to the conditions by tapping up or down on the buttons on the wheel.

jedberg 2021-08-16 23:13:18 +0000 UTC [ - ]

I don't put any faith in it at all actually. That's why I pay attention while it drives, looking for novel situations, which would include self driving errors, so I can correct them.

jeffrallen 2021-08-16 21:55:01 +0000 UTC [ - ]

I tried it and found I was spending brainpower fighting the system, sending noise inputs wrt real world conditions in order to trick the system into not disengaging because it decided I was not "driving" it enough. The hacker in meet loved it, the rational person in me said, turn that off before it makes you crash!

e40 2021-08-16 20:51:27 +0000 UTC [ - ]

For you, definitely. For the people I see reading while the car drives them, not at all.

sgustard 2021-08-16 21:53:52 +0000 UTC [ - ]

I agree 100% with jedberg as to my own driving experience with autopilot. Works great, and I still pay complete attention because I don't want to die in a fiery crash. If you're not going to pay attention, driving a dumber car doesn't make it safer.

sorokod 2021-08-16 20:32:23 +0000 UTC [ - ]

Survivorship bias?

spywaregorilla 2021-08-16 20:55:32 +0000 UTC [ - ]

That would only be relevant is a substantial portion of people who felt poorly about tesla autopilot had literally perished from it.

sorokod 2021-08-16 21:44:45 +0000 UTC [ - ]

It was said in jest, but to your comment, no need to perish - just not be vocal about the negative feelings.

mensetmanusman 2021-08-16 13:07:44 +0000 UTC [ - ]

I remember when Google was first presenting on driverless technology about 10 years ago, and they mentioned how you have to go right to full self driving, because any advanced driver assistance will clash with human risk compensation behavior.

Risk compensation is fascinating; driving with a bike helmet causes the biker and drivers around the biker to behave more dangerously.

Is society sophisticated enough to deal with advanced driver assistance? Is it possible to gather enough data to create self driving ML systems?

WA 2021-08-16 19:51:34 +0000 UTC [ - ]

> Risk compensation is fascinating; driving with a bike helmet causes the biker and drivers around the biker to behave more dangerously.

Do you have a truly reliable source for that? Because I hear this statement once in a while, and it feels flawed.

A helmet protects you from severe head injury if you are in an accident. There are more reasons for accidents than reckless car drivers. For example:

- Bad weather

- Driver not seeing the biker at all (no matter with or without helmet)

- Crash between 2 cyclists

xsmasher 2021-08-16 21:21:25 +0000 UTC [ - ]

Parent did not say that helmets make you less safe. They said that helmets make drivers around the biker behave more dangerously.

https://www.bicycling.com/news/a25358099/drivers-give-helmet...

brandmeyer 2021-08-16 22:10:44 +0000 UTC [ - ]

3.5 inches on an average of ~1 meter was the measurement, in a study that a single researcher performed using himself as the rider.

This result is both weakly supported and small, and it shouldn't be considered actionable.

jacquesm 2021-08-16 14:17:14 +0000 UTC [ - ]

Risk compensation probably also works the other way, looking forward to all news cars standard supplied with a new safety device that cuts traffic accidents to a small fraction of what they used to be, the only ones remaining are all fatal for the driver.

A nice and very sharp 8" stainless steel spike on the steering wheel facing the driver.

toast0 2021-08-16 20:31:10 +0000 UTC [ - ]

> A nice and very sharp 8" stainless steel spike on the steering wheel facing the driver.

Didn't we have those in the 50s and 60s? Maybe not sharp, but collapsable steering columns are a significant improvement to survivability.

barbazoo 2021-08-16 20:04:38 +0000 UTC [ - ]

> Risk compensation is fascinating; driving with a bike helmet causes the biker and drivers around the biker to behave more dangerously.

Source please

bllguo 2021-08-16 20:28:06 +0000 UTC [ - ]

I remember reading that viewpoint in this essay: https://cyclingtips.com/2018/11/commentary-why-i-stopped-wea...

there are some sources and studies linked. i.e. countries with the highest rate of helmet use also have the highest cyclist fatality rates

xsznix 2021-08-16 20:15:20 +0000 UTC [ - ]

barbazoo 2021-08-16 20:48:00 +0000 UTC [ - ]

I cannot open the study in the first link but the second on seems to actually refute the claim instead of supporting it.

> There is a body of research on how driver behaviour might change in response to bicyclists’ appearance. In 2007, Walker published a study suggesting motorists drove closer on average when passing a bicyclist if the rider wore a helmet, potentially increasing the risk of a collision. Olivier and Walter re-analysed the same data in 2013 and claimed helmet wearing was not associated with close vehicle passing.

xsmasher 2021-08-16 21:24:14 +0000 UTC [ - ]

Keep reading.

> We then present a new analysis of the original dataset, measuring directly the extent to which drivers changed their behaviour in response to helmet wearing. This analysis confirms that drivers did, overall, get closer when the rider wore a helmet.

barbazoo 2021-08-17 20:55:13 +0000 UTC [ - ]

Yes, you're right. I should have read the whole thing.

phoe18 2021-08-16 12:58:30 +0000 UTC [ - ]

> "The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes,"

No mention of the deceptive marketing name "Full Self Driving" in the article.

dmix 2021-08-16 15:12:57 +0000 UTC [ - ]

I checked the website and they seem to be contextualizing "Full-self driving" with it coming at a future date:

> All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. [...] As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates.

https://www.tesla.com/en_CA/autopilot

I also personally would prefer they stuck to 'autopilot' and avoided the word full in 'full self-driving' and otherwise be more specific about what it means.

Other car companies typically productize the various features like lane assist, following cruise control, etc rather than bundle it into one. But that definitely makes communicating it more difficult.

Tesla probably doesn't want to call it 'limited self-driving' or 'partial self-driving'. Maybe 'computer assisted driving' but that doesn't sound as appealing. I can see the difficulty marketing here. But again not using 'full' as in it's complete and ready-to-go would help.

rvz 2021-08-16 13:23:55 +0000 UTC [ - ]

Exactly. Some of these cars do not even have 'Driver Monitoring', which means the car doesn't even track if the driver has their eyes on the road at all times, which puts many other drivers at risk.

On top of that, FSD is still admittedly Level 2; Not exactly 'Full Self Driving'? And the controls can easily be tricked to think that the driver has their 'hands on the wheel' which is not enough to determine driver attentiveness while FSD is switched on.

xeromal 2021-08-16 19:48:24 +0000 UTC [ - ]

I'm pretty sure because FSD is out to a limited number of users at the moment. I think it totals around a 1000.

kube-system 2021-08-16 20:09:35 +0000 UTC [ - ]

This is just more evidence of the confusion that Tesla marketing has created. "Full Self-Driving Capability" is the literally quoted option they've been selling for years now.

guerby 2021-08-16 15:09:35 +0000 UTC [ - ]

There is about 36000 death on the road per year in the USA for 280 millions vehicles, that's 128.5 death/million vehicule/year

If we assume the number of tesla autopilot death double this year to 8 (from 4 at the time of probe launch), for about 900 thousand tesla on the road in USA, that's 8.9 autopilot death/million tesla/year.

Ratio between the numbers of 14.4.

Tesla reporting says for Q1 2021 one crash on autopilot per 4.19 millions miles vs one crash per 484 thousand miles all vehicules.

Ratio between the numbers of 8.7

All numbers are full of biases and their ratio probably aren't that meaningful but they end up in the same magnitude.

Interesting data there "Fatality Facts 2019 Urban/rural comparison":

https://www.iihs.org/topics/fatality-statistics/detail/urban...

"Although 19 percent of people in the U.S. live in rural areas and 30 percent of the vehicle miles traveled occur in rural areas, almost half of crash deaths occur there. "

I was shocked that in the USA in 2019 about 40-46% of all road death people were unbelted, while 90% of front seat people wear seat belts according to observation studies.

Incidentally tesla car will beep to no end if weight is detected on a seat and seat belt isn't clicked: I have to click the seat belt when I put my (not so heavy) bag on the passenger seat since there's no software option to disable the beeping.

akira2501 2021-08-16 22:14:35 +0000 UTC [ - ]

> If we assume

You shouldn't. 16% of accidents are pedestrians. 8% are motorcyclists. 40% of accidents involve excessive speed or drugs and alcohol.

Accidents aren't a fungible item you can do this with.

> bag on the passenger seat since there's no software option to disable the beeping.

There is in the US for the rear seats. Additionally, you can just leave the belt always clicked in and just sit on top of them. There aren't many great technological solutions to human behavior.

jeffbee 2021-08-16 19:51:25 +0000 UTC [ - ]

You can't really make the comparison between the entire US fleet and Tesla alone. All Teslas are newer than the median car in the fleet, and Tesla owners are self-selected among wealthy people, because the cars are pretty expensive. The IIHS says that deaths per million vehicle-years among midsize luxury cars is 20. There are many cars where no driver died in a given year, for example the Mercedes C-class "4matic" sedan .

btbuildem 2021-08-16 19:29:13 +0000 UTC [ - ]

> I was shocked that in the USA in 2019 about 40-46% of all road death people were unbelted, while 90% of front seat people wear seat belts according to observation studies.

Doesn't that just speak to the effectiveness of seatbelts? Most people wear them, and two-fifths of those who died in a crash did not wear a seatbelt.

If we had the same proportion of deaths as we have seatbelt wearers, that would indicate the belts are ineffective.

guerby 2021-08-16 20:18:47 +0000 UTC [ - ]

Yes we agree.

It's just that in France "only" 20-25% of fatalities are for people not wearing seatbelt.

Observation statistics are at about 98-99% of front seat users wearing seat belt in France.

Seat belt in front is mandatory since 1st july 1973 and back seat since 1st october 1990.

So seat belts not Tesla autopilot or whatever would save around 8000 lives per year in the USA.

Are tesla cars in the USA nagging about seat belt with no software off switch like in France?

zzt123 2021-08-16 19:35:12 +0000 UTC [ - ]

Assuming that belted and unbelted people get into accidents at the same rate, and given another commenter mentioning that Autopilot users have 8.7x lower accident rate than baseline, that makes Autopilot a greater safety add than seat belts, no?

foepys 2021-08-16 20:25:34 +0000 UTC [ - ]

Can Autopilot work in heavy rain or fog? If not, those comparisons are useless. Those are the conditions where most accidents occur, not in sunny Californian weather.

guerby 2021-08-16 20:39:54 +0000 UTC [ - ]

Yes up to a point where it gives up and asks the driver to take back the wheel.

But at this point you really see nothing and you'll limit your speed to 10-40 km/h by yourself.

I used it in those situation on my tesla model 3 to be able to focus a maximum on the small visibility left as a driver, and with both hands strongly on the wheel and foot on the brake, low visibility is really dangerous and scary on the road.

Part of the issue is that you don't know what speed the car arriving behind you will have so where's your optimum speed? Too slow and rear ended by bad drivers, too fast and it won't go well.

It's fresh on my mind since I had such driving conditions two weeks ago on the highway. Trucks stayed at suicidal 90 km/h ...

darkwizard42 2021-08-16 19:12:43 +0000 UTC [ - ]

Two notes:

1. generally speaking the right way to think about accidents/fatalities/hard breaking events is per miles driven given that risk scales with time spent on road (and miles driven is the best proxy we have at the moment, insurance companies use this stat)

2. If wearing a seat belt prevents a ton of fatalities as advertised and generally proven, it would make sense that of the road fatalities that do happen, many are due to not wearing a seat belt.

10% of people not wearing seat belts is still hundreds of millions of miles driven without seat belts.

jazzyjackson 2021-08-16 19:18:49 +0000 UTC [ - ]

My issue with comparing these statistics is that on the highway I see no shortage of reckless driving: speeding 20 over, weaving traffic, etc. Subtract this population (maybe by taking away their license and building public transit for them) and what do the numbers look like? Your seat belt stat supports this, a lot of drivers aren’t even trying not to die.

Of course, highway driving is the least dangerous (per passenger mile) since everything is so predictable, I don’t know how many deaths are caused by t-bones at intersections but that at least should disappear now that auto-radar brakes are a thing… (tesla thinks it’s too good for radar of course, even tho it can’t recognize a god damn fire truck is in the way)

yawaworht1978 2021-08-16 22:22:38 +0000 UTC [ - ]

I remember that video "driver is only there because of regulatory rules". That is a flat out lie, safe to say so by now. The autopilot accidents per distance is also cherry picked, turn on autopilot everywhere , including bad weather and see how that comparison goes. And the claim that the cars have all the hardware for future fsd is quite out there too. It's a bit like saying I have the next Michael Phelps here, he just cannot swim yet.

sidibe 2021-08-16 13:08:33 +0000 UTC [ - ]

Glad the regulators are looking into this. It bothers me that now Tesla seems to have no liability at all for the system not working, since it's always the driver's fault for not paying enough attention.

As Teslas get better at driving the drivers will be paying less attention inevitably, Tesla needs to start being responsible at some point

bob33212 2021-08-16 13:35:16 +0000 UTC [ - ]

Every year young drivers die because they were inexperienced and didn't realize they were going too fast to too slow for a certain situation.

Once full self driving is statistically safer than humans how will you not let people use it? It is like saying you would rather have 10 children die because of bad driving skills rather than 1 child die because they were not paying attention at all times.

sidibe 2021-08-16 13:41:44 +0000 UTC [ - ]

>Once full self driving is statistically safer than humans how will you not let people use it?

I'm fine with self-driving if/when it works (though I'm pretty sure from watching FSD Beta videos shot and edited by their biggest fans with a few interventions every 5 minutes, this is many many many years away for Tesla). But the company selling the self driving has to be responsible to some degree for the mistakes it makes.

WA 2021-08-16 19:58:29 +0000 UTC [ - ]

And responsible for the marketing it puts out:

https://www.tesla.com/videos/autopilot-self-driving-hardware...

"… HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF."

Online since 2016, debunked as a lie. Still on Tesla’s website.

thebruce87m 2021-08-16 22:46:48 +0000 UTC [ - ]

Just statistically safer won’t cut it - it will have to me many orders of magnitude safer. Instead of drunk people and mobile phone users dying it will be random accidents that humans would easily have avoided but is some weird edge case for the ML model. It’ll be a cars plowing down kids on trikes on a clear day, all captured in perfect HD on the cars cameras and in the press the next day with the crying driver blaming the car.

That’ll be a hard thing to overcome for the public. The drunk person “had it coming”, but did little Timmy?

bob33212 2021-08-17 02:08:34 +0000 UTC [ - ]

So Drunk people don't kill innocent people? Obviously it can't be just barely safer than humans. It need to be a lot safer to change public opinion

jazzyjackson 2021-08-16 19:21:33 +0000 UTC [ - ]

As far as that goes if we want to save lives we can just regulate that semi-autonomous cars have to enforce the speed limit + 0 visibility from fog and heavy rain is an automatic pull over and wait for conditions to improve.

Vecr 2021-08-17 00:18:31 +0000 UTC [ - ]

I don't think cars enforcing the law is a good idea at all. You have the moral and legally defensible right to break the law where not doing so would create substantial and immediate risk to yourself of others.

jazzyjackson 2021-08-17 03:13:12 +0000 UTC [ - ]

Ok so just log it and send the registered owner a bill, they can defend the action in court

Maybe Elon can add a prompt in the car: one time payment to unlock 100+ mph

kube-system 2021-08-16 20:45:39 +0000 UTC [ - ]

There are a lot of good points here in the comments already about the relative safety of Tesla's system compared to other vehicles and other automated driving system -- and I think they're probably right.

The differentiating issue with Tesla's system is the way it is sold and marketed. Important operational safety information shouldn't be hidden in fine print. Subtly misleading marketing has unfortunately become acceptable in our culture, but this idea needs to stay out of safety-critical systems.

We need a mandate for clear and standardized labelling for these features, à la the Monroney sticker. All manufacturers should have to label and market their cars with something like SAE J3016. https://www.sae.org/binaries/content/gallery/cm/articles/pre...

TacticalCoder 2021-08-16 13:11:21 +0000 UTC [ - ]

I drive a lot across Europe: as in, really a lot, long trip across several countries, several times a year. I drive enough on the highways to know a few scary situations, like the truck driver in a big curve slightly deviating out of his lane and "pushing" me dangerously close to the median strip for example.

To me driving requires paying constant attention to the road and being always ready to act swiftly: I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".

I have nothing against many "recent" safety features, like the steering wheel shaking a bit if the car detects you're getting out of your lane without having activated your blinker. Or the car beginning to brake if it detects an obstacle. Or the car giving you a warning if there's a risk when you change lane, etc.

But how can you react promptly if you're not ready? I just don't get this.

Unless it's a fully self-driving car, without even a steering wheel, a car should help you focus more, not less.

KronisLV 2021-08-16 13:21:37 +0000 UTC [ - ]

> But how can you react promptly if you're not ready? I just don't get this.

You cannot, that's the simple truth. You're supposed to focus on the road anyways and should be able to take over once any sort of autopilot or assist system starts working erroneously, yet in practice many people simply assume that those systems being there in the first place mean that you can simply stop focusing on the road altogether.

It feels like the claim of "fully self driving vehicle" is at odds with actual safety, or at least will remain so until the technology actually progresses far enough to be on average safer than human drivers, moral issues aside. Whether that will take 15, 50 or 500 years, i cannot say, however.

That said, currently such functionality could be good enough for the driver to take a sip from a drink, or fiddle around with a message on their phone, or even mess around on the navigation system or the radio - things that would get done regardless because people are irresponsible, but making which a little bit safer is feasible.

ghaff 2021-08-16 13:56:46 +0000 UTC [ - ]

It's nothing (well certainly not everything) to do with people's assumptions. There's a ton of research around how people simply stop paying attention when there's no reason for them to pay attention 99% of the time. It doesn't even need to be about them pulling out a book or watching a movie. It can simply be zoning out.

Maybe, as you say, it's feasible today or soon to better handle brief distractions but once you allow that it's probably dangerous to assume that people won't stretch out those distractions.

Retric 2021-08-16 14:39:42 +0000 UTC [ - ]

We have empirical data showing how safe actual level 2 self driving cars are in practice. So there’s no reason to work from base assumptions. Yes, level 2 self driving cars cause avoidable accidents, but overall rate is very close to the rate people do. The only way that’s happing is they are causing and preventing roughly similar numbers of accidents.

Which means people are either paying enough attention or these self driving systems are quite good. My suspicion is it’s a mix of both, where people tend to zone out in less hazardous driving conditions and start paying attention when things start looking dangerous. Unfortunately, that’s going to cause an equilibrium where people pay less attention as these systems get better.

Brakenshire 2021-08-16 15:10:09 +0000 UTC [ - ]

> We have empirical data showing how safe actual level 2 self driving cars are in practice.

Do we? Where does that come from? The data Tesla provides is hopelessly non-representative because it makes the assumption that the safety of any given road is independent of whether a driver chooses to switch on the system there.

Retric 2021-08-16 15:24:52 +0000 UTC [ - ]

Only overall numbers actually mater here, if self driving is off then that’s just the default risk from human driving in those conditions. Talk to your insurance company, they can give you a break down by make, model, and trim levels.

SpelingBeeChamp 2021-08-16 15:44:21 +0000 UTC [ - ]

I am pretty sure that if I call Geico they will not provide me with those data. Am I wrong?

Retric 2021-08-16 16:07:16 +0000 UTC [ - ]

Mine did, but I don’t use Geico. If they don’t give you the underlying data you can at least compare rates to figure out relative risks.

2021-08-16 14:23:52 +0000 UTC [ - ]

cma 2021-08-16 15:22:07 +0000 UTC [ - ]

I feel like driver monitoring can keep it safe, and should even be available without autopilot enabled.

Comma.ai makes the monitoring more strict when the system is less certain or when in denser traffic.

JohnJamesRambo 2021-08-16 13:31:38 +0000 UTC [ - ]

These are exactly my arguments to my girlfriend on why she shouldn’t use the Autopilot on our Tesla. Your mind will stray, the feature is exactly meant to do that to you. The feedback loop goes the wrong way. Then boom you don’t see emergency vehicles at a wreck apparently. I do blame Elon, he did the Silicon Valley thing of just promise a lot of untested stuff before the laws have solidified. Uber, Lime scooters, etc. The Tesla is a great car, but self-driving is orders of magnitude harder than he thinks.

jays 2021-08-16 14:26:46 +0000 UTC [ - ]

Agreed. I'd also add that other car manufacturers have made tradeoffs on safety issues for decades.

So I wonder if it's more about Telsa capitalizing on the hype of self driving cars (with the expensive self-driving add-on) in the short term and less about him misunderstanding the magnitude of difficulty.

Telsa is using the proceeds from that add-on to make them seem more profitable and fund the actual development. It's smart in some aspects, but very risky to consumers and Telsa.

ghaff 2021-08-16 15:24:05 +0000 UTC [ - ]

If you go back a few years, there were clearly expectations being set around L4/5 self-driving that that have very clearly not been met.

I still wonder to what degree this was a collective delusion based on spectacular but narrow gains mostly related to supervised learning in machine vision, how much was fake it till you make it, and how much was pure investor/customer fleecing.

lastofthemojito 2021-08-16 15:08:38 +0000 UTC [ - ]

I learned this playing Gran Turismo video games way back when. The game has long endurance races (I seem to remember races that ran about 2 hours, but there may have been longer ones). Eventually you get hungry or thirsty or have to use the bathroom, so you pause the game, take care of business, and resume. It's really easy to screw up if the game was paused while your car was doing anything other than stable, straight travel. A turn that I successfully handled 100 times before can suddenly feel foreign and challenging if I resume there with little context.

Obviously that's not exactly the same thing as taking over for a real car when the driver assistance features give up, but seems similarly challenging to take over the controls at the most precarious moment of travel, without being sort of "warmed up" as a driver.

jcpham2 2021-08-16 18:13:53 +0000 UTC [ - ]

500 laps at Laguna Seca in a manual transmission car let's go!

zemptime 2021-08-16 14:45:49 +0000 UTC [ - ]

I see a lot of comments here postulating how autopilot is a terribly designed feature from people who appear not to be speaking from first hand experience and now I feel compelled to comment, exactly following that HN pattern someone posted about how HN discussions go. That said thanks for keeping this discussion focused & framed as a system design one, doesn't feel like a Tesla hate train so I feel comfortable hoppin' in and sharing. This is a little refreshing to see.

Anyway, perhaps I'm in a minority here, but I feel as though my driving has gotten _significantly safer_ since getting a Tesla, particularly on longer road trips.

Instead of burning energy making sure my car stays in the lane I can spend nearly all my time observing drivers around me and paying closer attention farther down the road. My preventative and defensive driving has gone up a level.

> I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".

I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice. This isn't letting autopilot drive, it's like a hybrid act where it does the rote driving and I constantly take over to quickly pass a semi on a windy day, not pass it on a curve, or get over some lanes to avoid tire remnants in the road up ahead. I'm able to watch the traffic in front and behind and find pockets on the highway with nobody around me and no clumping bound to occur (<3 those).

To your suspicion, it is a different mode of driving. Recently I did a roadtrip (about half the height of the USA) in a non-Tesla, and I found myself way more exhausted and less alert towards the end of it. Could be I'm out of habit but egh.

Anyway, so far I've been super lucky. I don't think it's possible to avoid all car crashes no matter how well you drive. But I _for sure_ have avoided avoidable ones and taken myself out of situations where they later occurred thanks to the extra mental cycles afforded to me by auto-pilot. My safety record in the Tesla is currently perfect and I'll try and keep it that way.

I don't think autopilot is perfect either but I do think it's a good tool and I'm a better driver for it. Autopilot has definitely helped me spend better focus on driving.

malwrar 2021-08-16 15:13:55 +0000 UTC [ - ]

This expresses the mindset I find myself in when I use Autopilot. It's like enabling cruise control, you're still watching traffic around you but now you don't need to focus on maintaining the correct speed or worry about keeping your car perfectly in a lane. You can more or less let the car handle that (with your hands on the wheel to guard against the occasional jerky maneuver when a lane widens for example) while you focus on the conditions around you.

throwaway0a5e 2021-08-16 15:19:25 +0000 UTC [ - ]

Exactly. It frees the driver from increasingly advanced levels of mundane driving (cruise control manages just speed, adaptive cruise also deals with following distance, lane keeping deals with most of the steering input, etc) allowing the driver to focus more on monitoring the situation and strategic portion of driving rather than the tactical. Of course, this relies on the driver to actually do that. They could just use devote that extra attention to their phone.

scrumbledober 2021-08-16 20:25:22 +0000 UTC [ - ]

my 2021 Subaru Forester does all of these things and I do feel like I am safer with them on and paying attention to the rest of driving.

somedude895 2021-08-16 16:58:26 +0000 UTC [ - ]

Exactly this. I treat AP like I'm letting a learner drive. Constantly observing to make sure it's doing the right thing. I've been on long road trips and with AP my mind stays fresh for much longer compared to with other cars.

AndrewBissell 2021-08-16 19:06:37 +0000 UTC [ - ]

The problem is, even if your subjective idea of how Autopilot affects your own driving is correct, it appears not to be the case for a significant subset of Tesla drivers, enough that they've been plowing into emergency vehicles at such an elevated rate as to cause NHTSA to open an investigation.

Also, your subjective impressions may be what they are simply because you have not yet encountered the unlucky set of conditions which would radically change your view, as was surely the case for all the drivers involved in these sorts of incidents.

TacticalCoder 2021-08-16 15:53:31 +0000 UTC [ - ]

There's zero Tesla hate here and certainly zero EV hate here, on the contrary: I just feel the interior build quality on the Tesla could be a bit better but I'm sure they'll get there.

I wouldn't want my, strangely enough upvoted a lot, comment, to be mistaken for Tesla hate. I like what they're doing. I just think the auto-pilot shouldn't give a false sense of security.

> I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice.

> I don't think it's possible to avoid all car crashes no matter how well you drive.

Same here... And animals are my worst nightmare: there are videos on YouTube just terrifying.

For I do regularly watch crash videos to remind me of some of the dangers on the road.

somerandomqaguy 2021-08-16 16:38:43 +0000 UTC [ - ]

I think you two are talking about different things.

You're talking about Autopilot which is just driver assistance technologies; lane keep assistance, adaptive cruise control, blind spot monitoring, etc. It's not to replace driver attention, it's just monitor sections of the road the the driver can't pay attention to full time. The driver is still remaining in control and attentive to the road.

The person you're responding to seems to be talking talking about the Full Self Driving feature who's initial marketing implied that the driver need not be mentally engaged at all or too impaired to drive normally. Which was later back pedal led to say that you need to pay attention.

gugagore 2021-08-16 15:15:59 +0000 UTC [ - ]

Some people activate cruise control and then rest their right foot on the floor. I activate cruise control whenever possible because while it is activated, I can drive with my foot resting on the brake pedal. I like being marginally more responsive to an event that requires braking since I don’t need to move my foot from the accelerator.

kwhitefoot 2021-08-16 19:01:13 +0000 UTC [ - ]

What I always tell people is that together me and my car drive better than either of us on their own (Tesla Model S 70D, 2015, AP1.5).

pedrocr 2021-08-16 13:49:03 +0000 UTC [ - ]

I drive a Tesla and don't use the self-steering feature exactly because of this. What I do instead is enable the warnings from the same software like the ones you describe. That is actually a large gain. I'm already paying attention as I'm driving the car at all times and the software helps me catch things I haven't noticed for some reason. Those features seem really well done as the false positives are not too frequent and just a nuisance but the warnings are often valuable.

oblio 2021-08-16 14:25:05 +0000 UTC [ - ]

Does it have/use emergency braking in case of danger, if you don't use self-driving?

caf 2021-08-16 14:30:29 +0000 UTC [ - ]

Yes.

jazzyjackson 2021-08-16 19:30:30 +0000 UTC [ - ]

Yes but they are phasing out radar in favor of vision-only. Model 3 and Y have been shipping without radar braking for the past few months.

wilg 2021-08-16 19:57:39 +0000 UTC [ - ]

They still do emergency braking regardless of the sensor technology.

nickik 2021-08-16 21:54:46 +0000 UTC [ - ]

The vision-only system has passed all required tests for certification and Tesla themselves consider it to be a much safer system now.

ocdtrekkie 2021-08-16 14:14:33 +0000 UTC [ - ]

It can be really jarring too when a car behaves differently than you expect: I regularly use cruise control on my Kia, which makes driving much less stressful. It keeps the car centered in the lane, more or less turns the car with the road, and of course, matches the speed of the car in front of it with reasonable stopping distance. I wouldn't call it "self-driving" by any means, but if not for the alert that gets ticked off if your hands are off the wheel too long, it'd probably go on it's own for quite a long time without an incident.

However, I also once so far have experienced what happens when this system experiences a poorly-marked construction zone. Whilst most construction sites on the interstate system place temporary road lines for lane shifts, this one solely used cones. While I was paying attention and never left the flow of traffic, the car actually fought a little bit against me following the cones into another lane, because it didn't see the cones, it was following the lines.

It doesn't surprise me at all that if someone gets too comfortable trusting the car to do the work, even if they think they're paying attention, they could get driven off the roadway.

hermitdev 2021-08-16 15:54:01 +0000 UTC [ - ]

I was thinking about this the other day - driving in construction. The town I live in is currently doing water main replacement. So, lots of torn up roads, closed lanes and even single-lane only with a flagger alternating directions. No amount of safety cones will make it obvious what's going on.

How do automated systems deal with flaggers? Visibility of the stop/slow sign isn't sufficient to make a determination on whether it's safe to proceed (not to mention "stop" changes meaning here, entirely, from a typical stop sign). Often, whether or not you can proceed comes down to hand gestures from the flagger proper.

Not that I expect any reasonable driver to be using something like autopilot through such a situation, but we've also seen plenty of evidence that there are unreasonable drivers currently using these systems, as well.

ocdtrekkie 2021-08-16 17:08:18 +0000 UTC [ - ]

Conceivably in the somewhat-near future (10 years+), most cars on the road will have some sort of ADAS system, in which I'd presume it'd start to make sense for construction to use some sort of digital signalling. Something like a radio signal broadcast that can send basic slow/stop flagging signals to a lane of traffic.

Of course, the problem is, if we haven't developed it today, the ADAS systems of today won't understand it in ten years when there's enough saturation to be practical to use it. Apart from Tesla, very few car manufacturers are reckless enough to send OTA updates that can impact driving behavior.

Lane-following ADAS systems of today, mind you, can work relatively fine in construction areas... provided lane lines are moved, as opposed to relying solely on traffic cones.

robomartin 2021-08-16 14:31:00 +0000 UTC [ - ]

It is my belief that the most ideal form of truly self driving vehicles will not happen until a time when vehicles can talk to each other on the road to make each other aware of position and speed data. I don't think this has to be full GPS coordinates at all. This is about short range relative position information.

A mesh network of vehicles on the road would add the ability for vehicles to become aware of far more than a human driver can ever know. For example, if cars become aware of a problem a few km/miles ahead, they can all adjust speed way before encountering the constriction in order to optimize for traffic flow (or safety, etc.).

Of course, this does not adequately deal with pedestrians, bikes, pets, fallen trees, debris on the road, etc.

Not saying cars would exclusively use the mesh network as the sole method for navigation, they have to be highly capable without it. The mesh network would be an enhancement layer. On highways this would allow for optimization that would bring forth some potentially nice benefits. For example, I can envision reducing emissions through traffic flow optimization.

Remember that electric cars still produce emissions, just not necessarily directly while driving. The energy has to come from somewhere and, unless we build a massive number of nuclear plants, that somewhere will likely include a significant percentage of coal and natural gas power plants.

The timeline for this utopia is likely in the 20+ year range. I say this because of the simple reality of car and truck ownership. People who are buying cars today are not going to dispose of them in ten years. A car that is new today will likely enter into the used market in 8 to 10 years and be around another 5 to 10. The situation is different with commercial vehicles. Commercial trucks tend to have longer service lives by either design or maintenance. So, yeah, 20 to 30 years seems reasonable.

mhb 2021-08-16 14:10:27 +0000 UTC [ - ]

Yes. This also makes me kind of nervous when just using normal car adaptive cruise control. I feel as though my foot needs to be hovering near the pedal anyway and that's often less comfortable than actually pushing on the pedal and controlling it myself.

2021-08-16 14:40:06 +0000 UTC [ - ]

comeonseriously 2021-08-16 14:04:30 +0000 UTC [ - ]

I agree with everything you said. I do hope that eventually the tech gets to the point where it can take over full time. We recently took a road trip for our vacation and the amount of road rage we witnessed was ... mind boggling. Don't get me wrong, not everyone is a raging asshole, but there were enough to make me wonder just why so many people are so freaking angry.

kbshacker 2021-08-16 15:58:09 +0000 UTC [ - ]

Exactly, the only driving assistance feature I use is adaptive cruise control, and I don't have plans to use anything more. If I trust autonomous systems too much, I would not be ready when it matters.

Faaak 2021-08-16 14:07:16 +0000 UTC [ - ]

To me they are really aids. Of course you keep being concentrated, but I found that it takes out a lot of mental load like keeping the car straight, constantly tweaking the accelerator, etc..

It just makes the trips easier on the brain, and thus, for me, safer overall: its easier to see the overall situation when you've got free mental capacity

rad_gruchalski 2021-08-16 16:15:24 +0000 UTC [ - ]

The most useful button on my car is the speed limiter. Everything else can go.

zip1234 2021-08-16 14:44:07 +0000 UTC [ - ]

Also, these cars know the speed limits for the road but let you set cruise control/self driving above the speed limit. Seems like for safety purposes that should not be allowed. Not only are people paying significantly less attention but they also are speeding.

aembleton 2021-08-16 14:59:40 +0000 UTC [ - ]

> Also, these cars know the speed limits for the road

Does it always get this correct, or does it sometimes read a 30mph sign on a side road and then slow the car on the motorway down to that speed?

rad_gruchalski 2021-08-16 16:20:38 +0000 UTC [ - ]

Different manufacturers probably use different systems but no. BMW attempts to read the speed limit signs using the frontal camera with a mix of some sort of stored info - it knows that the speed limit is about to change (Mobileye?), but it is very often that it won't catch a sign in the bend or when the weather is bad. Also, it does not recognize time restricted speed limits, for example 30kph from 7:00 to 17:00 Monday to Friday so it would keep driving 30kph outside of those hours while 50kph is allowed. In some places in Germany, it does not recognize the city limits and carries on showing 70kph for a kilometer longer than it should.

zip1234 2021-08-16 15:02:39 +0000 UTC [ - ]

I'm not sure how the cars know the speed limit. Maybe someone else knows? My guess is combo of GPS/camera to position correctly on road and the lookup of known speed limit data. Perhaps it reads signs though?

hermitdev 2021-08-16 15:43:48 +0000 UTC [ - ]

My car shows the speed limit of roads it knows. It uses GPS and stored limits. It also doesn't know the limits of non-major roads and doesn't attempt to show a limit then. My car is a 2013, and I've not paid the $$ to update the maps in that time (seriously, they want $200-$400 to update the maps).

Since I bought my car, Illinois (where I live) has raised the maximum limit on interstates by 10 MPH. My car doesn't know about it. If my car limited me to what it thought the limit was, I'd probably be driving 20 MPH slower than prevailing traffic, a decidedly unsafe situation.

cranekam 2021-08-16 15:50:30 +0000 UTC [ - ]

The rental car I am using now certainly a) reads road signs for speed limit information, b) is definitely fooled by signs on off ramps etc.

It’s hard to imagine how speed limit systems would work without some sort of vision capabilities — a database of speed limits would never be up to date with roadworks and so on.

zip1234 2021-08-16 16:36:34 +0000 UTC [ - ]

Nobody should be using autopilot driving through roadworks anyways.

Sargos 2021-08-16 15:03:18 +0000 UTC [ - ]

Going slower than traffic is actually unsafe and increases the chances of collisions with other drivers.

zip1234 2021-08-16 16:43:23 +0000 UTC [ - ]

Going slower than traffic happens all the time. Over the road trucks often have speed governors set to 60-70 mph for example.

emerged 2021-08-16 15:24:13 +0000 UTC [ - ]

Almost nobody drives at or below the speed limit. It’s dangerous to do so in many places.

filoleg 2021-08-16 15:18:37 +0000 UTC [ - ]

That’s a feature accommodating realities of driving on public roads, not a bug.

If you drive on a 60mph speed limit highway, no one is driving 60mph, everyone is going around 70mph. If you decide to use autopilot and it limits you to 60mph, you singlehandedly start disrupting the flow of traffic (that goes 70mph) and end up becoming an increased danger to yourself and others.

Not even mentioning cases when the speed limits change overnight or the map data is outdated or if a physical sign is unreadable.

zip1234 2021-08-16 16:45:36 +0000 UTC [ - ]

Over the road trucks often have speed governors, some companies limit their trucks to 60 mph because it saves a lot of fuel and leads to a much (50%) lower risk of collisions.

filoleg 2021-08-16 17:28:20 +0000 UTC [ - ]

Apples to oranges. Stopping distance of a 16-wheeler is magnitudes larger than that of a typical sedan, so in their case it makes sense.

For specific numbers (after subtracting reaction distance being the same for both):

55mph: car 165ft, 16-wheeler 225ft. 65mph: car 245ft, 16-wheeler 454ft.

As you can see, the gap between a car's stopping distance and a 16-wheeler's stopping distance increases with speed increasing, and non-linearly at that. Not even mentioning the destructive potential of a car vs. a 16-wheeler.

I would agree with your point if majority of the roads were occupied by 16-wheelers, but it isn't the case (at least in the metro area that I commute to work in).

Source for numbers used: https://trucksmart.udot.utah.gov/motorist-home/stopping-dist...

Note: I agree that it would be safer if everyone drove the exact speed limit, as opposed to everyone going 10mph above the speed limit. However, in a situation where everyone is driving 10mph above the speed limit, you are creating a more dangerous situation by driving 10mph slower instead of driving 10mph above like everyone else.

hnarn 2021-08-16 15:16:50 +0000 UTC [ - ]

> like the truck driver in a big curve slightly deviating out of his lane and "pushing" me dangerously close to the median strip for example

This is a situation that you simply shouldn’t put yourself in. There is no reason to ever drive right next to a large vehicle, on either side, except for very short periods when overtaking them on a straight road.

throwaway0a5e 2021-08-16 15:20:39 +0000 UTC [ - ]

This just isn't realistically possible on most highways except in the lightest traffic conditions. You are gonna spend some time beside trucks whether you like it or not.

hnarn 2021-08-16 15:25:47 +0000 UTC [ - ]

Spending time right next to a truck is completely optional. You can either speed up or slow down, either of them will put you in a position where you are no longer right next to them.

occamrazor 2021-08-16 16:44:30 +0000 UTC [ - ]

What if there is a more or less uninterrupted row of trucks in the right lane?

hnarn 2021-08-16 16:49:17 +0000 UTC [ - ]

We can play “what if” all day, but I’m not interested. In 99,9% of cases you can and should avoid driving next to a large vehicle.

paul7986 2021-08-16 13:55:27 +0000 UTC [ - ]

Fully automated Self driving cars is either a pipe dream or decades away in which many more people will be killed on the road in the name of technological progress.

ra7 2021-08-16 14:05:48 +0000 UTC [ - ]

Fully autonomous cars are already a reality with Waymo in AZ and AutoX, Baidu in China. I don't know how safe the Chinese companies are, but Waymo's safety record [1] is nothing short of stellar.

[1] https://waymo.com/safety

ghaff 2021-08-16 14:30:21 +0000 UTC [ - ]

Good for Waymo and hopefully Google keeps up this science project. But it's a very limited and almost as perfect an environment as you could have outside of a controlled test area. Those who were saying L4/5 would be decades at least away seem to be those who were on the right track. Kids growing up today are going to have to learn to drive.

ra7 2021-08-16 14:34:58 +0000 UTC [ - ]

L5 may be decades away. I think we will see L4 in some major metro areas in the US by end of this decade. SF is heating up with Cruise and Waymo's heavy testing. Their progress will be a great indicator for true city driving.

ghaff 2021-08-16 14:40:19 +0000 UTC [ - ]

>we will see L4 in some major metro areas in the US by end of this decade

I think you're far more likely to see L4 on limited access highways in good weather. A robotaxi service in a major city seems much more problematic given all the random behavior by other cars, pedestrians, cyclists, etc. and picking up/dropping off people in the fairly random ways that taxis/Ubers do. (And you'll rightly be shut down 6 months for an investigation the first time you run over someone even if they weren't crossing at a crosswalk.)

And for many people, including myself, automated highway driving would actually be a much bigger win than urban taxi rides which I rarely have a need for.

ocdtrekkie 2021-08-16 14:18:48 +0000 UTC [ - ]

Waymo selected the one state willing to entirely remove any safety reporting requirements for self-driving cars as the place to launch their service. Regardless of what they claim to the contrary, if they had confidence in their safety record, they would've launched it in California, not Arizona.

Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe. A former employee expressed disappointment they weren't the first self-driving car company to kill someone, because that meant they were behind.

ra7 2021-08-16 14:22:35 +0000 UTC [ - ]

> Regardless of what they claim to the contrary, if they had confidence in their safety record, they would've launched it in California, not Arizona.

California only months ago opened up permits for paid robotaxi rides. So no, they couldn't have launched it in CA. If you've noticed, they actually are testing in SF with a permit.

> Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe.

What lies? Their CA disengagement miles are for everyone to see, their safety report is open, they have had 0 fatalities in their years of operation. Seems like you just made this up.

dragonwriter 2021-08-16 15:00:05 +0000 UTC [ - ]

> California only months ago opened up permits for paid robotaxi rides. So no, they couldn't have launched it in CA.

Well, yeah, that's the logic of an established business. Disruptive startups flout laws rather than following them.

ocdtrekkie 2021-08-16 14:38:35 +0000 UTC [ - ]

I recall a particular incident where Waymo was marketing their car being able to drive a blind man to a drive-thru, way before the thing could safely drive more than a mile on it's own. My understanding is that in 2021, it still can't navigate parking lots (which would preclude using it for drive-thrus).

Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.

https://www.technologyreview.com/2014/08/28/171520/hidden-ob...

The article is seven years old at this point, but it's also incredibly humbling in how much bull- Waymo puts out, especially compared to the impressions their marketing team puts out. (Urmson's son presumably has a driver's license by now.)

In at least one scenario, the former Waymo engineer upset he had failed to kill anyone yet ("I’m pissed we didn’t have the first death"), caused a hit-and-run accident with a Waymo car, and didn't report it to authorities, amongst other serious accidents: https://www.salon.com/2018/10/16/googles-self-driving-cars-i... Said star Waymo engineer eventually went to prison for stealing trade secrets and then got pardoned by Donald Trump. Google didn't fire him for trying to kill people, they only really got upset with him because he took their tech to Uber.

I'd say Waymo has a storied history of dishonesty and coverups, behind a technology that's more or less a remote control car that only runs in a narrow group of carefully premapped streets.

ra7 2021-08-16 14:53:10 +0000 UTC [ - ]

> I recall a particular incident where Waymo was marketing their car being able to drive a blind man to a drive-thru, way before the thing could safely drive more than a mile on it's own.

How is a marketing video relevant from 2015 relevant to their safety record? They weren't even operating a public robotaxi service back then.

> My understanding is that in 2021, it still can't navigate parking lots (which would preclude using it for drive-thrus).

Completely false. Here is one navigating a Costco parking lot (can't get any busier than that) [1]. If you watch any videos in that YouTube channel, it picks you up and drops you off right from the parking lot. Yes, you can't use it for drive-thrus, but it doesn't qualify as "lying about capabilities".

> Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.

Here is one recognizing a handheld stop sign from a police officer while it stopped for an emergency vehicle [2].

[1] https://www.youtube.com/watch?v=p5CXcJD3mcU

[2] https://www.youtube.com/watch?v=MpDbX1FViWk&t=75s

nradov 2021-08-16 15:31:46 +0000 UTC [ - ]

The workers doing road repairs in my neighborhood don't even use handheld stop signs. Just vague and confusing gestures.

ra7 2021-08-16 15:36:45 +0000 UTC [ - ]

I think in those cases a Waymo vehicle would probably require remote assistance. It's a really difficult scenario for a computer to make sense of.

andreilys 2021-08-16 14:29:22 +0000 UTC [ - ]

which many more people will be killed on the rise in the name of technological progress.

Seeing as car crashes are the leading cause of deaths from people aged 1-54, it may be an improvement from the status quo

More than 38,000 people die every year in crashes on U.S. roadways. The U.S. traffic fatality rate is 12.4 deaths per 100,000 inhabitants. An additional 4.4 million are injured seriously enough to require medical attention. Road crashes are the leading cause of death in the U.S. for people aged 1-54.

hn8788 2021-08-16 14:34:16 +0000 UTC [ - ]

I'd say it depends on how many of those deaths are caused by the driver doing something unsafe. I'd be more comfortable with higher traffic deaths that primarily affect bad drivers than a lower number of deaths randomly spread across all drivers by a blackbox algorithm.

_ph_ 2021-08-16 14:51:44 +0000 UTC [ - ]

If you are texting while driving and hit a stopped car or run a red light, you are very lightly to kill others. Actually more likely, as a side impact is more dangerous than a frontal one.

jazzyjackson 2021-08-16 19:35:54 +0000 UTC [ - ]

But the car doesn’t need to drive itself to avoid those factors, it just needs to have radar auto braking

ac29 2021-08-16 16:36:59 +0000 UTC [ - ]

> Road crashes are the leading cause of death in the U.S. for people aged 1-54

This isnt true according to the CDC. Cancer and heart disease lead for the 44-54 group, and while "accidental injury" does lead from 1-44, if you break down the data, in many cases vehicle based accidents are not not the largest single source. For example:

Drowning is the largest single cause in 1-4

Cancer is the largest single cause in 5-9

Suicide is the largest single cause 10-14

https://wisqars-viz.cdc.gov:8006/lcd/home

hnburnsy 2021-08-16 14:25:04 +0000 UTC [ - ]

Will changes such as machine-readable road markings, car to car communications, and traffic management systems make this happen quicker.

For example, couldn't emergency vehicles could send out a signal directly to autonomous vehicles or via a traffic managagemnt system to slow down or require the driver to take over when approaching. An elementary version of this is Waze which will notify you of road hazards or cars stopped on the side of the road.

2021-08-16 14:42:36 +0000 UTC [ - ]

dotdi 2021-08-16 12:59:46 +0000 UTC [ - ]

Unfortunately the article doesn't mention anything about how common it is for human drivers to crash into first responder vehicles during the night. I'm not trying to downplay these cases, as hitting emergency vehicles is very bad indeed, yet ~4 such crashes per year might be in the same ballpark or even better than "unassisted" drivers that cause such crashes.

josefx 2021-08-16 13:40:12 +0000 UTC [ - ]

Might be around 98 a year if this[1] is the correct list.

Edit: I think the page count at the bottom of that list is off, it seems to repeat the last page so it might be less.

[1]https://www.respondersafety.com/news/struck-by-incidents/?da...

onlyrealcuzzo 2021-08-16 19:32:50 +0000 UTC [ - ]

Considering that less than 0.3% of cars in the US are Teslas and that - I would guess - less than 10% of them are using autopilot at any given time - they are likely 100s of times more likely to hit first responders.

btbuildem 2021-08-16 19:32:45 +0000 UTC [ - ]

Indeed! I was looking for the same comparison.

In Canada the Highway Act states that you must move over (change lanes) for stopped emergency vehicles. It seems to solve that problem gracefully, leaving an empty lane between the stopped vehicles and traffic.

toomuchtodo 2021-08-16 13:01:29 +0000 UTC [ - ]

Also doesn’t mention that this is an inherent limitation in TACC+ systems, and is specifically called out as such in Volvo, BMW, and Cadillac vehicle manuals as a limitation. Much ado about nothing unless regulators are going to outlaw radar based adaptive cruise control (which, of course, they’re not).

salawat 2021-08-16 13:32:41 +0000 UTC [ - ]

frankly, it's hard to even crash into an emerfency in my opinion while actually driving snd paying attention given tgeir lights have gotten so darn bright it's damn near blinding. frankly, I have to slow to a crawl not out of rubbernecking fascination, butout of self preservation to adapt to the dang lighting searing my retinas at night.

now running into unlit emergency vehicles? still think tgat's rather difficult sans inebriation or sleep dep.

lacksconfidence 2021-08-16 14:55:39 +0000 UTC [ - ]

the lights are actually what cause the crash. Some drivers just drive straight into the lights. This is part of why police, at least around here, have particular protocols around how far they stop behind a car, and never standing between the cop car and the car they stopped.

darkerside 2021-08-16 12:59:53 +0000 UTC [ - ]

Long overdue. We're going to need a more rapid and iterative way to do this if we got to have even a chance of autopilot type technologies succeeding over the long run. Companies and regulators need to be collecting feedback and pushing for improvement on a regular basis. I still don't think it's likely to succeed, but if it did, this would be the way.

crubier 2021-08-16 13:00:58 +0000 UTC [ - ]

Predictable outcome of a sensing system fully based on deep learning. Rare unusual situations don't have enough training data and lead to unpredictable output.

I still think that Tesla's approach is the right one, I just think they need to gather more data before letting this product be used in the wild unsupervised.

judge2020 2021-08-16 13:02:24 +0000 UTC [ - ]

Current TACC/auto-steer doesn’t use deep learning except on the newest Model 3/Y vehicles with “TeslaVision”. All cars with radar use the radar and radar only to determine if they should stop for the following car.

dawnerd 2021-08-16 15:16:32 +0000 UTC [ - ]

Definitely use cameras as well to determine stopping. Otherwise there wouldn’t have been the issue with bridges or shadows causing phantom braking.

laichzeit0 2021-08-16 15:36:46 +0000 UTC [ - ]

How does TeslaVision work with stationary objects at night? Like say a big ass truck with its lights off? Do you just pray the vision system recognizes “something” is there? I know they want to pursue a pure-vision system with no radar input, but it seems like there will be some crazy low light / low visibility edge cases you’d have to deal with.

marvin 2021-08-16 19:42:07 +0000 UTC [ - ]

How does a human detect a big ass truck with its lights off at night? This is solvable with computer vision. Tesla's dataset is almost nothing but edge cases, and they keep adding more all the time. My money says they'll get there.

laichzeit0 2021-08-17 04:14:23 +0000 UTC [ - ]

The thing is we don’t. Many people die from rear ending broke down trucks. I’m fine with that, I’m not so sure if regulators will be fine with a TSLA killing someone and going “welp, a human wouldn’t have seen it either, let’s just add this to our edge case dataset”.

360walk 2021-08-16 15:24:39 +0000 UTC [ - ]

I think it is necessary for the crashes to occur, to gather the data required to re-train the auto-pilot. We as a society need to decide whether we want to pay this cost of technological advancement.

crubier 2021-08-16 22:33:30 +0000 UTC [ - ]

No. Gathering data of human drivers braking in those circumstances would result in a perfectly fine dataset. This idea of needing human sacrifice is bonkers.

hetspookjee 2021-08-16 21:35:44 +0000 UTC [ - ]

I wonder how many accidents happened because people were fiddling with their screen to turn on the windscreen wipers and took their eyes of the road too long in poor sight. I can't believe the law allowed the controls for the wiper speed to be put into this capacitive screen.

leroman 2021-08-16 13:38:49 +0000 UTC [ - ]

Seeing all the near-misses and life saving interventions on YouTube involving Tesla vehicles, 1 death seems not such a bad result..

The question should be - how many lives were saved by this system vs how many would die if driven "normally"?

thereisnospork 2021-08-16 19:29:12 +0000 UTC [ - ]

>The question should be - how many lives were saved by this system vs how many would die if driven "normally"?

It is also necessary to project this into the future, i.e. looking at the integral of expected lives lost 'rushing' self driving cars vs. 'waiting-and-seeing' (as Americans die at a rate of 40,000 per annum).

If twice as many people die for a fixed number of years to create a self driving system that results in half the fatality rate of the status quo, that becomes worth it very, very quickly.

young_unixer 2021-08-16 14:58:45 +0000 UTC [ - ]

I don't think that should be the question.

For example: it's not morally equivalent to die while drunk driving at 150 km/h vs dying as a pedestrian because someone ran over you.

I would prefer 10 drunk drivers die instead of just one innocent person.

leroman 2021-08-16 15:07:51 +0000 UTC [ - ]

There's a place for a discussion about the moral repercussions, I doubt it's a 10 drunk driver vs 1 soccer mom situation ;)

nickik 2021-08-16 21:58:37 +0000 UTC [ - ]

The only person who died fell asleep in the car and would have died in any car as far as I remember.

njarboe 2021-08-16 23:30:10 +0000 UTC [ - ]

Anybody have more information about how many, if any, of these wrecks happened after Tesla moved to vision only Autopilot? Maybe this obvious problem has already been mostly fixed with the new system and possibly a main reason for going to full vision in the first place. It seems to me NHTSA need to move a little faster with this quickly changing technology as they are going back to examine crashes that happened starting from Jan, 2018 with this probe.

joewadcan 2021-08-16 19:02:50 +0000 UTC [ - ]

This will end up being a very good thing for Tesla. They were able to operate semi-autonomous vehicles for years while they iterated through software versions. They are faaar from done, but a likely outcome will be more stringent regulation on companies that want to do a similar approach of putting the betas in customers hands. This makes it way harder for automotive companies to get the same leeway, putting Tesla further ahead.

jazzyjackson 2021-08-16 19:09:39 +0000 UTC [ - ]

This is good for Bitcoin

geekraver 2021-08-17 15:14:42 +0000 UTC [ - ]

It would be great if Tesla could actually get speed limit detection right for starters, and then slow down the car in time for a new reduced speed limit. Had my MY for just a few days and found that inability pretty annoying. It detected a 40mph road as 60, and a 60mph freeway (which is only ever 60mph) as 70. I’ve had 5 year old Garmin GPS units that do better than that offline. I mentioned this in a local owner group and a bunch others chimed in saying they see this too.

okareaman 2021-08-16 14:18:50 +0000 UTC [ - ]

When I was learning to drive, my grandmother drilled into me to never swerve for an animal that jumps out in front of the car. This saved me when I was driving by the Grand Canyon and jack rabbits kept jumping, out of nowhere seemingly, in front of my car. I drilled "never swerve" into my son and it saved him on a mountain road when he hit a deer. He didn't go into the trees. When I drove in Alaska I asked why the forest was cut back from the road. They said that moose like to step out in front of cars.

I have no idea how self-driving fits into this. I don't have a feel how self-driving responds to emergencies. I'd have to experience an emergency in one. For that reason, I don't see myself ever trusting self-driving.

lawn 2021-08-16 15:18:04 +0000 UTC [ - ]

It's always contextual. If you run into a moose head on, you're in for a very bad time.

rad_gruchalski 2021-08-16 16:42:53 +0000 UTC [ - ]

That's where the other three rules apply:

1. always pay full attention to where you are because there might be a truck or a family of 5 coming from the opposite direction, 2. never lift, 3. always look in the direction you want to travel in, not in the direction you currently travel

kwhitefoot 2021-08-16 19:08:26 +0000 UTC [ - ]

When my wife learned to drive in Norway she was instructed never to swerve to avoid a collision regardless of whether it was an elk, a dog or a human being in front of the car, just to stamp hard on the brake.

The rationale being that swerving most likely puts more people at risk more of the time. Especially true here where leaving the road often means either colliding with the granite cliff wall or ending up in the fjord or lake.

killjoywashere 2021-08-17 03:52:25 +0000 UTC [ - ]

TeslaDeaths.com estimates Tesla at ~240M miles per death as of late 2019 (1). NHTSA puts the national rat for all vehicles in 2020 at ~90M miles per death. If we exclude motorcycles, the national rate is 104M miles per death in 2020.

(1) https://www.tesladeaths.com/miles.html (2) https://www-fars.nhtsa.dot.gov/Main/index.aspx

myko 2021-08-16 14:23:48 +0000 UTC [ - ]

I don't know how Tesla ever presumes to achieve FSD when they cannot detect stopped objects in the road, especially emergency vehicles. This is incredibly disappointing.

Does anyone know if the FSD Beta has this ability?

_ph_ 2021-08-16 15:04:01 +0000 UTC [ - ]

All these crashes happened with the radar-controlled auto pilot. A radar system basically cannot detect static obstactles, as it doesn't have the spacial resolution to distinguish an obstacle in your lane from something right beside or above it (bridges). They can only use the radar to follow other cars, because these are not stationary objects.

Recently, Tesla switched from radar-based to pure optical obstacle recognition. This should vastly improve this kind of behavior. Ironically that the investigation starts at a moment when they basically got rid of the old system.

Look on youtube for videos of the FSD beta. It is amazingly good at recognizing the surroundings of a car, including parked vehicles at the road side.

kwhitefoot 2021-08-16 19:11:34 +0000 UTC [ - ]

> cannot detect stopped objects in the road,

Neither can Volvo or VW.

Actually my 2015 Ap1.5 Model S does detect stopped vehicles, unfortunately not reliably.

zebnyc 2021-08-16 20:38:56 +0000 UTC [ - ]

I was excited to read about Tesla's "autopilot" until I read the details. To me as a consumer, autopilot would let me get in my car after dinner in SF, set the destination as Las Vegas and wake up in the morning at Vegas. Wake me up when that exists.

Or I can tell my car, "Hey tesla, go pickup my kid from soccer practice" and it would know what to do.

tyingq 2021-08-16 13:10:42 +0000 UTC [ - ]

The crash in the city of Woodlands, Texas, was pretty terrifying. After hitting a tree, the car caught on fire. The driver was found in the back seat, presumably because he couldn't figure out how to open the door to get out.

Meekro 2021-08-16 18:28:43 +0000 UTC [ - ]

Interesting! When HN discussed that story a few months ago[1], the common notion was that the driver had enabled autopilot and climbed into the back seat.

Your take seems a lot more plausible.

[1] https://news.ycombinator.com/item?id=26869962

tyingq 2021-08-16 18:52:48 +0000 UTC [ - ]

That discussion was because the local sheriff's office said that there was nobody in the drivers seat at the time of impact. No idea why they would have said that. Tesla says the steering wheel was deformed in a manner consistent with a person being in the driver's seat when it crashed.

bishoprook2 2021-08-16 13:40:01 +0000 UTC [ - ]

I just read about that after your post. Even with the typical lawyer hyperbole it's pretty bad.

It seems to me that Tesla door handles (in a world where they've been designing door latches for some time) are just plain ridiculous and likely unreliable but are a side effect of the market the company has been selling into. Gadgets go a long way with Tesla owners.

Obviously, things like a latch should not only work under all conditions including no-power, but they should probably be the same under all conditions. 'Emergency' latches aren't going to be used during an emergency as muscle memory is too important.

literallyaduck 2021-08-16 14:46:25 +0000 UTC [ - ]

I believe we should have safety probes. Lots of people who have taken money from the auto industry want this specifically for Tesla. There is a strong possibility this is political punishment for wrongthink.

kemiller 2021-08-16 20:03:21 +0000 UTC [ - ]

OK people. There have been a grand total of 11 cases in 2.5 years. NHTSA investigates a lots of things. How many regular drivers collided with emergency vehicles in the same time frame?

jdavis703 2021-08-16 20:38:37 +0000 UTC [ - ]

The FBI has stats on police deaths by type of death. If memory serves correctly slightly more cops were killed in traffic crashes than that.

However, I’m assuming the crashes were quite varied: anything from a driver recklessly fleeing a stop to some drunk crashing into a cop on the highway shoulder. Most likely these deaths didn’t have a systematic pattern to them that could be prevented if only we knew what the root cause was.

2021-08-16 20:45:00 +0000 UTC [ - ]

sunshineforever 2021-08-16 22:25:54 +0000 UTC [ - ]

I wonder, how does the autopilot safety record compare to driving in similar conditions to those which AP is typically used (open highway, good weather).

fallingknife 2021-08-16 13:24:40 +0000 UTC [ - ]

I always see anecdotes about Tesla crashes, but not any statistics vs other cars. I tend to assume that this is because they aren't more dangerous.

catillac 2021-08-16 14:09:24 +0000 UTC [ - ]

My thought would be that we hear lots of anecdotes because they claim a full self driving capability, so it’s particularly interesting when they crash and are using these assist features. Indeed when I bought my Model Y I paid extra for that capability.

Here’s more detail: https://www.tesla.com/support/autopilot

sitkack 2021-08-17 15:00:30 +0000 UTC [ - ]

Sounds like the system should be shut off completely until they find the flaws.

MonadIsPronad 2021-08-16 13:03:24 +0000 UTC [ - ]

Oopsie. This strikes me as perhaps one of the growing pains of not-quite-self-driving: common sense would dictate that manual control would be taken by the driver when approaching an unusual situation like a roadside incident, but we just can't trust the common sense of a minority of people.

Tesla perhaps isn't being loud enough about how autopilot isn't self-driving, and shouldn't even be relied upon to hit the brakes when something is in front of you.

ghaff 2021-08-16 14:04:34 +0000 UTC [ - ]

What on earth makes you think it's the minority of people who stop paying attention when machinery is handling some task all by itself the vast majority of the time? There's plenty of research that says otherwise even among highly-trained individuals.

tmountain 2021-08-16 13:22:10 +0000 UTC [ - ]

> In one of the cases, a doctor was watching a movie on a phone when his vehicle rammed into a state trooper in North Carolina.

Doesn't autopilot require you to put your hands on the wheel fairly regularly? Are these incidents just a matter of people using this feature outside of its intended use case?

2021-08-16 13:28:35 +0000 UTC [ - ]

Ajedi32 2021-08-16 13:36:18 +0000 UTC [ - ]

One hand on his phone one hand on the wheel, I assume.

Newer versions of Autopilot watch to make sure you keep your eyes on the road, probably to prevent this very scenario[1].

[1]: https://www.theverge.com/2021/5/27/22457430/tesla-in-car-cam...

LightG 2021-08-18 10:45:06 +0000 UTC [ - ]

Maybe the AI has become self-aware. It's now picking off it's enemies, slowly, strategically, one by one.

tacobelllover99 2021-08-16 13:27:02 +0000 UTC [ - ]

Man Tesla are the wrose. Causing more crashes and more likely to catch on fire!

Oh wait NM that's tradional ICE cars.

FUD is dangerous

nikkinana 2021-08-16 13:42:35 +0000 UTC [ - ]

Shake 'em down, shut that shit down. Non union fuckers.

t0rt01se 2021-08-16 13:18:37 +0000 UTC [ - ]

About time some adults got involved.

thoughtstheseus 2021-08-16 14:08:32 +0000 UTC [ - ]

Ban human driving on the interstate, highways, etc. Boom, self driving now works at scale.

2021-08-16 12:57:18 +0000 UTC [ - ]

gamblor956 2021-08-16 17:01:47 +0000 UTC [ - ]

Craziest statistic: of the 31 serious crashes involving driver-assist systems in the U.S. since June 2016, 25 of them involved Tesla Autopilot.

blueplanet200 2021-08-16 17:16:00 +0000 UTC [ - ]

gamblor956 2021-08-16 17:43:01 +0000 UTC [ - ]

It does not matter how you dice up the statistics. Of the millions of cars with driver assist, Tesla's make up the majority of serious accidents, and the majority of accidents of vehicles crashing into emergency vehicles on the road. On an absolute basis, relative basis, per capita basis, etc., Tesla Autopilot has more, serious crashes than all other driver assist systems. (And note that this does not include any FSD-related accidents.)

This is an issue because Tesla markets its cars as being "safer" than other company's vehicles, and the data shows that their driver assist system is objectively not.

mshumi 2021-08-16 12:59:46 +0000 UTC [ - ]

Judging by the lack of a market reaction this morning, this is mostly immaterial.

phpnode 2021-08-16 13:01:22 +0000 UTC [ - ]

Stocks like Tesla have long been divorced from business realities so I wouldn't put too much stake in that

smallhands 2021-08-16 13:18:10 +0000 UTC [ - ]

time to short Tesla stocks!

phpnode 2021-08-16 13:20:08 +0000 UTC [ - ]

Markets and irrationality and solvency quote goes here

rvz 2021-08-16 13:48:42 +0000 UTC [ - ]

Well I did give a NKLA short 17 days ago and well I ended up laughing all the way to the bank. [0]

[0] https://news.ycombinator.com/item?id=27996773

phpnode 2021-08-16 13:52:29 +0000 UTC [ - ]

Fortune favours the brave, occasionally!

bathtub365 2021-08-16 13:24:38 +0000 UTC [ - ]

The NASDAQ wasn’t open at the time of your comment, so how can you even make that determination?

TSLA is down almost 2% in pre-market trading at the time of this comment, though.

catillac 2021-08-16 14:12:17 +0000 UTC [ - ]

I don’t know much about trading, but it appears to be down nearly 5% this morning as of right now. Regardless, I think you’re conflating trading price with whether something is material in general.

antattack 2021-08-16 13:21:59 +0000 UTC [ - ]

NHTSA is lumping TACC with Autopilot to increase number of incidents and make the case sound more serious:

"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," NHTSA said in a document opening the investigation.

TACC is very different from Autopilot.

jdavis703 2021-08-16 13:25:00 +0000 UTC [ - ]

NHTSA reports are usually very neutral. I’d be very surprised if they were out to get Tesla, or really any other corporation or individual.

antattack 2021-08-16 13:43:07 +0000 UTC [ - ]

Every car with adaptive cruise has similar disclaimer, pointing how unreliable system is recognizing parked vehicles [1]:

Safety Consideration When Using Adaptive Cruise Control

• The system can only brake so much. Your complete attention is always required while driving.

• Adaptive Cruise Control does not steer your vehicle. You must always be in control of vehicle steering.

• The system may not react to parked, stopped or slow-moving vehicles. You should always be ready to take action and apply the brakes.

[1]https://my.gmc.com/how-to-support/driving-performance/drivin...

kelvin0 2021-08-16 13:37:37 +0000 UTC [ - ]

Having human drivers and assisted drivers on the same road is problematic currently.

I think the best situation would be to have 'automated' stretches of highway specially designed to 'help' self driving systems.

Only self driving vehicles would be allowed on such special highways, and everything would be built around such systems.

SCNP 2021-08-16 14:06:29 +0000 UTC [ - ]

Let me preface by saying that I hold no strong opinions on this matter and my comments are purely speculative.

This is kind of a position I've held for a long time but a different aspect of the problem. I think a system similar to IFF in aircraft would solve all of these issue. If every car knew where every other car was at all times, you could easily devise a system that would be nearly flawless. The issue is, there is no incremental path to this solution. You would essentially have to start over with the existing transportation network.

mattnewton 2021-08-16 14:14:10 +0000 UTC [ - ]

The problem is that you don’t just need to know about every other vehicle, you still need all the perceptual stuff for pedestrians, bikers, baby carriages, trash, road closures, traffic cops in the middle of the road, etc. All those things are arguably harder to detect reliably than a somewhat standard sized box of metal with pairs of lights in the front and back. I think shooting for superhuman perception of all these things is still where Tesla is failing.

SCNP 2021-08-16 14:50:09 +0000 UTC [ - ]

True. I guess I was thinking that if you build totally new infrastructure for these new overhauled cars, you'd keep it completely separate from other modes of transportation. My sci-fi inclinations had me imagining tubes like Logan's Run.

2021-08-16 19:50:23 +0000 UTC [ - ]

ghaff 2021-08-16 14:01:42 +0000 UTC [ - ]

Who is going to pay for these dedicated stretches of highway that only, presumably, relatively wealthy owners of self-driving cars are going to be allowed to use?

kelvin0 2021-08-16 20:14:38 +0000 UTC [ - ]

Any entity (individuals or corporate) could use it of course. Rich or not, since Electric vehicles such as buses could be a public form of transportation on these specially adapted roads.

rvz 2021-08-16 13:16:29 +0000 UTC [ - ]

Did I not just say this before? [0] [1] [2], seems like the safety probe formally agrees with my (and many other's) concerns over the deceptive advertising of the FSD package and its safety risks towards other drivers.

Perhaps this is for the best.

[0] https://news.ycombinator.com/item?id=27996321

[1] https://news.ycombinator.com/item?id=27863941

[2] https://news.ycombinator.com/item?id=28053883

nickik 2021-08-16 22:00:50 +0000 UTC [ - ]

That is literally not what this is about.

supperburg 2021-08-16 19:26:09 +0000 UTC [ - ]

The Tesla FSD beta is what we all dreamed of in the 90s. It’s mind blowing. Its so crazy that it’s finally arrived, though not able to fully and reliably drive itself in every circumstance, and nobody seems to care. People only seem to be foaming at the mouth about the way way the product is labeled. If HN found the holy grail but it was labeled “un-holy grail” then HN would apparently chuck it over their shoulder.

2021-08-16 19:52:02 +0000 UTC [ - ]

zugi 2021-08-16 19:59:58 +0000 UTC [ - ]

Teslas are the safest vehicles on the road, according to the National Highway Traffic Safety Administration. (https://drivemag.com/news/how-safe-are-tesla-cars-5-facts-an...).

Teslas crash 40% less than other cars, and 1/3 the number of people are killed in Teslas versus other cars.

Indeed once a common failure mode like this is identified it needs to be investigated and fixed. Something similar happened a few years ago when someone driving a Tesla while watching a movie (not paying attention) died when they crashed into a light-colored tractor trailer directly crossing the road. So an investigation makes sense. But much of the general criticism of self-driving and autopilot here seems misplaced. Teslas and other self-driving vehicle technologies are saving lives. They will continue to save lives compared to human drivers, as long as we let them.

derbOac 2021-08-16 20:23:42 +0000 UTC [ - ]

I really wrestle with this line of reasoning. Tesla keeps pointing this out, and it's appealing to me, but at the same time something about it seems off to me. I can't tell if this is erroneous intuition on my part blinding me to a more rational assessment, or if that intuition is onto something important.

Some top-of-my-head thoughts:

1. I think to make a fair comparison of Tesla versus other cars, you'd have to really ask "how much safer are Tesla owners in Teslas compared to other cars randomly assigned to them?" That is, comparing the accident rates of Teslas compared to other cars is misleading because Tesla owners are not a random slice of the population. I almost guarantee that if you e.g., looked at their accident rates prior to owning a Tesla their accident rates would be lower than the general population.

2. In these autopilot situations, bringing up general accident rates seems sort of like a red herring to me. The actual causally relevant issue is "what would happen in this scenario if someone were driving without an autopilot?" So, for example, in the example of the rider who was killed when the autopilot drove them into a semi, the actually relevant question is "what would have happened if that driver, or someone interchangeable with them, was driving without autopilot? Would have they drove themselves into a semi?"

3. Various experts have argued general vehicle accident rates aren't comparable to Teslas because average cars are much, much older. As such, you should be comparing accident rates of cars of the same age, if nothing else. So, aside from the driver effect pointed out earlier, you have the question of "what would the accident rate look like in a Tesla or a car identical to it without autopilot?"

4. At some point with autopilot -- whether it be Tesla or other companies -- you have to start treating it comparably to a single individual. So, for example, what are the odds of Person A27K38, driving the same number of miles as Tesla, having a certain pattern of accidents? If you found a specific person drove into first responders on the side of the road 11 times, wouldn't that be suggestive of a pattern? Or would it? It's not enough to ask "how often do non autopilot drivers drive into first responders on the side of the road", it seems to me important to ask "how often would a single driver drive into first responders on the side of the road, given a certain number of miles driven in that same period?" At some point, autopilot becomes a driver, in the sense it has a unique identity regardless of how many copies of it there are? Maybe that's not right but it seems like that is the case.