U.S. opens probe into Tesla’s Autopilot over emergency vehicle crashes
gundmc 2021-08-16 12:57:50 +0000 UTC [ - ]
Robotbeat 2021-08-16 13:21:18 +0000 UTC [ - ]
The fact that an huge formal investigation happened with just a single casualty is proof that it may actually be superior for safety in the long-term (when combined with feedback from regulators and government investigators). One death in conventional vehicles is irrelevant. But because of the high profile of Tesla’s technology, it garners a bunch of attention from the public and therefore regulators. This is PRECISELY the dynamic that led to the ridiculously safe airline record. The safer it is, the more that rare deaths will be investigated and the causes sussed out and fixed by industry and regulators together.
Perhaps industry/Tesla/whoever hates the regulators and investigations. But I think they are precisely what will cause self driving to become ever safer, and eventually become as safe as industry/Tesla claims, safer than human drivers while also being cheap and ubiquitous. Just like airline travel today. A remarkable combination of safety and affordability.
This might be the only way to ever do it. I don’t think the airline industry could’ve ever gotten to current levels of safety by testing everything on closed airfields and over empty land for hundreds of millions of flight hours before they had sufficient statistics to be equal to today.
It can’t happen without regulators and enforcement, either.
sandworm101 2021-08-16 13:37:11 +0000 UTC [ - ]
Of course this is impossible in the real world. Nobody is going to buy a car that will randomly make its own decisions, that will pull the wheel from your hands ever time it thinks you are making an illegal lane change. Want safety? How about a Tesla that is electronically incapable of speeding. Good luck selling that one.
alistairSH 2021-08-16 13:54:52 +0000 UTC [ - ]
That's almost exactly what my Honda does. Illegal (no signal) lane change results in a steering wheel shaker (and optional audio alert). And the car, when sensing an abrupt swerve which is interpreted as the vehicle leaving the roadway, attempts to correct that via steering and brake inputs.
But, I agree with your more general point - the human still needs to be primary. My Honda doesn't allow me to remove my hands from the steering wheel for more than a second or two. Tesla should be doing the same, as no current "autopilot" system is truly automatic.
ohazi 2021-08-16 19:39:00 +0000 UTC [ - ]
By the way, this is fucking terrifying when you first encounter it in a rental car on a dark road with poor lane markings while just trying to get to your hotel after a five hour flight.
I didn't encounter an obvious wheel shaker, but this psychotic car was just yanking the wheel in different directions as I was trying to merge onto a highway.
Must be what a malfunctioning MCAS felt like in a 737 MAX, but thankfully without the hundreds of pounds of hydraulic force.
peeters 2021-08-16 14:11:08 +0000 UTC [ - ]
To be clear, tying the warning to the signal isn't about preventing unsignaled lane changes, it's gauging driver intent (i.e. is he asleep and drifting or just trying to change lanes). It's just gravy that it will train bad drivers to use their signals properly.
sandworm101 2021-08-16 14:46:09 +0000 UTC [ - ]
peeters 2021-08-16 15:28:37 +0000 UTC [ - ]
> 142 (1) The driver or operator of a vehicle upon a highway before turning (...) from one lane for traffic to another lane for traffic (...) shall first see that the movement can be made in safety, and if the operation of any other vehicle may be affected by the movement shall give a signal plainly visible to the driver or operator of the other vehicle of the intention to make the movement. R.S.O. 1990, c. H.8, s. 142 (1).
That said there's zero cost to doing so regardless of whether other drivers are affected.
sandworm101 2021-08-16 20:51:53 +0000 UTC [ - ]
peeters 2021-08-17 02:13:00 +0000 UTC [ - ]
dragonwriter 2021-08-16 15:29:28 +0000 UTC [ - ]
Usually, it makes you liable because it is illegal. CA law for instance reqires signalling 100ft before a lane change or turn.
hermitdev 2021-08-16 16:35:41 +0000 UTC [ - ]
alistairSH 2021-08-16 14:55:39 +0000 UTC [ - ]
alistairSH 2021-08-16 14:24:42 +0000 UTC [ - ]
nthj 2021-08-16 21:36:23 +0000 UTC [ - ]
Robotbeat 2021-08-16 14:09:51 +0000 UTC [ - ]
alistairSH 2021-08-16 14:22:47 +0000 UTC [ - ]
jeofken 2021-08-16 19:07:18 +0000 UTC [ - ]
My insurance company gives a lower rate if you buy the full autopilot option, and that to me indicates they agree it drives better than I, or other humans, do.
Johnny555 2021-08-16 19:18:55 +0000 UTC [ - ]
If following the speed limit makes cars safer, another way to achieve that without autopilot is to just have all cars limit their speed to the speed limit.
Sometimes it’s wonky when the street lines are unclear. It’s not perfect but a better driver than I am in 80% of cases
The problem is in those 20% of cases where you'd lulled into boredom by autopilot as you concentrate on designing your next project in your head, then suddenly autopilot says "I lost track of where the road is, here you do it!" and you have to quickly gain context and figure out what the right thing to do is.
Some autopilot systems use eye tracking to make sure that the driver is at least looking at the road, but that doesn't guarantee that he's paying attention. But at least that's harder to defeat than Tesla's "nudge the steering wheel once in a while" method.
beambot 2021-08-16 19:46:48 +0000 UTC [ - ]
The devil is in the details... GPS may not provide sufficient resolution. Construction zones. School zones with variable hours. Tunnels. Adverse road conditions. Changes to the underlying roads. Different classes of vehicles. Etc.
By the time you account for all the mapping and/or perception, you could've just improved the autonomous driving and eliminated the biggest source of humans driving: The human.
freeone3000 2021-08-16 20:42:15 +0000 UTC [ - ]
Johnny555 2021-08-16 19:57:53 +0000 UTC [ - ]
The parent post said that autopilot won't let him go over the speed limit and implies that makes him safer. My point is that you don't need full autopilot for that.
So this is not a technical problem at all, but a political one. As the past year has shown, people won't put up with any convenience or restriction, even if it could save lives (not even if it could save thousands of lives)
asdff 2021-08-16 22:42:23 +0000 UTC [ - ]
What would be even easier than all of that, though, is just installing speeding cameras and mailing tickets.
watt 2021-08-16 20:56:28 +0000 UTC [ - ]
jeofken 2021-08-16 22:37:21 +0000 UTC [ - ]
I have not yet struggled to stay alert when it drives me, and it has driven better than I would have - so it certainly is an improvement over me driving 100% of the time. It does not have road rage and it does not enjoy the feeling of speeding, like I do when I drive, nor does it feel like driving is a competition, like I must admit I do when I am hungry, stressed, or tired.
> just have all cars limit their speed to the speed limit
No way I’d buy a car that does not accelerate when I hit the pedal. Would you buy a machine that is not your servant?
phyzome 2021-08-17 00:19:19 +0000 UTC [ - ]
I mean... that's an odd thing for someone to say who has bought a vehicle with over-the-air firmware updates.
jeofken 2021-08-17 11:01:53 +0000 UTC [ - ]
I wish I could hack but car, but also wouldn’t trust others if they did
tablespoon 2021-08-16 19:43:19 +0000 UTC [ - ]
Yeah, add to that the unreliability of Tesla's system means that it cannot pull the wheel from the driver, because it's not unusual for it to want to do something dangerous and need to be stopped. You don't want it to "fix" a mistake by driving someone into the median divider.
wizzwizz4 2021-08-16 13:47:43 +0000 UTC [ - ]
Because when the human disagrees with the machine, the machine is usually the one making a mistake. It might prevent accidents, but it would also cause them, and you lose predictability in the process (you have to model the human and the machine).
foobiekr 2021-08-16 20:54:52 +0000 UTC [ - ]
CrazyStat 2021-08-16 14:01:49 +0000 UTC [ - ]
That would be unsafe in many situations. If the flow of traffic is substantially above the speed limit--which it often is--being unable to match it increases the risk of accident. This is known as the Solomon curve [1].
treesknees 2021-08-16 14:35:44 +0000 UTC [ - ]
With the logic presented in the theoretical foundation section, it seems that the safer move would actually be slow down and match the speed of all the trucks and other large vehicles... which won't happen.
Matching speed sounds great, except there are always people willing to go faster and faster. In my state they raised the speed limit from 70 to 75, it just means more people are going 85-90. How is that safer?
filoleg 2021-08-16 15:26:57 +0000 UTC [ - ]
However, you individually going 70-75 when everyone else is going 85-90 is less safe than you going 85-90 like everyone else in the exact same situation.
>there are always people willing to go faster and faster
That’s why no one says “go as fast as the fastest vehicle you see”, it is “go with the general speed of traffic”. That’s an exercise for human judgement to figure that one out, which is why imo it isn’t a smart idea to have the car automatically lock you out of overriding the speed limit.
FireBeyond 2021-08-17 03:57:15 +0000 UTC [ - ]
And yet the roads are full of vehicles literally incapable of going 85. Many trucks cannot do more than 69mph.
kiba 2021-08-16 18:54:10 +0000 UTC [ - ]
CamperBob2 2021-08-16 19:31:28 +0000 UTC [ - ]
MichaelGroves 2021-08-16 14:50:49 +0000 UTC [ - ]
(Trivial if the self driving tech works at all....)
TheCapn 2021-08-16 20:59:19 +0000 UTC [ - ]
There's just too much shit that can't be "seen" with a camera/sensor in conjested traffic. Having a swarm of vehicles all gathering/sharing data is one of the only true ways forward IMO.
sandworm101 2021-08-16 14:32:44 +0000 UTC [ - ]
rad_gruchalski 2021-08-16 16:29:45 +0000 UTC [ - ]
cactus2093 2021-08-16 19:54:26 +0000 UTC [ - ]
Instead they did the exact opposite with the plaid mode model S, lol. It kind of works against their claims that they prioritize safety when their hottest new car - fully intended for public roads - has as its main selling point the ability to accelerate from 60-120 mph faster than any other car.
Someone 2021-08-16 13:57:10 +0000 UTC [ - ]
As a simple example, ABS (https://en.wikipedia.org/wiki/Anti-lock_braking_system) only interferes with what the driver does when an error occurs.
More related to self-driving, there’s various variants of https://en.wikipedia.org/wiki/Lane_departure_warning_system that do take control of the car.
And it is far from “incapable of speeding”, but BMW, Audi and Mercedes-Benz “voluntarily” and sort-of limit the speed of their cars to 250km/hour (https://www.autoevolution.com/news/gentlemens-agreement-not-...)
FridayoLeary 2021-08-16 13:53:51 +0000 UTC [ - ]
Nothing more annoying then a car that thinks i don't know how to drive (warning beeps etc.).
gambiting 2021-08-16 19:12:50 +0000 UTC [ - ]
>>How about a Tesla that is electronically incapable of speeding. Good luck selling that one.
From 2022 all cars sold in the EU have to have an electronic limiter that keeps you to the posted speed limit(by cutting power if you are already going faster) - the regulation does allow the system to be temporarily disabled however.
ggreer 2021-08-16 21:28:13 +0000 UTC [ - ]
I think it's a silly law and I'm very glad I don't live in a place that requires such annoyances, but it's not as bad as you're claiming.
1. https://etsc.eu/briefing-intelligent-speed-assistance-isa/
Symbiote 2021-08-16 21:57:51 +0000 UTC [ - ]
It could reliably recognize the speed limit signs (red circle), but it never recognized the similar grey-slash end-of-limit signs. It also didn't recognize the start-of-town or end-of-town signs, so it didn't do anything about the limits they implied.
I would certainly have had to disable it, had it been reducing the acceleration in the way that document describes.
9935c101ab17a66 2021-08-18 04:49:09 +0000 UTC [ - ]
People literally are buying cars that “make their own decisions”. Importantly though, these systems only activate in the case of an imminent collision IF the corrective measure won’t cause another collision.
> that will pull the wheel …
Yah, of course no one is going to buy a care that does what you describe because what you describe is insane and inherently unsafe. Unless a collision is imminent, nothing happens.
velcii 2021-08-16 13:45:47 +0000 UTC [ - ]
I really don't think that would give many data points, because all of the instances would be when a human fell asleep or wasn't paying attention.
dboreham 2021-08-16 13:44:51 +0000 UTC [ - ]
vntok 2021-08-16 13:30:26 +0000 UTC [ - ]
Conversely, when Waymo iterates and improves their own safety ratios by a significant amount, that evidently does not result in Tesla's improving in return.
unionpivo 2021-08-16 13:41:49 +0000 UTC [ - ]
Until it leads to something systemic, that then regulator mandates for all vehicles
vntok 2021-08-16 19:18:09 +0000 UTC [ - ]
Not the case at all between lidars and cameras.
treeman79 2021-08-16 13:28:01 +0000 UTC [ - ]
Asking someone to pay attention when they are not doing anything is unrealistic. I would be constantly bored / distracted. My wife would instantly fall asleep. Etc etc.
3pt14159 2021-08-16 13:25:20 +0000 UTC [ - ]
lastofthemojito 2021-08-16 13:31:40 +0000 UTC [ - ]
joshgrib 2021-08-16 13:38:44 +0000 UTC [ - ]
If anything a driver monitoring system seems even better than the interlock system, for example you couldn't have your kids/friends blow for you to bypass it.
unionpivo 2021-08-16 13:40:07 +0000 UTC [ - ]
Go with safest drivers first.
Seanambers 2021-08-16 13:40:08 +0000 UTC [ - ]
ra7 2021-08-16 13:53:34 +0000 UTC [ - ]
1. Autopilot only works on (or intended to work on) highways. But they are comparing their highway record to all accident records including city driving, where accident rate is far higher than highway driving.
2. They're also comparing with every vehicle in the United States including millions of older vehicles. Modern vehicles are built for higher safety and have a ton of active safety features (emergency braking, collision prevention etc). Older vehicles are much more prone to accidents and that skews the numbers.
The reality is Teslas are no safer than any other vehicles in its class ($40k+). Their safety report is purely marketing spin.
deegles 2021-08-16 14:49:52 +0000 UTC [ - ]
wilg 2021-08-16 20:03:56 +0000 UTC [ - ]
Would another way of saying this be that they are as safe as other vehicles in that class? And that therefore Autopilot is not more unsafe than driving those other cars?
pandaman 2021-08-16 23:49:58 +0000 UTC [ - ]
ra7 2021-08-16 20:23:37 +0000 UTC [ - ]
We should really compare Autopilot with its competitors like GM’s Super Cruise or Ford’s Blue Cruise, both of which offer more capabilities than Autopilot. That will show if Tesla’s driver assist system is more or less safe than their competitors product.
ggreer 2021-08-16 22:07:48 +0000 UTC [ - ]
The reason GM's Super Cruise got a higher rating by Consumer Reports was because CR didn't even test the capabilities that only Tesla had (such as automatic lane change and taking offramps/onramps). Also, the majority of the evaluation criteria weren't about capabilities. eg: "unresponsive driver", "clear when safe to use", and "keeping the driver engaged".[1]
1. https://www.consumerreports.org/car-safety/cadillac-super-cr...
ben_w 2021-08-16 13:47:15 +0000 UTC [ - ]
I think the comparison should be Tesla with/without AI, not Tesla/not-Tesla; so roughly either x2 or x4 depending on what the other active safety features do.
It’s not nothing, but it’s much less than the current sales pitch — and the current sales pitch is itself the problem here, for many legislators.
Ajedi32 2021-08-16 14:03:22 +0000 UTC [ - ]
This still isn't the correct comparison. Major selection bias with comparing miles with autopilot engaged to miles without it engaged, since autopilot cannot be engaged in all situations.
A better test would be to compare accidents in Tesla vehicles with the autopilot feature enabled (engaged or not) to accidents in Tesla vehicles with the autopilot feature disabled.
HPsquared 2021-08-16 20:05:00 +0000 UTC [ - ]
bluGill 2021-08-16 14:16:14 +0000 UTC [ - ]
Robotbeat 2021-08-16 14:24:40 +0000 UTC [ - ]
https://www.iihs.org/topics/fatality-statistics/detail/urban...
freshpots 2021-08-16 21:08:46 +0000 UTC [ - ]
Yes it can. The only time it can't be activated is if there is no clearly marked center line.
throwaway0a5e 2021-08-16 13:45:02 +0000 UTC [ - ]
Ajedi32 2021-08-16 13:47:59 +0000 UTC [ - ]
The real test of this would be: of all Tesla vehicles, are the ones with autopilot enabled statistically safer or less safe than the ones without autopilot enabled?
FireBeyond 2021-08-17 03:59:44 +0000 UTC [ - ]
AP has the luxury of being able to be turned off in less than ideal conditions. Human drivers can't do that.
It's the reason why only Tesla touts these numbers. They're inaccurate and misleading.
input_sh 2021-08-16 13:48:00 +0000 UTC [ - ]
czzr 2021-08-16 13:49:51 +0000 UTC [ - ]
Makes me laugh, especially with the “geeks are immune to marketing” trope that floats around here equally as regularly.
Waterluvian 2021-08-16 13:06:24 +0000 UTC [ - ]
I need to be touching the wheel and applying some force to it or it begins yelling at me and eventually brings me slowly to a stop.
I’ve had it for a year now and I cannot perceive of a way, without physically altering the system (like hanging a weight from the wheel maybe?) that would allow me to stop being an active participant.
I think the opposite is true: Tesla’s move fast and kill people approach is the mistake. Incremental mastering of autonomous capabilities is the way to go.
jeffnappi 2021-08-16 13:16:34 +0000 UTC [ - ]
Personally Autopilot has actually made driving safer for me... I think there's likely abuse of the system though that Tesla could work harder to prevent.
DrBenCarson 2021-08-16 15:57:50 +0000 UTC [ - ]
They are sending multiple signals that this car can drive itself (going so far as charging people money explicitly for the "self-driving" feature) when it cannot in the slightest do much more than stay straight on an empty highway.
They should be forced to change the name of the self-driving features, I personally think "Backseat Driver" would be more appropriate.
jhgb 2021-08-16 19:52:15 +0000 UTC [ - ]
It is literally an autopilot. Just like an autopilot on an airplane, it keeps you stable and in a certain flight corridor. There's virtually no difference except for Tesla's Autopilot's need to deal with curved trajectories.
labcomputer 2021-08-16 20:37:56 +0000 UTC [ - ]
Well, and it actively avoids collisions with other vehicles (most of the time). Airplane (and boat) autopilots don't do that.
"But you're using the word autopilot wrong!"
jhgb 2021-08-17 13:13:10 +0000 UTC [ - ]
Kaytaro 2021-08-16 19:05:36 +0000 UTC [ - ]
tgsovlerkhgsel 2021-08-16 13:08:26 +0000 UTC [ - ]
> physically altering the system (like hanging a weight from the wheel maybe?)
was exactly what people were doing. But it's also possible to be physically present, applying force, but being "zoned out", even without malicious intent.
johnnyApplePRNG 2021-08-16 13:19:14 +0000 UTC [ - ]
I've occasionally noticed myself zoning out behind the wheel of my non-self-driving car as well.
It's actually very common. [0]
[0] https://www.actuarialpost.co.uk/article/quarter-of-fatal-cra...
FireBeyond 2021-08-17 04:01:57 +0000 UTC [ - ]
shakna 2021-08-16 13:12:26 +0000 UTC [ - ]
> I’ve had it for a year now and I cannot perceive of a way, without physically altering the system (like hanging a weight from the wheel maybe?) that would allow me to stop being an active participant.
That's exactly what people were doing with the Tesla. Hanging a weight to ensure the safety system doesn't kick in. [0][1]
[0] https://edition.cnn.com/2021/04/28/cars/tesla-texas-crash-au...
[1] https://twitter.com/ItsKimJava/status/1388240600491859968/ph...
Waterluvian 2021-08-16 13:18:36 +0000 UTC [ - ]
I guess the probe will reveal what share of fatal accidents are caused by this.
rcxdude 2021-08-16 13:48:33 +0000 UTC [ - ]
theluketaylor 2021-08-17 00:37:59 +0000 UTC [ - ]
Teslas can famously be tricked by wedging an orange between the rim and spoke of the steering wheel to produce enough torque on the wheel to satisfy the detection. There are enough videos of it on youtibe that tesla could easily be found negligent for not doing enough to prevent drivers from defeating a safety system given that alternate technology that more directly tracks attention is available and tricking tesla's detection method became common knowledge.
SEJeff 2021-08-16 13:40:55 +0000 UTC [ - ]
People causing these problems almost certainly are putting something over the cabin camera and a defeat device on the steering wheel.
ChrisClark 2021-08-16 13:15:58 +0000 UTC [ - ]
Waterluvian 2021-08-16 13:17:29 +0000 UTC [ - ]
slg 2021-08-16 19:23:00 +0000 UTC [ - ]
It therefore isn't a clean swap of a human paying attention to a human who isn't. It becomes a complicated equation that we can't just dismiss with "people won't pay attention". It is possible that a 90%/10% split of drivers paying attention to not paying attention is more dangerous when they are all driving manually than a 70%/30% split if those drivers are all using self-driving tech to cover for them. Wouldn't you feel safer if the driver behind you who is answering texts was using this incremental self-driving tech rather than driving manually?
No one has enough data on the performance of these systems or how the population of drivers use them to say definitively that they are either safer or more dangerous on the whole. But it is definitely something that needs to be investigated and researched.
helsinkiandrew 2021-08-16 13:28:19 +0000 UTC [ - ]
That is the fatal flaw in anything but a perfect system - any kind of system that is taking the decisions about steering from the driver is going to result in the driver at best thinking about other things and worse getting into the back seat to change. If you had to develop a system to make sure someone was paying attention, you wouldn't make them sit in a warm comfy seat looking at a screen - you would make them actively engage with what they were looking at - like steering.
And ultimately it doesn't matter how many hundreds of thousands of hours of driving you teach your system with, it may eventually be able to learn about parked cars, kerbs and road signs, but there won't be enough examples of different accidents and how emergency vehicles behave to ever make it behave safely. Humans can cope with driving emergencies fairly well (not perfectly admittedly) no matter how many they've been involved in using logic and higher level reasoning.
Baeocystin 2021-08-16 19:35:52 +0000 UTC [ - ]
We've known for a very long time that this sort of automation/manual control handoff failure is a very big deal, and yet there seems to be an almost willful blindness from the manufacturers to address it in a meaningful way.
fzzzy 2021-08-16 13:27:24 +0000 UTC [ - ]
ghaff 2021-08-16 13:48:28 +0000 UTC [ - ]
hcurtiss 2021-08-16 14:35:42 +0000 UTC [ - ]
ghaff 2021-08-16 14:44:52 +0000 UTC [ - ]
minhazm 2021-08-16 19:32:27 +0000 UTC [ - ]
int_19h 2021-08-16 21:10:30 +0000 UTC [ - ]
ghaff 2021-08-16 20:47:45 +0000 UTC [ - ]
cmpb 2021-08-16 13:04:39 +0000 UTC [ - ]
tapoxi 2021-08-16 13:09:02 +0000 UTC [ - ]
My car has adaptive cruise control and lane keep assist, but I'm not relying on either for anything more complex than sipping a drink while on the highway.
rmckayfleming 2021-08-16 13:22:13 +0000 UTC [ - ]
Robotbeat 2021-08-16 13:25:04 +0000 UTC [ - ]
Karunamon 2021-08-16 18:59:10 +0000 UTC [ - ]
darkerside 2021-08-17 12:03:38 +0000 UTC [ - ]
MR4D 2021-08-16 19:48:14 +0000 UTC [ - ]
Conversely, if it didn't warn me right before an accident, then the absence of that warning would be useful too.
All of that information should be put back into the model based on crash reporting. Everything else can be ignored.
I would argue that the information should be available to all automakers (perhaps using the NHTSA as a conduit), so that each of them have the same safety information, but can still develop their own models. The FAA actually does this already with the FAA Accident and Incident Data Systems [0] and it has worked pretty darn well.
oceanghost 2021-08-16 20:29:35 +0000 UTC [ - ]
It also reads the speed limit signs and places a reminder in the display. I think it can brake if it detects something in front of it, but I'm not certain.
MR4D 2021-08-16 21:59:20 +0000 UTC [ - ]
My main point (perhaps buried more than it should have been) is that by centralizing accident data along with whether an alert went off (or not), and sharing that with all automobile manufacturers can help this process proceed better.
Right now the data is highly fragmented and there is not really a common objective metric by which to make decisions to improve models.
oceanghost 2021-08-17 02:26:23 +0000 UTC [ - ]
I agree that would be reasonable and desirable :-)
dalbasal 2021-08-16 14:02:32 +0000 UTC [ - ]
These things are on the road already. They have issues, but so do human only cars. Tweaks probably get made, like some special handling of emergency vehicle scenarios. But, it's not enough to stop it.
Meanwhile, it's not a permanent state. Self driving technology is advancing, becoming more common on roads. Procedures, as well as infrastructure, is growing around the existence of self driven cars. Human supervisor or not, the way these things use the road affects the design of roads. If your emergency speed sign isn't being headed by self driven cars, your emergency speed sign has a bug.
tgsovlerkhgsel 2021-08-16 13:07:15 +0000 UTC [ - ]
"The average driver" includes everyone, ranging from drivers using it as intended with close supervision, drivers who become inattentive because nothing is happening, and drivers who think it's a reasonable idea to climb into the back seat with a water bottle duct taped to the steering wheel to bypass the sensor.
OTOH, the average driver for the unassisted scenario also includes the driver who thinks they're able to drive a car while texting.
TacticalCoder 2021-08-16 13:15:00 +0000 UTC [ - ]
Shouldn't that compared to "average driver + myriad of modern little safety features" instead of "average unassisted driver"? The one who has the means to drive a Tesla with the "full driving" mode certain has the means to buy, say, a Toyota full of assistance/safety features (lane change assist, unwanted lane change warning and whatnots).
politician 2021-08-16 13:37:49 +0000 UTC [ - ]
tgsovlerkhgsel 2021-08-16 14:42:58 +0000 UTC [ - ]
Making it a crime isn't an "obvious solution" to actually make it not happen. Drunk driving is a crime and yet people keep doing it. Same with texting and driving.
politician 2021-08-16 15:31:21 +0000 UTC [ - ]
Prevention as a goal is how we end up with dystopia.
rcxdude 2021-08-16 13:50:11 +0000 UTC [ - ]
politician 2021-08-16 14:04:45 +0000 UTC [ - ]
CaptArmchair 2021-08-16 13:28:53 +0000 UTC [ - ]
Doesn't this hide a paradox? Using a self-driving car as intended implies that the driver relinquishes a part of the human decision making process to the car. While close supervision implies that the driver can always take control back from the car, and therefore carries full personal responsibility of what happens.
The caveat here is that the car might make decisions in a rapidly changing, complex context which the driver might disagree with, but has no time to correct for through manual intervention. e.g. hitting a cyclist because the autonomous system made an erroneous assertion.
Here's another way of looking at this: if you're in a self-driving car, are you a passenger or a driver? Do you intend to drive the car yourself or let the car transport you to your destination?
In the unassisted scenario, it's clear that both intentions are one and the same. If you want to get to your location, you can't but drive the car yourself. Therefore you can't but assume full personal responsibility for your driving. Can the same be said about a vehicle that's specifically designed and marketed as "self-driving" and "autonomous"?
As a driver, you don't just relinquish part of the decision making process to the car, what essentially happens is that you put your trust in how the machine learning processes that steer the car were taught to perceive the world by their manufacturer. So, if both car and occupant disagree and the ensuing result is an accident, who's at fault? The car? The occupant? The manufacturer? Or the person seeking damages because their dog ended up wounded?
The issue here isn't that self-driving cars are inherently more dangerous then their "dumb" counter parts. It's that driving a self-driving car creates it's own separate class of liabilities and questions regarding responsible driving when accidents do happen.
jdavis703 2021-08-16 13:20:31 +0000 UTC [ - ]
tgsovlerkhgsel 2021-08-16 13:25:24 +0000 UTC [ - ]
I'm not saying improvements should stop there, but once the system has reached parity, it's OK to deploy it and let it improve from there.
bcrl 2021-08-16 14:14:38 +0000 UTC [ - ]
An imperfect self driving vehicle is the worst of all worlds: they lull the driver into the perception that the vehicle is safe while not being able to handle abnormal situations. The fact that there are multiple crashes on the record where Telsas have driven into stationary trucks and obstacles on roads is pretty damning proof that drivers can't always react in the time required when an imperfect self driving system is in use. They're not intrinsically safe.
At the very least drivers should be required additional training to operate these systems. Like pilots, drivers need to be taught how to recognize when things go awry and react to possible failures. Anything less is not rooted in safety culture, and it's good to see there are at least a few people starting to shine the light on how these systems are being implemented from a safety perspective.
notahacker 2021-08-16 19:15:11 +0000 UTC [ - ]
Nothing absurd about thinking a system which has parity with the average human driver is too risky to buy unless you consider yourself to be below average at driving. (As it is, most people consider themselves to be better than average drivers, and some of them are even right!) The accident statistics that comprise the "average human accident rate" are also disproportionately caused by humans you'd try to discourage from driving in those circumstances...
Another very obvious problem is that an automated system which kills at the same rate per mile as an average human drivers will tend to be driven a lot more because no effort (and probably replace better-than-average commercial drivers long before teenagers and occasional-but-disproportionately-deadly drivers can afford it).
Robotbeat 2021-08-16 13:23:54 +0000 UTC [ - ]
dheera 2021-08-16 19:08:33 +0000 UTC [ - ]
It's just that the companies that are NOT doing incremental approaches are largely at the mercy of some investors who don't know a thing about self-driving, and they may die at any time.
I agree with you that it is technically flawed, but it may still be viable in the end. At least their existence is not dependent on the mercy of some fools who don't get it, they just sell cars to stay alive.
That's one of the major problems of today's version of capitalism -- it encourages technically flawed ways to achieve scientific advancement.
yawaworht1978 2021-08-16 22:14:22 +0000 UTC [ - ]
api 2021-08-16 14:24:56 +0000 UTC [ - ]
Tesla auto-drive seems like it's about 80% of the way there.
backtoyoujim 2021-08-16 13:36:36 +0000 UTC [ - ]
It is not solely the trust and dependence but inclusive is the group of idiots with access to wealth without regard to human life.
bishoprook2 2021-08-16 13:36:46 +0000 UTC [ - ]
mnmmn123456 2021-08-16 13:03:07 +0000 UTC [ - ]
ben_w 2021-08-16 13:42:57 +0000 UTC [ - ]
[0] https://theoatmeal.com/blog/google_self_driving_car
salawat 2021-08-16 13:26:55 +0000 UTC [ - ]
I will be amused, intrigued, and possibly a bit horrified if by the time FSD hits level 5, and they stick with the Neural Net of Neural Nets architecture if there isn't a rash of system induced variance in behavior as emergent phenomena take shape.
Imagined news: All Tesla's on I-95 engaged in creating patterns whereby all non-Teala traffic was bordered by a Tesla on each side. Almost like a game of Go, says expert. Researchers stumped.
Then again, that's imply you had an NN capable of retraining itself on the fly to some limited degree, which I assume no one sane would put into service... Hopefully this comment doesn't suffer a date of not aging well.
antattack 2021-08-16 14:00:12 +0000 UTC [ - ]
mrfusion 2021-08-16 13:05:18 +0000 UTC [ - ]
jdavis703 2021-08-16 13:23:25 +0000 UTC [ - ]
merrywhether 2021-08-16 15:30:01 +0000 UTC [ - ]
jdavis703 2021-08-16 16:10:54 +0000 UTC [ - ]
userbinator 2021-08-16 13:42:56 +0000 UTC [ - ]
jdavis703 2021-08-16 17:00:11 +0000 UTC [ - ]
yawaworht1978 2021-08-16 22:10:00 +0000 UTC [ - ]
Car traffic and streets are more dense and often have humans crossing them without regards to laws, bicycles, motorbikes, road construction and bad weather.
Not saying one auto pilot system is better than the other, however, they operate in different environments.
swiley 2021-08-16 13:34:02 +0000 UTC [ - ]
supperburg 2021-08-16 19:15:06 +0000 UTC [ - ]
nathias 2021-08-16 13:01:05 +0000 UTC [ - ]
weird-eye-issue 2021-08-16 13:02:22 +0000 UTC [ - ]
nathias 2021-08-16 14:03:22 +0000 UTC [ - ]
weird-eye-issue 2021-08-17 12:22:23 +0000 UTC [ - ]
formerly_proven 2021-08-16 13:10:10 +0000 UTC [ - ]
P.S. /s. Obviously, Mr. Poe.
rvz 2021-08-16 13:36:35 +0000 UTC [ - ]
This is why 'attention' and 'driver monitoring' was not included.
salawat 2021-08-16 13:41:29 +0000 UTC [ - ]
It's okay. I do it too. Really need to work on seeing yourself making that argument as a starting point and not an endpoint.
judge2020 2021-08-16 13:00:51 +0000 UTC [ - ]
jedberg 2021-08-16 19:47:45 +0000 UTC [ - ]
I have autopilot on my car, and it definitely makes me a better and safer driver. It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things. Instead I can spend all my brainpower focused on looking for potential emergencies, instead of splitting time between lane keeping/following and looking for emergencies.
I no longer have to look at my speedometer or the lane markers, I can take a much broader view of the traffic and conditions around me.
Before you say it's impossible to be safe driving with an assistive product, I suggest trying one out.
tomdell 2021-08-16 19:52:28 +0000 UTC [ - ]
Drunk_Engineer 2021-08-16 19:58:59 +0000 UTC [ - ]
"Risk compensation is a theory which suggests that people typically adjust their behavior in response to perceived levels of risk, becoming more careful where they sense greater risk and less careful if they feel more protected."
xahrepap 2021-08-16 20:34:12 +0000 UTC [ - ]
https://usa.streetsblog.org/2017/09/13/wide-residential-stre...
I was first introduced to "wide streets in neighborhoods are more dangerous than narrow" on HN years ago. (I don't think it was the linked article, but that was the first one that came up just now after a search :P )
Since having read that, I've actually noticed how true this is, at least to me anecdotally. When I'm driving in a neighborhood with crowded streets, I can't bring myself to go over 15MPH, much less over the speed limit (typically 25 in neighborhoods in the US).
Wide streets give a sense of security. So I feel like people are less likely to pay attention going around bends, parked cars, etc, than if they didn't have that sense of security.
telside 2021-08-16 23:44:26 +0000 UTC [ - ]
Forget the average, how about the bottom 10-20% of all drivers? I don't trust the bottom 10% driving with "Autopilot" at all, zero. They are going to use it to go on autopilot while driving, exactly as the marketing implies. I mean there has to even be people who think the car itself is conscious. Car has advanced AI, must be conscious.
To think otherwise is just highly underestimating how clueless some people are.
vnchr 2021-08-17 00:52:33 +0000 UTC [ - ]
kemiller 2021-08-16 20:00:25 +0000 UTC [ - ]
gusgus01 2021-08-16 21:12:51 +0000 UTC [ - ]
They then use a "common formula" to show that that leads to more fatal accidents, but didn't actually study on actual crash data.
birken 2021-08-16 20:12:06 +0000 UTC [ - ]
There is absolutely no doubt I'm a safer driver with Comma than without it. I'm still in control, but Comma not only allows me to expend less effort driving (which allows me to stay alert over longer periods of time), but also be much less emotional when driving. I'm pretty convinced that a large percentage of accidents are caused by frustrated or bored drivers doing crazy things that you just don't feel the urge to do with the assistance of self-driving.
jedberg 2021-08-16 20:49:18 +0000 UTC [ - ]
Edit: After one minute I got a downvote.
SECProto 2021-08-16 21:52:37 +0000 UTC [ - ]
jedberg 2021-08-16 21:56:16 +0000 UTC [ - ]
SECProto 2021-08-16 23:33:43 +0000 UTC [ - ]
Sorry, I actually meant to refer to Birken's comment above. Advertising might not have been the best word - astroturfing? If you read their comment but replace "comma" with "tesla" it still reads as spammy.
Yours was fine (though discussing downvotes will always get you downvotes, my comment included).
> Otherwise, is every person here who mentions Tesla advertising too?
Only the ones who needlessly sing the praises of the Tesla autopilot in barely-related threads.
jskrn 2021-08-16 19:57:36 +0000 UTC [ - ]
joshuanapoli 2021-08-16 21:44:03 +0000 UTC [ - ]
I guess that we have to look at the results here to judge whether too many people are not paying attention. Hopefully the investigation will reveal whether the autopilot incidents of collision with emergency vehicles is significantly more frequent or less frequent than from vehicles being driven in the traditional way.
wilg 2021-08-16 19:59:00 +0000 UTC [ - ]
malwarebytess 2021-08-16 20:06:37 +0000 UTC [ - ]
new_realist 2021-08-16 20:29:33 +0000 UTC [ - ]
tomdell 2021-08-16 20:40:38 +0000 UTC [ - ]
https://www.latimes.com/business/story/2021-06-29/nhtsa-adas...
Tesla released data in the past, but that’s quite suspect as they have an obvious agenda and aren’t known for open and honest communication.
https://www.latimes.com/business/autos/la-fi-hy-tesla-safety...
woah 2021-08-16 21:01:53 +0000 UTC [ - ]
samstave 2021-08-16 20:31:48 +0000 UTC [ - ]
I also think there should be dedicated lanes for self driving cars..
A very good friend of mine was a sensor engineer at google working on virtual sensors that interacted with hand gestures in the air... and is now a pre-eminent sensor engineer for a large japanese company everyone has heard of...
We drove from the bay to nprthern california in his tesla and it was terrifying how much trust he put into that car. I got car sick and ended up throwing up out the window...
Knowing what I know of machines working in tech since 1995 -- I would trust SHIT for self-driving just yet.
rubicon33 2021-08-16 19:52:45 +0000 UTC [ - ]
theluketaylor 2021-08-17 00:20:32 +0000 UTC [ - ]
Autopilot bothers me for a number of reasons. Fundamentally it's a poor driver, spending time in people's blind spots unnecessarily, braking and accelerating in rather abrupt ways, and just generally acting like a teenage driver who just got their license. It simply doesn't practice defensive driving.
I also spend a lot of time driving on undivided rural highways. These are highly dangerous roads, with closing speeds in excess of 200 km/hr at times. In those situations autopilot drives far too much according to the strict rules of the road and can't adjust to the situation. It doesn't use the lane space to leave additional room and it doesn't give a wide berth to cyclists.
It also bothers me deeply that one of the ways to override autopilot is to make steering input and that's also the indicator Tesla uses to determine the user participating. I haven't used autopilot much for the reasons above, but the few times I did active it the steering input required to tell the car I'm there is also enough steering input to move the car several feet in the lane due to very direct steering. It feels like I'm just fighting the car. That is deeply unnerving since the force required to override autopilot feels like enough to jump nearly half a lane and cause a collision.
jedberg 2021-08-17 00:28:36 +0000 UTC [ - ]
theluketaylor 2021-08-17 00:48:38 +0000 UTC [ - ]
I'm guessing GM Supercruise since it's the only one I'm aware of that uses eye tracking in production (though Tesla claims to have enabled that in the US just recently. Personally I'm not sure their camera placement can really do proper eye tracking). Supercruise's disengagement rate is low though, generally much lower than Tesla's.
I do like supercruise from what I've seen of it (haven't had a chance to actually use it since GM seems determined to waste their advantage by not rushing it into every car they make).
jedberg 2021-08-17 01:16:17 +0000 UTC [ - ]
bob1029 2021-08-16 21:47:34 +0000 UTC [ - ]
I would be terrified to share the road with someone of this mindset. Your vehicle is a lethal weapon when you are driving it around (assisted or otherwise). At no point can someone claim that a tesla vehicle circa today is able to completely assume the duty of driving. You are still 100% responsible for everything that happens in and around that car. You had better have a plan for what happens if autopilot decides to shit the bed while a semi jackknifes in front of you.
The exceptions are what will kill you - and others - every single time. It's not the boring daily drives where 50 car pileups and random battery explosions occur. Maybe your risk tolerance is higher. Mine is not. Please consider this next time you are on the road.
nexuist 2021-08-16 22:03:44 +0000 UTC [ - ]
When you automate away the mundane, exceptions are much easier to catch.
crote 2021-08-16 23:30:15 +0000 UTC [ - ]
Self-driving cars will take that stuff over 99% of the time, but the 1% where it screws up is the dangerous part. There are plenty of examples where a self-driving car seemingly randomly seems to go completely haywire, without any obvious reason.
Instead of spending brain power on driving properly, you now have to spend brain power on looking at what you should be doing _and_ checking if the car actually agrees and is doing it.
Staying 100% focused without actually _doing_ anything is incredibly difficult. Many countries intentionally add curves to their highways to keep drivers alert: having a pencil-straight road for hundreds of miles really messes with the human brain.
stephencanon 2021-08-17 01:17:15 +0000 UTC [ - ]
That doesn’t help at all if you’re reading a book or playing on your phone, both of which are things I observe Tesla drivers doing pretty often when I’m in the Bay Area.
jedberg 2021-08-16 21:54:11 +0000 UTC [ - ]
It's the same thing here. The computer is assisting me so that I can take care of the novel situations, the exceptions if you will. I can pay closer attention to the road and see that jackknifed trailer sooner because I wasn't looking at my speedometer to check my speed.
And I don't have a Tesla, I use a different autopilot system.
throwaway0a5e 2021-08-16 23:19:22 +0000 UTC [ - ]
He's saying that he no longer has to worry about those things the same way cruise control lets you not worry about the speedometer needle and dedicate more of your attention budget outside the car.
Of course you can be an idiot and spend it on your cell phone but that's not really a failure mode specific to any given vehicle technology.
CommieBobDole 2021-08-16 20:25:21 +0000 UTC [ - ]
Obviously that's a single anecdote, and I don't know if it would have gone through with it because I immediately corrected, but that was my experience.
ec109685 2021-08-16 21:38:02 +0000 UTC [ - ]
The question is whether a system that absolutely requires that you pay attention going through intersections (which you should obviously do) is safer in aggregate than not having those features enabled at all in those situations.
E.g. are weird lane changes that people don't catch happening more frequently than people zooming through red lights because they weren't paying attention. Only the data can show that, and Tesla should share it.
MisterTea 2021-08-16 20:30:18 +0000 UTC [ - ]
Ive been driving for 25 years, cars, trucks, trailers, standard and auto transmissions, and I have never once thought to myself "I'd be such a better diver if I didn't have to pay attention to my speed, lane keeping or following distance" Why? Because those mundane things are already on autopilot in my brain.
Posts like yours are so absurd to met that I cant help but think shill.
jedberg 2021-08-16 20:52:58 +0000 UTC [ - ]
It's like people who do math by hand and then get a calculator.
spywaregorilla 2021-08-16 20:54:17 +0000 UTC [ - ]
studentrob 2021-08-16 21:53:42 +0000 UTC [ - ]
rrix2 2021-08-17 00:36:13 +0000 UTC [ - ]
int_19h 2021-08-16 21:05:02 +0000 UTC [ - ]
throwaway09223 2021-08-16 19:52:17 +0000 UTC [ - ]
breakfastduck 2021-08-16 21:00:25 +0000 UTC [ - ]
aguasfrias 2021-08-16 20:01:48 +0000 UTC [ - ]
Personally, I tend to turn off things like lane keeping because I end up having to babysit them more than I would like. It doesn't always read the lanes correctly, though I have not tried Tesla's technology yet.
new_realist 2021-08-16 20:27:42 +0000 UTC [ - ]
yumraj 2021-08-16 22:03:57 +0000 UTC [ - ]
But, I feel that this depends on the type of driver and their personality. I, for on, have never felt comfortable with cruise controls, even adaptive ones, let alone partial self-driving. I had always found that I am more comfortable when I was driving rather than trying to make sure that the adaptive cruise control is able to make a complete stop in case of emergencies. Perhaps I'm just a little untrusting and paranoid :).
andyxor 2021-08-16 20:20:47 +0000 UTC [ - ]
I don't know if this "AI" has any sort of quality control, but how difficult is it to test if it detects a solid white line on the side of the road in at least 6 out of 10 tries
it also tends to suddenly disengage and pass control to the driver at most dangerous parts of the trip e.g. when passing other car in narrow lane, etc.
This "driver assistant" is a series of disasters in the making.
hellbannedguy 2021-08-16 20:16:42 +0000 UTC [ - ]
jedberg 2021-08-16 20:51:00 +0000 UTC [ - ]
But no, I don't trust it to drive itself. If I'm tired I won't drive, regardless of autopilot.
phyzome 2021-08-17 00:17:42 +0000 UTC [ - ]
I'm curious about this part. Do you manually input a limit, or trust it to read street signs?
And how often do you look at your speedometer anyhow? I think on the highway I glance at it maybe once every few minutes and otherwise match speed with the other vehicles, and in the city I look more often but mostly just drive at what feels a safe pace (which seems to match the limits, more or less.)
jedberg 2021-08-17 00:25:36 +0000 UTC [ - ]
It's true, I don't look at the speedometer all that often when driving manually, but it's just one less thing to worry about.
asdff 2021-08-16 22:36:23 +0000 UTC [ - ]
jedberg 2021-08-16 23:13:18 +0000 UTC [ - ]
jeffrallen 2021-08-16 21:55:01 +0000 UTC [ - ]
e40 2021-08-16 20:51:27 +0000 UTC [ - ]
sgustard 2021-08-16 21:53:52 +0000 UTC [ - ]
sorokod 2021-08-16 20:32:23 +0000 UTC [ - ]
spywaregorilla 2021-08-16 20:55:32 +0000 UTC [ - ]
sorokod 2021-08-16 21:44:45 +0000 UTC [ - ]
mensetmanusman 2021-08-16 13:07:44 +0000 UTC [ - ]
Risk compensation is fascinating; driving with a bike helmet causes the biker and drivers around the biker to behave more dangerously.
Is society sophisticated enough to deal with advanced driver assistance? Is it possible to gather enough data to create self driving ML systems?
WA 2021-08-16 19:51:34 +0000 UTC [ - ]
Do you have a truly reliable source for that? Because I hear this statement once in a while, and it feels flawed.
A helmet protects you from severe head injury if you are in an accident. There are more reasons for accidents than reckless car drivers. For example:
- Bad weather
- Driver not seeing the biker at all (no matter with or without helmet)
- Crash between 2 cyclists
xsmasher 2021-08-16 21:21:25 +0000 UTC [ - ]
https://www.bicycling.com/news/a25358099/drivers-give-helmet...
brandmeyer 2021-08-16 22:10:44 +0000 UTC [ - ]
This result is both weakly supported and small, and it shouldn't be considered actionable.
jacquesm 2021-08-16 14:17:14 +0000 UTC [ - ]
A nice and very sharp 8" stainless steel spike on the steering wheel facing the driver.
toast0 2021-08-16 20:31:10 +0000 UTC [ - ]
Didn't we have those in the 50s and 60s? Maybe not sharp, but collapsable steering columns are a significant improvement to survivability.
barbazoo 2021-08-16 20:04:38 +0000 UTC [ - ]
Source please
bllguo 2021-08-16 20:28:06 +0000 UTC [ - ]
there are some sources and studies linked. i.e. countries with the highest rate of helmet use also have the highest cyclist fatality rates
xsznix 2021-08-16 20:15:20 +0000 UTC [ - ]
barbazoo 2021-08-16 20:48:00 +0000 UTC [ - ]
> There is a body of research on how driver behaviour might change in response to bicyclists’ appearance. In 2007, Walker published a study suggesting motorists drove closer on average when passing a bicyclist if the rider wore a helmet, potentially increasing the risk of a collision. Olivier and Walter re-analysed the same data in 2013 and claimed helmet wearing was not associated with close vehicle passing.
xsmasher 2021-08-16 21:24:14 +0000 UTC [ - ]
> We then present a new analysis of the original dataset, measuring directly the extent to which drivers changed their behaviour in response to helmet wearing. This analysis confirms that drivers did, overall, get closer when the rider wore a helmet.
phoe18 2021-08-16 12:58:30 +0000 UTC [ - ]
No mention of the deceptive marketing name "Full Self Driving" in the article.
dmix 2021-08-16 15:12:57 +0000 UTC [ - ]
> All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. [...] As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates.
https://www.tesla.com/en_CA/autopilot
I also personally would prefer they stuck to 'autopilot' and avoided the word full in 'full self-driving' and otherwise be more specific about what it means.
Other car companies typically productize the various features like lane assist, following cruise control, etc rather than bundle it into one. But that definitely makes communicating it more difficult.
Tesla probably doesn't want to call it 'limited self-driving' or 'partial self-driving'. Maybe 'computer assisted driving' but that doesn't sound as appealing. I can see the difficulty marketing here. But again not using 'full' as in it's complete and ready-to-go would help.
rvz 2021-08-16 13:23:55 +0000 UTC [ - ]
On top of that, FSD is still admittedly Level 2; Not exactly 'Full Self Driving'? And the controls can easily be tricked to think that the driver has their 'hands on the wheel' which is not enough to determine driver attentiveness while FSD is switched on.
xeromal 2021-08-16 19:48:24 +0000 UTC [ - ]
kube-system 2021-08-16 20:09:35 +0000 UTC [ - ]
guerby 2021-08-16 15:09:35 +0000 UTC [ - ]
If we assume the number of tesla autopilot death double this year to 8 (from 4 at the time of probe launch), for about 900 thousand tesla on the road in USA, that's 8.9 autopilot death/million tesla/year.
Ratio between the numbers of 14.4.
Tesla reporting says for Q1 2021 one crash on autopilot per 4.19 millions miles vs one crash per 484 thousand miles all vehicules.
Ratio between the numbers of 8.7
All numbers are full of biases and their ratio probably aren't that meaningful but they end up in the same magnitude.
Interesting data there "Fatality Facts 2019 Urban/rural comparison":
https://www.iihs.org/topics/fatality-statistics/detail/urban...
"Although 19 percent of people in the U.S. live in rural areas and 30 percent of the vehicle miles traveled occur in rural areas, almost half of crash deaths occur there. "
I was shocked that in the USA in 2019 about 40-46% of all road death people were unbelted, while 90% of front seat people wear seat belts according to observation studies.
Incidentally tesla car will beep to no end if weight is detected on a seat and seat belt isn't clicked: I have to click the seat belt when I put my (not so heavy) bag on the passenger seat since there's no software option to disable the beeping.
akira2501 2021-08-16 22:14:35 +0000 UTC [ - ]
You shouldn't. 16% of accidents are pedestrians. 8% are motorcyclists. 40% of accidents involve excessive speed or drugs and alcohol.
Accidents aren't a fungible item you can do this with.
> bag on the passenger seat since there's no software option to disable the beeping.
There is in the US for the rear seats. Additionally, you can just leave the belt always clicked in and just sit on top of them. There aren't many great technological solutions to human behavior.
jeffbee 2021-08-16 19:51:25 +0000 UTC [ - ]
btbuildem 2021-08-16 19:29:13 +0000 UTC [ - ]
Doesn't that just speak to the effectiveness of seatbelts? Most people wear them, and two-fifths of those who died in a crash did not wear a seatbelt.
If we had the same proportion of deaths as we have seatbelt wearers, that would indicate the belts are ineffective.
guerby 2021-08-16 20:18:47 +0000 UTC [ - ]
It's just that in France "only" 20-25% of fatalities are for people not wearing seatbelt.
Observation statistics are at about 98-99% of front seat users wearing seat belt in France.
Seat belt in front is mandatory since 1st july 1973 and back seat since 1st october 1990.
So seat belts not Tesla autopilot or whatever would save around 8000 lives per year in the USA.
Are tesla cars in the USA nagging about seat belt with no software off switch like in France?
zzt123 2021-08-16 19:35:12 +0000 UTC [ - ]
foepys 2021-08-16 20:25:34 +0000 UTC [ - ]
guerby 2021-08-16 20:39:54 +0000 UTC [ - ]
But at this point you really see nothing and you'll limit your speed to 10-40 km/h by yourself.
I used it in those situation on my tesla model 3 to be able to focus a maximum on the small visibility left as a driver, and with both hands strongly on the wheel and foot on the brake, low visibility is really dangerous and scary on the road.
Part of the issue is that you don't know what speed the car arriving behind you will have so where's your optimum speed? Too slow and rear ended by bad drivers, too fast and it won't go well.
It's fresh on my mind since I had such driving conditions two weeks ago on the highway. Trucks stayed at suicidal 90 km/h ...
darkwizard42 2021-08-16 19:12:43 +0000 UTC [ - ]
1. generally speaking the right way to think about accidents/fatalities/hard breaking events is per miles driven given that risk scales with time spent on road (and miles driven is the best proxy we have at the moment, insurance companies use this stat)
2. If wearing a seat belt prevents a ton of fatalities as advertised and generally proven, it would make sense that of the road fatalities that do happen, many are due to not wearing a seat belt.
10% of people not wearing seat belts is still hundreds of millions of miles driven without seat belts.
jazzyjackson 2021-08-16 19:18:49 +0000 UTC [ - ]
Of course, highway driving is the least dangerous (per passenger mile) since everything is so predictable, I don’t know how many deaths are caused by t-bones at intersections but that at least should disappear now that auto-radar brakes are a thing… (tesla thinks it’s too good for radar of course, even tho it can’t recognize a god damn fire truck is in the way)
yawaworht1978 2021-08-16 22:22:38 +0000 UTC [ - ]
sidibe 2021-08-16 13:08:33 +0000 UTC [ - ]
As Teslas get better at driving the drivers will be paying less attention inevitably, Tesla needs to start being responsible at some point
bob33212 2021-08-16 13:35:16 +0000 UTC [ - ]
Once full self driving is statistically safer than humans how will you not let people use it? It is like saying you would rather have 10 children die because of bad driving skills rather than 1 child die because they were not paying attention at all times.
sidibe 2021-08-16 13:41:44 +0000 UTC [ - ]
I'm fine with self-driving if/when it works (though I'm pretty sure from watching FSD Beta videos shot and edited by their biggest fans with a few interventions every 5 minutes, this is many many many years away for Tesla). But the company selling the self driving has to be responsible to some degree for the mistakes it makes.
WA 2021-08-16 19:58:29 +0000 UTC [ - ]
https://www.tesla.com/videos/autopilot-self-driving-hardware...
"… HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF."
Online since 2016, debunked as a lie. Still on Tesla’s website.
thebruce87m 2021-08-16 22:46:48 +0000 UTC [ - ]
That’ll be a hard thing to overcome for the public. The drunk person “had it coming”, but did little Timmy?
bob33212 2021-08-17 02:08:34 +0000 UTC [ - ]
jazzyjackson 2021-08-16 19:21:33 +0000 UTC [ - ]
Vecr 2021-08-17 00:18:31 +0000 UTC [ - ]
jazzyjackson 2021-08-17 03:13:12 +0000 UTC [ - ]
Maybe Elon can add a prompt in the car: one time payment to unlock 100+ mph
kube-system 2021-08-16 20:45:39 +0000 UTC [ - ]
The differentiating issue with Tesla's system is the way it is sold and marketed. Important operational safety information shouldn't be hidden in fine print. Subtly misleading marketing has unfortunately become acceptable in our culture, but this idea needs to stay out of safety-critical systems.
We need a mandate for clear and standardized labelling for these features, à la the Monroney sticker. All manufacturers should have to label and market their cars with something like SAE J3016. https://www.sae.org/binaries/content/gallery/cm/articles/pre...
TacticalCoder 2021-08-16 13:11:21 +0000 UTC [ - ]
To me driving requires paying constant attention to the road and being always ready to act swiftly: I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".
I have nothing against many "recent" safety features, like the steering wheel shaking a bit if the car detects you're getting out of your lane without having activated your blinker. Or the car beginning to brake if it detects an obstacle. Or the car giving you a warning if there's a risk when you change lane, etc.
But how can you react promptly if you're not ready? I just don't get this.
Unless it's a fully self-driving car, without even a steering wheel, a car should help you focus more, not less.
KronisLV 2021-08-16 13:21:37 +0000 UTC [ - ]
You cannot, that's the simple truth. You're supposed to focus on the road anyways and should be able to take over once any sort of autopilot or assist system starts working erroneously, yet in practice many people simply assume that those systems being there in the first place mean that you can simply stop focusing on the road altogether.
It feels like the claim of "fully self driving vehicle" is at odds with actual safety, or at least will remain so until the technology actually progresses far enough to be on average safer than human drivers, moral issues aside. Whether that will take 15, 50 or 500 years, i cannot say, however.
That said, currently such functionality could be good enough for the driver to take a sip from a drink, or fiddle around with a message on their phone, or even mess around on the navigation system or the radio - things that would get done regardless because people are irresponsible, but making which a little bit safer is feasible.
ghaff 2021-08-16 13:56:46 +0000 UTC [ - ]
Maybe, as you say, it's feasible today or soon to better handle brief distractions but once you allow that it's probably dangerous to assume that people won't stretch out those distractions.
Retric 2021-08-16 14:39:42 +0000 UTC [ - ]
Which means people are either paying enough attention or these self driving systems are quite good. My suspicion is it’s a mix of both, where people tend to zone out in less hazardous driving conditions and start paying attention when things start looking dangerous. Unfortunately, that’s going to cause an equilibrium where people pay less attention as these systems get better.
Brakenshire 2021-08-16 15:10:09 +0000 UTC [ - ]
Do we? Where does that come from? The data Tesla provides is hopelessly non-representative because it makes the assumption that the safety of any given road is independent of whether a driver chooses to switch on the system there.
Retric 2021-08-16 15:24:52 +0000 UTC [ - ]
SpelingBeeChamp 2021-08-16 15:44:21 +0000 UTC [ - ]
Retric 2021-08-16 16:07:16 +0000 UTC [ - ]
cma 2021-08-16 15:22:07 +0000 UTC [ - ]
Comma.ai makes the monitoring more strict when the system is less certain or when in denser traffic.
JohnJamesRambo 2021-08-16 13:31:38 +0000 UTC [ - ]
jays 2021-08-16 14:26:46 +0000 UTC [ - ]
So I wonder if it's more about Telsa capitalizing on the hype of self driving cars (with the expensive self-driving add-on) in the short term and less about him misunderstanding the magnitude of difficulty.
Telsa is using the proceeds from that add-on to make them seem more profitable and fund the actual development. It's smart in some aspects, but very risky to consumers and Telsa.
ghaff 2021-08-16 15:24:05 +0000 UTC [ - ]
I still wonder to what degree this was a collective delusion based on spectacular but narrow gains mostly related to supervised learning in machine vision, how much was fake it till you make it, and how much was pure investor/customer fleecing.
lastofthemojito 2021-08-16 15:08:38 +0000 UTC [ - ]
Obviously that's not exactly the same thing as taking over for a real car when the driver assistance features give up, but seems similarly challenging to take over the controls at the most precarious moment of travel, without being sort of "warmed up" as a driver.
jcpham2 2021-08-16 18:13:53 +0000 UTC [ - ]
zemptime 2021-08-16 14:45:49 +0000 UTC [ - ]
Anyway, perhaps I'm in a minority here, but I feel as though my driving has gotten _significantly safer_ since getting a Tesla, particularly on longer road trips.
Instead of burning energy making sure my car stays in the lane I can spend nearly all my time observing drivers around me and paying closer attention farther down the road. My preventative and defensive driving has gone up a level.
> I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".
I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice. This isn't letting autopilot drive, it's like a hybrid act where it does the rote driving and I constantly take over to quickly pass a semi on a windy day, not pass it on a curve, or get over some lanes to avoid tire remnants in the road up ahead. I'm able to watch the traffic in front and behind and find pockets on the highway with nobody around me and no clumping bound to occur (<3 those).
To your suspicion, it is a different mode of driving. Recently I did a roadtrip (about half the height of the USA) in a non-Tesla, and I found myself way more exhausted and less alert towards the end of it. Could be I'm out of habit but egh.
Anyway, so far I've been super lucky. I don't think it's possible to avoid all car crashes no matter how well you drive. But I _for sure_ have avoided avoidable ones and taken myself out of situations where they later occurred thanks to the extra mental cycles afforded to me by auto-pilot. My safety record in the Tesla is currently perfect and I'll try and keep it that way.
I don't think autopilot is perfect either but I do think it's a good tool and I'm a better driver for it. Autopilot has definitely helped me spend better focus on driving.
malwrar 2021-08-16 15:13:55 +0000 UTC [ - ]
throwaway0a5e 2021-08-16 15:19:25 +0000 UTC [ - ]
scrumbledober 2021-08-16 20:25:22 +0000 UTC [ - ]
somedude895 2021-08-16 16:58:26 +0000 UTC [ - ]
AndrewBissell 2021-08-16 19:06:37 +0000 UTC [ - ]
Also, your subjective impressions may be what they are simply because you have not yet encountered the unlucky set of conditions which would radically change your view, as was surely the case for all the drivers involved in these sorts of incidents.
TacticalCoder 2021-08-16 15:53:31 +0000 UTC [ - ]
I wouldn't want my, strangely enough upvoted a lot, comment, to be mistaken for Tesla hate. I like what they're doing. I just think the auto-pilot shouldn't give a false sense of security.
> I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice.
> I don't think it's possible to avoid all car crashes no matter how well you drive.
Same here... And animals are my worst nightmare: there are videos on YouTube just terrifying.
For I do regularly watch crash videos to remind me of some of the dangers on the road.
somerandomqaguy 2021-08-16 16:38:43 +0000 UTC [ - ]
You're talking about Autopilot which is just driver assistance technologies; lane keep assistance, adaptive cruise control, blind spot monitoring, etc. It's not to replace driver attention, it's just monitor sections of the road the the driver can't pay attention to full time. The driver is still remaining in control and attentive to the road.
The person you're responding to seems to be talking talking about the Full Self Driving feature who's initial marketing implied that the driver need not be mentally engaged at all or too impaired to drive normally. Which was later back pedal led to say that you need to pay attention.
gugagore 2021-08-16 15:15:59 +0000 UTC [ - ]
kwhitefoot 2021-08-16 19:01:13 +0000 UTC [ - ]
pedrocr 2021-08-16 13:49:03 +0000 UTC [ - ]
oblio 2021-08-16 14:25:05 +0000 UTC [ - ]
jazzyjackson 2021-08-16 19:30:30 +0000 UTC [ - ]
ocdtrekkie 2021-08-16 14:14:33 +0000 UTC [ - ]
However, I also once so far have experienced what happens when this system experiences a poorly-marked construction zone. Whilst most construction sites on the interstate system place temporary road lines for lane shifts, this one solely used cones. While I was paying attention and never left the flow of traffic, the car actually fought a little bit against me following the cones into another lane, because it didn't see the cones, it was following the lines.
It doesn't surprise me at all that if someone gets too comfortable trusting the car to do the work, even if they think they're paying attention, they could get driven off the roadway.
hermitdev 2021-08-16 15:54:01 +0000 UTC [ - ]
How do automated systems deal with flaggers? Visibility of the stop/slow sign isn't sufficient to make a determination on whether it's safe to proceed (not to mention "stop" changes meaning here, entirely, from a typical stop sign). Often, whether or not you can proceed comes down to hand gestures from the flagger proper.
Not that I expect any reasonable driver to be using something like autopilot through such a situation, but we've also seen plenty of evidence that there are unreasonable drivers currently using these systems, as well.
ocdtrekkie 2021-08-16 17:08:18 +0000 UTC [ - ]
Of course, the problem is, if we haven't developed it today, the ADAS systems of today won't understand it in ten years when there's enough saturation to be practical to use it. Apart from Tesla, very few car manufacturers are reckless enough to send OTA updates that can impact driving behavior.
Lane-following ADAS systems of today, mind you, can work relatively fine in construction areas... provided lane lines are moved, as opposed to relying solely on traffic cones.
robomartin 2021-08-16 14:31:00 +0000 UTC [ - ]
A mesh network of vehicles on the road would add the ability for vehicles to become aware of far more than a human driver can ever know. For example, if cars become aware of a problem a few km/miles ahead, they can all adjust speed way before encountering the constriction in order to optimize for traffic flow (or safety, etc.).
Of course, this does not adequately deal with pedestrians, bikes, pets, fallen trees, debris on the road, etc.
Not saying cars would exclusively use the mesh network as the sole method for navigation, they have to be highly capable without it. The mesh network would be an enhancement layer. On highways this would allow for optimization that would bring forth some potentially nice benefits. For example, I can envision reducing emissions through traffic flow optimization.
Remember that electric cars still produce emissions, just not necessarily directly while driving. The energy has to come from somewhere and, unless we build a massive number of nuclear plants, that somewhere will likely include a significant percentage of coal and natural gas power plants.
The timeline for this utopia is likely in the 20+ year range. I say this because of the simple reality of car and truck ownership. People who are buying cars today are not going to dispose of them in ten years. A car that is new today will likely enter into the used market in 8 to 10 years and be around another 5 to 10. The situation is different with commercial vehicles. Commercial trucks tend to have longer service lives by either design or maintenance. So, yeah, 20 to 30 years seems reasonable.
mhb 2021-08-16 14:10:27 +0000 UTC [ - ]
comeonseriously 2021-08-16 14:04:30 +0000 UTC [ - ]
kbshacker 2021-08-16 15:58:09 +0000 UTC [ - ]
Faaak 2021-08-16 14:07:16 +0000 UTC [ - ]
It just makes the trips easier on the brain, and thus, for me, safer overall: its easier to see the overall situation when you've got free mental capacity
rad_gruchalski 2021-08-16 16:15:24 +0000 UTC [ - ]
zip1234 2021-08-16 14:44:07 +0000 UTC [ - ]
aembleton 2021-08-16 14:59:40 +0000 UTC [ - ]
Does it always get this correct, or does it sometimes read a 30mph sign on a side road and then slow the car on the motorway down to that speed?
rad_gruchalski 2021-08-16 16:20:38 +0000 UTC [ - ]
zip1234 2021-08-16 15:02:39 +0000 UTC [ - ]
hermitdev 2021-08-16 15:43:48 +0000 UTC [ - ]
Since I bought my car, Illinois (where I live) has raised the maximum limit on interstates by 10 MPH. My car doesn't know about it. If my car limited me to what it thought the limit was, I'd probably be driving 20 MPH slower than prevailing traffic, a decidedly unsafe situation.
cranekam 2021-08-16 15:50:30 +0000 UTC [ - ]
It’s hard to imagine how speed limit systems would work without some sort of vision capabilities — a database of speed limits would never be up to date with roadworks and so on.
zip1234 2021-08-16 16:36:34 +0000 UTC [ - ]
Sargos 2021-08-16 15:03:18 +0000 UTC [ - ]
zip1234 2021-08-16 16:43:23 +0000 UTC [ - ]
emerged 2021-08-16 15:24:13 +0000 UTC [ - ]
filoleg 2021-08-16 15:18:37 +0000 UTC [ - ]
If you drive on a 60mph speed limit highway, no one is driving 60mph, everyone is going around 70mph. If you decide to use autopilot and it limits you to 60mph, you singlehandedly start disrupting the flow of traffic (that goes 70mph) and end up becoming an increased danger to yourself and others.
Not even mentioning cases when the speed limits change overnight or the map data is outdated or if a physical sign is unreadable.
zip1234 2021-08-16 16:45:36 +0000 UTC [ - ]
filoleg 2021-08-16 17:28:20 +0000 UTC [ - ]
For specific numbers (after subtracting reaction distance being the same for both):
55mph: car 165ft, 16-wheeler 225ft. 65mph: car 245ft, 16-wheeler 454ft.
As you can see, the gap between a car's stopping distance and a 16-wheeler's stopping distance increases with speed increasing, and non-linearly at that. Not even mentioning the destructive potential of a car vs. a 16-wheeler.
I would agree with your point if majority of the roads were occupied by 16-wheelers, but it isn't the case (at least in the metro area that I commute to work in).
Source for numbers used: https://trucksmart.udot.utah.gov/motorist-home/stopping-dist...
Note: I agree that it would be safer if everyone drove the exact speed limit, as opposed to everyone going 10mph above the speed limit. However, in a situation where everyone is driving 10mph above the speed limit, you are creating a more dangerous situation by driving 10mph slower instead of driving 10mph above like everyone else.
hnarn 2021-08-16 15:16:50 +0000 UTC [ - ]
This is a situation that you simply shouldn’t put yourself in. There is no reason to ever drive right next to a large vehicle, on either side, except for very short periods when overtaking them on a straight road.
throwaway0a5e 2021-08-16 15:20:39 +0000 UTC [ - ]
hnarn 2021-08-16 15:25:47 +0000 UTC [ - ]
occamrazor 2021-08-16 16:44:30 +0000 UTC [ - ]
hnarn 2021-08-16 16:49:17 +0000 UTC [ - ]
paul7986 2021-08-16 13:55:27 +0000 UTC [ - ]
ra7 2021-08-16 14:05:48 +0000 UTC [ - ]
ghaff 2021-08-16 14:30:21 +0000 UTC [ - ]
ra7 2021-08-16 14:34:58 +0000 UTC [ - ]
ghaff 2021-08-16 14:40:19 +0000 UTC [ - ]
I think you're far more likely to see L4 on limited access highways in good weather. A robotaxi service in a major city seems much more problematic given all the random behavior by other cars, pedestrians, cyclists, etc. and picking up/dropping off people in the fairly random ways that taxis/Ubers do. (And you'll rightly be shut down 6 months for an investigation the first time you run over someone even if they weren't crossing at a crosswalk.)
And for many people, including myself, automated highway driving would actually be a much bigger win than urban taxi rides which I rarely have a need for.
ocdtrekkie 2021-08-16 14:18:48 +0000 UTC [ - ]
Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe. A former employee expressed disappointment they weren't the first self-driving car company to kill someone, because that meant they were behind.
ra7 2021-08-16 14:22:35 +0000 UTC [ - ]
California only months ago opened up permits for paid robotaxi rides. So no, they couldn't have launched it in CA. If you've noticed, they actually are testing in SF with a permit.
> Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe.
What lies? Their CA disengagement miles are for everyone to see, their safety report is open, they have had 0 fatalities in their years of operation. Seems like you just made this up.
dragonwriter 2021-08-16 15:00:05 +0000 UTC [ - ]
Well, yeah, that's the logic of an established business. Disruptive startups flout laws rather than following them.
ocdtrekkie 2021-08-16 14:38:35 +0000 UTC [ - ]
Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.
https://www.technologyreview.com/2014/08/28/171520/hidden-ob...
The article is seven years old at this point, but it's also incredibly humbling in how much bull- Waymo puts out, especially compared to the impressions their marketing team puts out. (Urmson's son presumably has a driver's license by now.)
In at least one scenario, the former Waymo engineer upset he had failed to kill anyone yet ("I’m pissed we didn’t have the first death"), caused a hit-and-run accident with a Waymo car, and didn't report it to authorities, amongst other serious accidents: https://www.salon.com/2018/10/16/googles-self-driving-cars-i... Said star Waymo engineer eventually went to prison for stealing trade secrets and then got pardoned by Donald Trump. Google didn't fire him for trying to kill people, they only really got upset with him because he took their tech to Uber.
I'd say Waymo has a storied history of dishonesty and coverups, behind a technology that's more or less a remote control car that only runs in a narrow group of carefully premapped streets.
ra7 2021-08-16 14:53:10 +0000 UTC [ - ]
How is a marketing video relevant from 2015 relevant to their safety record? They weren't even operating a public robotaxi service back then.
> My understanding is that in 2021, it still can't navigate parking lots (which would preclude using it for drive-thrus).
Completely false. Here is one navigating a Costco parking lot (can't get any busier than that) [1]. If you watch any videos in that YouTube channel, it picks you up and drops you off right from the parking lot. Yes, you can't use it for drive-thrus, but it doesn't qualify as "lying about capabilities".
> Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.
Here is one recognizing a handheld stop sign from a police officer while it stopped for an emergency vehicle [2].
nradov 2021-08-16 15:31:46 +0000 UTC [ - ]
ra7 2021-08-16 15:36:45 +0000 UTC [ - ]
andreilys 2021-08-16 14:29:22 +0000 UTC [ - ]
Seeing as car crashes are the leading cause of deaths from people aged 1-54, it may be an improvement from the status quo
More than 38,000 people die every year in crashes on U.S. roadways. The U.S. traffic fatality rate is 12.4 deaths per 100,000 inhabitants. An additional 4.4 million are injured seriously enough to require medical attention. Road crashes are the leading cause of death in the U.S. for people aged 1-54.
hn8788 2021-08-16 14:34:16 +0000 UTC [ - ]
_ph_ 2021-08-16 14:51:44 +0000 UTC [ - ]
jazzyjackson 2021-08-16 19:35:54 +0000 UTC [ - ]
ac29 2021-08-16 16:36:59 +0000 UTC [ - ]
This isnt true according to the CDC. Cancer and heart disease lead for the 44-54 group, and while "accidental injury" does lead from 1-44, if you break down the data, in many cases vehicle based accidents are not not the largest single source. For example:
Drowning is the largest single cause in 1-4
Cancer is the largest single cause in 5-9
Suicide is the largest single cause 10-14
hnburnsy 2021-08-16 14:25:04 +0000 UTC [ - ]
For example, couldn't emergency vehicles could send out a signal directly to autonomous vehicles or via a traffic managagemnt system to slow down or require the driver to take over when approaching. An elementary version of this is Waze which will notify you of road hazards or cars stopped on the side of the road.
dotdi 2021-08-16 12:59:46 +0000 UTC [ - ]
josefx 2021-08-16 13:40:12 +0000 UTC [ - ]
Edit: I think the page count at the bottom of that list is off, it seems to repeat the last page so it might be less.
[1]https://www.respondersafety.com/news/struck-by-incidents/?da...
onlyrealcuzzo 2021-08-16 19:32:50 +0000 UTC [ - ]
btbuildem 2021-08-16 19:32:45 +0000 UTC [ - ]
In Canada the Highway Act states that you must move over (change lanes) for stopped emergency vehicles. It seems to solve that problem gracefully, leaving an empty lane between the stopped vehicles and traffic.
toomuchtodo 2021-08-16 13:01:29 +0000 UTC [ - ]
salawat 2021-08-16 13:32:41 +0000 UTC [ - ]
now running into unlit emergency vehicles? still think tgat's rather difficult sans inebriation or sleep dep.
lacksconfidence 2021-08-16 14:55:39 +0000 UTC [ - ]
darkerside 2021-08-16 12:59:53 +0000 UTC [ - ]
crubier 2021-08-16 13:00:58 +0000 UTC [ - ]
I still think that Tesla's approach is the right one, I just think they need to gather more data before letting this product be used in the wild unsupervised.
judge2020 2021-08-16 13:02:24 +0000 UTC [ - ]
dawnerd 2021-08-16 15:16:32 +0000 UTC [ - ]
laichzeit0 2021-08-16 15:36:46 +0000 UTC [ - ]
marvin 2021-08-16 19:42:07 +0000 UTC [ - ]
laichzeit0 2021-08-17 04:14:23 +0000 UTC [ - ]
360walk 2021-08-16 15:24:39 +0000 UTC [ - ]
crubier 2021-08-16 22:33:30 +0000 UTC [ - ]
hetspookjee 2021-08-16 21:35:44 +0000 UTC [ - ]
leroman 2021-08-16 13:38:49 +0000 UTC [ - ]
The question should be - how many lives were saved by this system vs how many would die if driven "normally"?
thereisnospork 2021-08-16 19:29:12 +0000 UTC [ - ]
It is also necessary to project this into the future, i.e. looking at the integral of expected lives lost 'rushing' self driving cars vs. 'waiting-and-seeing' (as Americans die at a rate of 40,000 per annum).
If twice as many people die for a fixed number of years to create a self driving system that results in half the fatality rate of the status quo, that becomes worth it very, very quickly.
young_unixer 2021-08-16 14:58:45 +0000 UTC [ - ]
For example: it's not morally equivalent to die while drunk driving at 150 km/h vs dying as a pedestrian because someone ran over you.
I would prefer 10 drunk drivers die instead of just one innocent person.
leroman 2021-08-16 15:07:51 +0000 UTC [ - ]
nickik 2021-08-16 21:58:37 +0000 UTC [ - ]
njarboe 2021-08-16 23:30:10 +0000 UTC [ - ]
joewadcan 2021-08-16 19:02:50 +0000 UTC [ - ]
geekraver 2021-08-17 15:14:42 +0000 UTC [ - ]
okareaman 2021-08-16 14:18:50 +0000 UTC [ - ]
I have no idea how self-driving fits into this. I don't have a feel how self-driving responds to emergencies. I'd have to experience an emergency in one. For that reason, I don't see myself ever trusting self-driving.
lawn 2021-08-16 15:18:04 +0000 UTC [ - ]
rad_gruchalski 2021-08-16 16:42:53 +0000 UTC [ - ]
1. always pay full attention to where you are because there might be a truck or a family of 5 coming from the opposite direction, 2. never lift, 3. always look in the direction you want to travel in, not in the direction you currently travel
kwhitefoot 2021-08-16 19:08:26 +0000 UTC [ - ]
The rationale being that swerving most likely puts more people at risk more of the time. Especially true here where leaving the road often means either colliding with the granite cliff wall or ending up in the fjord or lake.
killjoywashere 2021-08-17 03:52:25 +0000 UTC [ - ]
(1) https://www.tesladeaths.com/miles.html (2) https://www-fars.nhtsa.dot.gov/Main/index.aspx
myko 2021-08-16 14:23:48 +0000 UTC [ - ]
Does anyone know if the FSD Beta has this ability?
_ph_ 2021-08-16 15:04:01 +0000 UTC [ - ]
Recently, Tesla switched from radar-based to pure optical obstacle recognition. This should vastly improve this kind of behavior. Ironically that the investigation starts at a moment when they basically got rid of the old system.
Look on youtube for videos of the FSD beta. It is amazingly good at recognizing the surroundings of a car, including parked vehicles at the road side.
kwhitefoot 2021-08-16 19:11:34 +0000 UTC [ - ]
Neither can Volvo or VW.
Actually my 2015 Ap1.5 Model S does detect stopped vehicles, unfortunately not reliably.
zebnyc 2021-08-16 20:38:56 +0000 UTC [ - ]
Or I can tell my car, "Hey tesla, go pickup my kid from soccer practice" and it would know what to do.
tyingq 2021-08-16 13:10:42 +0000 UTC [ - ]
Meekro 2021-08-16 18:28:43 +0000 UTC [ - ]
Your take seems a lot more plausible.
tyingq 2021-08-16 18:52:48 +0000 UTC [ - ]
bishoprook2 2021-08-16 13:40:01 +0000 UTC [ - ]
It seems to me that Tesla door handles (in a world where they've been designing door latches for some time) are just plain ridiculous and likely unreliable but are a side effect of the market the company has been selling into. Gadgets go a long way with Tesla owners.
Obviously, things like a latch should not only work under all conditions including no-power, but they should probably be the same under all conditions. 'Emergency' latches aren't going to be used during an emergency as muscle memory is too important.
literallyaduck 2021-08-16 14:46:25 +0000 UTC [ - ]
kemiller 2021-08-16 20:03:21 +0000 UTC [ - ]
jdavis703 2021-08-16 20:38:37 +0000 UTC [ - ]
However, I’m assuming the crashes were quite varied: anything from a driver recklessly fleeing a stop to some drunk crashing into a cop on the highway shoulder. Most likely these deaths didn’t have a systematic pattern to them that could be prevented if only we knew what the root cause was.
sunshineforever 2021-08-16 22:25:54 +0000 UTC [ - ]
fallingknife 2021-08-16 13:24:40 +0000 UTC [ - ]
catillac 2021-08-16 14:09:24 +0000 UTC [ - ]
Here’s more detail: https://www.tesla.com/support/autopilot
sitkack 2021-08-17 15:00:30 +0000 UTC [ - ]
MonadIsPronad 2021-08-16 13:03:24 +0000 UTC [ - ]
Tesla perhaps isn't being loud enough about how autopilot isn't self-driving, and shouldn't even be relied upon to hit the brakes when something is in front of you.
ghaff 2021-08-16 14:04:34 +0000 UTC [ - ]
tmountain 2021-08-16 13:22:10 +0000 UTC [ - ]
Doesn't autopilot require you to put your hands on the wheel fairly regularly? Are these incidents just a matter of people using this feature outside of its intended use case?
Ajedi32 2021-08-16 13:36:18 +0000 UTC [ - ]
Newer versions of Autopilot watch to make sure you keep your eyes on the road, probably to prevent this very scenario[1].
[1]: https://www.theverge.com/2021/5/27/22457430/tesla-in-car-cam...
LightG 2021-08-18 10:45:06 +0000 UTC [ - ]
tacobelllover99 2021-08-16 13:27:02 +0000 UTC [ - ]
Oh wait NM that's tradional ICE cars.
FUD is dangerous
nikkinana 2021-08-16 13:42:35 +0000 UTC [ - ]
thoughtstheseus 2021-08-16 14:08:32 +0000 UTC [ - ]
gamblor956 2021-08-16 17:01:47 +0000 UTC [ - ]
blueplanet200 2021-08-16 17:16:00 +0000 UTC [ - ]
gamblor956 2021-08-16 17:43:01 +0000 UTC [ - ]
This is an issue because Tesla markets its cars as being "safer" than other company's vehicles, and the data shows that their driver assist system is objectively not.
mshumi 2021-08-16 12:59:46 +0000 UTC [ - ]
phpnode 2021-08-16 13:01:22 +0000 UTC [ - ]
smallhands 2021-08-16 13:18:10 +0000 UTC [ - ]
bathtub365 2021-08-16 13:24:38 +0000 UTC [ - ]
TSLA is down almost 2% in pre-market trading at the time of this comment, though.
catillac 2021-08-16 14:12:17 +0000 UTC [ - ]
antattack 2021-08-16 13:21:59 +0000 UTC [ - ]
"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," NHTSA said in a document opening the investigation.
TACC is very different from Autopilot.
jdavis703 2021-08-16 13:25:00 +0000 UTC [ - ]
antattack 2021-08-16 13:43:07 +0000 UTC [ - ]
Safety Consideration When Using Adaptive Cruise Control
• The system can only brake so much. Your complete attention is always required while driving.
• Adaptive Cruise Control does not steer your vehicle. You must always be in control of vehicle steering.
• The system may not react to parked, stopped or slow-moving vehicles. You should always be ready to take action and apply the brakes.
[1]https://my.gmc.com/how-to-support/driving-performance/drivin...
kelvin0 2021-08-16 13:37:37 +0000 UTC [ - ]
I think the best situation would be to have 'automated' stretches of highway specially designed to 'help' self driving systems.
Only self driving vehicles would be allowed on such special highways, and everything would be built around such systems.
SCNP 2021-08-16 14:06:29 +0000 UTC [ - ]
This is kind of a position I've held for a long time but a different aspect of the problem. I think a system similar to IFF in aircraft would solve all of these issue. If every car knew where every other car was at all times, you could easily devise a system that would be nearly flawless. The issue is, there is no incremental path to this solution. You would essentially have to start over with the existing transportation network.
mattnewton 2021-08-16 14:14:10 +0000 UTC [ - ]
SCNP 2021-08-16 14:50:09 +0000 UTC [ - ]
ghaff 2021-08-16 14:01:42 +0000 UTC [ - ]
kelvin0 2021-08-16 20:14:38 +0000 UTC [ - ]
rvz 2021-08-16 13:16:29 +0000 UTC [ - ]
Perhaps this is for the best.
[0] https://news.ycombinator.com/item?id=27996321
supperburg 2021-08-16 19:26:09 +0000 UTC [ - ]
zugi 2021-08-16 19:59:58 +0000 UTC [ - ]
Teslas crash 40% less than other cars, and 1/3 the number of people are killed in Teslas versus other cars.
Indeed once a common failure mode like this is identified it needs to be investigated and fixed. Something similar happened a few years ago when someone driving a Tesla while watching a movie (not paying attention) died when they crashed into a light-colored tractor trailer directly crossing the road. So an investigation makes sense. But much of the general criticism of self-driving and autopilot here seems misplaced. Teslas and other self-driving vehicle technologies are saving lives. They will continue to save lives compared to human drivers, as long as we let them.
derbOac 2021-08-16 20:23:42 +0000 UTC [ - ]
Some top-of-my-head thoughts:
1. I think to make a fair comparison of Tesla versus other cars, you'd have to really ask "how much safer are Tesla owners in Teslas compared to other cars randomly assigned to them?" That is, comparing the accident rates of Teslas compared to other cars is misleading because Tesla owners are not a random slice of the population. I almost guarantee that if you e.g., looked at their accident rates prior to owning a Tesla their accident rates would be lower than the general population.
2. In these autopilot situations, bringing up general accident rates seems sort of like a red herring to me. The actual causally relevant issue is "what would happen in this scenario if someone were driving without an autopilot?" So, for example, in the example of the rider who was killed when the autopilot drove them into a semi, the actually relevant question is "what would have happened if that driver, or someone interchangeable with them, was driving without autopilot? Would have they drove themselves into a semi?"
3. Various experts have argued general vehicle accident rates aren't comparable to Teslas because average cars are much, much older. As such, you should be comparing accident rates of cars of the same age, if nothing else. So, aside from the driver effect pointed out earlier, you have the question of "what would the accident rate look like in a Tesla or a car identical to it without autopilot?"
4. At some point with autopilot -- whether it be Tesla or other companies -- you have to start treating it comparably to a single individual. So, for example, what are the odds of Person A27K38, driving the same number of miles as Tesla, having a certain pattern of accidents? If you found a specific person drove into first responders on the side of the road 11 times, wouldn't that be suggestive of a pattern? Or would it? It's not enough to ask "how often do non autopilot drivers drive into first responders on the side of the road", it seems to me important to ask "how often would a single driver drive into first responders on the side of the road, given a certain number of miles driven in that same period?" At some point, autopilot becomes a driver, in the sense it has a unique identity regardless of how many copies of it there are? Maybe that's not right but it seems like that is the case.
Animats 2021-08-16 20:07:48 +0000 UTC [ - ]
It's clear what Tesla really has - a good lane follower and cruise control that slows down for cars ahead. That's a level 2 system. That's useful, but, despite all the hype about "full self driving", it seems that's all they've got.
"Full self driving" just adds some lane-changing assistance and hints from the nav system.
icelandicmoss 2021-08-16 21:51:20 +0000 UTC [ - ]
But when a supposedly 'all-seeing always watching' autopilot drives straight into a large stationary object in clear daylight, we have no understanding of how the situation occurred.
This I think has a couple of effects:
1) The apparent randomness makes the idea of these crashes a lot more scary -- psychologically we seem to have a greater aversion to danger we can't predict, and we can't tell ourselves the 'ah but that wouldn't happen to me' story.
2) Predictability of road incidents actually is a relevant piece of information. As a road user (including pedestrian), most of my actions are taken on the basis of what I am expecting to happen next, and my model for this is how humans drive (and walk). Automated drivers have different characteristics and failure modes, and that makes them an interaction problem for me.
oaw-bct-ar-bamf 2021-08-16 22:13:00 +0000 UTC [ - ]
Only when the vehicle computer detects a known object on the road that it knows should not be there it is applying brakes or trying to steer around.
I would feel safer if the algorithm would assume the negative case as default and only give the „green light“ once it determined that the road is free to drive on. In case of unknown (not yet supervised) road obstructions the worst needs to be assumed.
That’s where the ‚unexplainable‘ crashes are coming from. Something the size of an actual truck is obstructing the road. But couldn’t quite classify it because the truck has tipped over and is lying on the road sideways. Not yet learned by the algorithm. Can't be that bad, green light, no need to avoid or brake.
sangnoir 2021-08-17 01:27:22 +0000 UTC [ - ]
The problem with Tesla's "No LIDAR ever, cameras are good enough" approach is that it fails to detect emergency vehicles: they filter out stationary items out of radar signal as noise[1],and Tesla's ML models probably can't reliably identify oblique vehicles and semi trailers as obstacles.
1. Makes sense in isolation: frequent radar returns from roadside and overhead signs would be a pain to deal with
Varriount 2021-08-17 07:46:06 +0000 UTC [ - ]
MaxikCZ 2021-08-17 07:55:01 +0000 UTC [ - ]
quartesixte 2021-08-16 22:32:28 +0000 UTC [ - ]
Trying to remember if the opposite of this is how human drivers are taught, or if this is implicit in how we move about the world. My initial gut reaction says yes and this is a great phrasing of something that was always bothering me about automated driving.
Perhaps we should model our autopilots after horses: refusal to move against anything unfamiliar, and biased towards going back home on familiar routes.
cameron_b 2021-08-17 17:49:30 +0000 UTC [ - ]
The answer was “the mile in front of you”
Additionally there was some statistic about the frequency of accidents within a very short distance of the drivers residence, which seemed to underscore the importance of being aware of just how much your brain filters out the “familiar” in contrast to a newly stimulating environment.
jiscariot 2021-08-17 20:54:04 +0000 UTC [ - ]
If I google it, I get like three pages of law firms.
Animats 2021-08-17 06:29:36 +0000 UTC [ - ]
I agree, but it will up the false alarm rate in a system without good depth perception for all objects. This is tough with cameras only. Reflective puddles are a problem; they're hard to range with vision only. Anything that doesn't range well, which is most very uniform surfaces, becomes a reason to slow down. As you get closer, the sensor data gets better and you can usually decide it's safe to proceed.
Off-road autonomous vehicles have to work that way, but on-road ones can be more optimistic.
Waymo takes a hard line on this, and their vehicles drive rather conservatively as a result. They do have false-alarm problems and slowdowns around trouble spots.
oaw-bct-ar-bamf 2021-08-19 10:51:09 +0000 UTC [ - ]
If the system gets faster over time, even better. But I cannot imagine huge adoption unless the system gets actually reliable. I am pretty much in favor of the Waymo approach.
oaw-bct-ar-bamf 2021-08-19 10:53:45 +0000 UTC [ - ]
willcipriano 2021-08-16 22:29:06 +0000 UTC [ - ]
2021-08-16 22:30:51 +0000 UTC [ - ]
2021-08-16 23:16:35 +0000 UTC [ - ]
diggernet 2021-08-17 00:21:42 +0000 UTC [ - ]
To me, that blindness is simply unacceptable. If there is anything in the road, whether identified or not, it should automatically be flagged as a hazard. That flag should only be removed if it is detected to be moving in a way such that it will be somewhere else when you get there.
I have Subaru EyeSight. It has no problem seeing stationary objects. What's Tesla's problem?
harles 2021-08-17 00:25:42 +0000 UTC [ - ]
Of course the vision system is supposed to compensate for this, and it performs poorly on objects it doesn’t see often, like emergency vehicles.
jiggawatts 2021-08-17 01:41:57 +0000 UTC [ - ]
So, if they have the input data, why is it being ignored by autopilot?
harles 2021-08-17 03:53:38 +0000 UTC [ - ]
[0] https://www.tesla.com/autopilotAI
diggernet 2021-08-17 14:32:10 +0000 UTC [ - ]
harles 2021-08-17 14:39:06 +0000 UTC [ - ]
It’s simply not possible to do depth estimation like this without priors. That’s one of the serious limitations of such systems - you have to train on every class of object you don’t want to hit.
diggernet 2021-08-17 18:15:47 +0000 UTC [ - ]
harles 2021-08-17 22:22:57 +0000 UTC [ - ]
elihu 2021-08-16 23:25:51 +0000 UTC [ - ]
It's one thing to have to deal with inexplicable behavior from other cars, but to have to deal with inexplicable behavior from your own car seems quite a bit more unnerving.
riskable 2021-08-17 13:30:27 +0000 UTC [ - ]
Honestly, I see this as a necessary transition pain towards fully automated vehicles. No matter how you slice it there's going to be periods where fully automated driving systems aren't quite there yet but are good enough 97% of the time that human drivers let their guard down. It's going to take some sacrifices to get to fully autonomous driving.
The good news is that even with these accidents self-driving features are a bazillion times safer than human drivers. It sure seems like the occasional vehicle collision into stationary objects is going to throw a great big wrench into self-driving safety statistics but it isn't even a rounding error compared to the sheer number of accidents caused by human drivers.
pauljurczak 2021-08-17 03:45:10 +0000 UTC [ - ]
And yet, tens of thousands of drivers are working as unpaid beta testers for Tesla. Mind-boggling.
tshaddox 2021-08-17 01:01:57 +0000 UTC [ - ]
I don't see why these are inexplicable to humans. It's certainly no more difficult to explain than, say, a (non-adaptive) cruise control in a car from 2000 doing the same thing.
> Whilst humans can be dangerous drivers, the incidents they cause generally have a narrative sequence of events that are comprehensible to us -- for instance, driver was distracted, or visibility was poor.
But that is arguably a sufficient explanation for these Tesla crashes as well. The driver being distracted or inattentive or unable to see clearly is a requirement for all of these Tesla crashes, as far as I know.
icelandicmoss 2021-08-17 01:21:20 +0000 UTC [ - ]
kevin_thibedeau 2021-08-17 01:37:50 +0000 UTC [ - ]
postmeta 2021-08-16 20:58:12 +0000 UTC [ - ]
""" These events occur typically when a vehicle is partially in a lane and radar has to ignore a stationary object. This is pretty standard and inherent with TACC + radar.
The faster Tesla pushes the vision only stack to all cars after they’ve validated the data, the faster this topic becomes moot. Andrej Karpathy talks and shows examples of what that would do here. Minutes 23:00-28:00 https://youtu.be/a510m7s_SVI
Older examples from manuals of other TACC systems which use radar:
Volvo’s Pilot Assist regarding AEB/TACC.
According to Wired, Volvo’s Pilot Assist system is much the same. The vehicles’ manual explains that not only will the car fail to brake for a sudden stationary object, it may actually race toward it to regain its set speed:
“Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.”
Cadillac Super Cruise - Page 252
Stationary or Very Slow-Moving Objects
ACC may not detect and react to stopped or slow-moving vehicles ahead of you. For example, the system may not brake for a vehicle it has never detected moving. This can occur in stop-and-go traffic or when a vehicle suddenly appears due to a vehicle ahead changing lanes. Your vehicle may not stop and could cause a crash. Use caution when using ACC. Your complete attention is always required while driving and you should be ready to take action and apply the brakes.
BMW Driving Assistant Plus - Page 124
A warning may not be issued when approaching a stationary or very slow-moving obstacle. You must react yourself; otherwise, there is the danger of an accident occurring.
If a vehicle ahead of you unexpectedly moves into another lane from behind a stopped vehicle, you yourself must react, as the system does not react to stopped vehicles. """
pkulak 2021-08-16 22:57:00 +0000 UTC [ - ]
btilly 2021-08-16 23:16:29 +0000 UTC [ - ]
The problem with radar on the ground is that most of what comes to a radar detector is reflections from a stationary world, with relative delays so small as to be undetectable. So the first step in processing is to filter out everything at the speed of that motionary world. All fixed objects therefore disappear, and you are left sorting out moving objects. Which means you now can't detect stationary objects at all.
Tesla has a different problem. They probably don't have depth perception. They therefore have to classify objects, and make educated guesses about where they are relative to the car. Unexpected kinds of objects, or objects in unexpected configurations, fail to be classified and therefore fail to be analyzed.
In principle, Tesla can succeed. After all we don't have binocular vision past 6 meters either. Tesla is improving.
But they haven't yet.
ajross 2021-08-16 22:17:37 +0000 UTC [ - ]
It's worth checking out for sure. Not worth the headline bandwidth and flamage budget being spent on it.
sjg007 2021-08-16 21:05:11 +0000 UTC [ - ]
Loughla 2021-08-16 21:19:15 +0000 UTC [ - ]
It's actually really annoying if you live in a rural area without clearly defined lanes, and large, stationary objects (tractors and whatnot) close to the road.
nzrf 2021-08-16 21:55:37 +0000 UTC [ - ]
Additionally, the back up sensor is a tad over zealous also.
xattt 2021-08-16 23:36:26 +0000 UTC [ - ]
bit_logic 2021-08-16 22:07:37 +0000 UTC [ - ]
- Car switching in/out of my lane, I manually take over
- Tight curve in the freeway, manually take over
- Very frequently check the dashboard indicator that shows if the sensors "sees" the car front or not
- Anything unusual like construction, cones, car on shoulder, manually take over
- Anything that looks difficult like weird merging lanes, manually take over
- Any bad weather or condition like sun directly in front, manual drive
- Frequently adjusting max speed setting on ACC. It's safer to not be too much above the prevailing speeds. Otherwise, if ACC suddenly becomes blind, it can accelerate dangerously as it tries to reach max set speed.
- I don't trust lane keep much, it's mostly a backup for my own steering and making my arms less tired turning the wheel
The key thing is to recognize just how dumb this technology is. It's not smart, it's not AI. It's just a bit above the old cruise control. With that mindset it can be used safely.
jumpkick 2021-08-16 22:34:27 +0000 UTC [ - ]
throwaway0a5e 2021-08-16 23:12:50 +0000 UTC [ - ]
If you use it that way it's fine. If you expect it to be as smart as a student driver it's not fine.
tshaddox 2021-08-17 01:04:22 +0000 UTC [ - ]
amanaplanacanal 2021-08-17 01:36:08 +0000 UTC [ - ]
tshaddox 2021-08-17 16:11:11 +0000 UTC [ - ]
LightG 2021-08-19 08:41:22 +0000 UTC [ - ]
I'd rather at least get the pleasure of driving, rather than basically becoming the supervisor for my car. Quartlery performance reviews, checking on KPI's.
Autopilot and the like are absolutely not on my list of features I'm looking for when buying a new car. Crusie-control? Handy. AP, waste of (my) time.
tayo42 2021-08-16 22:14:23 +0000 UTC [ - ]
ajross 2021-08-16 22:24:47 +0000 UTC [ - ]
I think that requires more numerate analysis than you're giving though. The data from the story is a sample size of 11 crashes over three years (I think). If that's really the size of the effect, then your "but [...]" clause seems very suspect.
There are almost two million of these cars on the roads now. It seems extremely likely that the number of accidents prevented by AP dwarfs this effect, so arguing against it even by implication as you do here seems likely to be doing more harm than good.
That doesn't mean it's not worth investigating what seems like an identifiable edge case in the AP obstacle detection. But that's a bug fix, not an argument about "Level 2 Autonomy" in general.
sunshineforever 2021-08-16 22:32:29 +0000 UTC [ - ]
mmcconnell1618 2021-08-16 22:18:41 +0000 UTC [ - ]
sunshineforever 2021-08-16 22:33:28 +0000 UTC [ - ]
shrimpx 2021-08-16 22:42:21 +0000 UTC [ - ]
geekraver 2021-08-17 14:06:02 +0000 UTC [ - ]
MisterTea 2021-08-17 14:20:03 +0000 UTC [ - ]
president 2021-08-16 23:00:58 +0000 UTC [ - ]
RcouF1uZ4gsC 2021-08-16 23:42:34 +0000 UTC [ - ]
Actually, from talking with friends who are first responders, many times they will park fire trucks, etc so that they are blocking enough of the road to protect the first responders and the victims. The last thing you want is to have another car come and crash into first responders or victims of the initial accident. That is why they will deliberately park the truck at an angle to protect the people.
FireBeyond 2021-08-17 03:51:33 +0000 UTC [ - ]
Larger departments or those dealing with busier freeways have even started re-purposing older engines with water ballasts and attenuators as 'blocker' engines.
asdff 2021-08-16 22:31:21 +0000 UTC [ - ]
HALtheWise 2021-08-16 22:51:56 +0000 UTC [ - ]
asdff 2021-08-17 05:40:32 +0000 UTC [ - ]
quartesixte 2021-08-16 22:33:20 +0000 UTC [ - ]
shrimpx 2021-08-16 22:45:23 +0000 UTC [ - ]
jazzyjackson 2021-08-17 05:08:00 +0000 UTC [ - ]
quartesixte 2021-08-17 07:58:10 +0000 UTC [ - ]
shrimpx 2021-08-18 18:59:50 +0000 UTC [ - ]
https://news.ycombinator.com/item?id=28198933
didntknowya 2021-08-17 04:21:58 +0000 UTC [ - ]
qweqwweqwe-90i 2021-08-16 20:54:33 +0000 UTC [ - ]
kube-system 2021-08-16 21:08:11 +0000 UTC [ - ]
https://www.sae.org/binaries/content/gallery/cm/articles/pre...
ajross 2021-08-16 22:20:10 +0000 UTC [ - ]
Good grief. This meme will not die. The car literally tells you to keep your hands on the wheel every time you engage autopilot, yells at you if you don't, will lock you out of the system as punishment if you don't comply, and if you really seem disabled will bring the car to a stop and turn the hazards on. It simply will not operate without an attentive driver, or at the very least one spending considerable energy at defeating the attention nags.
There are exactly zero Tesla drivers in the world who don't know these rules. Just stop with the nonsense. Please.
FireBeyond 2021-08-17 03:52:40 +0000 UTC [ - ]
It only yells at you now because Tesla had to be forced to make it do so. Previously it'd let you go for a quarter of an hour before checking in on you.
Good grief yourself.
vkou 2021-08-16 22:23:15 +0000 UTC [ - ]
Tesla's marketing also knows that there are exactly zero drivers in the world who follow those rules, but that doesn't stop them from overselling the capabilities of what they ship.
ajross 2021-08-16 22:26:46 +0000 UTC [ - ]
kube-system 2021-08-17 05:03:24 +0000 UTC [ - ]
ajross 2021-08-17 05:06:32 +0000 UTC [ - ]
kube-system 2021-08-17 05:43:32 +0000 UTC [ - ]
1. concede to peer pressure and/or
2. doubt of the validity or seriousness of those warnings/lockouts
kube-system 2021-08-17 04:50:40 +0000 UTC [ - ]
There are plenty of people who have been convinced that those safety features/warnings are “just there for lawyers” and have attached items to the wheel to defeat the safety lockouts in order to show off their “self driving car” to their friends.
fxtentacle 2021-08-17 09:59:49 +0000 UTC [ - ]
"Tesla driver slept as car was going over 80 mph on Autopilot, Wisconsin officials say"
0x000000001 2021-08-17 20:11:07 +0000 UTC [ - ]
evanextreme 2021-08-16 20:58:05 +0000 UTC [ - ]
paxys 2021-08-16 21:51:26 +0000 UTC [ - ]
qweqwweqwe-90i 2021-08-16 22:22:20 +0000 UTC [ - ]
breakfastduck 2021-08-16 20:59:31 +0000 UTC [ - ]
But no, just keep disregarding the clearly significant issues and mis-marketing, because progress, right?
thebruce87m 2021-08-16 22:27:28 +0000 UTC [ - ]
sillystuff 2021-08-17 00:48:20 +0000 UTC [ - ]
https://www.forbes.com/sites/bradtempleton/2020/07/28/teslas...
clifdweller 2021-08-16 21:07:03 +0000 UTC [ - ]
jacquesm 2021-08-16 20:56:27 +0000 UTC [ - ]