Top
Best
New

Posted by doener 3 days ago

Tesla concealed fatal accidents to continue testing autonomous driving(www.rts.ch)
324 points | 203 commentspage 2
mikl 3 days ago|
Same article in German, if that’s more your thing: https://www.srf.ch/news/dialog/autonomes-fahren-wie-tesla-un...
RoxiHaidi 3 days ago||
One day an AI will obviously be infinitely better at driving than a human will be but that day is not yet here.
JumpCrisscross 3 days ago||
> that day is not yet here

Have you been in a Waymo? SAE Level 4 is here, and it’s safer than humans [1].

[1] https://waymo.com/safety/impact/

nunez 3 days ago|||
Not the OP, but I have! I also have FSD v14 in my Tesla

Vastly VASTLY prefer Waymo. It's very good at its mission and is, at minimum, infinitely better than being in an Uber rideshare. I'd rather wait 20 minutes for a Waymo than 5 for any Uber or 0 to use my own car.

Ironically, Waymo got me much more interested in using my city's public transportation offering which is much better than I previously thought.

That said, Tesla FSD v14 is the best autonomous option for a supervised system that you can actually use.

gloosx 3 days ago|||
This is Waymo saying Waymo cars are safer than humans. Obviously the "it’s safer than humans" claim is selection biased, statistically underpowered apples-to-oranges comparison with limited sample size
JumpCrisscross 3 days ago||
> Obviously the "it’s safer than humans" claim is selection biased, statistically underpowered apples-to-oranges comparison with limited sample size

I haven't seen a good criticism of their methodology. If you have one, I'd be curious about their take.

On a more-direct measure, Waymos have had starkly lower fatalities and at-risk incidents than human drivers on average and, I think, near their best.

baq 3 days ago|||
it is finitely better today and will be better still. this doesn't mean it's better at everything a human driver can do, it's just better on average. the jagged frontier is real and a very important safety consideration; nevertheless, the averages matter, too.
senordevnyc 3 days ago|||
“Infinitely” is a high bar, but Waymo is already demonstrably better than the majority of human drivers.
qsera 3 days ago||
But only in very controlled environments...
bluefirebrand 3 days ago|||
Personally I don't know if I care. Unless I can have some guarantee that the AI will prioritize my life and safety over literally any other concern, I'm not sure I would trust it

I don't ever want to be inside an AI driven vehicle that might decide to sacrifice me to minimize other damage

pmarreck 3 days ago|||
> to minimize other damage

You mean deaths to multiple other people, do you not? Let's just call a spade a spade here and point out the genuine ethical dilemma.

What's the ratio between "bodies of your own kids" and "other human bodies you have no other connection with" in terms of what a "proper" AI that is controlling a car YOU purchased, should be willing to make in trade in terms of injury or death?

I think most people would argue that it's greater than 1* (unless you are a pure rationalist, in which case, I tip my hat to you), but what "SHOULD" it be?

*meaning, in the case of a ratio of 2 for example, you would require 2 nonfamiliar deaths to justify losing one of your own kids

senordevnyc 3 days ago|||
Yeah, you also have to consider that your kids can be on either side of the equation too.
catlikesshrimp 3 days ago||
I honestly don't know if by the other side of the equation is your kid being on the street when somebody elses's av causes the accident. Bonus points of the owner of the av is not liable for the accident.
bluefirebrand 3 days ago||||
> You mean deaths to multiple other people, do you not

I mean deaths the AI predicts for other people, yes

And I'm not saying I would never choose to kill myself over killing a schoolbus full of children, but I'll be damned if a computer will make that choice for me.

AlotOfReading 3 days ago|||
I don't believe any AV software out there attempts to solve the trolley problem. It's just not relevant and moreover, actually illegal to have that code in some situations.

You can't get into a trolley situation without driving unsafely for the conditions first, so companies focus on preventing that earlier issue.

JumpCrisscross 3 days ago||||
> deaths the AI predicts for other people

Isn’t this entirely hypothetical? In reality, are any systems doing this calculus? Or are they mimicking humans, avoiding obstacles and reducing energies in a series of rapid-fire calls?

XorNot 3 days ago||
It was an entire media beat up because the media was too afraid to talk about anything real and the public not interested.

There's plenty we could talk about: i.e. the failure scenarios of shallow reasoning systems, the serious limitations on the resolution and capability of the actual Tesla cameras used for navigation, the failure modes of LIDAR etc.

Instead we got "what if the car calculates the trolley problem against you?"

And observationally, proof a staggering number of people don't know their road rules (since every variant of it consists of concocting some scenario where slamming on the brakes is done at far too late but you somehow know perfectly well there's not a preschool behind the nearest brick wall or something).

I remember running some basic numbers on this in an argument and you basically wind up at, assuming an AI is fast enough to detect a situation, it's sufficiently fast that it would literally always be able to stop the car with the brakes, or no level of aggressive manoeuvring would avoid the collision.

Which is of course what the road rules are: you slam on the brakes. Every other option is worse and gets even worse when an AI can brake quicker and faster if its smart enough to even consider other options.

saalweachter 3 days ago||
> Which is of course what the road rules are: you slam on the brakes.

Yeah, there are a shocking number of accidents which basically amount to "they tried to swerve and it went badly".

You can concoct a few scenarios where other drivers are violating the road rules so much as to basically be trying to murder you -- the simplest example is "you are stopped at a light and a giant truck is barreling towards you too fast to stop".

If you are a normal driver, you probably learn about this when you wake up in the hospital, but an autonomous vehicle could be watching how fast vehicles are approaching from behind you. There's going to be a wide range of scenarios where it will be clear the truck is not going to stop but there's still time to do something (for instance, a truck going 65mph takes around 5 seconds to stop, so if it's halfway through its stopping distance, you've got around 2.5 seconds to maneuver out of the way).

That does leave you all sorts of room to come up with realistic trolley problems.

JumpCrisscross 3 days ago||
> That does leave you all sorts of room to come up with realistic trolley problems

But all require a human (or malicious) driver on one hand. The more rule-following AVs on the road, the fewer the opportunities for such trolley problems.

And I'd still argue that debating these ex ante is, while philosophically fascinating, not a practical discussion. I'm not seeing a case where one would code anything further than collision avoidance and e.g. pre-activating restraints.

saalweachter 3 days ago||
Yeah, realistically the problems almost never happen and hopefully become rarer over time.

The typical human preference WRT the trolly problem ("don't take an action which leads to deaths, even if it would save more lives") is also a reasonable -- maybe the only reasonable answer -- to these hypotheticals.

Ie, move against the light to avoid getting rear ended, but not if you're going to run over a pedestrian or cause an accident with another vehicle trying to do so. (Even if getting rear ended would push you into the pedestrian or other car.)

Timon3 3 days ago|||
The AI can also only ever predict that you might die. So how should these predictions be weighed? Say there's a group of five children - the car predicts a 90% chance of death for them, vs. 50% for you if the car avoids them. According to your comments, it seems like you'd want the car to choose to hit the children, right?

What is the lowest likelihood of your own death you'd find acceptable in this situation?

CrazyStat 3 days ago|||
We can take the AI out of the question entirely and ask how many other humans you personally as a driver would be willing to mow down to avoid your own death—driving off a bridge, say.

I would suggest that all but the most narcissistic would have some limit to how many pedestrians they would be willing to run over to save their own lives. The demand that the AI have no such limit—“that the AI will prioritize my life and safety over literally any other concern”—is grotesque.

hermannj314 3 days ago||||
What would that guarantee look like and would it be legal to sell a product that made that guarantee?

"Prioritizing my life over every other concern" looks like plowing over pedestrians to get me to the hospital. I dont think you can legally sell a product that promises that.

JumpCrisscross 3 days ago||||
> not sure I would trust it

This is a fair concern. I’m unconvinced it’s even remotely a real market or political pressure.

On the market side, Waymo is constrained by some combination of production and auxiliaries. (Tesla, by technology.) On the political side, the salient debate is around jobs, in large part because Waymo has put to bed many of the practical safety questions from a best-in-class perspective.

bluefirebrand 3 days ago||
Sure, but what happens when the tech gains market capture and inevitably enshittifies, the same way every other piece of tech has?

I'm not really thinking about when self driving is State of the Art Research. I'm talking about when it becomes table stakes.

Honestly the real truth is I just do not trust tech companies to make decisions that are remotely in my best interest anymore.

I can't even trust tech companies to build software that respects a "do not send me marketing emails" checkbox, why would I ever trust a car driven by software built by the same sort of asshole?

JumpCrisscross 3 days ago||
> what happens when the tech gains market capture

Idk, we solve it then. Motor vehicles kill 40,000 Americans a year [1]. I’m willing to cautiously align with Google and maybe even Tesla if they can take a bit out of those numbers.

[1] https://www.cdc.gov/nchs/fastats/accidental-injury.htm

maxerickson 3 days ago||||
I find it interesting that you don't give other drivers any consideration in your analysis.
bluefirebrand 3 days ago||
Other drivers should take public transit if they don't want to / are afraid to operate their own vehicles

As for me I actually like driving and I'm good at it. I'm not afraid of operating my own vehicle like so many people seem to be

watwut 3 days ago|||
They are not afraid to operate their own vehicles. They are afraid you will kill them.

You just said that you do not care how many people you kill - regardless of whether they are pedestrians, whether they are driving cars or whether they are on the bus. That is what people react to.

maxerickson 3 days ago|||
No, I mean that they are not prioritizing you and many make poor choices.

Replacing bad other drivers with good autonomous systems is likely a great trade off for you, even if you are in an autonomous vehicle that is eager to sacrifice you if there is an unavoidable incident.

occamofsandwich 3 days ago||||
Sure, but then I don't want you to have a vehicle at all to minimize my own risk.
bluefirebrand 3 days ago||
Feel free to minimize your own risk by staying home and never leaving
occamofsandwich 3 days ago||
[flagged]
qsera 3 days ago|||
Appreciate the honesty.
rimliu 3 days ago||
Was it 2015 when HN was full of prediction we won't be driving in five years? From what I see the serious accidents with human drivers are caused by deliberately doing the dangerous thing (in my corner of the world - mostly overtaking at the wrong place or time, or both). Besides that humans drive very safely. Outside of the tightly controlled environment I don't see self-driving getting any better till systems have a proper world-model. So, maybe never.
oblio 3 days ago||
Look, there is no way corporations would lie for their own interest. Especially when they spent tens of billions to develop something.

It's not like they sold us leaded gasoline or "healthy tobacco" for decades.

gchamonlive 3 days ago||
[flagged]
Forgeties79 3 days ago||
That’s certainly the myth musk and his compatriots repeat whenever they’re slightly inconvenienced by consideration for the broader public, yes.
philipallstar 3 days ago|||
[flagged]
post-it 3 days ago||
Usually when people provide examples, they're intended to serve as a representative sample of a larger trend, and not an exhaustive list. Hope that helps.
cj 3 days ago||
Their point still stands.

Not all companies do illegal things.

IMO it’s also a distraction to blame it on “capitalism” or some “larger trend” rather than just pointing directly at the company and people responsible.

“The system is broken” line hasn’t worked for years now. Maybe if we stop blaming the system and start blaming the people?

tonyedgecombe 3 days ago|||
>Not all companies do illegal things.

The Koch brothers stopped breaking the law because it was too expensive. Instead they started lobbying to get the laws changed. This is where the idea that the system is rotten comes from.

ModernMech 3 days ago|||
No one claimed all companies do illegal things.
philipallstar 3 days ago||
All of this is a crazy overgeneralisation of the hundreds of millions of companies in the world:

> Look, there is no way corporations would lie for their own interest. Especially when they spent tens of billions to develop something.

> It's not like they sold us leaded gasoline or "healthy tobacco" for decades.

ModernMech 3 days ago|||
Saying "corporations have lied in the past for their own self interest" and then pointing to two very well known examples does not imply or over generalize that all corporations do that.

The point isn't to demonize all corporations, it's to say specifically that a pathology of some megacorporations is broadscale lying to the public about the safety of their products for personal gain.

sumeno 3 days ago|||
If I say "Ted is the Unibomber" do you think I'm saying everyone named Ted is the Unibomber? This is basic reading comprehension stuff
philipallstar 3 days ago||
[flagged]
Forgeties79 3 days ago|||
You would be surprised how passionately people defend Tesla on HN sometimes, especially when safety records come up.
friendzis 3 days ago|||
Otherwise number go down
lotsofpulp 3 days ago|||
Liability insurance pricing tells the whole story, without clickbait articles or emotion.

If there was a significant problem, my liability only insurance premiums would be higher for the Tesla compared to a non Tesla. But they are not.

JumpCrisscross 3 days ago|||
> my liability only insurance premiums would be higher for the Tesla compared to a non Tesla. But they are not

You’re correct inasmuch as we have no evidence there is “a significant problem.” But if Tesla is hiding evidence, as this article suggests, that might just be because lawsuits are still gaining steam.

lotsofpulp 3 days ago||
Liability insurance premiums would reflect higher risk of Tesla vehicles causing collisions, regardless if Tesla is at fault or if the driver is at fault. The insurance company still has to pay, which means the Tesla owners have to pay.
JumpCrisscross 3 days ago||
> Liability insurance premiums would reflect higher risk of Tesla vehicles causing collisions, regardless if Tesla is at fault or if the driver is at fault

Why? They only pay out if you’re at fault. And if there aren’t final judgements in a deep pipeline of cases, premiums wouldn’t have a reason to adjust yet.

lotsofpulp 3 days ago||
I am assuming Tesla has been around long enough and driven enough miles to have a sufficiently representative data set for insurance companies to know. I cannot imagine the pipeline of cases to be so deep as people are waiting on payments from collisions from years ago.

I am also assuming that a collision involving a Tesla has at fault determinations that are more accurate than other brands, given the 6 or 7 cameras that are recording and should make determining fault easier.

Basically, if the Tesla was more dangerous to drive than a Toyota, because it was a Tesla, then insurance companies would be paying out more for insuring Teslas, and hence insurance companies would be charging higher liability only insurance premiums.

edit to respond to Forgeties79:

> The issue is they are potentially lying. It’s why we are even having this discussion. The numbers could be fraudulent

When your vehicle gets into a collision, no one contacts the auto manufacturer about who was at fault. Suppose two cars collide. The police write a report, collect evidence, maybe the drivers submit their video recordings to the insurer.

But no one is calling Tesla and asking them to determine who was at fault. And if they did, Tesla would say we never agreed to be liable, and the driver should have been paying attention. There is no way to escape that if it was costing insurers more to insure liability for a Tesla, they would be asking for higher premiums.

Whether or not Tesla is lying to the government or whoever is irrelevant for the goal of determining if Teslas cause more damage than other vehicle brands.

JumpCrisscross 3 days ago|||
The entire point of these articles about mounting lawsuits is those assumptions may be wrong. The liabilities involved are higher. And given Tesla is potentially mucking with the data, the exculpatory value of having all those cameras is diminished.

> if the Tesla was more dangerous to drive than a Toyota, because it was a Tesla, then insurance companies would be paying out more for insuring Teslas

You may be over indexing how much work liability insurers do. I have an umbrella policy. It absolutely doesn’t take into account the fact that I ski and fly a plane, for example. At the end of the day, their liability is capped and it’s usually easier to weed out by claims history than running models on small premiums.

lotsofpulp 3 days ago||
> The entire point of these articles about mounting lawsuits is those assumptions may be wrong.

And my entire point is I trust the incentives of the insurer to accurately price risk and determine at fault more than a publication that needs clicks.

> And given Tesla is potentially mucking with the data, the exculpatory value of having all those cameras is diminished.

Does the data from Tesla even come into play for an insurer? They need to pay the damaged parties regardless of whether or not Tesla and its software are at fault. For premium pricing purposes, what Tesla does is irrelevant until after Tesla is found liable.

In the meantime, a collision with a Tesla is the same as any other auto brand’s. I don’t think Ford/Toyota/anyone else’s software comes into play. No auto brand picks up the liability for the driver (except Mercedes in some circumstances, I think), so no automaker is in the picture for payment in the event of an individual collision.

JumpCrisscross 3 days ago||
> my entire point is I trust the incentives of the insurer to accurately price risk and determine at fault more than a publication that needs clicks

Fair enough. I agree with you in the long run. I just don't think we've seen the litigation that will define liability play out yet.

> Does the data from Tesla even come into play for an insurer?

Directly? No. At least, not unless AI actuaries make the work worth the while.

For juries calculating damages? Plaintiffs weighing whether to bring a case? Sure. That, in turn, plays into liability. And that is something insurers care about.

> In the meantime, a collision with a Tesla is the same as any other auto brand’s

In the meantime, yes. If collisions with Teslas predictably result in larger damages than with other brands, you'd expect to see more litigation when a Tesla is involved/suspected at fault, and with that, higher costs.

> No auto brand picks up the liability for the driver

Tesla has been assigned liability already [1].

[1] https://law.marquette.edu/facultyblog/2025/08/jury-awards-24...

Forgeties79 3 days ago|||
The issue is they are potentially lying. It’s why we are even having this discussion. The numbers could be fraudulent.

And unfortunately, musk has earned people’s default skeptical stance towards him.

belter 3 days ago|||
[dead]
chneu 3 days ago||
Or pushed beef that destroys the environment and gives people GI cancers while claiming the opposite.
IAmBroom 3 days ago||
I don't recall any TV ads claiming beef cured GI cancers. Maybe you should change your channel.
mnvsbl 3 days ago||
This discussion was #1 and just vanished. Why?
red75prime 3 days ago|
Because the article is sensationalized crap.
dham 3 days ago||
Here we go again. Autopilot != FSD. Autopilot is not "autonomous" driving. It's lane keep with adaptive cruise control. The same system that Honda, Toyota, etc have. Yes the naming is wrong, the marketing is bad, but I don't see it as much worse as Toyota safety sense. If you use it to be "safe" you're going swerve off the highway into a ditch. I used super cruise from GM in my friends suv. As soon as lane markers go away on a bridge, I almost hit the railing.

I'll get downvoted but just giving you the facts. I'm glad the Autopilot name has been retired. Such a bad name, but maybe a good name because autopilot in planes can't see and avoid obstacles either.

idop 3 days ago||
Elon himself uses both terms interchangeably[1], and the two reportedly use the same stack, so why shouldn't we conflate the terms?

[1] https://electrek.co/2026/03/18/tesla-cybertruck-fsd-crash-vi...

iknowstuff 1 day ago||
They do not use the same stack.
ori_b 3 days ago|||
Can you explain why that makes it ok to cover up accidents and lie about the recordings of the event being corrupted?
estimator7292 3 days ago|||
How about the fact that Tesla is killing people and covering it up?

Would you go to a driver's funeral and tell their family that um, ackshully it's sparkling autopilot?

What do you think you're adding to the conversation? You're trying to distract from the fact that real, actual people have been actually killed by this.

x187463 3 days ago||
It's not a semantic issue, FSD is a completely different system, but many people mix up the terms when discussing these systems due to poor naming. Autopilot is just cruise control and lane keep. FSD handles navigation and full vehicle control. Articles discussing the dangers of Autopilot are making perfectly reasonable claims about a system which was poorly named/marketed, but they are not meaningfully relevant to conversations about FSD.
dv_dt 3 days ago|||
The news isn't necessarily of the effectiveness of the particular tech stack, but the integrity, or lack thereof of the manufacturer in reporting incidents. If that is in question, assessing the effectiveness of any of Tesla's tech stacks fsd or autonomy, or taxis for driving is in doubt.
Glemllksdf 3 days ago|||
I don't get it?

If autopilot was missleading, full self driving is too?

Rohansi 3 days ago|||
Autopilot is completely different software from FSD. If you think FSD is stupid then Autopilot is worse because it won't do anything other than stay in the same lane and adjust speed to the car in front of you.

For some reason you could turn this on when you're not driving on the highway. It doesn't do anything for traffic lights, stop signs, obstacles, etc. because it's just cruise control. It's also included with every vehicle (unlike FSD).

x187463 3 days ago|||
The difference is FSD is properly annotated as (Supervised) and does exactly that. Autopilot does not 'autopilot' the vehicle by any reasonable measure.
Glemllksdf 3 days ago|||
Supervisded self driving would be correct. I don't think I was aware of the (Supervised) before your comment tbh.
freejazz 3 days ago|||
FSD (not)
x187463 2 days ago||
FSD controls every aspect of the vehicle's operation and explicitly demands human supervision. There's nothing misleading or incorrect in the name FSD (Supervised).
freejazz 1 day ago||
Sure (let's disagree)
buellerbueller 3 days ago|||
Here we go again; Musk fanboy to the rescue!
trymas 3 days ago||
IMHO you're shifting goal posts (and I am not downvoting).

Tesla (or probably mostly Elon) was not selling "adaptive cruise control". It's selling "Autopilot" for $8k (now with a subscription AFAIK), with a pinky promise that "soon" or "next year" or "after two weeks" (jk) you essentially will set a destination, go to sleep and wake up at destination[1].

It's same as saying that "LLM != AI" and arguing that "ChatGPT is not AI - it's a glorified statistics model that is good at creating human sounding texts". Yeah - you and I understand this - but the average guy most likely does not and will get burned by this, because dozen tech-bros are burning billions of dollars and try to convince everyone that it's a panacea to every problem you can think of.

[1] It's a slight exageration, though I won't spend time digging for quotes but my main point is that's what Tesla are selling to an average guy and not nerds who can distinguish on what's possible, what's working and what level of driving assist there are.

x187463 3 days ago||
"Autopilot" is not $8K, that's FSD. Autopilot was the default cruise control/lane keep software and was renamed "Traffic Aware Cruise Control" a few months ago. The original name was ridiculously misleading.
rvz 3 days ago||
[flagged]
x187463 3 days ago|
This article specifically mentions "Autopilot", not FSD. I'll call out Tesla for BS as much as the next person and I own no stock, but FSD (Supervised) is exactly what it says. There's no aspect of vehicle operation that isn't controlled by FSD, but it must be supervised.
dangus 3 days ago|
To pile on to this pathetic excuse for a company: anyone considering buying a Tesla should know that they are the #1 brand for fatal accidents in the United States, with over twice the accident rate of a typical automaker: https://www.roadandtrack.com/news/a62919131/tesla-has-highes...

This terrible statistic can’t just be explained by aggressive driving owners or some other factor like that. Dodge has plenty of aggressive drivers buying their 700HP V8 rear wheel drive vehicles but they have better fatal accident rates than Tesla.

I’m convinced that Tesla makes unsafe cars and covers it up wherever they can.

The crash test safety awards their vehicles have won are clearly not representative of reality.

The self-driving system Tesla offers is only “ahead” of the competition because the competition is unwilling to sell an unsafe system.

infecto 3 days ago||
Your link only suggests driver and road conditions to be blamed. Consider the amount of power coming from a base model, I would lean towards driver. What they do with FSD stats is terrible and it would be refreshing to have some unbiased looks at it. Your narrative though is too biased and the link makes no connection to Tesla being responsible for the fatalities.
dangus 3 days ago||
I am proud of my bias against fascists and trillionaires.

Tesla’s fatality statistics don’t share my bias and speak for themselves.

At some point “it’s the driver’s fault” turns into “bad vehicle design” when enough scale is applied. Tesla is not some kind of low-volume niche automaker.

If I write software where the “Save” button is small and grey and the “discard changes” button is bold and blue, it’s not the user’s fault when many of my users lose their data. I was responsible for bad design.

It was Tesla’s choice to remove physical controls, make their screen as distracting as possible, make their cars accelerate faster than they need to despite competing in mainstream vehicle segments, deliver FSD with misleading promises in beta state, remove manual door releases from the vehicle trapping people inside, etc.

philipallstar 3 days ago|||
> Tesla vehicles have a fatal crash rate of 5.6 per billion miles driven, according to the study; Kia is second with a rate of 5.5,

Basically the same as Kia. Why are Kias so bad?

xutopia 3 days ago|||
2 reasons I can see.

Kia have way smaller and cheaper cars with less security features to market. Tesla had front page news at some point saying how they were the safest car ever produced.

Tesla is giving people driving their cars a false sense of security.

philipallstar 3 days ago||
But the article doesn't say that at all - quite the opposite:

> The study's authors make clear that the results do not indicate Tesla vehicles are inherently unsafe or have design flaws. In fact, Tesla vehicles are loaded with safety technology; the Insurance Institute for Highway Safety (IIHS) named the 2024 Model Y as a Top Safety Pick+ award winner, for example. Many of the other cars that ranked highly on the list have also been given high ratings for safety by the likes of IIHS and the National Highway Transportation Safety Administration, as well.

estearum 3 days ago||||
Until recently, Kias were sub-entry level shitboxes

This would affect both driver selection and performance during impact

Slap a ridiculously powerful drivetrain on it and a premium price tag and you have a Tesla

infecto 3 days ago||||
I am sure there is a component of safety systems in a Kia but I would bet the bigger weighting is on driver profile.
dangus 3 days ago|||
[flagged]
philipallstar 3 days ago|||
> You’re so close to understanding!

Sorry, I don't understand this. I'm just asking a question. Do you reply to every question with that?

infecto 3 days ago|||
You’re missing the obvious explanation here. Driver profile. You could have the safest car around but if it’s being driven by unsafe drivers it will lead to higher accidents and fatalities.
dangus 3 days ago|||
The model Y and S are far too high volume to fall into a driver profile issue.

Why are we looking at this from a perspective of making an attempt to exonerate Tesla first? Why not trust the data until a better explanation arises?

Basically you’re just guessing that it’s the drivers’ fault, but that’s not what the data says specifically.

infecto 3 days ago||
You’re arguing about some data from a study that came out a year ago that nobody could reproduce and has since been debunked because they were not using true vehicle miles traveled.

I am not guessing anything but suggesting that you’re making up narratives that make you feel good. I prefer truth and honesty over feeling good.

dangus 2 days ago||
Who debunked it?

Even the links given to Snopes and a Tesla employee/executive in this discussion didn’t debunk it.

The Tesla employee just said “nuh uh the denominator is bigger” as if we should believe a non-independent source.

I think it’s hilarious the lengths people will go to give Tesla the benefit of the doubt. Nobody would be questioning a study like this if Ford was the discussion topic. We would all just be saying “oh yeah, that checks out, Ford slaps together garbage vehicles, I’m not surprised.”

infecto 2 days ago||
Nobody is going to any lengths. This is an old article without any real reporting. This hit the meme cycle over a year ago. Tesla still rates high on safety from different agencies.

I have no feeling about weather it’s true or not. Just tired of folks like yourself that have no issue perpetuating bad reporting or misinformation. It is like those comments on news articles, sad.

senordevnyc 3 days ago|||
I can get on board with the rationale that Tesla drivers are idiots.
maxcan 3 days ago|||
that study was pretty thoroughly debunked. Also, I believe it was put out by a lobbying group representing auto dealerships who see the Tesla DTC model as a mortal threat. There is a lot of legitimate criticism to be directed towards Tesla but the ISeeCars study "aint it".
mzl 3 days ago|||
I've heard people saying the study is bad, but whenever I've asked about why the answers have been pretty bad. Do you have a good source for why we should disregard it?
dangus 3 days ago|||
Find a link that shows it’s debunked then? All they did was analyze federal crash data.

I don’t know what’s so hard to believe about the study. Tesla’s numbers are pretty similar to other low-performing brands.

maxcan 3 days ago||
https://www.snopes.com/news/2025/01/11/tesla-fatality-rates/

https://en.wikipedia.org/wiki/ISeeCars.com#Partnerships

https://x.com/larsmoravy/status/1860100416819855492

Looking for more. tl;dr is that NHTSA publishes accident rates but not mileage. ISeeCars has access to legacy auto mileage from dealership data but guessed at mileage for Tesla's in the period in question. Their methodology was not released and was a fraction of the total mileage that Tesla recorded over that period.

I do agree that Tesla could do a much better job with data transparency. But the claims of the ISC report are pretty difficult to reconcile with the crash test ratings they've gotten from many regulators across the world.

dangus 3 days ago||
So really, Snopes didn’t actually find any direct evidence that the study is wrong according to their own article, and the only way to debunk it is to take the word of a Tesla executive who made the total mileage denominator bigger.

This doesn’t show that ISeeCars is biased against Tesla specifically. Certainly their methodology could be flawed, but we don’t really have anyone else debunking them directly, and we have no evidence that it’s a hit piece.

IIHS crash test scores being good are the only counterpoint and those are synthetic in nature: crash test in a controlled environment. They can’t test for things like whether the occupants could get out or whether autopilot hallucinated and got into an accident that would have otherwise not happened.

post-it 3 days ago|||
For a while they were the safest car in crash tests, weren't they? Was there an inflection point where they were dropping like a rock? Or is this a case of measuring different things (crash tests vs fatal accident rates)?

I know you probably don't know off the top of your head, I'm hoping someone can chime in.

mzl 3 days ago||
Dan Luu had some interesting analysis about car safety, comparing how different auto-makers fared on newly introduced crash tests: https://danluu.com/car-safety/

The main take-away for me from that page is that very few manufacturers seem to design for actual safety (only Volvo had good results), and Tesla was angry that a new test had been introduced which feels indicative of a bad safety culture.

iugtmkbdfil834 3 days ago|||
I am admittedly not a fan, but I note that in my social circle I don't have anyone who considers one, one that has one wants to sell one, one vendor has one ( the truck one ), but it is clearly for marketing purposes so at least it makes sense.
jeffbee 3 days ago|||
How do we know it can't be explained by self-selecting driver population? That sounds like the most likely explanation, and it's the only explanation advanced by the article you provided.
dangus 3 days ago|||
Who would have guessed that a vehicle with no turn signal stalk or physical control to shift gears is unsafe!

Tesla sells too many vehicles for it to be a “self selecting driver population” thing anymore. They sell almost as many Model Ys as Honda CRVs.

I have a hard time believing that driver profile has anything to do with it, and I especially dislike the temptation to explain away the data by making unsubstantiated excuses for the company.

Dodge has better statistics than Tesla and they almost exclusively sell muscle cars.

post-it 3 days ago||||
I guess there's something to be said for "hey, if you're considering buying a Tesla, you may be the kind of person that's likely to kill themself in a car crash. Consider buying a safer car or taking the bus!"
Forgeties79 3 days ago||
Reminds me of the first episode of madman where the guy pitches appealing to everyone’s “inherent death wish” when selling cigarettes haha

“That’s it? If you’re gonna die, die with us?”

infecto 3 days ago|||
They don’t, these are the anti-Tesla folks. No level of reasoning is available for discussions like this.

I don’t like Elon but I also don’t think fiction and misleading stats serve anyone.

ymolodtsov 3 days ago|||
We're talking about a brand whose every car has at least 350HP, and most of them have more.

It's not an apples-to-oranges comparison.

dangus 3 days ago||
So why is Dodge better on the list? Most Dodge models sold are rear wheel drive performance cars. They basically only sell the Challenger/Charger and the Hornet SUV that nobody’s buying.

The lengths people will go to defend Tesla continue to astound me. Can’t we just say that they suck without making excuses for them?

ymolodtsov 3 days ago||
Because the word "data" doesn't have magical properties? And the fact that you have some "data" that seems to contradict both the personal experience of many other people and data points is actually curious?

Like this: https://www.euroncap.com/assessments/tesla/model%2B3/1110/

Something has to be flawed or there has to be some bias.

dangus 3 days ago||
Why does something have to be flawed? Because we have to find some way to exonerate Tesla by default?

Synthetic controlled environment crash tests can’t test for a lot of fatal and documented problems with Teslas like electric door latches trapping occupants, distracted drivers due to most vehicle functions being screen-based, owners who think the vehicles are more autonomous than they really are due to misleading marketing, and other issues that crash tests just don’t catch.

I.e., They might do well once they’re in a crash but they might get in more crashes per mile leading to more fatalities per mile and a crash test can’t show that.

I can even accept that they get into more crashes per mile because they aren’t road tripped as frequently as gasoline cars and are used in more high fatality settings (e.g., stroads have worse fatalities than highways). Obviously that in particular wouldn’t be Tesla’s fault.

ymolodtsov 3 days ago||
Their electric latches are stupid and should be illegal. Screens don't add distractions, certainly not more than any Uber driver juggling two phones has. People really overestimate that imaginary issue, especially when every car on the road has a screen with a CarPlay now.

Care-free drivers could be the result of their marketing for sure but ultimately it's on them and this is precisely the bias I'm talking about.

Look, I do drive a Model 3. I like some parts and hate the others. Calling this "Autopilot" should have been illegal. But it does sound quite weird to me to insinuate they aren't safe as cars when European agencies found no safety issues with them.

dangus 2 days ago||
> Screens don't add distractions

lol. Those same European agencies you claim find the Tesla to be super safe are the ones mandating physical controls in future vehicles.

https://www.autoblog.com/news/europe-and-china-now-require-p...

Study on the distracting touch screens:

https://www.washington.edu/news/2025/12/16/video-drivers-str...

Eyes off road during level 2 autonomous driving:

https://www.sciencedirect.com/science/article/abs/pii/S00014...

Teslas can do well in synthetic crash tests and still get into more fatal accidents per mile for other reasons. The two concepts aren’t mutually exclusive.

I’d be fine with admitting that it’s because EV owners put fewer highways miles on them. But Tesla’s dismissive and combative reputation on safety along with removing simple controls like turn signal stalks isn’t doing them any favors.

I own a Mazda from their physical controls era (sadly somewhat coming to a close with the 2027 CX-5). The idea that the median driver is an uber driver juggling two phones is not realistic. My Mazda does not allow me to operate CarPlay with the touch screen while I’m driving. This alone stops me from messing with it. I can operate common functions by feel with physical buttons in the vehicle without looking at the screen.

I can operate the climate controls, mirror adjustments, turn signals, gear shifter, windshield wipers, air vents, drive modes, basically everything without looking completely by touch and muscle memory.

As a model 3 owner you have to admit your Tesla is basically an impossible rental car for the average person who has never driven one. They’ll have to take a tutorial on how to shift with a touch screen, how to adjust mirrors, where are the turn signals, how to turn on the windshield wipers, etc.

Now imagine if every car brand was Tesla and they all had completely different touch screen software designs and everything was on the screen. I think Avis and Hertz might go out of business or have to start their own car company to compensate.

friendzis 3 days ago||
> I’m convinced that Tesla makes unsafe cars and covers it up wherever they can.

Tesla makes unsubstantiated, exaggerated claims about capabilities of their system and directly encourages unsafe behavior. How many other manufacturers encourage test subjects to drive full speed ahead into a concrete divider "to see what happens"?