Top
Best
New

Posted by doener 3 days ago

Tesla concealed fatal accidents to continue testing autonomous driving(www.rts.ch)
324 points | 203 comments
jasoncartwright 3 days ago|
Teslas turning off autopilot seconds before a crash, apparently avoiding being recorded as active during an incident, is wild https://futurism.com/tesla-nhtsa-autopilot-report
iugtmkbdfil834 3 days ago||
I think this is part of the reason I am wary of trying it ( including some of the competitor's variants ). They all want you to pay attention, because you may be forced to make a decision out of the blue. I might as well be in control all the time and not try to course correct at the literal last second.
Symmetry 3 days ago|||
SAE level 2 is just a bad idea. People can't be expected to carefully monitor a car and take over at a moment's notice when it's doing all the driving. My adaptive cruise control is great and I hope to have a future car where I can zone out while it drives and take over after after a few seconds heads up, but the zone between shouldn't be a valid feature.
JumpCrisscross 3 days ago|||
I think you mean SAE Level 3. SAE Level 2 is “lane centering” and “adaptive cruise control” [1]. (Level 3 is “when the feature requests, you must drive.)

[1] https://www.ncdd.com/images/blog/diagram.png

Symmetry 3 days ago||
I meant to include both SAE 2 and 3. I think having both lane keeping and cruise control on at the same time will tend to cause people to lose focus in a way they wouldn't if they had to do one or the other.
dualvariable 3 days ago|||
I don't even use cruise control. I like to be actively switched on all the time constantly making little decisions, including speed, so that I actually am instantly ready if I need to make some big decision.

People these days letting the car drive, thinking they can spring into action I think are underestimating just how cold their cache lines are getting and the major page fault they're going to take when they try to take over.

And I've seen comments by people that they were letting the car drive itself into a bad situation they could see developing, but didn't jump in to take over right away in anticipation (effectively betting on the car over their own skill but still realizing they had to jump in if the car got it wrong--which is just so incredibly confused).

pmarreck 3 days ago||||
Interestingly, I think that similar types of arguments are made against "agentic coding"

If you don't pay constant attention, you will never notice when it slips in a bug or security issue

catlikesshrimp 3 days ago|||
Car crash deaths are better known than software bug caused deaths. Worse: a car crash can cause the driver's death; I wouldn't offload work on which my life depends to an experimental tech.
TheScaryOne 3 days ago||
Today's car crash deaths are sometimes software bug caused deaths. Toyota failed their forensic audit of their drive by wire code back in 2013. https://capitolweekly.net/toyota-has-settled-hundreds-of-sud...
ownagefool 3 days ago|||
Sure, but you can do that in a diff after the event, rather than live.
IgorPartola 3 days ago||||
A self driving car should have no steering wheel. If it has a steering wheel it is a vote of no confidence from the manufacturer.
ghaff 3 days ago|||
I don't really buy that. There are a lot of situations (e.g. being directed to park in a space at a fairgrounds, ski area, or whatever) that you can't reasonably expect AFAIK to be programmed into a car's computer. Even if a car can legitimately handle roads under most circumstances, they're not going to be able to handle everything.
catlikesshrimp 3 days ago|||
"Because the Origin does not have manual controls, the NHTSA must issue an exception to the Federal Motor Vehicle Safety Standards to permit operation on public roads"

Too bad that project failed.

https://en.wikipedia.org/wiki/Cruise_(autonomous_vehicle)

butlike 3 days ago|||
I think their point was "it's not ready yet."
grog454 3 days ago||||
Throttle and yoke aren't a vote of no confidence from aircraft manufacturers. Some modes of operation are suitable for autopilot and some are not.
NekkoDroid 3 days ago|||
There is a reason that pilots get basically told the ins and outs of a specific plane. Imagine the outrage if people need to do month long training for a specific car just to be able to drive it (and not just a general "here is how cars roughly work and the laws of the road").
sobellian 3 days ago||||
Would it be a vote of no confidence in Full Self Flying?
tclancy 3 days ago||
No, it would be an acknowledgement of the lack of perfection in human systems so far.
saalweachter 3 days ago|||
I mean, they kinda are.

Airline pilots aren't supposed to take a nap, and there are occasionally articles about the various things that have gone wrong because the pilots weren't paying attention.

tclancy 3 days ago||||
That presents an interesting failure mode challenge.
singpolyma3 3 days ago||||
Well we don't have any self driving cars outside of San Francisco. Only cars with advanced driver assistance.
lotsofpulp 3 days ago||
Quite a few more places have them now:

https://support.google.com/waymo/answer/9059119?hl=en

jerlam 3 days ago||
Also in Vegas (Zoox), and China has their own competitive market of self-driving taxis.
gambiting 3 days ago|||
How do you reverse such a car into your own driveway that's positioned in a funny way at an angle and an incline? What if you're parking off road for any reason? Like, you have to be able to manoeuvre your own vehicle sometimes.
x187463 3 days ago|||
Treat it like a driver assistance system. I treat FSD the same as I treat Augmented Cruise Control and Lane Keep Assist in my CRV. I keep my hands on the steering wheel and follow along with the decision making.
grvdrm 3 days ago|||
Reminds me of a situation not long ago.

I’m in left lane on highway. Tesla ahead of me but quite a ways away.

I realize as I’m driving that the Tesla is moving quite slow for the left lane driving. And before you say it, yes there are lots of people speeding in highway left lanes too.

So - I passed on the right rather than tailgate. Look over and see a guy leaning back in his seat. No hands on wheel. Could’ve been asleep. And driving 10-15 mph slower than you’d expect in that lane.

To your point about using it FSD the way you do, makes total sense to me. Which implies you would also cruise at the right speed depending on the lane you are in, unlike my example.

x187463 3 days ago|||
One of my major complaints about FSD is the 'speed profiles'. You used to be able to set a target speed directly. Now, you can only select a profile. You're either going the exact speed limit, 2-3mph over, or essentially 'with the flow of traffic' which can lead to speeding +15 over the limit.
grvdrm 3 days ago||
Didn't know about that feature. Thanks for the illumination. On verge of going full electric and looking at BMW, Lucid, Porsche, Rivian, Tesla.

I wonder what's taught to new drivers about this sort of situation. My intuitive feeling (driving for almost 30 years) is you drive with the flow of traffic when traffic is present. I don't see too many left lane drivers glued to speed limits, but it's obvious when someone is a fast or slow.

x187463 3 days ago||
It's worth noting that older Tesla's pre-2024, are stuck on an old version of FSD due to compute limitations. Recent FSD, generally, does not hang out in the left lane and is very good at recognizing when vehicles approach from the rear. It will move to the right lane to allow them to pass.
grvdrm 3 days ago||
Excellent -- noted.
Mawr 2 days ago|||
> the Tesla is moving quite slow for the left lane driving. And before you say it, yes there are lots of people speeding in highway left lanes too.

Is that code for "the Tesla was following the law by driving within the speed limit and I don't find that acceptable" or what?

> I passed on the right rather than tailgate.

... right, since those are the only two options. Tailgating is just one of the potential valid options to choose from after all.

> And driving 10-15 mph slower than you’d expect in that lane.

So not "slower than the speed limit", but rather "slower than you'd expect". Sigh.

grvdrm 1 day ago||
I won't comment on whether it's acceptable to speed or not. I don't think that's the point.

Most highways I drive on exhibit a predictable pattern. Slower folks in right lane. Faster folks in left lane. Maybe those slower folks are at the speed limit, or above, or below. Left lane folks somewhat faster.

Should everyone obey the speed limit? Sure! Hard to argue that point.

My observation was a Tesla driving at - let's call it "right lane speed" in the left lane. Maybe slower. Slow enough that you'd soon see a predictable back-up behind the car - some tailgating, brake usage, etc. The stuff that in my view leads to more accidents, swerving, and phantom traffic that occurs when people pile on each other, use brakes excessively, and end up slowing to a crawl.

FWIW: The "is speeding acceptable" question is somewhat resolved by police. I rarely see people pulled over for speeding within the flow of traffic, vs. somewhat swerving in/out or just driving much faster than everyone.

Don't remember the last time I saw an officer pick a car out of a normally flowing left lane to issue just that one driver a ticket.

throwanem 3 days ago||||
Real question, then, from someone who only bothers driving when he must and even then in a 2016 model: Why do you use it? What beneficial purpose do you find it to serve?

I'm asking because I feel I must be missing something, inasmuch as to have my hands on the wheel while not controlling the car is an experience with which I'm familiar from skids and crashes, and thinking about it as an aspect of normal operation makes the hair stand up on the back of my neck. (Especially with no obviously described "deadman switch" or vigilance control!)

x187463 3 days ago||
Here's a simple example from last week. FSD was in control on my way to work, stopped at a red light early in the morning before the sun was up. The light turns green and FSD does not accelerate. I figured it was somehow confused and I was starting to move toward hitting the accelerator myself when a car comes flying through the red light from the driver's side. I hadn't noticed this car, but FSD saw it and recognized it wasn't slowing down. I could see there were headlights, but it wasn't clear how fast it was going.

It's just nice having a 'second set of eyes' in a sense. It's also very useful when driving in unfamiliar cities where much of my attention would be spent on navigation and trying to recognize markings/signs/light positions that are atypical. FSD handles the minutia of basic vehicle operation so I can focus on higher level decisions. Generally, at inner-city speeds, safety and time-to-act are less of an issue and it just becomes a matter of splitting attention between pedestrians, obstacles, navigation, etc. FSD if very helpful in these situations.

throwanem 3 days ago|||
Huh.

I appreciate your thoughtful and detailed response. I'll need to think about it for a while, too. It had not occurred to me to consider the possibility that someone else's FSD might protect me from the general incompetence and unreliability of amateur motor vehicle operators.

(Jumping a light in the dark? Not thinking or learning to navigate by verbal instructions from your satnav or phone, instead of compromising the primary sense you must constantly use to drive without risking manslaughter? I'm sorry, but if this is the standard, I really can't describe it other than it is...to say nothing of your considering safety less important, as you say, in the "inner city" that is my home.)

x187463 3 days ago||
> Jumping a light in the dark?

I don't know what this means.

> Not thinking or learning to navigate by verbal instructions from your satnav or phone, instead of compromising the primary sense you must constantly use to drive without risking manslaughter?

Navigating involves reading street signs, block numbers, and traffic markings. These are all visual elements that can distract from safety monitoring. How many minor accidents result from driver's trying to figure out where they are, or need to go?

> I'm sorry, but if this is the standard, I really can't describe it other than it is...to say nothing of your considering safety less important, as you say, in the "inner city" that is my home.

My claim isn't that safety is less important in city driving, it's that driving is far safer due to lower speeds. There's more time to react and lower risk of catastrophic results when driving at 35mph. The challenge for a driver isn't sudden loss of control as you may experience at 65+mph. The city driving challenge is trying to track markings, signage, pedestrians, and parked cars while also navigating and managing the vehicle's basic operation. FSD can track all of that without distraction and leave the driver responsible for more human reasoning tasks.

throwanem 3 days ago||
> I don't know what this means.

You failed, in this case by hastening to cross the intersection as soon as the light came green, to account for the possibility of another driver's error. If you weren't taught to do that, as I was, then the mistake is not entirely your own. It was still a mistake, which you have already acknowledged would have led you into an accident had your vehicle not rescued you.

> There's more time to react and lower risk of catastrophic results when driving at 35mph.

Not for me. You're the one wearing power armor, remember.

x187463 2 days ago||
> You failed, in this case by hastening to cross the intersection as soon as the light came green, to account for the possibility of another driver's error. If you weren't taught to do that, as I was, then the mistake is not entirely your own. It was still a mistake, which you have already acknowledged would have led you into an accident had your vehicle not rescued you.

Even if we accept your interpretation of the situation as true, you're making the case for FSD. You can think of FSD (or other self-driving solutions) as raising the floor for bad drivers. If I'm a driver with some otherwise dangerous habits (nobody is perfect) then FSD is filling the gaps in my skill.

> Not for me. You're the one wearing power armor, remember.

But this is a joint interaction between the pedestrian and the vehicle. I can't make the pedestrian more aware. I can't give the pedestrian super-human reaction time. I can, however, give those traits to the vehicle. That's a major selling point for autonomous vehicles.

throwanem 2 days ago||
Well, sure. As I said yesterday, it hadn't previously occurred to me to think of someone else's FSD as helping keep me safe from them. (Thank you again, by the way, for helping me put two and two together on that!)

As a pedestrian, I don't need superhuman reaction time, because unlike some I move at speeds a human mind can comprehend. Nor, I promise you, need I be "more aware" - what a frankly foolish thing to say, when there is nothing on Earth even remotely as dangerous to me, day to day, than you and those like you! I assure you, I am about as aware as it is possible to be. I have to be! Look at you.

But this again is a splendid illustration of the problem, for which I again must give my gratitude: the old-school motorhuckle lifestyler dingbats were right all along, it turns out, to call cars "cages." You carry yours around in your head all the time, I see.

grvdrm 3 days ago|||
Glad you're ok!

I was watching the Tesla display on my way back home from LaGuardia airport last week (passenger, not driver).

No accidents or close calls, but it was obvious that I might be focused on 1 or 2 things in that very busy and chaotic environment whereas the car (FSD or otherwise) sees more than 2 things and possibly avoids something on my behalf.

throwanem 3 days ago||
> I might be focused on 1 or 2 things in that very busy and chaotic environment...

...so you hired a professional to do that job for you, instead of risking the wellbeing of everyone nearby. This was the correct decision!

XorNot 3 days ago|||
Which is just worse.

When I'm driving I know what I'm doing, what I'm planning to do and can scan the road and controls with that context.

Making me have to try and guess what the car is going to do at any given time is adding complexity to the process: am I changing lanes now, oh I guess I am because the autonomy thinks we should etc.

grvdrm 3 days ago|||
Not sure about your car but the car I have with augmented cruise requires hands on wheel. Turns off otherwise. (Volvo XC90)

I agree that there are situations where what I do as a trained driver is different from augmented cruise.

A good example (or perhaps I'm wrong) is this: in a lane, car pulls into lane in front of me and between the car further ahead. Now I don't have enough space in between me and that new entrant. But instead of using brakes (unless eggregious), I bleed speed until I make space I want. Augmented cruise doesn't do that - it hits brakes.

So, from behind, I think it looks like I'm using my brakes a lot more than I am when on augmented cruise. And excessive brake use distracts the driver behind me.

x187463 3 days ago|||
Sure, but the practical experience is that FSD is fairly predictable. It's just a matter of personal preference that comes from experience. I wouldn't impose a system like FSD on everybody.
FeloniousHam 3 days ago||
I'm a >90% FSD user, and I approve this sentiment. My wife hates it for the mistakes it makes (eg. seems like there is recent shadow recognition regression) and "errors in judgement" (not getting in the turn lane in a timely manner), she would never use it on her own.

I've got plenty of experience, and (feel as though) I know most of it's failure points. I had to drive my 30 minute commute last week, and it was decidedly unfun. I have seen the future and I don't want to go back.

x187463 3 days ago||
96% here, including DC and Baltimore. Besides the bizarre Navigation choices and waiting to long for lane changes, FSD has reached essentially zero interventions outside of bad mapping situations. I really wish Tesla would use better map data, for sure.
d1sxeyes 3 days ago|||
To be fair, that report says

> the self-driving feature had “aborted vehicle control less than one second prior to the first impact”

It seems right to me that the self-driving feature aborts vehicle control as soon as it is in a situation it can’t resolve. If there’s evidence that Tesla is actively using this to “prove” that FSD is not behind a crash, I’m happy to change my mind. For me, probably 5s prior is a reasonable limit.

idop 3 days ago|||
It's an insane reversal of roles. In a standard level 2 ADAS, the system detects a pending collision the driver has not responded to and pumps the breaks. Tesla FSD does the reverse: it detects a pending collision that it has not responded to, and shuts itself off instead of pumping the breaks. It's pure insanity.

Also, Tesla routinely claims that "FSD was not active at the time of the crash" in such cases, and they own and control the data, so it's the driver's word against theirs. They most recently used this claim for the person who almost flew off an overpass in Houston because FSD deactivated itself 4 seconds before impact[1]. They used it unironically as an excuse why FSD is not at fault, despite the fact that FSD created the situation in the first place.

[1] https://electrek.co/2026/03/18/tesla-cybertruck-fsd-crash-vi...

d1sxeyes 3 days ago||
AEB is enabled even when FSD is off, which sounds like the L2 ADAS behaviour you're describing. Just because FSD disengages, it doesn't mean that no other Collision Avoidance Assist features are operating.

> because FSD deactivated itself 4 seconds before impact

This isn't accurate. The driver deactivated FSD 4 seconds before impact. Don't get me wrong, the video looks pretty much like FSD wouldn't have been able to do anything better than the driver did, but she didn't give it a chance.

superxpro12 3 days ago||||
IDK, this has the same unethical energy as police turning off body cameras.

in the BEST CASE, this is a confluence of coincidences. Engineering knows about this and leaves it "low prio wont fix" because its advantageous for metrics.

In the worst case, this is intentional.

In any case, the "right thing to do" is NOT turn off the cameras just before a collision, and yet it happens.

This is also Safety Critical Engineering 101. Like.... this would be one of the first scenarios covered in the safety analysis. Someone approved this behavior, either intentionally, or through an intentional omission.

JumpCrisscross 3 days ago||
> the "right thing to do" is NOT turn off the cameras just before a collision

Source for autopilot being disabled “seconds before a crash” also disabling cameras? (Sorry if I missed it above.)

onemoresoop 3 days ago||||
This is a policy that Tesla put in place, period. Handling control to driver suddenly in a weird moment can make the whole situation even more dangerous as the driver is not primed to handle it on the spot, it’s all too unexpected.
scott_w 3 days ago|||
Yep, your comment reminds me of a time my mother was about to hit a bird in the road. However, she was too busy arguing with the passenger to notice, and her driving was starting to become erratic already. I decided not to tell her because I knew that the shock could cause her do something more drastic like crash the car to try and avoid it.
boringg 3 days ago|||
I guess i'll step in for the counter.

How is a car supposed to pre-empt when it is in a situation that is to challenging for it to navigate? Isn't it the driver who should see a situation that looks dicey for FSD and take control?

onemoresoop 3 days ago|||
Maybe the car should not have this dangerous feature in the first place? Or maybe train drivers thoroughly and frequently for when this situation arises it becomes less dangerous.

It seems to me FSD for Tesla is not ready to go into Prod as it is now.

throw0101a 2 days ago||||
> Isn't it the driver who should see a situation that looks dicey for FSD and take control?

How does a driver judge what is and is not "dicey" from the FSD's perspective?

If you don't have confidence in FSD, then you wouldn't use it in the first place. If you do have confidence, then why would you ever (or how often) would you take over?

Is there some kind of 'confidence gauge' that the FSD displays in how well it thinks it can handle the situation? If there is/was, perhaps the driver could see it dropping and prime himself to take over.

tremon 3 days ago|||
How is a car supposed to pre-empt when it is in a situation that is to challenging for it to navigate?

By anticipating further ahead. If it finds itself into a situation that it can't get itself out of, it means it should have made more defensive choices earlier or relinquish control earlier. And if it doesn't have either the reasoning capacity or the spatial awareness data to do that, it is not fit for general usage and should be pulled.

boringg 3 days ago||
Was this case FSD or was this earliest generation technology? And does this still happen?

I agree you right in that's what you expect to happen.

x187463 3 days ago||||
This is reasonable, and you have to imagine many collisions involve the driver taking control at the last second causing the software to deactivate. That being said, this becomes a matter of defining a self-driving collision as one in which self-driving contributed materially to the event rather than requiring self-driving be activated at the exact moment of impact.
kombookcha 3 days ago||
Agreed. I also feel like there is a world of difference between the driver deliberately assuming control at the last second because they notice that an accident is about to happen, and the car itself yielding control unprompted because it thinks an accident is about to happen.

The former is to be expected. The latter seems likely to potentially make an already dangerous situation worse by suddenly throwing the controls to an inattentive driver at a critical moment. It seems like it would be much safer for the autopilot to continue doing its best while sounding a loud alarm to make it clear that something dangerous is happening.

x187463 3 days ago||
> It seems like it would be much safer for the autopilot to continue doing its best while sounding a loud alarm to make it clear that something dangerous is happening.

This is essentially what FSD does, today. When the system determines the driver needs to take over, it will sound an alert and display a take-over message without relinquishing control.

bena 3 days ago||||
So, the car puts itself in a situation it can't resolve, then just abdicates responsibility at the last moment.

That's still not a good look.

And it does mean that FSD isn't to be as trusted as it is because if the car is putting itself in unresolvable situations, that's still a problem with FSD even if it isn't in direct control at the moment of impact.

scott_w 3 days ago|||
The few Tesla post-mortems I’ve read early on stated that FSD turned off before impact and used this as a defence to their system. If they shared that this happened 1 second before impact (so far too late for a human to respond), I’d have sympathy. I have never read a Tesla statement that contained this information.

For normal incidents, 2 seconds is taken as a response time to be added for corrective action to take effect (avoidance, braking). I’d expand this for FSD because it implies a lower level of engagement, so you need more time to reengage with the car.

Bluestrike2 3 days ago|||
Disregarding the fact that NHTSA findings apparently contradict it (though that may just be a more recent change than the 2022 report), Tesla claims to use five seconds before a collision event as the threshold for their data reporting on their FSD marketing page:

> If FSD (Supervised) was active at any point within five seconds leading up to a collision event, Tesla considers the collision to have occurred with FSD (Supervised) engaged for purposes of calculating collision rates for the Vehicle Safety Report. This approach accounts for the time required for drivers to recognize potential hazards and take manual control of the vehicle. This calculation ensures that our reported collision rates for FSD (Supervised) capture not only collisions that occur while the system is actively controlling the vehicle, but also scenarios where a driver may disengage the system or where the system aborts on its own shortly before impact.[0]

In theory, that should more than cover the common perception-response times of around ~1 to 1.5 seconds used as a rule of thumb for most car accidents. But I'm quite curious what research has been done on the disengagement process as driver assistance systems return control to the driver and its impact on driver response times and their overall alertness.

If drivers trust the car to handle braking and steering for you, are we really going to see perception–response times that low, or have we changed the behavior being measured? Instead of timing a direct response to a stimulus, we’re now including the time required to re-engage their attention (even if they're nominally "paying attention"), transition to full control of the vehicle, and then react to the stimulus that they're now barreling down on.

For that matter, this approach is making the implicit assumption that pressing the brake pedal or turning the steering while is a sign of now-active control and awareness. Is it? Or could it just be a sort of instinctual reaction? I've been in the passenger seat when a driver has slammed on the brakes, only to find myself moving my right foot as if to hit an imaginary brake pedal even knowing I obviously wasn't the one driving. Hell, I remember my mom doing that back when I was learning to drive during normal braking.

0. https://www.tesla.com/fsd/safety#:~:text=within five seconds

plqbfbv 3 days ago|||
It's well known for a while now, and it's not to avoid recording being active, it's to avoid a possibly damaged computer to keep working in a likely compromised situation. What happens if the car crashes and flips, AP/FSD has no training on that, and wheels keep spinning at full speed while first responders try to secure the car?

AEB should still be working to pump the breaks AFAIK, but auto-steer and cruise control will be disabled while the computer and electronics are still perfectly operational to make the car more secure for the passengers and first responders after the event.

EDIT: IIRC the threshold for disengagement is 1s.

palmotea 3 days ago||
>> Teslas turning off autopilot seconds before a crash, apparently avoiding being recorded as active during an incident, is wild https://futurism.com/tesla-nhtsa-autopilot-report

> It's well known for a while now, and it's not to avoid recording being active, it's to avoid a possibly damaged computer to keep working in a likely compromised situation. What happens if the car crashes and flips, AP/FSD has no training on that, and wheels keep spinning at full speed while first responders try to secure the car?

That sounds like an ass-covering justification. There may be a good reason for triggering some kind of interlock to prevent the problems you outlined, but if their implementation 1) also stopped recording seconds before a crash or 2) they publicly claimed it wasn't responsible since it turned itself off, then Tesla is behaving unethically and dishonestly.

plqbfbv 3 days ago||
I'm just stating what I remember, I'm not trying to defend Tesla.

For 1) it's the first time I hear it from a technical point of view - Tesla's dashcam records continuously for the last 10m, and should save the data on the internal computer in case of a crash and send it back to Tesla if feasible AFAIR (I'm an owner). IIRC it's not the first case though where Tesla claimed the data wasn't available or corrupted, and then it was actually recovered some time later after pressure from authorities. So I think technically the data is there, but also believe Tesla is behaving unethically and dishonestly to cover up or delay retrieval.

2) I often hear it as FUD, as in: AP/FSD was off, the user just did it by accident, wasn't accustomed to it, or just didn't know how it worked. AFAIR most of the accidents had the data released and showed some of the following: user touched steering wheel and disengaged autosteer/FSD (whether knowingly or by accident), user was pressing accelerator pedal by accident, user was pressing accelerator instead of brake, etc etc

ymolodtsov 3 days ago||
Tesla has a very bad track record in terms of both compliance and disclosure when it comes to autonomy incidents.
boringg 3 days ago|
Did you find the article lack any real numbers related to the claims? It was a bit weird in that that information was so vague.

Individual tragic anecdotal incidences aside the vagueness of the article really diluted the merit of the claims.

culi 3 days ago||
[flagged]
boringg 3 days ago||
[flagged]
doener 3 days ago||
The article was also published in German: https://www.srf.ch/news/dialog/autonomes-fahren-wie-tesla-un...
raverbashing 3 days ago|
It's the Swiss national radio/tv service, they probably have the article in 4 languages or more
Geee 3 days ago||
This is about the old autopilot, not FSD, and there doesn't seem to be anything new in the article. This is based on the same leaked data which has been public since 2023. The title seems to be inaccurate, as there's nothing to indicate that they hid fatal accidents.
93po 3 days ago|
[flagged]
adev_ 3 days ago||
So...For a bit of context on the video and the article:

- The documentary is from the RTS. The RTS is the main publicly owned media from Switzerland. They are not the typical European owned public media: They are generally pretty well funded (contrary to most). They also tend to generate good (high) quality content, tend to be independent and rather neutral (leaning slightly to the left politically speaking).

- The video is in French because, in Switzerland, the media are divided in three group associated to the regional languages: RTS for the French, SSR for the German and RSI for the Italian. Thats why you do get German translation.

- They are generally pretty cooperative and open minded. If one of you want to submit english subtitles. Just contact them, they might accept it (I do not promise anything).

limbero 3 days ago|
Sorry, but you seem to be implying that European public owned media outlets are not normally to be trusted. Why?

I started out writing a list of European countries with high quality public broadcasters, but the comment started looking silly since the list quickly grew very long.

outime 3 days ago|||
I've lived for many years in two large European countries and in both cases I found them hard to trust. Perhaps you have deep, first-hand knowledge of multiple European countries but in my experience they take too much money and are heavily biased. For that reason I'd prefer there to be no public broadcast companies - at least so my tax money doesn't support manipulation. In over 30+ years of life, I've never encountered a truly neutral public broadcaster in Europe, though I'm sure there may be exceptions.
spiderfarmer 3 days ago|||
In my country I judge them purely by what they do and say in the sectors where I know a lot about, and the facts they bring are mostly correct.

Also, they don't tout a single party line.

Gud 3 days ago|||
Which countries?
paganel 3 days ago||||
The national broadcaster here in Romania has been politically leaning on whoever was paying the bills, hence on who’s holding political control over the country.

I can say the same about the foreign bureaus of State-owned media thingies like Deutsche Welle and Radio France Internationale, both of these entities actively rooting for the Romanian political candidate that was seen as closer to German and French interests (I’m talking the last couple of rounds of Romanian presidential elections).

u_sama 3 days ago||||
They have left leaning biases, RTVE is basically a propaganda channel for the PSOE at this point and France Info/France2 have center-left biases which makes them not neutral and representing the corpus of society. They are all well-funded though.
orwin 3 days ago||
France info is really right wing in my opinion, probably the most liberal (economically) public radio station in France. Even calling them center right doesn't do them justice. Maybe borrowing from historians and calling them 'extreme center' is the best to position them on a political spectrum.
u_sama 3 days ago||
I mistook France Info and France Inter there, France Info is actually neutral meaning they have right wing and left wing voices which I like.
adev_ 3 days ago||||
> Sorry, but you seem to be implying that European public owned media outlets are not normally to be trusted. Why?

The quality of European publicly owned medias is highly country specific and variates quite a lot:

- Some of them are critically underfunded and it becomes visible (tendency to cheap sensationalism, superficial investigation or recycled content).

- Some of them are politically rooted (Left or Right) or controlled due to a direct/indirect government involvment.

But all considered: I would say that the average are still an order of magnitude better in term of content quality and independence that the average privatized media.

spiderfarmer 3 days ago|||
You're probably responding to Swiss person that lives in the USA.
pmarreck 3 days ago||
Are these still accidents where the driver was not paying attention, though?
singpolyma3 3 days ago|
Of course. But the argument that the nature of FSD causes them to not pay attention.
zulgin 3 days ago||
Look I don't like Tesla as much as the next person, I think it is wildly over-hyped and over-valued. But this article is just slop.

The headline says - "How Tesla hid accidents to test its Autopilot" but the actual article has no explanation as to (1) how Tesla hid anything or, for that matter, (2) who did Tesla hide this information from

It mashes together a Tesla data leak from 2022 and an unconnected lawsuit from 2026 without ever explaining how those 2 are connected.

Tesla has a pattern of making deceptive promises and deceptive disclosures but this article doesn't make that case at all.

Lerc 3 days ago||
>Tesla has a pattern of making deceptive promises and deceptive disclosures but this article doesn't make that case at all.

This is something I find frequently as well, moreso with Musk related things than Tesla. Lord knows there are plenty of things to be critical of.

If investigative journalism wants to regain the respect it once had, fewer allegations with concrete claims serves both the public and faith in media over large quantities of vague claims.

I admit if you want to sway public opinion, the latter is more effective, but is also a mechanism that doesn't require alignment with the truth. When that approach is normalised, it opens the door for anyone to shove popular opinion around.

HFguy 3 days ago|||
After you wrote this, I went and read the article I also didn't see much there either. And wonder why you are getting down voted. And TBC, also not a tesla fan (the truck is dumb).
tiberriver256 3 days ago||
Thanks
kotaKat 3 days ago||
Hot take but I feel like Tesla owners (hell, anyone with 'autonomous driving' vehicles) need to see some kind of modern lecture based on the Children of the Magenta talk on automation dependence in aircraft. Mandatory, before you can trigger the system on.

FSD has built this generation's newest children of the magenta line.

https://www.youtube.com/watch?v=5ESJH1NLMLs

meindnoch 3 days ago|
>Tesla owners (hell, anyone with 'autonomous driving' vehicles)

Or LLM users.

mnvsbl 3 days ago|
Full report here (video): https://www.rts.ch/play/tv/temps-present/video/tesla-la-face...
More comments...