alt.hn

2/20/2026 at 5:00:18 PM

Tesla has to pay historic $243M judgement over Autopilot crash, judge says

https://electrek.co/2026/02/20/tesla-has-to-pay-historical-243-million-judgement-over-autopilot-crash-judge-says/

by jeffbee

2/20/2026 at 7:02:56 PM

I’m not usually an apologist, and I’d agree with this judgement if the car was left to its own devices, but the driver of the car held his foot on the accelerator which is why it blew through those stop signs and lights.

In regards to the autopilot branding, would a reasonable person expect a plane on autopilot to fly safely if the pilot suddenly took over and pointed it at the ground?

by tass

2/20/2026 at 7:09:48 PM

The average person does not know how to fly a plane or what a plane autopilot does. It's a ridiculous superficial comparison. Planes have professional pilots who understand the capabilities and limits of aviation autopilot technology.

Tesla has had it both ways for ages - their stock price was based on "self-driving cars" and their liability was based on "asterisk asterisk the car cannot drive itself".

by jrjeksjd8d

2/20/2026 at 10:21:53 PM

Autopilots in airplanes are kind of dumb (keep heading, speed, and altitude, they won’t do much anything else), which is why Tesla doesn’t use the name as branding for its full self driving software. People at least know that much.

But then again even on HN people like parent think that autopilot is the same as full self driving, when it is and always has been just smarter cruise control. The payout was for autopilot (a feature that most new cars have these days under various names), not full self driving.

by seanmcdirmid

2/20/2026 at 7:21:12 PM

According to your analogy. Certified pilot = Certified driving license holder. Its not like Tesla is advertising non driving license or in eligible person can drive using Autopilot. I wonder how can you even justify your statement

by nitinreddy88

2/20/2026 at 7:34:13 PM

Autopilot is part of a private pilots license and systems are approved by the FAA. Tesla autopilot isn't part of a driving license, nor did it undergo review by the NHTSA prior to launch because Elon considered it "legal by default".

by tapoxi

2/20/2026 at 7:16:01 PM

If the average person does not know what an autopilot does, why would they expect Tesla's 'autopilot' to take such good care of them? I am reminded of a case many years ago when a man turned on the cruise control in his RV and went to the back to make himself lunch, after which the RV went off some sort of hill or cliff.

Rudimentary 'autopilots' on aircraft have existed for about a century now, and the earlier versions (before transistorization) only controlled heading and attitude (if conditions and other settings allowed it), with little indication of failure.

by nickff

2/20/2026 at 9:40:04 PM

This would be more like they enabled cruise control, hit the brakes, and sued the manufacturer because they were rear-ended.

by tass

2/20/2026 at 8:36:07 PM

> If the average person does not know what an autopilot does

The average person does know what an autopilot does, they're just wrong.

I think the example you provided supports that.

by D-Coder

2/20/2026 at 9:46:24 PM

Not sure what "autopilot" means in a car. Is the self-parking feature called "landing gear"?

by zadikian

2/20/2026 at 9:54:10 PM

The original judgement held that the driver was 2/3 responsible, Tesla 1/3 responsible, which seems reasonable. The $243 million wasn't for causing the accident, but was a punitive amount for doing things that looked an awful lot like lying to the court and withholding evidence.

by Starman_Jones

2/20/2026 at 10:03:20 PM

This makes a lot of sense and makes the verdict seem reasonable, thanks for providing the context.

by carefree-bob

2/20/2026 at 7:56:35 PM

A “reasonable person” in a cockpit is not the same as a “reasonable person” behind the steering wheel.

Pilots undergo rigorous training with exam after exam they must pass.

No one is handed the keys to a Boeing 747 after some weekly evening course and an hours driving test.

by Gud

2/20/2026 at 9:37:10 PM

I don't mean a reasonable pilot. Would a reasonable person expect autopilot in a plane prevents a plane from crashing into something that the pilot was accelerating towards while physically overriding the controls. The claim is that autopilot should not have been able to crash even with the driver actively overriding it and accelerating into that crash.

To me, it's reasonable to assume that the "autopilot" in a car I drive (especially back in 2019) is going to defer to any input override that I provide. I wouldn't want it any other way.

by tass

2/20/2026 at 6:25:19 PM

This case will make settlement amounts higher, which is the main thing car companies care about when making decisions about driving features/marketing.

With Robotaxi it will get even higher as it will be clear 100% the company's fault.

by xiphias2

2/20/2026 at 6:41:56 PM

Fight Club 2.0: You pay to retrain it only if the AI will kill more people than our settlement fund can pay out.

by 1970-01-01

2/20/2026 at 6:59:20 PM

You're already downvoted, but this quote from Fight Club always annoyed me as it misunderstands how recalls work.

1. Insurance companies price in the risk, and insurance pricing absolutely influences manufacturers (see the absolute crap that the Big 3 sold in the 70s) 2. The government can force a recall based on a flaw whether or not the manufacturer agrees

by coredog64

2/20/2026 at 7:02:10 PM

v2.0- Tesla drivers insure with Tesla and the recalls are all OTA software fixes.

by 1970-01-01

2/20/2026 at 5:35:05 PM

Tesla Apologists: The judge/jury agreed that Tesla was "Full Self Driving" all the way to the scene of the crash.

by jqpabc123

2/20/2026 at 6:39:06 PM

If I read the article it says autopilot, not FSD.

by dekhn

2/20/2026 at 6:57:01 PM

> If I read the article it says autopilot, not FSD.

What's the difference? And does it matter?

Both are misleadingly named, per the OP:

> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”

> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.

> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.

by palmotea

2/20/2026 at 7:12:09 PM

Autopilot is similar to cruise control that is aware of other cars, and lane keeping. I would fully expect the sort of accident that happened to happen (drop phone, stop controlling vehicle, it continues through an intersection).

FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.

by dekhn

2/20/2026 at 7:57:50 PM

Autopilot is similar to cruise control that is aware of other cars, and lane keeping.

Thanks for explaining why labeling it "Autopilot" is misleading and deceptive.

by jqpabc123

2/20/2026 at 8:14:56 PM

This is not even funny anymore. You reap what you sow.

by omnimus

2/20/2026 at 10:06:56 PM

> FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.

I'd love to know where your confidence stems from.

by FireBeyond

2/20/2026 at 10:11:29 PM

My confidence comes only from what I hear people doing with the system. I have zero experience with it and consider most of the PR from Tesla to be junk.

"would not expect" is the way a cautious person demonstrates a lack of confidence.

by dekhn

2/20/2026 at 7:06:47 PM

I remember having this argument with a friend.

My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)

Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)

Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).

Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.

So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.

But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.

by atonse

2/20/2026 at 7:13:52 PM

Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking.

Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.

by Retric

2/20/2026 at 9:42:26 PM

>"Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking."

Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".

by NikolaNovak

2/20/2026 at 7:23:41 PM

Doesn’t basic airplane autopilot just maintain flight level, speed, and heading? What are some other things it can do?

by beering

2/20/2026 at 7:45:41 PM

Recover from upsets is the big thing. Maintaining flight level, speed, and heading while upside down isn’t acceptable.

Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.

by Retric

2/20/2026 at 8:31:25 PM

That's still massively simpler than making a self-driving car.

It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.

by ErroneousBosh

2/20/2026 at 8:40:16 PM

> It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude

That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...

Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.

Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.

by Retric

2/20/2026 at 7:08:52 PM

well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?

by keeganpoppen

2/20/2026 at 7:11:49 PM

Hope he sees this, bro

by selridge

2/20/2026 at 9:26:55 PM

Jeeze, save some for the rest of us!

by randyrand

2/20/2026 at 7:58:36 PM

This is 0.005 of what musk allegedly have ? He might be sad

by motbus3

2/20/2026 at 8:07:42 PM

We are all paying these large verdicts through higher product costs, lower salaries, lower stock market returns, and higher insurance rates.

by hnburnsy

2/20/2026 at 10:10:09 PM

100% correct. Also, with 40,000 deaths due to car crashes each year in the U.S. (and 2 million+ severe injuries not resulting in death), I'd consider it drop in the bucket.

by alamortsubite

2/20/2026 at 7:15:53 PM

Will this have any effect on other companies developing self driving tech? It sets a very high precedent for fines, and may discourage companies from further working on such tech.

by robotnikman

2/20/2026 at 7:31:38 PM

Developing, no, but once companies start releasing vehicles onto our shared public streets I have a lot less tolerance for launching science experiments that end up killing bystanders.

I can understand the argument that in the abstract over-regulation kills innovation but at the same time in the US the pendulum has swung so far in the other direction that it’s time for a correction.

by janalsncm

2/20/2026 at 8:36:02 PM

I have no tolerance for bystanders being killed in general. If the science experiments kill on average less bystanders I'm all for them, if they don't they should be stopped until made safer.

by Zababa

2/20/2026 at 7:19:40 PM

That's an old argument by corporations against liability. Should they not be fully liable?

It should discourage them from making unsafe products. If it's not economical for them to make safe products, it's good that they go bankrupt and the economic resources - talent, money - go to someone else. Bankruptcy and business failure are just as fundamental to capitalism as profit.

by mmooss

2/20/2026 at 7:31:20 PM

These product-liability lawsuits are out of control; perhaps this judgement is directionally correct, but the punitive damages seem insane. This reminds me of the lawsuits which drove Instant Pot bankrupt, where the users were clearly doing very stupid things, and suffered injuries because they were able to physically overpower the safety mechanisms on their pressure-cookers.

by nickff

2/20/2026 at 7:51:20 PM

> These product-liability lawsuits are out of control

Businesses also claim that, all the time. We need some evidence.

I remember doctors claiming that malpractice lawsuits were out of control; research I read said that it wasn't an economic issue for doctors and that malpractice was out of control.

by mmooss

2/20/2026 at 8:11:44 PM

I invite you to read both the claims and the judgements related to the Instant Pot lawsuits yourself; they're all quite clear, and you can come to your own decision about how reasonable they are.

My read is that people overpowered the safety interlock, after which the lid (predictably) flew off, and they were injured (mostly by the hot steam and bits of food). I think it's ridiculous for people to expect safety mechanisms to be impossible to bypass, but maybe you disagree!

by nickff

2/20/2026 at 10:25:52 PM

> My read is that people overpowered the safety interlock

And you obviously think doing so was next to impossible therefore Instant Pot shouldn't be liable.

But what evidence do you have of the difficulty of bypassing that specific safety mechanism?

by judahmeek

2/20/2026 at 10:33:03 PM

I've owned and used two of the affected models (for ~6 years in total), and also read the version of events presented by the plaintiffs (in their claims), as well as the judges' view of what happened.

by nickff

2/20/2026 at 7:07:54 PM

Good

It seems clear that "autopilot" was a boisterous overclaim of its capabilities that led to people dying.

It may be minorly absurd to win founder-IPO-level wealth in a lawsuit, but it's also clear that smaller numbers don't act as an effective deterrent to people like Elon Musk.

by bsimpson

2/20/2026 at 8:08:38 PM

I've always thought of it more as "Co-Pilot", but formally: "Autopilot" might truly be the better definition (lane-keeping, distance-keeping), whereas a "Co-Pilot" (in aviation) implies more active control, ie: pulling you up from a nose dive.

So... informally, "Tesla Co-Pilot" => "You're still the pilot but you have a helper", vs "Tesla Autopilot" => "Whelp, guess I can wash my hands and walk away b/c it's AuToMaTiC!"

...it's tough messaging for sure, especially putting these powertools into peoples hands with no formal training required. Woulda-coulda-shoulda, similar to the 737MAX crashes, should "pilots" of Teslas required training in the safety and navigation systems before they were "licensed" to use them?

by ramses0

2/20/2026 at 7:35:13 PM

Right! We demand engineering perfection! No autopilot until we guarantee it will NEVER kill a soul. Don't worry that human drivers kill humans all the time. The rubric is not better than a human driver, it is an Angelic Driver. Perfection is what we demand.

by eYrKEC2

2/20/2026 at 8:20:55 PM

Waymo drives better than most people.

Tesla Autopilot seems to mostly drive hubris. The fine print says you're still supposed to maintain control. They don't have as sophisticated sensors as competitors because Elon decreed "humans don't have LiDAR, so we don't need to pay for it."

Nobody is saying it has to be perfect, but Tesla hasn't demonstrated that it's even trying.

by bsimpson

2/20/2026 at 9:27:34 PM

I can see where they're coming from with the video-only concept, but even they admit it's not self-driving yet, so just don't call it self-driving (or "FSD**" or "autopilot") until it is.

by zadikian

2/20/2026 at 6:54:39 PM

It's crazy that they weren't reeled in by a regulator and it had to make it all the way through the court system. People are dead. A court judgement can't change that. Preemptive action would have.

by tehjoker

2/20/2026 at 6:47:34 PM

> Tesla also claimed that references to CEO Elon Musk’s statements about Autopilot during the trial misled the jury....

> The company essentially argued that references to Elon Musk’s own public claims about Autopilot, claims that Tesla actively used to sell the feature for years, were somehow unfair to present to a jury. Judge Bloom was right to reject that argument.

Of course, since Elon Musk has lied and over-promised a lot about Tesla's self-driving technology. It's an interesting defense to admit your CEO is a lair and can't be trusted.

by palmotea

2/20/2026 at 10:29:24 PM

Yeah, when the SEC pushed on the same issue, the company's response was to add fine print that Elon's statements "may be aspirational" and "may not reflect engineering realities".

by FireBeyond

2/20/2026 at 6:28:08 PM

I'm not clear on what Tesla is doing these days. They've been left in the dust on autonomous driving, they've failed to update their successful car models, and their only new model was a spectacular failure.

by standardUser

2/20/2026 at 6:33:09 PM

Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?

by mey

2/20/2026 at 6:55:22 PM

>> I'm not clear on what Tesla is doing these days.

> Ask the CEO? Based on recent incentives and acquisitions, are they planning to remain a car company?

I believe Musk wants to hype humanoid robots, because he can't get away with irrationally hyping electric cars or self-driving technology like you used to.

Tesla was never a car company, their real product is sci-fi dreams.

by palmotea

2/20/2026 at 7:04:45 PM

Agreed, and he’s already behind in humanoid robots, so the hype there won’t last long. The problem is that China is obliterating him at every turn because they actually build things that work instead of just hyping things and saying fake numbers of how much money it could be if every human on the planet bought 20.

by blackjack_

2/20/2026 at 7:01:43 PM

By which metrics has Tesla been left in the dust wrt autonomous driving? Right now they are the only brand where you can buy a car and have it do basically 90% (or sometimes 100%) of your daily driving. Sure, it's supervised, but the alternatives are literally extremely geogated taxis

by Almondsetat

2/20/2026 at 8:40:07 PM

> By which metrics has Tesla been left in the dust wrt autonomous driving

By the fact that they don't have autonomous driving. And this very judgement demonstrates that.

If you have to keep your full attention on the road at all times and constantly look out for the 10% case where the autopilot may spectacularly fail, it instantly turns off the vast majority of prospective users.

Funny enough the tech that Musk's tweets and the Tesla hype machine has been promising for the last decade is actually on the streets today. It's just being rolled out by Waymo.

by paxys

2/20/2026 at 7:09:56 PM

It's not free, is it? You buy the car, subscribe to their arbitrarily-priced subscription service, and then it does 90% of your driving.

That's like paying for a "self-juicing juicer" that only works with proprietary juice packages sold through an overpriced subscription.

Edit: Mostly a criticism. I have no bone to pick with Elon, but subscription slopware is the reason why Chinese EVs are more desirable to average Joes like me.

by bigyabai

2/20/2026 at 7:15:40 PM

This is a goalpost move

by Almondsetat

2/20/2026 at 7:27:22 PM

Tesla has a level 3 system that it's willing to gamble on not needing intervention for a handful of miles for a handful of Tesla fanboys. It's very telling that their "level 4" robotaxis are basically unicorns and only exist (existed? it's not clear they are even available anymore) in a single neighborhood subsection of the level 3 robotaxis full area in Austin.

Waymo on the other hand has a level 4 system, and has for many years, in many cities, with large service areas.

Tesla is unquestionably in the dust here, and the delusional, er, faithful are holding out for this mythical switch flip where Elon snaps his fingers and every Tesla turns into a level 4 robotaxi (despite the compute power in these cars being on the level of a GTX 5090, and the robotaxis having custom hardware loadouts)

by WarmWash

2/20/2026 at 7:45:04 PM

I don't understand the point of your reply. Waymo is geofenced taxis. You cannot buy a Waymo. It cannot drive basically wherever you want. Teslas mostly can. So, again, how is Tesla the one left in the dust?

by Almondsetat

2/20/2026 at 8:16:55 PM

Yes, Tesla leads on level 3 driving.

If you want to call this "autonomous" well then we are arguing semantics. But I think colloquially, autonomous means "no human".

by WarmWash

2/20/2026 at 6:55:19 PM

Yet Tesla is trading near its all time high.

by mr_00ff00

2/20/2026 at 6:56:06 PM

And as the lunar new year demo dance shows, China is leaving them in the dust building humanoid robots.

by bryanlarsen

2/20/2026 at 6:51:08 PM

My question too.

though they did update the model y (looks like a duck), they just cancelled the model S and X

by m463

2/20/2026 at 7:19:33 PM

Optimus robots!

In 2 years Tesla will be replacing most factory workers with fully autonomous robots that will do most of the work. This will generate trillions in revenue and is totally definitely trust me bro possible.

Expect huge updates on this coming in the near future, soon. Tesla will be the most valuable company on Earth. Get in the stock now.

(cars, solar panels, energy storage, and robotaxis are no longer part of the roadmap because optimus bots will bring in so much money in 2 years definitiely that these things won't matter so don't ask about them or think about them thanks.)

by WarmWash

2/20/2026 at 8:46:50 PM

https://www.jdpower.com/business/press-releases/2026-us-elec...

Tesla Model 3 highest overall in owner satisfaction.

"Left in the dust?"

by slowmovintarget

2/20/2026 at 8:58:03 PM

I don't understand how that's a retort against the claim that they've "been left in the dust on autonomous driving". Are you contending that autonomous driving is the only reason that Tesla owners would like their cars?

by protimewaster

2/20/2026 at 9:49:23 PM

JD power is a company you pay to give you an award.

That's why Chevy has a bunch from them, including "Highest initial quality"

by mrguyorama

2/20/2026 at 6:43:36 PM

[flagged]

by wget02

2/20/2026 at 6:41:19 PM

[flagged]

by WalterBright

2/20/2026 at 6:44:07 PM

> It's an absurd judgement.

> Consider also how many lives have been saved by the autopilot.

> Be careful what you wish for.

How many? Tell me.

by palmotea

2/20/2026 at 6:49:27 PM

It saved my life. I was standing on the Golden gate bridge looking over the edge. A Tesla model 3 pulled over and started playing baby I need your lovin at full volume. I cried and climbed into the car and it turned on the seat heater.

by breakyerself

2/20/2026 at 6:55:49 PM

There's legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here's a few to get your algorithm working:

https://www.youtube.com/watch?v=A3K410O_9Nc

https://www.youtube.com/watch?v=Qy6SplEn4hQ

https://www.youtube.com/watch?v=GcfgIltPyOA

https://www.youtube.com/watch?v=Tu2N8f3nEYc

by 1970-01-01

2/20/2026 at 7:05:05 PM

> There's legit dashcam video showing Autopilot preventing severe crashes. Go on YouTube instead of balking. Here's a few to get your algorithm working:

So? I just watched those. They don't prove anything about "lives [that] have been saved by the autopilot." They all look like scenarios a human driver could handle (and *I, personally have handled situations similar to some of those). If autopilot is saving lives, you have to show, statistically, it's better than human drivers in comparable conditions.

Also the last one appears to be of a Tesla fanboy who had just left a Tesla shareholder meeting, and seems pretty biased. I'd say his Cybertruck actually reacted pretty late to the danger. It was pretty obvious from the dashcam that something was wrong several seconds before the car reacted to it at the last second.

by palmotea

2/20/2026 at 7:16:46 PM

I can't speak for Tesla's FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: https://waymo.com/safety/impact/. They found that Waymos get into about 81% fewer crashes per mile. Compared to a statistical human driver, Waymo prevented around 411 collisions that would have resulted in any injury, and 27 collisions that would have resulted in serious injury or death. It seems like for Waymo specifically, self-driving cars are demonstrably safer than human drivers. Not sure if that generalizes to Tesla FSD, though.

by ethmarks

2/20/2026 at 7:52:38 PM

> I can't speak for Tesla's FSD specifically, but Waymo did a study on the collision rate of their autonomous cars compared to human drivers: https://waymo.com/safety/impact/. They found that Waymos get into about 81% fewer crashes per mile.

I think that's true. Though I recall that Waymo limits their cars to safer and more easily handled conditions, which is totally the right thing to do, but it probably means that statistic needs an asterisk.

> Not sure if that generalizes to Tesla FSD, though.

I don't think it does. They're two totally different systems.

by palmotea

2/20/2026 at 7:09:08 PM

The last link is literally a man stating he could not handle the situation without FSD driving him. You're experiencing cognitive dissonance with the evidence.

by 1970-01-01

2/20/2026 at 7:14:09 PM

> The last link is literally a man stating he could not handle the situation without FSD driving him. You're experiencing cognitive dissonance with the evidence.

And I have doubts about that man's reliability.

by palmotea

2/20/2026 at 6:53:46 PM

Statistics aren't collected on that. But I've read anecdotes where individuals recounted the autopilot saving them from a severe accident.

You can also google "how many lives has tesla autopilot saved?" and the results suggest that the autopilot is safer than human pilots.

by WalterBright

2/20/2026 at 7:17:37 PM

That doesn't make any sense. If I, a human, hit the brakes in time and avoid an accident today, then hit someone tomorrow, should I not be held responsible for the second incident?

by triceratops

2/20/2026 at 7:30:05 PM

The point is to compare the consequences of more deaths by not using autopilot with deaths by the autopilot.

If you accidentally kill someone with your car, do you think you should have to pay $243m?

by WalterBright

2/20/2026 at 9:26:02 PM

> If you accidentally kill someone with your car, do you think you should have to pay $243m?

It would be a large portion of my net worth for sure. Maybe also prison time. Can we put Autopilot or Tesla in a prison?

by triceratops

2/20/2026 at 9:23:58 PM

The judgement is only so high because of punitive damages for misleading marketing. Their actual liability for the collision itself is relatively low (and indeed the jury found them only 33% at fault).

by recursivecaveat

2/20/2026 at 7:53:35 PM

> If you accidentally kill someone with your car, do you think you should have to pay $243m?

If you have billions of dollars, and somehow can't go to prison, yes you should. If not in compensation to the victim, then in some kind of fine scaled to wealth. The amount paid needs to be high enough to be a deterrent to the individual who did wrong.

by palmotea

2/20/2026 at 8:51:47 PM

If this extreme judgement causes Tesla to abandon autodrive, there will be more deaths on the road. It just isn't rational.

Besides, nobody makes you turn on the autodrive.

by WalterBright

2/20/2026 at 9:25:32 PM

You haven't established that it has saved any lives beyond vague, hand waving anecdotes.

What is autodrive? Are you talking about basic autopilot, enhanced autopilot, or full self-driving? They are separate modes:

https://en.wikipedia.org/wiki/Tesla_Autopilot#Driving_featur...

Which revision of the hardware and software is the "good one"? Remember that Tesla claimed in 2016 that all Teslas in production "have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver". But that was, of course, a lie:

https://web.archive.org/web/20240730071548/https://tesla.com...

https://electrek.co/2025/10/22/tesla-changes-all-cars-have-s...

What Tesla used to claim was "full autonomy" is now called "Full Self-Driving (supervised)", whatever that's supposed to mean. How many times has "Full Self-Driving (supervised)" gone dangerously wrong but was stopped? How many times was supervision not enough:

https://electrek.co/2025/05/23/tesla-full-self-driving-veers...

Show me some concrete numbers to back your claims. If you can't do that then I think you've fallen victim to Tesla's dishonest marketing.

by breve

2/20/2026 at 6:23:27 PM

Tesla would benefit from the board replacing the CEO. It's increasingly clear that there is a problem and it's not talent, it's decision-making.

by josefritzishere

2/20/2026 at 6:25:16 PM

Their stock would crash to $10 without the hype machine

by dolphinscorpion

2/20/2026 at 6:57:51 PM

Après moi, le déluge.

by mmooss

2/20/2026 at 6:45:51 PM

It will be zero if they keep doing the same shit

by breakyerself

2/20/2026 at 7:14:14 PM

Ultimately, I believe there will need to be something catastrophic to oust musk/change leadership. And by that point, its questionable if anything worthwhile will be left to salvage.

My current bet is that optimus will fail spectacularly and Tesla gets left far behind as Rivian's R2 replaces it.

One thing I will note: I know folks that work at TSLA. Musk is more of a distraction. If he goes and if competent leadership is brought in, there's still enough people and momentum to make something happen...

by pm90

2/20/2026 at 7:11:54 PM

this is literally one of 1-3 companies who have a decent strategy in the age of AI. the rest is pretending changes will not affect them. even this judgement: the guy decided to pick the phone while driving car not capable of red light detection. It could be any other car with similar auto steer capabilities. Right now same car with OTA updates would keep him alive. Sure, they are doing something wrong.

by maxdo

2/20/2026 at 7:47:33 PM

Did mecha-Hitler tell you that?

by breakyerself

2/20/2026 at 7:23:01 PM

Will it actually? Has the market sent any signal that they won’t tolerate Musk?

You’re a lot more optimistic about this than I am.

by madeofpalk

2/20/2026 at 9:30:24 PM

Tesla has made some great cars, but their CEO is not making sound decisions . I really think a new CEO could turn around Tesla. It doesn't need to hit rock bottom first. Every major auto company has been through this.

by josefritzishere

2/20/2026 at 6:13:53 PM

This will continually be appealed until it’s reduced.

by DoesntMatter22

2/20/2026 at 6:27:14 PM

They claim have a pretrial agreement to reduce it to 3x compensatory damages (which would make the total judgemnet 160 million instead of 243 million).

Appealing is expensive because they have to post a bond with 100% collateral, and you pay for it yearly. In this case, probably around 8 million a year.

So in general its not worth appealing for 5 years unless they think they will knock off 25-30% of the judgement.

Here it's the first case of it's kind so i'm sure they will appeal, but if they lose those appeals, most companies that aren't insane would cut their losses instead of trying to fight everything.

by DannyBee

2/20/2026 at 6:22:10 PM

This was the appeal.

by LeoPanthera

2/20/2026 at 6:27:57 PM

No it wasn't, it was a motion to set aside the verdict, made before the trial judge.

The appeal will go to the 11th circuit.

by DannyBee

2/20/2026 at 6:29:53 PM

No it wasn't. This was the trial judge deciding to not reduce it. $43 million in compensatory damages is unusually high for a wrongful death.

by tiahura

2/20/2026 at 6:33:50 PM

$43 millon does not seem spectacularly high compensation for killing someone at the age of 22.

by jeffbee

2/20/2026 at 6:42:14 PM

When my spouse worked in the area of determining "the value of an individual" (economically, not morally), it was computed as present value lifetime earnings: the cumulative income of the individual, converted back to its current value (using some sort of inflation model). IIRC, the PVLE averaged out to about $1-10M.

by dekhn

2/20/2026 at 6:50:47 PM

You shouldn't be down voted. Regardless of the moral or technical issues involved, there are established formulas used to calculate damages in wrongful death civil suits. Your range is generally correct although certain factors can push it higher. (Punitive damages are a separate issue.)

by nradov

2/20/2026 at 7:17:27 PM

There are not "established formulas" or, to the extent that they are, the coefficients and exponents are not determined. The parties always argue about the discount rates and whatnot.

by jeffbee

2/20/2026 at 7:29:03 PM

Sure, no argument there, I was just referring to research like this: https://escholarship.org/uc/item/82d0550k

"""Results. At a discount rate of 3 percent, males and females aged 20-24 have the highest PVLE — $1,517,045 and $1,085,188 respectively. Lifetime earnings for men are higher than for women. Higher discount rates yield lower values at all ages."""

by dekhn

2/20/2026 at 7:02:18 PM

I generally don't complain about being downvoted, but it is always puzzling when I post a neutral fact without any judgement.

by dekhn

2/20/2026 at 10:01:00 PM

Why the f this was downvoted? Literally from the article:

> Tesla has indicated it will appeal the verdict to a higher court.

by EastSmith

2/20/2026 at 7:12:05 PM

Does the American legal system have infinite appeals?

by Hamuko

2/20/2026 at 7:15:37 PM

I'm so lost. The guy decided to pick up the phone from the floor while driving the car at high speed.

1. It could be ANY car with similar at that time auto steer capabilities. 2. Why the hate , because of some false promise ? Because as of today same car would save the guy in exact same situation, because FSD now handles red lights perfectly. Far better and safer vs ANY other tech included in the avg car price of same segment ( $40-50k).

by maxdo

2/20/2026 at 7:20:40 PM

Not sure if it’s using the same FSD decision matrix but my model S chimed at me to drive into the intersection while sitting at a red light Last night with absolutely zero possibility it saw a green light anywhere in the intersection.

Perfectly isn’t a descriptor I would use. But this is just anecdotal.

by madsmith

2/20/2026 at 7:35:18 PM

> Why the hate , because of some false promise ?

Another name for "false promise" when made for capital gain is "fraud". And when the fraud is in the context of vehicular autonomy, it becomes "fraud with reckless endangerment". And when it leads to someone's death, that makes it "proximate cause to manslaughter".

by BugsJustFindMe

2/20/2026 at 7:55:26 PM

As the source article says, the jury did agree that the driver was mostly liable. They found Tesla partially liable because they felt that Tesla's false promise led to the driver picking up his phone. If they'd been more honest about the limitations of their Autopilot system, as other companies are about their assisted driving functionalities, the driver might have realized that he needed to stop the car before picking up his phone.

by SpicyLemonZest

2/20/2026 at 8:48:37 PM

For every story like this, there are 10 stories of people who died the old fashioned way behind the wheel.

Accidents like this are obviously tragic, but let's remember that self driving software is already ~10x safer than a human. Unfortunately, lawsuits like this will slow down the rollout of this life-saving technology, resulting in a greater loss of life.

by joshfraser

2/20/2026 at 8:53:14 PM

Companies that roll it out responsibly aren’t having issues. Tesla deserves these judgments and should not be allowed to roll this software out in the irresponsible way they have been.

by bathtub365

2/20/2026 at 9:21:39 PM

Possibly, but this wasn't a self-driving car. The ruling gave Tesla some blame for falsely marketing it as one.

by zadikian

2/20/2026 at 9:07:51 PM

> self driving software is already ~10x safer than a human.

This is, technically speaking, pure bullshit. You have no proof because none exists.

by sonofhans