Marketdash

Tesla's 'Mad Max' FSD Mode Draws Sharp Safety Warning After $240 Million Autopilot Verdict

MarketDash
Prominent investor Ross Gerber calls Tesla's aggressive autonomous driving profile 'basically unsafe' as the company faces mounting legal and regulatory pressure.

Get Alphabet Inc. (Class C) Alerts

Weekly insights + SMS alerts

Ross Gerber, the co-founder of investment firm Gerber Kawasaki, isn't mincing words about Tesla Inc. (TSLA)'s autonomous driving software. On Wednesday, he issued a stern warning that cuts to the heart of Tesla's self-driving ambitions, and it comes at a particularly awkward time for the electric vehicle maker.

Gerber's comments arrive fresh off a significant legal setback for Tesla involving its Autopilot system. It's the kind of one-two punch that makes investors and customers alike sit up and take notice.

The 'Mad Max' Problem

Taking to X (the platform formerly known as Twitter), Gerber addressed some backlash from Tesla's ardent supporters while doubling down on his core concern: safety. "Tesla people seem triggered that I think it's a bad idea to have a self driving profile that is basically unsafe," Gerber stated.

The profile in question is the ominously named "Mad Max" mode, introduced in a recent FSD software update. This isn't your gentle, cautious chauffeur mode. As Gerber describes it, the mode allows for higher speeds and more frequent lane changes than even the "Hurry" profile. His warning was blunt: "Mad Max mode might actually kill someone as drives too fast and erratic."

Think about that for a second. An investor who has been broadly supportive of Tesla is publicly saying one of its software features might be lethal. That's not your everyday market commentary.

The $240 Million Backdrop

Gerber didn't make this critique in a vacuum. He directly linked it to Tesla's very expensive recent history. "Tesla just had to pay $240 million for liability for someone who died while using autopilot," he noted, referencing a high-profile Florida case.

On February 20th, U.S. District Judge Beth Bloom upheld a $243 million jury verdict against Tesla. The case stemmed from a tragic 2019 crash in Key Largo, where a Model S operating on Enhanced Autopilot accelerated through an intersection at over 60 mph, killing 22-year-old Naibel Benavides. Judge Bloom found the evidence "more than supports" the jury's finding of liability.

So when Gerber talks about safety, he's pointing at a quarter-billion-dollar reason why it matters. It's not just philosophical; it's financial and legal, with real human cost.

Get Alphabet Inc. (Class C) Alerts

Weekly insights + SMS (optional)

Is Progress Stalling?

This isn't Gerber's first rodeo critiquing Tesla's hardware-software combo. Just a few days earlier, on February 18th, he posted that "Things don't seem to be improving" regarding FSD performance. He even floated the idea that Elon Musk's company might need "hardware adjustments"—a suggestion that goes against Tesla's long-standing camera-vision-only ethos.

These comments followed reports that Tesla's small Robotaxi fleet in Austin, Texas, was involved in five crashes in a single month. That kind of track record, even for a test fleet, fuels speculation. It leads people to wonder if Tesla might eventually have to adopt LiDAR, the laser-sensor technology used by competitors like Alphabet Inc. (GOOGL) and Rivian Automotive Inc. (RIVN).

The Regulatory Squeeze

Elon Musk remains wildly optimistic about the future, touting the coming "Cybercab" and what he calls the "largest autonomous fleet." But the road there is getting bumpier. Tesla's FSD system continues to be scrutinized by the National Highway Traffic Safety Administration (NHTSA).

Back in October, the agency opened an investigation into roughly 2.88 million Tesla vehicles following reports of more than 50 safety-related incidents. That's a big number.

Now, layer on top of that the launch of an aggressive "Mad Max" driving mode. Videos have surfaced showing it weaving through traffic at speeds above 80 miles per hour. It's a feature that seems almost designed to attract regulatory attention and public concern. For a company already under the microscope for driver-assist safety, choosing to release a mode called "Mad Max" is... a bold choice.

So here's the situation: A prominent investor is sounding a loud alarm on safety following a massive legal loss. The company's self-driving tech is facing questions about whether it's actually getting better. Regulators are watching closely. And in the middle of it all, Tesla has released a software mode that encourages the car to drive like it's in a post-apocalyptic action movie. It's a fascinating, and concerning, moment for the evolution of autonomous driving.

Tesla's 'Mad Max' FSD Mode Draws Sharp Safety Warning After $240 Million Autopilot Verdict

MarketDash
Prominent investor Ross Gerber calls Tesla's aggressive autonomous driving profile 'basically unsafe' as the company faces mounting legal and regulatory pressure.

Get Alphabet Inc. (Class C) Alerts

Weekly insights + SMS alerts

Ross Gerber, the co-founder of investment firm Gerber Kawasaki, isn't mincing words about Tesla Inc. (TSLA)'s autonomous driving software. On Wednesday, he issued a stern warning that cuts to the heart of Tesla's self-driving ambitions, and it comes at a particularly awkward time for the electric vehicle maker.

Gerber's comments arrive fresh off a significant legal setback for Tesla involving its Autopilot system. It's the kind of one-two punch that makes investors and customers alike sit up and take notice.

The 'Mad Max' Problem

Taking to X (the platform formerly known as Twitter), Gerber addressed some backlash from Tesla's ardent supporters while doubling down on his core concern: safety. "Tesla people seem triggered that I think it's a bad idea to have a self driving profile that is basically unsafe," Gerber stated.

The profile in question is the ominously named "Mad Max" mode, introduced in a recent FSD software update. This isn't your gentle, cautious chauffeur mode. As Gerber describes it, the mode allows for higher speeds and more frequent lane changes than even the "Hurry" profile. His warning was blunt: "Mad Max mode might actually kill someone as drives too fast and erratic."

Think about that for a second. An investor who has been broadly supportive of Tesla is publicly saying one of its software features might be lethal. That's not your everyday market commentary.

The $240 Million Backdrop

Gerber didn't make this critique in a vacuum. He directly linked it to Tesla's very expensive recent history. "Tesla just had to pay $240 million for liability for someone who died while using autopilot," he noted, referencing a high-profile Florida case.

On February 20th, U.S. District Judge Beth Bloom upheld a $243 million jury verdict against Tesla. The case stemmed from a tragic 2019 crash in Key Largo, where a Model S operating on Enhanced Autopilot accelerated through an intersection at over 60 mph, killing 22-year-old Naibel Benavides. Judge Bloom found the evidence "more than supports" the jury's finding of liability.

So when Gerber talks about safety, he's pointing at a quarter-billion-dollar reason why it matters. It's not just philosophical; it's financial and legal, with real human cost.

Get Alphabet Inc. (Class C) Alerts

Weekly insights + SMS (optional)

Is Progress Stalling?

This isn't Gerber's first rodeo critiquing Tesla's hardware-software combo. Just a few days earlier, on February 18th, he posted that "Things don't seem to be improving" regarding FSD performance. He even floated the idea that Elon Musk's company might need "hardware adjustments"—a suggestion that goes against Tesla's long-standing camera-vision-only ethos.

These comments followed reports that Tesla's small Robotaxi fleet in Austin, Texas, was involved in five crashes in a single month. That kind of track record, even for a test fleet, fuels speculation. It leads people to wonder if Tesla might eventually have to adopt LiDAR, the laser-sensor technology used by competitors like Alphabet Inc. (GOOGL) and Rivian Automotive Inc. (RIVN).

The Regulatory Squeeze

Elon Musk remains wildly optimistic about the future, touting the coming "Cybercab" and what he calls the "largest autonomous fleet." But the road there is getting bumpier. Tesla's FSD system continues to be scrutinized by the National Highway Traffic Safety Administration (NHTSA).

Back in October, the agency opened an investigation into roughly 2.88 million Tesla vehicles following reports of more than 50 safety-related incidents. That's a big number.

Now, layer on top of that the launch of an aggressive "Mad Max" driving mode. Videos have surfaced showing it weaving through traffic at speeds above 80 miles per hour. It's a feature that seems almost designed to attract regulatory attention and public concern. For a company already under the microscope for driver-assist safety, choosing to release a mode called "Mad Max" is... a bold choice.

So here's the situation: A prominent investor is sounding a loud alarm on safety following a massive legal loss. The company's self-driving tech is facing questions about whether it's actually getting better. Regulators are watching closely. And in the middle of it all, Tesla has released a software mode that encourages the car to drive like it's in a post-apocalyptic action movie. It's a fascinating, and concerning, moment for the evolution of autonomous driving.