Tesla Cybertruck crashes into pole while using latest Full Self-Driving software

midian182

Posts: 10,468   +140
Staff member
Facepalm: A Cybertruck owner has discovered what happens when you activate Tesla's latest Full Self-Driving system and fail to pay attention: the vehicle crashed into a pole after hitting a curb. Thankfully, the person behind the wheel was fine, and he blames himself for the incident.

Jonathan Challinger, a Florida-based software developer who works for Kraus Hamdani Aerospace, posted a photo of his Cybertruck looking a lot worse than the pole it collided with.

Challinger explained that he was running the latest FSD v13.2.4 software while traveling in a right lane. The Cybertruck failed to merge out of the lane, which was coming to an end, even though there was no one on the left. The vehicle made no attempt to slow down or turn until it had already hit the curb, sending it into a pole.

Despite narrowly avoiding what could have been serious injuries, Challinger remains a committed Tesla fan – he even thanked the company for having "the best passive safety in the world" that enabled him to walk away without a scratch.

"I don't expect it to be infallible but I definitely didn't have utility pole in my face while driving slowly on an empty road on my bingo card," Challinger said in another post.

The owner is also taking full responsibility, calling it a big fail on his part for failing to pay attention. Challinger noted that he hasn't heard of any accidents on FSD V13.

Challinger also tagged the @Tesla_AI account asking how he could ensure the company received the data from the incident, noting that the service center and others had been less than responsive. He added that he has the dashcam footage and wants to get it out there as a PSA, "but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material."

Tesla enthusiast Troy Teslike notes that although the pole is in an unusual position, he believes that the incident shows three issues with FSD: its difficulty seeing road markings at night, its failure to save some road markings in digital maps, and its unreliable detection of certain solid objects due to the use of vision-only FSD.

Tesla CEO Elon Musk said that there would be unsupervised FSD in Texas and California this year. This crash suggests otherwise.

Tesla's owner's manual states that a vehicle's cabin camera monitors continued driver attentiveness when Full Self-Driving is engaged. FSD displays a series of escalating warnings if the driver repeatedly ignores prompts to apply slight force to the steering wheel or to pay attention. If these are also ignored, FSD is disabled for the rest of the drive.

If the driver still does not resume manual steering, FSD sounds a continuous chime, turns on the warning flashers, and slows the vehicle to a complete stop.

Permalink to story:

 
I'm glad this article clarified the role that user error played in this. Every other article I've read about this has been using it as an example of how we should never have self driving cars and that Tesla is a failure of a company. Then a few used the topic to make an editorial about how bad the cyber truck is.

Good on you guys for doing some good reporting and leaving opinion out of it.
 
So the "truck" drives itself directly into a pole and the man praises the manufacturer for creating a safe vehicle?

Okay.
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?

It's called Full Self Driving mode.
Does it do that? No, it sells the words.
It fully self drove into a pole.
I would expect a PoleStar to do that.

Change the name or stop letting people test it on real roads.
 
It's called Full Self Driving mode.
Does it do that? No, it sells the words.
It fully self drove into a pole.
I would expect a PoleStar to do that.

Change the name or stop letting people test it on real roads.
>ignores safety warnings
>gets hurt

Who cares what the marketing department calls it. The system had failsafes in place, they worked as intended, user ignored them and his car got wrecked.

Also, the DoT never approved autopilot for FSD. This is the same as driving while texting and then being surprised when you get into an accident
 
>ignores safety warnings
>gets hurt

Who cares what the marketing department calls it. The system had failsafes in place, they worked as intended, user ignored them and his car got wrecked.

Also, the DoT never approved autopilot for FSD. This is the same as driving while texting and then being surprised when you get into an accident
Not to mention the driver willfully ignored the messages from the car - it's worse than texting and driving - it's texting and driving while arguing with your friend on Facetime that the car is fine...

I'm a bit suspicious of the driver's "I don't want the attention" comment though.... if that was REALLY true, perhaps a "no comment" would have sufficed when interviewed?

Regardless, everyone in the know (posters on various forums don't count) have all agreed that the car is NOT at fault and that the DRIVER is.
 
Driver only praised the passive safety because he wasn't going fast enough to really feel how a rigid vehicle reacts to much greater impacts.

Elon admitted real FSD would need a hardware upgrade that isn't possible on current models citing massive cost.

Imagine paying up to $10K to beta test a lie?
Ouch.
 
Makes sense to me. If the FSD system detects the driver is not paying attention, it disables itself. Why this is the perfect solution and to be expected from Tesla/Musk. /s

Seems to me that in this case, the safer alternative would be for the FSD to bring the vehicle to a safe stop, and then disable itself, refusing to go further under anything but manual control of the driver.

Common sense, is not to be expected from Tesla or Musk, and yet the owner of this vehicle accepts full responsibility?

Does the FSD only give visual warnings? My Prius Prime gives both visible and audible warnings. Crap like this is part of the reason that I refused to consider a Tesla for my next vehicle.
 
So the "truck" drives itself directly into a pole and the man praises the manufacturer for creating a safe vehicle?

Okay.

If somebody had engaged cruise control, had not paid attention, and then had a serious crash as a result and walked away without a scratch, they would have been 100% justified saying the same. FSD is currently a system that works in tandem with an alert driver. This person failed to do his part, and he took responsibility as he should have.
 
Driver only praised the passive safety because he wasn't going fast enough to really feel how a rigid vehicle reacts to much greater impacts.

From the images, the CT appeared to crumple quite nicely. Much of the front of the car collapsed and absorbed the energy. Not sure what more you would expect.
 
Not at all. Would the driver have seen the pole had he been paying attention? Of course. The cameras, therefore, can also see it. The software is what needs attention.
As an automotive engineer using cameras to automate assembly processes I can say industrial cameras and the associated software have a long way to go. Camera supported automation only works with SixSigma reliability under favourable, constant lighting.
 
Last edited:
Drivers like these and cars like these shouldn’t be on the road. As simple as that.
The amount of Tesla drivers not paying attention is simply appalling. More often than not, the majority of the ones I encounter on my daily commute are simply too distracted and should not operate a vehicle.
 
Back