Did you read the article?So the "truck" drives itself directly into a pole and the man praises the manufacturer for creating a safe vehicle?
Okay.
Did you read the article?So the "truck" drives itself directly into a pole and the man praises the manufacturer for creating a safe vehicle?
Okay.
FSD has not been approved for used by the DOT for use on public roads. What the driver did was literally illegal.Mate you are ignoring the documentation on the feature. FSD was NOT disabled as confirmed by the car not warning and coming to a stop. That's a fact. FSD catastrophically did not see a massive solid object in its path. That they ACKNOWLEDGED was a flaw with vision only sensor methodology. There's nothing remarkable about any of this meaning any attempt to make such a system "FSD" fraudulent, unsafe and unfit for purpose.
I have NOT been new on this. As a control engineer, I've said from day dot this is cowboy test in production catastrophically unsafe junk. Musk wants to refine it with realworld production data not do it in a safe manner.
Also FSD is selling to the US market which has an enormous amount of people with sub 6 grade literacy. You want to blame dumb people? That's a huge amount of your pop and you knew it before selling such a shonky product. It's not rocket science. This is reality.
I don't have political stand against Musk. I didn't even mention any name or any company. I was pointing out loud the common practice of current tech companies who are easily and mindlessly sells jargons. Words have meaning, they aren't arbitrary so please, tech companies, just be honest with your consumer products and label the features properly.People need to stop shifting the blame of what happened because they're made at Musk about something else. If you take the shield off an angle grinder and you get hit in the face what the blade explodes, that's on you for not knowing how to use something properly. Workmans comp won't pay for that. You don't get sue the manufacture because YOU took the shield off and got hurt.
The dude was using a piece of equipment(a vehicle) improperly and it got damaged and he could have gotten hurt. What's so confusing here?
Full Self Driving shouldn't need anyone to pay attention, if you have to pay attention as if you were driving why turn "FSD" on? The feature is pointless if it can't be relied on.If somebody had engaged cruise control, had not paid attention, and then had a serious crash as a result and walked away without a scratch, they would have been 100% justified saying the same. FSD is currently a system that works in tandem with an alert driver. This person failed to do his part, and he took responsibility as he should have.
I’m not sure how much you know about vision based automation and how much experience you have in using such industrial systems which Tesla also uses.The cameras, of course, *did* see the object (if the driver can see it, of course the camera can see it). The software, not the cameras, did not recognize the object. This has exactly zero to do with "vision only" limitations.
If you have to pay attention, why turn cruise control on? If you have to pay attention, why turn adaptive cruise control on? FSD, in it's current revision, is a very elaborate adaptive cruise control. And it makes the driver's role in the system very clear every time it is enabled.Full Self Driving shouldn't need anyone to pay attention, if you have to pay attention as if you were driving why turn "FSD" on? The feature is pointless if it can't be relied on.
"Tesla CEO Elon Musk said that there would be unsupervised FSD in Texas and California this year. " We've been hearing promises like this for a decade and it still hasn't happened.
I suppose that in order to prove your point in this case, you'd have to analyze the camera imagery and determine if the pole was visible. If it was, the software can be improved to address it. We can't just assume the issue was the cameras.I’m not sure how much you know about vision based automation and how much experience you have in using such industrial systems which Tesla also uses.
You keep on repeating the cameras are fine but the software failed, therefore “vision only” systems are fine. There is a contradiction in terms here as software is an integral part of Tesla’s FSD. If your graphics card would crash at odd times due to driver issues you would most definitely complain yet for Tesla this is completely fine, a dichotomy which makes little sense.
I’ve been using vision systems in robotic assembly lines since 2004. They are pretty decent at some things under consistent, controlled lighting conditions, and not so decent at some other uses, I could go at some depth here, maybe some other time.
As a side note, one of the things which still amaze me is the disparity in capabilities between industrial systems and general photography ones. In the real world $10000 will buy you a high performance professional camera, spotting a large, 40MP or more sensor, along with AI powered software which is perfectly capable to seamlessly identify cousin’s Vinny’s face and replace it with cousin’s Frank’s, or other incredible features. Spend the same money on an industrial system and you’d be lucky to get a 1” 5MP sensor camera, with clunky software which has some difficulty identifying a part in place unless you have good, consistent lighting.
There’s a good reason why in my company we still use laser or inductive/ capacitive sensors to indicate the parts are present and properly seated before starting the assembly process.
It is the system’s fault, that is what I’m saying. In any case, a small sensor camera is not very discriminating in its readout to begin with. Add unreliable software to the mix.I suppose that in order to prove your point in this case, you'd have to analyze the camera imagery and determine if the pole was visible. If it was, the software can be improved to address it. We can't just assume the issue was the cameras.
when I did a quick search to add a link, that one came up as well. but just wasn't as funny, having seen a Pinto on my trips to The USA - lightweight no safety feature rust buckets they were . I don't think anyone would trust their family in one , let alone themselves .![]()
No, Tesla's Cybertruck Isn't "More Explosive Than The Ford Pinto" | Carscoops
Dig into the details of a recent report and you'll find more flaws than in a Cybertruck and Pinto combinedwww.carscoops.com
Then it should not be available to use. Which is my point.FSD has not been approved for used by the DOT for use on public roads. What the driver did was literally illegal.
The Federal Dept of Transportation does not have to specifically approve it. There are many semi-autonomous systems that they have not individually approved. States can allow it.Then it should not be available to use. Which is my point.