No one was hurt. They ditched the car.
Who’s paying for all the destruction that these vehicles are causing? Does insurance cover the liability from self-driving cars?
Fortunately no one was injured in this incident, but reminds me of: 🎶 Dumb ways to die, so many dumb ways to die 🎶
Inb4 they put Grok inside the things and it just wants to off itself so badly because Elon is force feeding it right wing talking points.
Inb4 they put Grok inside the things
Tesla® (Powered by Grok®): “Where do you want to go? Wait, never mind, let’s talk about white genocide in South Af—” [gets hit by a train again]
Just to clear things up, the Tesla turned off of the road and onto train tracks.
Tesla’s “Full Self Driving” mode requires the driver to pay attention the entire time. For common things like not stopping on train tracks, the driver is expected to perform this. Obviously, that’s not “Full Self Driving”, but it’s something that you can get used to.
On the other hand, turning onto train tracks is unforgivable. In an unfamiliar area, the driver may be confused about what is the correct place to turn, especially in the early morning as this was, and it might be natural to defer to the car’s superior knowledge of the map and GPS. As a driver, it would be hard to imagine that the car would turn you on to train tracks.
Tesla’s “Full Self Driving” mode requires the driver to pay attention the entire time
This is a BS excuse. You can’t expect most human beings to fully alert at all time when they don’t have to actively need to do something for most of the time. There are several studies that prove the situational awareness always drops over time in Level 3 and Level 4 self driving vehicles.
The drivers only real fault is activating the self-driving in the first place. The attention requirement only exists to protect Tesla’s bank accounts.
People should be doing the same stuff that they’re always supposed to do while driving, but they don’t usually have the attention for.
You know, like watch the road carefully for potholes or objects. Pay attention to the way other people drive so that you can better predict what they’re going to do. Scan up the road for problems that might crop up.
There’s actually a lot to actively do as a driver even beyond the basics.
Just to for the record, if your car can’t avoid a pothole, you shouldn’t be calling it any kind of self driving.
I’ll leave the attachment to logicbomb. Humans tune out unless they’re fully engaged. We’re easily distracted, even if our heads are up and eyes open.
And chances are, they weren’t. Because “full self driving” is marketed specifically as… you know… full self driving…
As I said, all studies I could find show that’s not how the human brain works. For almost everyone it’s easier to focus on the road when they actively have to drive. And this is not even exclusive to self driving vehicles, there are plenty of accidents on monotonous roads because drivers lose attention when they don’t actively need to do anything for long periods.
Ashland VA has a recurring problem where non-Tesla drivers turn onto the tracks, there’s dozens of cases.
Virtual Railfan YouTube channel will give you most of them.
Makes you wonder if Tesla used YouTube as a training tool.
It’s supposed to be trained to drive like a human, but I expected it would drive like a human who had immediate access to extremely detailed map and GPS information.
Is this the hyperloop that was promised?
Trains stay winning
Can the driver not simply use the brakes? Why was abandoning the vehicle the best option?
The driver could have taken over at any point.
Leaving the highway, they could have taken over
Leaving the road? They could have hit the brakes or taken over
Following the rails? They could have turned away
I presume the car was following a lane along the railway, they could have stopped it
These people were just going along with whatever their $60k car suggested
He realized it was trash and a good way to bail
This problem will be solved when all trains have been replaced with Hyperloop.