Anyone who gets into a "self-driving" car and lets the car have complete control on public roads... is a future Darwin Award Winner just waiting for the final conclusion.
This was apparently not a fully autonomous vehicle. It was a data gathering test vehicle and did indeed have a driver. But it was an electric car.
It was equipped with Level 2 advanced driving assistance systems (ADAS) that can take over steering, acceleration, and braking in specific scenarios. But, even though Level 2 driver support can control these primary driving tasks, the driver must remain alert and is required to actively supervise the technology at all times.
Also, since when did it become so repugnant, you would want to avoid the experience? (I make a point of driving cars that are inherently enjoyable to drive.)
In this case the car may be the factor in this deadly crash. Anyone willing to risk riding in one (with a baby) is like the people risking their lives with an experimental 💉 and giving it to their kids.
Robots have no preservation instinct. They are indifferent to everything. The technology derives from guided missiles, whose entire purpose is to end in an explosion. (I formerly worked on killer robots from outer space.)
The only answer to the question "Why did this accident happen?" is: Why not?
I really want to see a fleet of self driving cars push their limit on speed and coordination at a proving ground on a rotary (roundabout) intersection. Just imagine the pile up when 100 of those things pushing 120mph suddenly makes a slight mistake.
Apparently they test much of the AI in GTA V to get enough input data. Anyone who played GTA V knows that in certain areas with bad sight and narrow roads oncoming NPC traffic will randomly steer over to your side and frontal crash into you.
It's not a bug, it's a feature. And nobody seems to know why it happens. Perhaps self driving vehicles simply can't handle the stop maneuver in situations where the road is narrow, speed is high and you simply can't meet oncoming traffic without slowing down and go as far out to the side as possible. Self driving cars will never work in rural areas. Maybe in cities.
Anyone who gets into a "self-driving" car and lets the car have complete control on public roads... is a future Darwin Award Winner just waiting for the final conclusion.
Up there with taking COVID vaccine boosters.
It became self-aware, took a look around and decided life wasn't worth it.
It saw a photo of Hillary Clinton.
This was apparently not a fully autonomous vehicle. It was a data gathering test vehicle and did indeed have a driver. But it was an electric car.
It was equipped with Level 2 advanced driving assistance systems (ADAS) that can take over steering, acceleration, and braking in specific scenarios. But, even though Level 2 driver support can control these primary driving tasks, the driver must remain alert and is required to actively supervise the technology at all times.
The ADAS could have been a factor, but they are still investigating. Here's my source: https://www.msn.com/en-us/money/other/bmw-examines-if-fatal-crash-involved-driver-assist-system/ar-AA10Ia2k
Who is to say it simply switched on by itsself?
It would depend on how legitimate the log would be.
BMW has too much at stake to admit their car killed the occupant.
Since when did driving a car become so difficult you need a computer to do it for you?
Also, since when did it become so repugnant, you would want to avoid the experience? (I make a point of driving cars that are inherently enjoyable to drive.)
Exactly, I love the older stuff because you actually have to drive it, and that connection I enjoy.
Maybe if we could invent some way of making peoples awareness of what's going on around them better, driving becomes easy!
I've had my ass saved by good brakes and good handling.
2 biggest reasons, (IMO) 1- Safety standards that cause greater blind spots (wider pillars mainly) 2-people that text and drive
In this case the car may be the factor in this deadly crash. Anyone willing to risk riding in one (with a baby) is like the people risking their lives with an experimental 💉 and giving it to their kids.
https://www.dailymail.co.uk/news/article-11115969/Electric-self-driving-BMW-test-car-veers-oncoming-traffic-leaving-one-dead-Germany.html
Robots have no preservation instinct. They are indifferent to everything. The technology derives from guided missiles, whose entire purpose is to end in an explosion. (I formerly worked on killer robots from outer space.)
The only answer to the question "Why did this accident happen?" is: Why not?
When they target you for termination they will just make the hit via 5G & then call it a system malfunction.
This is just anti-car nuts posting this, self driving cars are 100% safe and effective. We should be buying one every two months.
Now that's some thick and meaty sarcasm.
I really want to see a fleet of self driving cars push their limit on speed and coordination at a proving ground on a rotary (roundabout) intersection. Just imagine the pile up when 100 of those things pushing 120mph suddenly makes a slight mistake.
Apparently they test much of the AI in GTA V to get enough input data. Anyone who played GTA V knows that in certain areas with bad sight and narrow roads oncoming NPC traffic will randomly steer over to your side and frontal crash into you.
It's not a bug, it's a feature. And nobody seems to know why it happens. Perhaps self driving vehicles simply can't handle the stop maneuver in situations where the road is narrow, speed is high and you simply can't meet oncoming traffic without slowing down and go as far out to the side as possible. Self driving cars will never work in rural areas. Maybe in cities.