Still questioning if Elon Musk is a white hat?
(media.greatawakening.win)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (27)
sorted by:
If someone isn't watching the auto-pilot to take manual control if it's doing something wrong that's them using it incorrectly and is not the fault of the product or the creators of the product.
From everything I've seen, dying to Tesla auto-pilot is textbook Darwinism.
Retarded thinking trying to justify a defective and DEADLY feature.
The AI cannot evaluate high-level situations far ahead and make decisions to maximize safety or plan way in advance. Any driver would slow down, but the AI cannot do such high-level evaluations of scenarios. Artificial AIs are imperfect pattern matchers, focused on interpreting the road, not emergency situations. A few seconds to take control. Suppose the car is barreling into a deadly situation. A sure-fire way of killing customers or innocent people.
You see a person with a stopped car, perhaps suicidal or unstable. But he is not on the road. You slow down a bit, be on alert. Or even change lane if you're in a bad neighborhood. If he dashes onto the road, you can avoid him easily. The AI? Might just be cruising because the road is empty. It cannot evaluate the body language of the person. Then he dashes out into the path of your car. AI drops out, telling you to take over. The AI just handed you a fucked-up situation that it allowed to happen. Even emergency braking is a poor outcome, because the AI allowed things to escalate.
Justify your argument to the innocent policeman who I believe got killed while doing their duty by the roadside. In many cases, I believe the Tesla cars did not even slow down. The cars keep going at high speed because the AI CANNOT EVALUATE OFF-NORMAL SCENARIOS, and then hand it over to you with a few seconds' notice WHEN IT HAS ALREADY FUCKED UP AND YOU'RE IN DEEP SHIT because the car is still at high speed.
IT IS A DEFECTIVE AND DEADLY FEATURE. Also, FUCK THE NHTSA for enabling manslaughter.
It's not meant to do any of those things in its current state. If you see someone on the side of the road and you're concerned they might do something you should be taking manual control immediately. Not the car's fault if you don't and let it operate in situations it's not currently meant to handle.
The cars are not fully "self driving", they have "auto-pilot". Auto-pilot is not meant to do all the work, it's meant to offload certain tasks while you maintain full control and monitor it.
Maybe in the future when the tech is improved it can be fully self-driving but you can't get there if you don't put the AI out there on the road for it to improve (though general improvments in AI will most likely also be needed). But in it's current state, it's perfectly safe if the driver is doing what they're supossed to be doing. No where near "defective", it does what it's meant to do and doesn't do what it isn't.
Your irrational hatred and fear of auto-pilot is just that; irrational.
An auto-pilot is fine for aviators. Often you can glide for a while in an emergency. It's not 100% risk-free of course, but planes are assumed to fly in empty air and pilots are trained for various emergency scenarios. I don't even think autopilots in planes are linked to radar, so in all likelihood they are just assuming empty air.
Not so vehicles. Technology is completely different. Why are you comparing apples and eggs? In a scene, there is a lot to interpret. You interpret wrong once per million hours, there will still be a lot of casualties. And that is just road pattern recognition. By plowiing into policemen by the roadside (in China, reported by ZH), Tesla's autopilot has shown an inability to intepret anything higher-level.
So, really why are you comparing apples and eggs? Why are you even making such an argument? Maybe because you have Tesla stock? I have actually read plenty of the DARPA Grand Challenge research papers, I have no doubt rollout of the tech will get lots of people killed. Nobody has a magic solution. It only sort of works in controlled conditions, for example, limited low-intensity routes.
Unleash Tesla autopilot on the general populace? Manslaughter 100%.
Bottom line, hordes of people are not going to use the Tesla "autopilot" in the proper way. The very term is misleading in the extreme.
Why, salesmen told prospective buyers that the car could almost drive itself.
Why did you think recently the two men slammed into a tree and died? Both I believe were professionals, and one was demonstrating the feature to the other. Did they remember to use it like you say it should be used? Even those two professionals have no appreciation of the limitations of the technology. So they died.
If Tesla car drivers are trained and disciplined like pilots, then sure, by all means let them try out the beta feature.
But they are not. And that means it's manslaughter.