Again, not minimizing their problems, but... Someone with a degree can more readily become a substitute teacher or coffee shop manager than someone without one. The guy who drives a delivery van is going to have a much harder time finding something else to step down to.
"Or go the Golgafrincham route." We already did if you had bothered to read the history book by Douglas Adams. The B-Ark's destination was Earth.
“If you know you can save at least one person, at least save that one. Save the one in the car,” von Hugo said in the interview. And that's why my future cars will probably be Mercedes. No way I get into any self-driving car where the car has the option of deciding I'm less valuable than some idiot that walked out in the middle of the traffic lane because they are an asshole.
1. Mercedes are for Nazi assholes. No debate on that issue; it's a proven fact. 2. You're assuming it's a binary choice. It's not.
Keep that leftist shit to the Red Room. It pretty much is a binary choice. The car's primary function should be to deliver it's passengers safe to the destination. That means the priority is always the passengers. No car should be given the ability to choose between possibly killing it's passenger or possibly killing someone outside the car. Yes the car should take care to avoid accidents like a kid running into the street but if the car is in a situation where it has to choose between hitting the kid or slamming it's passengers into a tree it should always hit the kid. Always.
You're assuming that the car will be programmed to take the second option. If, however, it isn't permitted to have that option by law, then what? Also, how often do you think that a car will have to make such a choice? Given that with inept meat monkeys at the wheel now, a person is likely to be in only one major accident in their life, I can't imagine that number going up with robots driving.
Why does the choice have to be either killing a kid or killing the passenger? What about the choice between hitting a kid or hitting the elderly woman next to him? What about the choice between hitting a kid or hitting the dog he's walking? What about the choice between hitting a kid or you know, just safely stopping? You're thinking binary. Any driving situation is not binary. I don't think AI has advanced to the point where it can effectively make those kind of judgements. Hence, the very simple fact that Mercedes Benz is going to open itself up to many, many, many lawsuits if it purposely programs its autonomous vehicles in such a way.
The car should do everything it can to safely stop or avoid an accident without hurting anyone. (Human or animal but yes humans should take priority over animals) The car should never be allowed to decide to hurt the occupants inside of it in order to avoid hurting someone outside of it. The car's first priority should always be to protect the passengers inside of it. Even if that means something outside the car gets hurt.
This is about Uber and Lyft, but it will certainly apply to self-driving cars as well. There's a link to the study in the article, BTW.
so riddle me this - if (according to the law and more importantly to my auto insurance company) I have to have both hands on the wheel anyway, then I might as well just be driving the car myself! It's like mounting your TV's remote control on top of the TV itself!
Isn't there a "level of harm" factor, in that steering into a tree or guardrail and therefore possibly injuring the passenger is better than steering into a pedestrian and almost certainly killing them while leaving the passenger unharmed?
Self-driving cars can't see Black people. Lest anyone thinks that this should be surprising, I submit to you this montage of clips from the 2009 TV series "Better Off Ted" that pointed out the problem of tech not being able to see Black people.
Really? How on earth did I get that idea from a headline that says, "Autonomous Cars Can't Recognize Pedestrians With Darker Skin Tones"? It cites work done by the UK government and Georgia Tech. But here's a bunch of links to articles from mainstream publications (as well as tech mags) all pointing out that when it comes to AI and people of color there's huge problems: https://www.wired.com/story/an-ai-run-world-needs-to-better-reflect-people-of-color/ https://time.com/5520558/artificial-intelligence-racial-gender-bias/ https://peopleofcolorintech.com/articles/meet-the-black-women-trying-to-fix-ai/ https://www.usatoday.com/story/tech...orities-needed-facial-recognition/3451932002/ https://www.washingtonpost.com/tech...tion-systems-casts-doubt-their-expanding-use/ https://www.cnet.com/news/why-facial-recognitions-racial-bias-problem-is-so-hard-to-crack/ https://towardsdatascience.com/the-...mmigrants-and-war-zone-civilians-e163cd644fe0 In short, it's a known problem in the tech industry, and they're slowly trying to address it.
The software they reviewed is 5% less likely to see a dark skinned person at night. They don't have any idea how well actual vehicles perform.
From my experiences in them over the years, I think Cruises are actually regressing. Not just relative to Waymo, but vs how they were in prior years. It really bodes poorly for the future of L4 and L5 autonomy. It's also possible that there's an uncanny valley of driving ability, and while it actually is getting better, the perception of it is getting worse.
Tech bros always seem to have a rush-in and trash-everything mentality, IME. Back in the 90s, when I worked for Borders/Waldenbooks, they installed an automation system, and instead of slowly ramping the thing up during testing, they just went fucking nuts and tried to test it out at close to full capacity. Not only was this a nightmare due to the machine malfunctioning, but we had to gather up all the shit, organize it so that it could go back into the racks (instead of being shipped out), and then do it all again the next day. If they had taken a slightly slower approach (Put a thousand or so copies of one book through the machine, and see if it got the quantity right, then start adding in different titles to see how it handled that.), it wouldn't have cost as much or taken so long to get the machine up and running.