Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So out of 10,000 tests, it’s okay if they fail on 100 of them?


As nice as it may be to think that humans are perfect, it's not like they'd score 100% on this level of testing either.


The automated driving systems will have to pass a far higher bar than human drivers as a comparison. People will get even more upset about self-driving tech causing injuries/wrecks/deaths/endangerment vs what human drivers cause.

Long after self-driving systems are superior to human drivers on average, the headlines will still scream about humans being killed by self-driving tech. The sensationalism will still sell and people will still be very outraged about it.

The expectation will be no mistakes. Anything short of that will always draw a hyper emotional negative response, which will lure in political/regulatory responses.


> Long after self-driving systems are superior to human drivers on average

For starters, that's not the correct metric. Self driving systems have to at least surpass the median driver, not the average (mean). Auto-related fatality stats are heavily skewed by a small subset of drivers who engage in very risky habits.


Why should they have to pass the median?

> Auto-related fatality stats are heavily skewed by a small subset of drivers who engage in very risky habits.

Right and it would be great if those people used self driving cars instead.


> Why should they have to pass the median?

Because you have to convince people like me to buy a self-driving car, and as long as that car is more likely to get me killed than I am, my family will remain in a car that I drive. I do not drive drunk, I avoid driving in inclement weather when not required, or at night, or when I'm really tired. I don't race, I don't road rage, I am a very defensive driver. I have not had an at-fault accident ever (in 30 years and counting since I got my license) and the only accidents I've ever been in at all were minor fender-benders.

So convince me why I should endanger myself so that you can have an unsafe computer driven auto on the road?

> Right and it would be great if those people used self driving cars instead.

So make a self-driving car for them. You will need to subsidize it, since these types of drivers are more likely than not unable to afford a fancy new toy. When the technology can finally cross the median point, then we can talk again about regular, good drivers hanging up their keys.


>People will get even more upset about self-driving tech causing injuries/wrecks/deaths/endangerment vs what human drivers cause.

Will they?

I mean for example Tobacco companies lied and the truth that we know today is that smoking is very very detrimental to their health. It's also detrimental non-smokers in society via second hand smoke, and secondary effects like cigarette butt litter. It doesn't even provide any solid utility like transportation does, it just feels good.

Not only do people still smoke today, people _start_ smoking today given all the information we have.

So when I see behavior like that, I'm not confident that people won't want FSD just because it's 'dangerous'.


You're 100% correct. People will want FSD for themselves for sure. That won't stop them from blaming the tech companies when they read articles about the cars killing people. Ralph Nader's _Unsafe At Any Speed_ tanked Corvair sales after publication, although his critiques arguably applied to other cars more than the Corvair. The sales of other, similar contemporary cars weren't affected at all.

FSD will be incredibly convenient, which means humans will always be motivated to come up a reason, valid or otherwise, that justifies their own use of the tech while allowing themselves to condemn others for mishaps incurred doing the exact same thing.

"They didn't maintain it correctly." "They didn't listen to the warnings." "They bought the wrong brand." "They weren't current on software updates."


I doubt it, the more common self driving deaths become the less newsworthy they will be.


In fact, we can even subject human drivers to the same tests and compare the results.


> As nice as it may be to think that humans are perfect, it's not like they'd score 100% on this level of testing either.

Someone posted upthread that current fatalities on something stupid like 1.5 per 100000 miles.

Humans are currently ahead in the safety stats game.


Maybe have variable points per test and have a minimum passing point total, so that an important test could fail you in its own.


Yeah we can quibble over the details. The key aspect is adversaries.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: