Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Autopilot and FSD beta are not the same. The latter is currently available to maybe a couple dozen testers that are clearly very well informed about the capabilities of the system, as well as the changes in each update. If you really don't believe it, watch their hours and hours of (frankly boring) videos of analyzing the behavior of the system in complex situations while still staying on top of safety.

It does remain to be seen how well Tesla will trust the general public with this level of improved autonomy. As you get closer and closer to the uncanny valley where things just appear to work, you get into the more tricky situations that truly befuddle humans and machines alike.

NHTSA scrutinizes crashes that involve anything close to Autopilot and FSD quite heavily. Aside from one or two incidents they've had complaints about, none of them have risen to the point where they had to put their foot down. Admittedly, Tesla were a big bunch of jerks about how they handled the situation, but still, these were isolated incidents with clear misuse from the driver's part.

I agree with you, in that Musk is overly optimistic (no shit, he's been saying this would be ready in 2018, and it's unclear if it will be in 2021). But he's also quite well informed of the facts on the ground, and is clearly aiming for the moonshot-winner-take-all prize by skipping Lidar and high-precision mapping. That might be a gamble, but need not be an inherently dangerous one, depending on how Tesla handles the super-grey areas around the uncanny valley, where the system appears to work, but really isn't worth risking your life upon. To some, it's already there, as you can see from idiots sleeping in their Teslas while on Autopilot. But again, outside of a couple of incidents over years and millions and miles, the rate of catastrophic failure (accidents) has been surprisingly low.



> clearly very well informed

I completely underestimated the role of professional safety drivers for autonomous vehicles. I thought it's "just a guy" sitting in the car for good measure, but it turns out that the majority of drivers is not fit for the job even after lengthy training, see e.g. [1] (a gread podcast in general).

Also all autonomous driving companies employ safety drivers - except one.

> NHTSA scrutinizes crashes that involve anything close to Autopilot and FSD quite heavily.

I wouldn't put too much hope into the NHTSA regulating Autopilot. It took a two year legal battle to get the data driving their analysis of the Autopilot in 2017, turns out it was completely provided by Tesla, but worse, when confounders where removed, it still showed a higher crash rate for Autopilot.

If you take a non-American view of Autopilot, Europeans agencies did scrutinize the crashes more closely and as a result have restricted the use of Autopilot.

If you are interested in the topic of autonomous driving I recommend the Autonocast podcast.

[1] http://www.autonocast.com/blog/2020/10/29/205-why-teslas-ful...

[2] https://arstechnica.com/cars/2019/02/in-2017-the-feds-said-t...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: