Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tesla’s FSD is so terrifyingly bad at routine tasks (I used it for six months before giving up) that it’s natural to assume whatever closed track testing they did was ineffective. Or perhaps they did a lot, I don’t know—but it doesn’t feel like many of those lessons learned made it into the “production” system.


I really like AP in general, but for FSD Beta, I tend to agree. I've seen enough mistakes from even the really careful Youtubers that I don't understand why its still in the field.

And those are the mistakes that they were willing to show! To be clear, I mean situations where the driver needed to take over but either didn't, or didn't in sufficient time to avoid an illegal or dangerous maneuver.


From what I can tell of people posting FSD videos on YouTube, they are actively seeking adversarial conditions with a desire to show where it fails. I’m sure there are some YouTubers that are trying to sugarcoat FSD, but I haven’t seen any.


I partially agree. I mean, I find dirty tesla's videos to be pretty fair.

Having said that, he's also been really clear that the video doesn't always make it obvious just how many aspects of FSD are just plain weird. Even when its not actively failing, it moves in odd ways that are uncomfortable.

He's also said that earlier versions resulted in curbed wheels.


Yep, my mate curbed his front passenger wheel that way. But I dunno how you fix that without LIDAR sensors mapping the terrain around the vehicle.


Agreed, that's one way. The other way is multiple downward facing cameras with stereoscopic vision of the area immediately around the car. Mobileye demonstrated a system like that and it worked really well.

Tesla's cameras have huge blind spots near the car and aren't stereoscopic in all directions. They had ultrasonic sensors at one time, but those had blind spots and major limitations too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: