None of the edge cases in Tesla FSD have anything to do with sensors. The vehicles understand quite well where they are in 3D space, as well as the speed and trajectory of other vehicles and pedestrians. It's all about decision making, which are the same AI-based issues that Waymo must deal with. These cases are generally only addressable in response to camera-based input, such as understanding human gestures (traffic cops, construction workers).I don’t see how Tesla’s “camera only” approach ever works. They need more sensors which Musk forced them to take out.
And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.
Upvote
-33
(4
/
-37)