We'll know Tesla is serious about robotaxis when it starts hiring remote operators.
See full article...
See full article...
Waymo’s safety drivers document each disengagement to help identify flaws in Waymo’s software, but Tesla customers are unlikely to do that.
So if Tesla wants to get into the taxi business, it will need a staff of mobile technicians to rescue stranded Teslas.
The "humans drive with just their eyes so AVs can too" argument does sound logical at first, but the more in depth you go, the more skeptical I am. Unfortunately it's in ways LIDAR doesn't necessarily solve.
In addition to your point about the quality of human sight, I actually wonder how much we use other senses. If you pass a car going "whump-whump-whump" from a flat tire, or "skreeeeeee!!" From dragging something metal underneath, you're likely to give it a little more space if you think a tire is about to blow or an axle is about to fall off.
An AV's cameras might see smoke bellowing from a truck's brakes, but would it know that is a potentially riskier vehicle?
Additionally, it's not a traditional sense but at least some humans will use memory while driving like "yikes, that red Tahoe is the one I passed 5 minutes ago that cannot stay in their lane. If I pass them again I'm leaving extra space and doing it quickly!"
Do any AVs "remember" other drivers and/or flag riskier drivers and behave differently around them, or do they just assume every other car on the road is playing by the same set of rules and with the same motivations?
Equilibrium is used quite a bit.Do you drive? What other sensors beside your eyes do you use? Sixth sense?
And if you say your ears, I assume you drive with your radio off?
Having driven in Massachusetts I both understand your sentiment, and also feel extra skeptical that self-driving cars will cope with it.Drive for 10 minutes in Massachusetts, and you'll know why we're all begging for non-human drivers in cars...
That also makes me doubt the Tesla approach of training on the data of a set of drivers who happen to own Tesla cars. Much of the driving will be bad or risky and I very much doubt anyone is going through and labelling that dataset by hand. So you're training your algorithm on all sorts of driving styles, not just safe driving. So you won't get safe driving behaviour out of your algorithm.Uh huh, and drivers always follow the code. They don't text behind the wheel, drive drunk, or under the influence of drugs or any other myriad reasons why humans often suck at driving.
Driving is one "edge-case" after another - that's what makes it an intractable problem. If it were just autonomous cars on the road - no other vehicles, pedestrians, animals etc. - things would be simpler by orders of magnitude.
The "made eye contact" is an excellent point where AV's all fall short - they have no way to communicate with other drivers around them.Having driven in Massachusetts I both understand your sentiment, and also feel extra skeptical that self-driving cars will cope with it.
I drive in Los Angeles county and feel pretty much the same way.
I was just dropping my daughter off at school, having read this article and half the comments before I left, and was thinking about it as I interacted with people on the road.
The driver trying to make a turn into a funky semi-alley that was blocked by a line of cars at the light for instance. I made eye contact with her and waved to let her know I was going to wait so she could get through. And then wondered how that is supposed to work in a self-driving future.
Then proceeded to navigate the drop off zone, with parents make dumb turns and decisions, kids crossing the street without any regard for anyone else, and it just feels like we're constantly in situations where your brain has to adapt.
Real world driving isn't a grid of roads and traffic signals.
Exactly. It gets funkier the further north and east you go, as roads aren't laid out well, but are paved-over cow paths. If you've ever done Boston, you'll know...Real world driving isn't a grid of roads and traffic signals.
MB is using turquoise lights. Seems like a good idea.Counter-Point: beginner drivers
Maybe AVs need to have a sign similar to those used by driver's schools for other drivers to see.
If "robotaxis" had the skills and reliability of a Secret Service protection detail driver, but with night vision, then car ownership would plummet. Why own a car (or at least as many cars as you currently own) when you could just have a car show up within a few minutes, take you where you want to go, then go away? That's the ultimate (and hopefully real before I'm too old to drive safely) goal of robocars. No more, or minimal, car ownership. Every car on the road is a taxi (if owned by a company and hailed) or limo (if owned by an individual). It's like Lyft or Uber, but with an obscenely skilled driver who never talks your ear off and isn't a potential safety risk.And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.
Real world driving isn't a grid of roads and traffic signals.
Do you drive? What other sensors beside your eyes do you use? Sixth sense?
And if you say your ears, I assume you drive with your radio off?
I would expect that Tesla receive a subpoena every time one of their vehicles is involved in a collision, and if one of their vehicles is known to have observed one.Given the latest court cases in the news and here at ARS, it is just as likely crash data gets treated;
a) If there was no driver in the driver's seat, it violates TOS for the car. It is the user/owner's fault. No Problem Found in Telsa's systems, and
b) If there was a driver in the driver's seat and there was any possilbility that the driver was incapacitated, it was driver error. No Problem Found in Telsa's systems, and
c) If there was a driver in the driver's seat and there was no possilbility that the driver was incapacitated, the driver was inattentive and it was driver error. No Problem Found in Telsa's systems
This feels like a deeply unserious response.
What part of your body processes all the information taken in by your eyes, and gives instructions to the rest of your body on how to react?
Except that those Teslas have humans behind the wheel browning their pants at regular intervals and intervening... so it's not a direct comparison between the two as to their relative capabilities, safety or, really, anything else.Now divide that by the number of vehicles of each brand on the road. In San Fransisco, Waymo has about 100 vehicles on the road at any one time (out of a fleet of 250). There are about 122,000 electric vehicles in San Francisco, most of them Teslas. Quick estimate from your numbers, Teslas are around 80 times safer (incidents per vehicle) than Waymo. Still an order of magnitude safer if you assume that only 10% of Teslas are engaging FSD at any given time.
It is bad enough that exaggeration is not necessary. The "later this year" lies didn't start until 2017:Remember FSD has been coming "later this year" for over a decade now.
I don't think we have solved kiddy-drop-off/pick-up for humans. It is a clusterfuck. The best you can do is just nuke the whole thing and walk 3 blocks.Then proceeded to navigate the drop off zone, with parents make dumb turns and decisions, kids crossing the street without any regard for anyone else, and it just feels like we're constantly in situations where your brain has to adapt.
How many of those Tesla owners paid for FSD? How many use it on a consistent basis?Now divide that by the number of vehicles of each brand on the road. In San Fransisco, Waymo has about 100 vehicles on the road at any one time (out of a fleet of 250). There are about 122,000 electric vehicles in San Francisco, most of them Teslas. Quick estimate from your numbers, Teslas are around 80 times safer (incidents per vehicle) than Waymo. Still an order of magnitude safer if you assume that only 10% of Teslas are engaging FSD at any given time.
Reasons why you might still want a carIf "robotaxis" had the skills and reliability of a Secret Service protection detail driver, but with night vision, then car ownership would plummet. Why own a car (or at least as many cars as you currently own) when you could just have a car show up within a few minutes, take you where you want to go, then go away? That's the ultimate (and hopefully real before I'm too old to drive safely) goal of robocars. No more, or minimal, car ownership. Every car on the road is a taxi (if owned by a company and hailed) or limo (if owned by an individual). It's like Lyft or Uber, but with an obscenely skilled driver who never talks your ear off and isn't a potential safety risk.
Counterpoint:None of the edge cases in Tesla FSD have anything to do with sensors. The vehicles understand quite well where they are in 3D space, as well as the speed and trajectory of other vehicles and pedestrians.
Two seconds later — just before impact — the Tesla’s forward-facing camera captures this image of the truck.
The car does not warn Banner of the obstacle. “According to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car,” the NTSB crash report says.
Meh, in Oct 2014 he claimed that next year a Tesla would be capable of self-driving 90% of the miles driving.... close enough, but point taken.It is bad enough that exaggeration is not necessary. The "later this year" lies didn't start until 2017:
https://en.wikipedia.org/wiki/Criticism_of_Tesla,_Inc.#FSD_Predictions
I am not remotely close to being ready to putting my kids in robotaxis.I don't think we have solved kiddy-drop-off/pick-up for humans. It is a clusterfuck. The best you can do is just nuke the whole thing and walk 3 blocks.
I think robotaxis could improve the situation slightly. By acting less irregular. And by not needing to match parents with kids.
I don't disagree, but I'm glad they're expanding only very, very slowly, rather than Tesla's move fast and kill things approach.Is Waymo what a scalable and rapidly expanding business looks like? Since being introduced in Phoenix in 2017 they have paid fares in 2 additional cities and are working on a third. I agree Tesla will not be introducing robo-taxis any time soon, or ever(?), but Waymo does not have a sustainable business and in a lot of ways gets too much credit. Let's also recognize that remotely operated vehicles come with their own unique set of risks.
Phoenix, AZ: Waymo began testing its autonomous vehicles in Phoenix in 2017. Phoenix became the first city where Waymo offered its fully autonomous ride-hailing service to the public .
San Francisco, CA: Waymo expanded its operations to San Francisco in 2021, offering autonomous rides initially to employees before opening the service to the public.
Los Angeles, CA: In 2024, Waymo started offering rides to the public in Los Angeles. The service began with a free ride phase, transitioning to a paid model.
Austin, TX: Also in 2024, Waymo began its autonomous ride-hailing service in Austin, following a similar phased approach as in Los Angeles.
The risks of remotely operated vehicles:
Latency issues. Delayed communication between the remote operator and the vehicle leads to accidents and near-misses.
Reliability on internet connectivity. Remotely operated vehicles rely on stable and fast internet connections. Loss of signal or connectivity issues cause the vehicle to malfunction and become stranded.
Cybersecurity risks. Remote operation creates a potential entry point for threat actors, which compromise the safety and security of the vehicle, its occupants, and other road users.
Operator fatigue and distraction. Remote operators experience boredom and distraction, leading to reduced attention and reaction times, increasing the risk of accidents.
Limited situational awareness. Remote operators viewing the road through screens cannot see the full local environment, traffic conditions, or pedestrian and cyclist behavior, which can lead to mistakes and accidents.
Dependence on human intervention. Relying on human operators doesn't work when all operators are currently engaged and busy.
Scalability limitations. As the number of remotely operated vehicles increases, the need for remote operators and supporting infrastructure grows exponentially, which becomes costly and logistically challenging.
Well, it makes complete sense to worry about the boyfriend.I am not remotely close to being ready to putting my kids in robotaxis.
It's been enough to come to terms with my daughter's boyfriend being old enough to drive them places.![]()
Sorry but until Waymo discloses how often human intervention happens and of what nature, all this analysis is limited. Their actual progress is essentially hypothetical.Wayt reported last year that Amazon’s technology—like Waymo’s—wasn’t fully automated. Amazon had more than 1,000 workers in India manually verifying customer selections. Wayt says that in mid-2022, “Just Walk Out required about 700 human reviews per 1,000 sales.”
Amazon aimed to reduce this figure to 20 to 50 human reviews per 1,000 items, but the company “repeatedly missed” its performance targets.
Could Waymo have a similar problem? I don’t know, and unsurprisingly, Waymo declined to comment about the frequency of remote interventions.
I frequently encounter Waymo vehicles, always checking to see if there is a driver or not that day (sometimes at night you will have a driver when the vehicle is on the way back to the depot). I've noticed that more and more often there is no driver, and I've quickly become impressed by the unprotected left turns the vehicles have been making in the last 9-12 months in urban settings.
As previously noted and apparently ignored subsequently, a human eye is tied to a human brain, which is an extraordinarily sensitive visual processing system tied to a first-rate capacity for spatial and inferential reasoning, application of judgment, and decision-making. And that human eye has a level of resolution and focusing power the envy of many other animal species.Do you drive? What other sensors beside your eyes do you use? Sixth sense?
And if you say your ears, I assume you drive with your radio off?
This seems like a fundamental misunderstanding of the corporation being operated without oversight by Elon Musk.But I predict that when Tesla begins its driverless transition, it will realize that safety requires a Waymo-style incremental rollout.
What's really dumb about it is he could have picked up a couple solid state LIDAR startups on the cheap back when Tesla stock was skyhigh.I don’t see how Tesla’s “camera only” approach ever works. They need more sensors which Musk forced them to take out.
And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.
I'm not sure there is a lot you could safely do to tag road infrastructure. The possible issues from bad actors could be enormous. If you can hack the environmental data tags there is all sorts of mayhem they could create.The part that is being ignored in these autonomous vehicle setup is the road infrastructure. I'd label it as the backend portion of driving, unconsidered and invisible. Except that it provides many sensory cues to the attentive driver. Roads will have to go beyond high-visibility lines and traffic markers, embedded reflectors, and rumble strips. Road construction and other 'temporary' traffic areas have to be considered and signaled as well.
Possibly such things as standardized RFID traffic tags and other signals that computer systems can easily recognize. But that would assume cooperative auto makers and governments that don't consider highway systems and roadways an afterthought.
Oh? It does? I must have missed the multi-billion dollar commuter line that someone has apparently installed between me and work when I wasn't looking.I think the solution to the problem you're describing, urban residents needing to move around within that urban environment, already exists. It's called public transit.
[OODA - Orient, Observe, Decide, Act.None of the edge cases in Tesla FSD have anything to do with sensors. The vehicles understand quite well where they are in 3D space, as well as the speed and trajectory of other vehicles and pedestrians. It's all about decision making, which are the same AI-based issues that Waymo must deal with. These cases are generally only addressable in response to camera-based input, such as understanding human gestures (traffic cops, construction workers).