On self-driving, Waymo is playing chess while Tesla plays checkers

Status
Not open for further replies.

powermacguy

Smack-Fu Master, in training
86
I frequently encounter Waymo vehicles, always checking to see if there is a driver or not that day (sometimes at night you will have a driver when the vehicle is on the way back to the depot). I've noticed that more and more often there is no driver, and I've quickly become impressed by the unprotected left turns the vehicles have been making in the last 9-12 months in urban settings.
 
Upvote
263 (264 / -1)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
In one of the recent threads, I commented that Waymo have approached this with orders of magnitude more responsibility, and their use of multiple sensor types means they're not reliant on one modality that is subject to common and very predictable failure modes (eg. heavy inclement weather)... and they're still running into parked cars.

There is zero chance, absolutely ZERO that Tesla releases a safe, reliable robotaxi on 8 August. It's just the usual Musk blather (only this time designed to convince shareholders to again approve the ludicrous compensation package). Remember FSD has been coming "later this year" for over a decade now.
 
Last edited:
Upvote
516 (533 / -17)

Crito

Ars Scholae Palatinae
668
Subscriptor++
From an economics perspective this makes perfect sense: have the low-effort "mundane driving work" performed by computers. At the point where there's an indication of a risky situation (a low-single-digit percentage), either switch to a remote driver or have them supervise and intervene as necessary. Even if your labor cost is twice as expensive because folks need to be computer savvy, and there's significant fixed cost in terms of mandatory wireless and additional hardware, you still win compared to missing the edge case/causing a crash ... and you get more training data in the process.
 
Upvote
73 (81 / -8)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
"This makes sense for other reasons, too. It would give Tesla time to introduce itself to local officials and offer training to local police and fire departments."

I know this isn't a quote, and it's Timothy's words - but, imagine the chutzpah needed to expect local law enforcement and first responders to adjust to your private company's playthings deployed on public roads (before they're ready), and not the other way around (ie. your vehicles need to adjust to them).
 
Upvote
156 (182 / -26)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
From an economics perspective this makes perfect sense: have the low-effort "mundane driving work" performed by computers. At the point where there's an indication of a risky situation (a low-single-digit percentage), either switch to a remote driver or have them supervise and intervene as necessary. Even if your labor cost is twice as expensive because folks need to be computer savvy, and there's significant fixed cost in terms of mandatory wireless and additional hardware, you still win compared to missing the edge case/causing a crash ... and you get more training data in the process.
Remote intervention only works for things like post-crash moving the vehicle out of the way, or where the vehicle is stuck in some situation that it has no reference for and doesn't "know" what to do. It's not a way to avoid crashes and intervene - there's not enough time for that - even a human driver behind the wheel likely doesn't have enough time to intervene most times.
 
Upvote
178 (184 / -6)

Crito

Ars Scholae Palatinae
668
Subscriptor++
I don’t see how Tesla’s “camera only” approach ever works. They need more sensors which Musk forced them to take out.

And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.

Agree with the first point though a quibble with the second.

Robotaxis scale at the point where they become cheap enough and frictionless enough that it allows semi-urban folks to go from two or three cars to one or two, or urban folks who might have a car for once-a-week trips to drop them altogether. I don't believe we're anywhere near that, but it certainly scales more than the current taxi business.

My back of the envelope math says a taxi driver earns $20/hour and has $0.50/mile in car costs, let's assume 40 miles/hour of driving (which seems high), for $40/hour total expense. If you drop the labor cost there to $1-5/hour and ensure significant reliability ("When I push the button I will get a car within n minutes"), current consumer car behaviors can change.
 
Upvote
255 (258 / -3)

solomonrex

Ars Legatus Legionis
13,516
Subscriptor++
The Amazon robostores weren't good at anything, it seems. Hundreds of cameras and they still had that problem rate? They didn't have a good selection or prices. Even with fewer items than usual, they couldn't automate it?

I also don't know why they persisted on that when rfid already exists and does most of the job.

And I think that's telling for Tesla. Even if the stores were trying something that a human literally doesn't do (spy on shoppers from the ceiling with a high accuracy rate), they threw many more cameras at the problem than Tesla and had a simpler, less dangerous task with more scrutiny for tagging/modeling.
 
Upvote
99 (104 / -5)

Tim Lee

Ars Tribunus Militum
1,901
"This makes sense for other reasons, too. It would give Tesla time to introduce itself to local officials and offer training to local police and fire departments."

I know this isn't a quote, and it's Timothy's words - but, imagine the chutzpah needed to expect local law enforcement and first responders to adjust to your private company's playthings deployed on public roads (before they're ready), and not the other way around (ie. your vehicles need to adjust to them).
I don't think it's crazy to expect that new technology will sometimes require first responders to get training. For example, when EVs were new, firefighters needed training on how battery fires differed from the gasoline-powered fires caused by ICE vehicles. Here, police officers need to know how you communicate with a vehicle that doesn't have a driver. Obviously, these companies should work hard to minimize the demands on first responders, but it's impossible to design an AV that works exactly like a human-driven car from the perspective of law enforcement.
 
Upvote
248 (254 / -6)

Tim Lee

Ars Tribunus Militum
1,901
Remote intervention only works for things like post-crash moving the vehicle out of the way, or where the vehicle is stuck in some situation that it has no reference for and doesn't "know" what to do. It's not a way to avoid crashes and intervene - there's not enough time for that - even a human driver behind the wheel likely doesn't have enough time to intervene most times.
It depends how the self-driving system is designed. Waymo's approach is that any time the vehicle thinks there's a risk of a crash it will preemptively come to a stop and wait for remote guidance. If it's sufficiently good at predicting when a crash might happen (which is obviously not trivial) this should be pretty effective at preventing crashes.
 
Upvote
111 (115 / -4)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
I don't think it's crazy to expect that new technology will sometimes require first responders to get training. For example, when EVs were new, firefighters needed training on how battery fires differed from the gasoline-powered fires caused by ICE vehicles. Here, police officers need to know how you communicate with a vehicle that doesn't have a driver. Obviously, these companies should work hard to minimize the demands on first responders, but it's impossible to design an AV that works exactly like a human-driven car from the perspective of law enforcement.
So, instead of these companies all jumping in feet first, perhaps there should be a standard developed, so first responders and law enforcement and others don't need training from every Tom, Dick and Harry that decides they want to release their vehicles on the roads?

AFAIK, there's no such standard?

EDIT: And it's not just police - fire fighters, ambulances, road workers - pretty much anyone that disrupts "ordinary" flow of traffic.
 
Last edited:
Upvote
32 (54 / -22)
Post content hidden for low score. Show…
Post content hidden for low score. Show…

Tim Lee

Ars Tribunus Militum
1,901
I get that ars loves to crap on musk (it gets oretty old), but an article like this not even mentioning that waymo is under a nhtsa investigation since a week or so ago is a little dishonest
Tesla, Cruise, and Zoox are also under NHTSA investigation. My sense is they are investigating everyone to cover their bases. Why would this be relevant?
 
Upvote
237 (241 / -4)

Tim Lee

Ars Tribunus Militum
1,901
That doesn't make sense - if it can predict a crash is coming, why can't it take steps to avoid the crash? And preemptively stopping may cause a crash (eg. rear ended), causing the very thing they're trying to prevent.

Having said that, it's good that they're being cautious - but if they don't have enough confidence in their systems such that human intervention is required when it thinks a crash is imminent, they don't belong on public roads IMHO. And that's Waymo - who are, by all accounts, lightyears ahead.... I can only shudder when imagining a Tesla robotaxi.
It seems plausible to me that "am I at risk of a crash" would be an easier question to answer than "what's the best set of steps to take to avoid the crash." Anyway I think their results speak for themselves. Waymos have gotten into crashes far less often than human-driven cars on the same roads.
 
Upvote
168 (170 / -2)

Crito

Ars Scholae Palatinae
668
Subscriptor++
Remote intervention only works for things like post-crash moving the vehicle out of the way, or where the vehicle is stuck in some situation that it has no reference for and doesn't "know" what to do. It's not a way to avoid crashes and intervene - there's not enough time for that - even a human driver behind the wheel likely doesn't have enough time to intervene most times.

Agreed it's not an "intervene in the 3 seconds before a crash" approach, but "this driver is currently in a high-risk situation by virtue of the closeness of traffic and speeds involved, let's have a highly-focused human driving remotely instead of a distracted one locally." Very curious if anyone has seen stats re: "what percentage of accidents are preventable"; found some federally-regulated motor carrier rates but they seemed remarkably low (which I'd expect from something carrying tens of thousands of pounds behind it).

Edit: ninja'd by the discussion above with Tim/balthazarr.
 
Upvote
7 (13 / -6)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
It seems plausible to me that "am I at risk of a crash" would be an easier question to answer than "what's the best set of steps to take to avoid the crash." Anyway I think their results speak for themselves. Waymos have gotten into crashes far less often than human-driven cars on the same roads.
Sure, I can see how that could be correct. It just seems like a strange way to go - this approach is probably never going to be suitable for higher-speed driving (like highways or higher-speed major urban routes), but I guess they have to deal with where the technology is at, and it IS good they're being cautious and responsible, unlike others (Uber, Tesla, etc).
 
Upvote
7 (15 / -8)
Post content hidden for low score. Show…

mobby_6kl

Ars Scholae Palatinae
1,095
That doesn't make sense - if it can predict a crash is coming, why can't it take steps to avoid the crash? And preemptively stopping may cause a crash (eg. rear ended), causing the very thing they're trying to prevent.

Having said that, it's good that they're being cautious - but if they don't have enough confidence in their systems such that human intervention is required when it thinks a crash is imminent, they don't belong on public roads IMHO. And that's Waymo - who are, by all accounts, lightyears ahead.... I can only shudder when imagining a Tesla robotaxi.

It can, it seems. Other than some recent cases where they might've hit some parked cars (haven't looked into those), Waymo seems to at least avoid causing accidents pretty well based on previous safety reviews. I doubt they manage this by just dropping this on human handlers seconds before an impact, so intervention is probably only needed when the car is stuck in a situation it doesn't know how to deal with.
 
Upvote
57 (57 / 0)
Post content hidden for low score. Show…

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
This is not true. Cars behind will maintain an appropriate stopping distance which allows them time to react, as mandated by the highway code.
Uh huh, and drivers always follow the code. They don't text behind the wheel, drive drunk, or under the influence of drugs or any other myriad reasons why humans often suck at driving.

Driving is one "edge-case" after another - that's what makes it an intractable problem. If it were just autonomous cars on the road - no other vehicles, pedestrians, animals etc. - things would be simpler by orders of magnitude.
 
Upvote
169 (169 / 0)
Post content hidden for low score. Show…
Post content hidden for low score. Show…
Post content hidden for low score. Show…

ip_what

Ars Tribunus Angusticlavius
6,181
There’s a section in here about how Waymo has less data than Tesla, but it’s higher quality because the Waymo safety driver/rote operator documents each disengagement.

I agree with the conclusion, but not necessarily how it got there. Tim anssumes Tesla is working with unlabeled disengagement data. I’d anssume differently—that everytime a Tesla autopilot disengages, it gets sent up into the cloud and over to India to be labeled. So Tesla probably does have a whole lot more labeled data.

Ok, but how good is Teslas labeling? You’d basically have to be able to label the disengagement as accidental or unneeded. Applying that label almost has to be quicker than analyzing a complex traffic scene. So incentives pile up for workers to reinforce the AI system’s confidence that it can handle situations it can’t. The way you fight this is with very good QC. Remind me, does Tesla have a history of good QC?
 
Upvote
94 (100 / -6)

linnen

Ars Tribunus Militum
2,815
Subscriptor
The part that is being ignored in these autonomous vehicle setup is the road infrastructure. I'd label it as the backend portion of driving, unconsidered and invisible. Except that it provides many sensory cues to the attentive driver. Roads will have to go beyond high-visibility lines and traffic markers, embedded reflectors, and rumble strips. Road construction and other 'temporary' traffic areas have to be considered and signaled as well.

Possibly such things as standardized RFID traffic tags and other signals that computer systems can easily recognize. But that would assume cooperative auto makers and governments that don't consider highway systems and roadways an afterthought.
 
Upvote
38 (45 / -7)

targetnovember

Ars Scholae Palatinae
988
When I read the line "Waymo has remote operators that sometimes provide guidance to its vehicles" I really wanted it to continue with " using the power, low latency, and speed of 5G!" so that some cell company executive can spontaneously climax reading the article. It just sounded like the setup of a commercial or marketing scenario Verizon or ATT has been talking about forever, and 5G was going to be amazing years ago. Now vehicle connectivity is in an article about something else and barely mentioned.
 
Upvote
48 (50 / -2)
Post content hidden for low score. Show…

linnen

Ars Tribunus Militum
2,815
Subscriptor
There’s a section in here about how Waymo has less data than Tesla, but it’s higher quality because the Waymo safety driver/rote operator documents each disengagement.

I agree with the conclusion, but not necessarily how it got there. Tim anssumes Tesla is working with unlabeled disengagement data. I’d anssume differently—that everytime a Tesla autopilot disengages, it gets sent up into the cloud and over to India to be labeled. So Tesla probably does have a whole lot more labeled data.

Ok, but how good is Teslas labeling? You’d basically have to be able to label the disengagement as accidental or unneeded. Applying that label almost has to be quicker than analyzing a complex traffic scene. So incentives pile up for workers to reinforce the AI system’s confidence that it can handle situations it can’t. The way you fight this is with very good QC. Remind me, does Tesla have a history of good QC?
Given the latest court cases in the news and here at ARS, it is just as likely crash data gets treated;
a) If there was no driver in the driver's seat, it violates TOS for the car. It is the user/owner's fault. No Problem Found in Telsa's systems, and
b) If there was a driver in the driver's seat and there was any possilbility that the driver was incapacitated, it was driver error. No Problem Found in Telsa's systems, and
c) If there was a driver in the driver's seat and there was no possilbility that the driver was incapacitated, the driver was inattentive and it was driver error. No Problem Found in Telsa's systems
 
Upvote
82 (86 / -4)
I don't think it's crazy to expect that new technology will sometimes require first responders to get training. For example, when EVs were new, firefighters needed training on how battery fires differed from the gasoline-powered fires caused by ICE vehicles. Here, police officers need to know how you communicate with a vehicle that doesn't have a driver. Obviously, these companies should work hard to minimize the demands on first responders, but it's impossible to design an AV that works exactly like a human-driven car from the perspective of law enforcement.
See, this is where I struggle. It's for the cars to manage the situation more than for first responders to learn how to co-exist with cars. If an officer is dealing with rapidly closing off a road due to a fatal collision that led to serious road hazards, they need to stop cars, make space for incoming emergency vehicles, and do it quickly. For officers to what... learn Tesla Sign Language? If an AV detects emergency lighting on the road, then it knows something is out of the norm. To me it's totally on the Teslas and Waymos of the world to sort this out.
 
Upvote
55 (65 / -10)

Eldorito

Ars Tribunus Angusticlavius
7,929
Subscriptor
I don’t see how Tesla’s “camera only” approach ever works. They need more sensors which Musk forced them to take out.

And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.

I dunno, Uber's gross revenue was $37b last year. You add up taxis and car rentals and you're looking at a couple of hundred billion. Once you start changing the concept of car ownership to rental it becomes huge.

Of course, there's also general transport industries. Moving things by truck is something like a trillion dollar industry in the US. The cost savings are so ridiculously huge that it'll change markets overnight, thus why there's such a race to be first here.
 
Upvote
61 (64 / -3)
Status
Not open for further replies.