On self-driving, Waymo is playing chess while Tesla plays checkers

Status
Not open for further replies.

aikouka

Ars Scholae Palatinae
1,195
Subscriptor
I got a month of FSD free in my 2021 Model S, and I can say that it is definitely better than when I had FSD on my 2018 Model 3. Albeit, after using it probably half a dozen times, I... honestly... just don't like it.

Starting with the obvious, it still has a bad habit of driving somewhat badly. I liken it to a relatively new student driver still on their learner's permit. For example, it was making a left turn onto a road that has two lanes on its inlet: one for left turns, one for right turns. Instead of properly turning into the left inlet, it simple turned into the middle. When trying it out in a 2024 Model Y, we were going around a curve that was actually right after the inlet described above. During that turn, the car just did not take it very well, and after the apex, it on a trajectory that placed it far, far too close to the curb. I pulled it back before it even got to the curb as I did not trust it. (I had seen a tweet earlier that day about a Tesla owner whose car smacked a curb while in FSD!)

I've also seen instances where it just doesn't turn very well. I was using it on a 35 MPH road and it was taking a decent left turn, but it just kept hugging the inside line tightly. This was actually a problem that I saw a lot on my 2018 Model 3 with FSD v11 where the car would get dangerously close to lane lines during turns.

Now, for the less obvious problem, one of the problems that I have with AI and algorithms in general is that they often lack context or understanding. As a result, their operation can appear robotic and even frustrating as it doesn't really do what you want. An example that I like to use for this is actually with YouTube. When I go out to lunch by myself, I'll usually pop in my IEMs and watch a video on YouTube; however, I tend to be more selective of what I watch. I tend to look for things that are shorter and that I don't want to watch with someone else. So, I may ignore certain videos not because I'm disinterested, but because I just don't want it right now. Unfortunately, YouTube does not understand that, and if I don't do something like add it to Watch Later, I may not see it again for a while.

When it comes to FSD, I find that it's like being in a car with someone that is completely new to your area. The car isn't going to understand things like how people love to ride in the left lane around here, so... sometimes you have to do the wrong thing and pass on the right. (It's pretty annoying even without FSD being in the mix.) I guess you could say this is the difference between talking to ChatGPT with history or without. It'd be like if the guy from Memento or Drew Barrymore from Fifty First Dates was behind the wheel every time. It wouldn't understand that some areas get more congested at certain times. At best, you'd have to hope that traffic data could help, but Tesla's traffic data tends to lag behind Waze. (It's odd since it's all owned by Google.)

As a side note, one thing that I tend to worry about with it are the really weird edge cases. I've had to dodge ladders and mattresses on the interstate, and honestly... I don't want to find out what FSD would do in that case.

Waymo’s safety drivers document each disengagement to help identify flaws in Waymo’s software, but Tesla customers are unlikely to do that.

Tesla does nag you to provide a voice log whenever you disengage FSD. They can be up to 10 seconds. Now, that doesn't always guarantee a good log as one time when I was a bit annoyed with its behavior, my response was, "It was being stupid." That was a very accurate report, but not quite helpful.

So if Tesla wants to get into the taxi business, it will need a staff of mobile technicians to rescue stranded Teslas.

Tesla already has mobile technicians. They're actually far more prevalent than their service centers, but they can't do everything a service center can do. Although, they should be capable of handling a flat tire.
 
Upvote
30 (30 / 0)
Post content hidden for low score. Show…
Post content hidden for low score. Show…

Uncivil Servant

Ars Scholae Palatinae
4,667
Subscriptor
The "humans drive with just their eyes so AVs can too" argument does sound logical at first, but the more in depth you go, the more skeptical I am. Unfortunately it's in ways LIDAR doesn't necessarily solve.

In addition to your point about the quality of human sight, I actually wonder how much we use other senses. If you pass a car going "whump-whump-whump" from a flat tire, or "skreeeeeee!!" From dragging something metal underneath, you're likely to give it a little more space if you think a tire is about to blow or an axle is about to fall off.

An AV's cameras might see smoke bellowing from a truck's brakes, but would it know that is a potentially riskier vehicle?

Additionally, it's not a traditional sense but at least some humans will use memory while driving like "yikes, that red Tahoe is the one I passed 5 minutes ago that cannot stay in their lane. If I pass them again I'm leaving extra space and doing it quickly!"

Do any AVs "remember" other drivers and/or flag riskier drivers and behave differently around them, or do they just assume every other car on the road is playing by the same set of rules and with the same motivations?

Don't forget the tactile sensation through your feet, fingers, and butt. I've heard F1 drivers talk in interviews about "feeling" their way around a track, but that's a bit different since they're almost sitting on the ground. Still, I wouldn't be surprised if Lewis Hamilton or Fernando Alonso could do Monaco blindfolded.

I'm almost afraid to suggest that, because I think we all know that Fernando will absolutely try it if he hears it.
 
Upvote
20 (20 / 0)

Aurich

Director of Many Things
40,904
Ars Staff
Drive for 10 minutes in Massachusetts, and you'll know why we're all begging for non-human drivers in cars...
Having driven in Massachusetts I both understand your sentiment, and also feel extra skeptical that self-driving cars will cope with it. 😂

I drive in Los Angeles county and feel pretty much the same way.

I was just dropping my daughter off at school, having read this article and half the comments before I left, and was thinking about it as I interacted with people on the road.

The driver trying to make a turn into a funky semi-alley that was blocked by a line of cars at the light for instance. I made eye contact with her and waved to let her know I was going to wait so she could get through. And then wondered how that is supposed to work in a self-driving future.

Then proceeded to navigate the drop off zone, with parents make dumb turns and decisions, kids crossing the street without any regard for anyone else, and it just feels like we're constantly in situations where your brain has to adapt.

Real world driving isn't a grid of roads and traffic signals.
 
Upvote
32 (32 / 0)

foobacca

Smack-Fu Master, in training
87
Subscriptor++
Uh huh, and drivers always follow the code. They don't text behind the wheel, drive drunk, or under the influence of drugs or any other myriad reasons why humans often suck at driving.

Driving is one "edge-case" after another - that's what makes it an intractable problem. If it were just autonomous cars on the road - no other vehicles, pedestrians, animals etc. - things would be simpler by orders of magnitude.
That also makes me doubt the Tesla approach of training on the data of a set of drivers who happen to own Tesla cars. Much of the driving will be bad or risky and I very much doubt anyone is going through and labelling that dataset by hand. So you're training your algorithm on all sorts of driving styles, not just safe driving. So you won't get safe driving behaviour out of your algorithm.
 
Upvote
6 (7 / -1)
Post content hidden for low score. Show…

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
Having driven in Massachusetts I both understand your sentiment, and also feel extra skeptical that self-driving cars will cope with it. 😂

I drive in Los Angeles county and feel pretty much the same way.

I was just dropping my daughter off at school, having read this article and half the comments before I left, and was thinking about it as I interacted with people on the road.

The driver trying to make a turn into a funky semi-alley that was blocked by a line of cars at the light for instance. I made eye contact with her and waved to let her know I was going to wait so she could get through. And then wondered how that is supposed to work in a self-driving future.

Then proceeded to navigate the drop off zone, with parents make dumb turns and decisions, kids crossing the street without any regard for anyone else, and it just feels like we're constantly in situations where your brain has to adapt.

Real world driving isn't a grid of roads and traffic signals.
The "made eye contact" is an excellent point where AV's all fall short - they have no way to communicate with other drivers around them.

There has been some work in having AV's communicate their intentions, but a lot of driving involves getting cues from those around you - a slight nod from the cyclist indicating they've seen you and are aware of your presence, eye contact with a pedestrian about to cross, confirming you've seen them, etc.

A lot of driving is anticipation, and a lot of that anticipation comes from subtle cues that AVs just won't get (including, as mentioned above, audible clues about a vehicle's current state).
 
Upvote
21 (21 / 0)

ZebulonPi

Ars Scholae Palatinae
793
Real world driving isn't a grid of roads and traffic signals.
Exactly. It gets funkier the further north and east you go, as roads aren't laid out well, but are paved-over cow paths. If you've ever done Boston, you'll know... :D Plus, what about construction? I can barely determine what they want me to do, where the detours lead, etc., there's no WAY FSD can handle that without a ton of metadata that it's not going to be getting via vision, or LIDAR for that matter.

Driving is human-based, because humans do it. It's not optimized for machine learning. I think they're coming at this backwards, and need to optimizing the CONDITIONS for ML driving, like NFC and "smart" roads and condition monitoring, and then build off THAT, not try to Frankenstein together something based on how humans do it.
 
Upvote
15 (15 / 0)

bigsnake499

Ars Scholae Palatinae
1,079
OK, all of this FSD is trying to solve the wrong problem. Even if you don't have to drive the vehicle yourself, here is a 5-seat, very heavy vehicle on the road when there's only one passenger. Traffic is still going to be bad and you're still going to spend 45mins on the road at rush hour. Yeah you might be to save a few percentage points of CO2 if it is an EV but the improvement is not worth the cost and complexity.
 
Upvote
0 (5 / -5)

dr_eeew

Wise, Aged Ars Veteran
116
And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.
If "robotaxis" had the skills and reliability of a Secret Service protection detail driver, but with night vision, then car ownership would plummet. Why own a car (or at least as many cars as you currently own) when you could just have a car show up within a few minutes, take you where you want to go, then go away? That's the ultimate (and hopefully real before I'm too old to drive safely) goal of robocars. No more, or minimal, car ownership. Every car on the road is a taxi (if owned by a company and hailed) or limo (if owned by an individual). It's like Lyft or Uber, but with an obscenely skilled driver who never talks your ear off and isn't a potential safety risk.
 
Upvote
8 (10 / -2)

Uncivil Servant

Ars Scholae Palatinae
4,667
Subscriptor
Real world driving isn't a grid of roads and traffic signals.

You know who translates abstract intellectual concepts into a workable, feasible solution for the real world?

The policy team.

I'm obviously biased, but I just still can't get over what an immense own-goal that was.
 
Upvote
18 (18 / 0)

Jordan83

Ars Tribunus Angusticlavius
6,098
Do you drive? What other sensors beside your eyes do you use? Sixth sense?

And if you say your ears, I assume you drive with your radio off?

This feels like a deeply unserious response.

What part of your body processes all the information taken in by your eyes, and gives instructions to the rest of your body on how to react?
 
Upvote
30 (30 / 0)
Post content hidden for low score. Show…

sd70mac

Ars Tribunus Militum
2,600
Subscriptor
Given the latest court cases in the news and here at ARS, it is just as likely crash data gets treated;
a) If there was no driver in the driver's seat, it violates TOS for the car. It is the user/owner's fault. No Problem Found in Telsa's systems, and
b) If there was a driver in the driver's seat and there was any possilbility that the driver was incapacitated, it was driver error. No Problem Found in Telsa's systems, and
c) If there was a driver in the driver's seat and there was no possilbility that the driver was incapacitated, the driver was inattentive and it was driver error. No Problem Found in Telsa's systems
I would expect that Tesla receive a subpoena every time one of their vehicles is involved in a collision, and if one of their vehicles is known to have observed one.
 
Upvote
4 (4 / 0)

Uncivil Servant

Ars Scholae Palatinae
4,667
Subscriptor
This feels like a deeply unserious response.

What part of your body processes all the information taken in by your eyes, and gives instructions to the rest of your body on how to react?

There are technically two answers to this, depending on levels of nitric acid and blood flow...
 
Upvote
2 (5 / -3)
Post content hidden for low score. Show…

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
Now divide that by the number of vehicles of each brand on the road. In San Fransisco, Waymo has about 100 vehicles on the road at any one time (out of a fleet of 250). There are about 122,000 electric vehicles in San Francisco, most of them Teslas. Quick estimate from your numbers, Teslas are around 80 times safer (incidents per vehicle) than Waymo. Still an order of magnitude safer if you assume that only 10% of Teslas are engaging FSD at any given time.
Except that those Teslas have humans behind the wheel browning their pants at regular intervals and intervening... so it's not a direct comparison between the two as to their relative capabilities, safety or, really, anything else.
 
Upvote
23 (24 / -1)
Then proceeded to navigate the drop off zone, with parents make dumb turns and decisions, kids crossing the street without any regard for anyone else, and it just feels like we're constantly in situations where your brain has to adapt.
I don't think we have solved kiddy-drop-off/pick-up for humans. It is a clusterfuck. The best you can do is just nuke the whole thing and walk 3 blocks.

I think robotaxis could improve the situation slightly. By driving less irregular. Paying better attention. And by not needing to match parents with kids.
 
Upvote
11 (11 / 0)
D

Deleted member 228006

Guest
Now divide that by the number of vehicles of each brand on the road. In San Fransisco, Waymo has about 100 vehicles on the road at any one time (out of a fleet of 250). There are about 122,000 electric vehicles in San Francisco, most of them Teslas. Quick estimate from your numbers, Teslas are around 80 times safer (incidents per vehicle) than Waymo. Still an order of magnitude safer if you assume that only 10% of Teslas are engaging FSD at any given time.
How many of those Tesla owners paid for FSD? How many use it on a consistent basis?
 
Upvote
16 (17 / -1)

foobacca

Smack-Fu Master, in training
87
Subscriptor++
If "robotaxis" had the skills and reliability of a Secret Service protection detail driver, but with night vision, then car ownership would plummet. Why own a car (or at least as many cars as you currently own) when you could just have a car show up within a few minutes, take you where you want to go, then go away? That's the ultimate (and hopefully real before I'm too old to drive safely) goal of robocars. No more, or minimal, car ownership. Every car on the road is a taxi (if owned by a company and hailed) or limo (if owned by an individual). It's like Lyft or Uber, but with an obscenely skilled driver who never talks your ear off and isn't a potential safety risk.
Reasons why you might still want a car
  • You have young kids. So you have child seats permanently in your car. And maybe blankets, an organiser on the back of the front seat ... That will be a faff to set up every time a new car turns up.
  • You have back pain so you have something attached to your seat to make you more comfortable.
  • You have a load of stuff for your job permanently in your car.
  • You just love one particular type of car and don't want just whatever turns up
  • Etcetera, lots of things you might want permanently in your car that wouldn't be in a taxi.
I'm generally a bicycle person and would love to have less cars parked everywhere but I have a car with child seats and the rest.

Though I can definitely see the taxi thing working for many people.
 
Upvote
14 (16 / -2)

Aurich

Director of Many Things
40,904
Ars Staff
None of the edge cases in Tesla FSD have anything to do with sensors. The vehicles understand quite well where they are in 3D space, as well as the speed and trajectory of other vehicles and pedestrians.
Counterpoint:

https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/
Two seconds later — just before impact — the Tesla’s forward-facing camera captures this image of the truck.

The car does not warn Banner of the obstacle. “According to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car,” the NTSB crash report says.

Yes, humans drive with their eye cameras, but "a human can do it therefore a computer can do it" isn't a statement that holds up to any level of scrutiny in 2024.
 
Upvote
37 (39 / -2)

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
Upvote
15 (15 / 0)

Aurich

Director of Many Things
40,904
Ars Staff
I don't think we have solved kiddy-drop-off/pick-up for humans. It is a clusterfuck. The best you can do is just nuke the whole thing and walk 3 blocks.

I think robotaxis could improve the situation slightly. By acting less irregular. And by not needing to match parents with kids.
I am not remotely close to being ready to putting my kids in robotaxis.

It's been enough to come to terms with my daughter's boyfriend being old enough to drive them places. 😂
 
Upvote
22 (22 / 0)
Post content hidden for low score. Show…

balthazarr

Ars Tribunus Angusticlavius
6,838
Subscriptor++
Is Waymo what a scalable and rapidly expanding business looks like? Since being introduced in Phoenix in 2017 they have paid fares in 2 additional cities and are working on a third. I agree Tesla will not be introducing robo-taxis any time soon, or ever(?), but Waymo does not have a sustainable business and in a lot of ways gets too much credit. Let's also recognize that remotely operated vehicles come with their own unique set of risks.

Phoenix, AZ: Waymo began testing its autonomous vehicles in Phoenix in 2017. Phoenix became the first city where Waymo offered its fully autonomous ride-hailing service to the public .

San Francisco, CA: Waymo expanded its operations to San Francisco in 2021, offering autonomous rides initially to employees before opening the service to the public.

Los Angeles, CA: In 2024, Waymo started offering rides to the public in Los Angeles. The service began with a free ride phase, transitioning to a paid model.

Austin, TX: Also in 2024, Waymo began its autonomous ride-hailing service in Austin, following a similar phased approach as in Los Angeles.


The risks of remotely operated vehicles:
Latency issues. Delayed communication between the remote operator and the vehicle leads to accidents and near-misses.

Reliability on internet connectivity. Remotely operated vehicles rely on stable and fast internet connections. Loss of signal or connectivity issues cause the vehicle to malfunction and become stranded.

Cybersecurity risks. Remote operation creates a potential entry point for threat actors, which compromise the safety and security of the vehicle, its occupants, and other road users.

Operator fatigue and distraction. Remote operators experience boredom and distraction, leading to reduced attention and reaction times, increasing the risk of accidents.

Limited situational awareness. Remote operators viewing the road through screens cannot see the full local environment, traffic conditions, or pedestrian and cyclist behavior, which can lead to mistakes and accidents.

Dependence on human intervention. Relying on human operators doesn't work when all operators are currently engaged and busy.

Scalability limitations. As the number of remotely operated vehicles increases, the need for remote operators and supporting infrastructure grows exponentially, which becomes costly and logistically challenging.
I don't disagree, but I'm glad they're expanding only very, very slowly, rather than Tesla's move fast and kill things approach.

I also suspect the human remote intervention is only a stop-gap measure - will they ever get past that requirement ("ever" meaning in some reasonable timeframe)? 🤷‍♂️ But I trust their go-slow approach much more than Tesla's beta test on all roads everywhere at once approach.
 
Upvote
9 (10 / -1)
I am not remotely close to being ready to putting my kids in robotaxis.

It's been enough to come to terms with my daughter's boyfriend being old enough to drive them places. 😂
Well, it makes complete sense to worry about the boyfriend.

Teenagers have about 3x the number of accidents and fatalities per mile. They are unsafe, on average, and it's a bit of a gamble riding with one. They are inexperienced, and impulsive, and they have no track record. (And as you might guess, the rates are a lot worse for boys in particular.)

Robotaxis will be the opposite, even if they don't feel that way right now.
 
Last edited:
Upvote
14 (15 / -1)

brewejon

Ars Scholae Palatinae
1,285
Wayt reported last year that Amazon’s technology—like Waymo’s—wasn’t fully automated. Amazon had more than 1,000 workers in India manually verifying customer selections. Wayt says that in mid-2022, “Just Walk Out required about 700 human reviews per 1,000 sales.”

Amazon aimed to reduce this figure to 20 to 50 human reviews per 1,000 items, but the company “repeatedly missed” its performance targets.

Could Waymo have a similar problem? I don’t know, and unsurprisingly, Waymo declined to comment about the frequency of remote interventions.
Sorry but until Waymo discloses how often human intervention happens and of what nature, all this analysis is limited. Their actual progress is essentially hypothetical.

Also, this brings up the old (by now) joke that AI stands for ‘absent Indians’. AI companies are proving time and again that their systems are more mechanical Turk than automated system.
 
Upvote
-8 (3 / -11)
I frequently encounter Waymo vehicles, always checking to see if there is a driver or not that day (sometimes at night you will have a driver when the vehicle is on the way back to the depot). I've noticed that more and more often there is no driver, and I've quickly become impressed by the unprotected left turns the vehicles have been making in the last 9-12 months in urban settings.

I just saw a Waymo swerve out of its lane this morning. I was shocked because they are usually so timid and precise, but I realized that another car looked like it would pull out in front of it and it was the same protective maneuver a good human driver would make.
 
Upvote
15 (15 / 0)
Post content hidden for low score. Show…

Snark218

Ars Legatus Legionis
36,436
Subscriptor
Do you drive? What other sensors beside your eyes do you use? Sixth sense?

And if you say your ears, I assume you drive with your radio off?
As previously noted and apparently ignored subsequently, a human eye is tied to a human brain, which is an extraordinarily sensitive visual processing system tied to a first-rate capacity for spatial and inferential reasoning, application of judgment, and decision-making. And that human eye has a level of resolution and focusing power the envy of many other animal species.

If a system lacks those advantages, and is attempting to accurately navigate a complex environment without the benefit of judgement or reasoning on camera input from sensors I wouldn't use to take a picture of my junk, you best believe it needs some lidar and ultrasound and some other ways to distinguish objects and obstacles in conditions of poor visibility, unpredictable behavior by other road users, poorly painted and signed roads, and other situations in which a human can use reasoning and prior experience to make the right call.
 
Upvote
11 (12 / -1)

ianmcf

Ars Scholae Palatinae
633
But I predict that when Tesla begins its driverless transition, it will realize that safety requires a Waymo-style incremental rollout.
This seems like a fundamental misunderstanding of the corporation being operated without oversight by Elon Musk.

It hasn't "realized that safety requires" anything. This is why there have been so many NTSB investigations and so much loss of life surrounding this company. Whether it's safe or not is just not a significant consideration for TSLA. Whether it juices the stock price is, and whether the Musk Rat gets to claim he's right, brilliant, essential, regardless of any evidence to the contrary.
 
Upvote
3 (5 / -2)

motytrah

Ars Tribunus Militum
2,942
Subscriptor++
I don’t see how Tesla’s “camera only” approach ever works. They need more sensors which Musk forced them to take out.

And “robotaxis” is not some liquid gold industry! It is a pretty niche business that “doesn’t scale” because the taxi business just isn’t that big.
What's really dumb about it is he could have picked up a couple solid state LIDAR startups on the cheap back when Tesla stock was skyhigh.
 
Upvote
0 (2 / -2)

shopsinc

Seniorius Lurkius
15
Subscriptor++
The part that is being ignored in these autonomous vehicle setup is the road infrastructure. I'd label it as the backend portion of driving, unconsidered and invisible. Except that it provides many sensory cues to the attentive driver. Roads will have to go beyond high-visibility lines and traffic markers, embedded reflectors, and rumble strips. Road construction and other 'temporary' traffic areas have to be considered and signaled as well.

Possibly such things as standardized RFID traffic tags and other signals that computer systems can easily recognize. But that would assume cooperative auto makers and governments that don't consider highway systems and roadways an afterthought.
I'm not sure there is a lot you could safely do to tag road infrastructure. The possible issues from bad actors could be enormous. If you can hack the environmental data tags there is all sorts of mayhem they could create.

I can also see rich communities getting their police to mark the entire area as under construction or a crime scene so no "outside" AVs will come in.

I think a safer, and cheaper, approach might be to have the municipal data available in the cloud and provided by a "trusted" vendor.
 
Upvote
5 (5 / 0)
Post content hidden for low score. Show…

Rindan

Ars Tribunus Militum
2,239
Subscriptor
I think the solution to the problem you're describing, urban residents needing to move around within that urban environment, already exists. It's called public transit.
Oh? It does? I must have missed the multi-billion dollar commuter line that someone has apparently installed between me and work when I wasn't looking.

Restructuring American cities and towns to rebuild them around public transit might be the better solution, but I unfortunately live in a political reality where that isn't going to happen. Tens of million of Americans are not going to move and restructure their cities and towns around a shinny new public transit system that connects everything. It's fine to be upset about that reality, as it is in fact upsetting, but that doesn't change the fact that that is the reality we live in.

We are not going to restructure American cities and build a massive public transit network across America that results in most commuters taking rail or a bus to work. We however can envision a future where the cars that they take to work are all small electric commuter cars that consume a tiny fraction of the energy a full sized car does.

I'll take a pretty good solution that we can implement in the near future, over a perfect solution that no American political observer with two neurons to rub together can easily predict won't happen anytime soon.
 
Last edited:
Upvote
6 (8 / -2)

linnen

Ars Tribunus Militum
2,815
Subscriptor
None of the edge cases in Tesla FSD have anything to do with sensors. The vehicles understand quite well where they are in 3D space, as well as the speed and trajectory of other vehicles and pedestrians. It's all about decision making, which are the same AI-based issues that Waymo must deal with. These cases are generally only addressable in response to camera-based input, such as understanding human gestures (traffic cops, construction workers).
[OODA - Orient, Observe, Decide, Act.
Orient - Even with GPS and inertial tracking error will still creep into the system. Still this can be considered as close to being solved as any computer problem. (Still keeping in mind GIGO.)
Observe - where we are now. "Is that a painted line? Is that the painted line currently in use? Where does the side of the road start and can it be used?" That kind of thing.
Decide - again, where we are now. "A being chases an object into the current path. Map a safe (or path of minimal damage to everything) path to continue or map a safe (or path of minimal damage to everything) path to stop."
Act - keeping in mind GIGO edge cases, pretty much solved.
 
Upvote
-1 (0 / -1)
Status
Not open for further replies.