Doubling the voltage: What 800 V architecture really changes in EVs

sword_9mm

Ars Legatus Legionis
25,723
Subscriptor
After reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?

Price?

That's what I gleamed from it.

I want infinityV. I want to just pass by the charger at like 10 mph and have a full battery by the time the car fully passes. ;)
 
Upvote
112 (115 / -3)
Wasn't there at least non-high end EV that used relays to split two 400V batteries from parallel (normal operating) to series for 800V charging? Maybe it was the other way around, which appears far more common. But I'd suspect that a cheap way to get fast charging could be an early step, especially if the cost wasn't too high and 800V+ chargers were available.
 
Upvote
23 (23 / 0)
Quote
Dr Gitlin
Dr Gitlin
Matthew not only mentioned that in this article, I even included a link to a review of just such an EV in that sentence. Non-high end is debatable, since I only really know of that feature on the GM EVs with 200+ kWh battery packs like the Silverado EV and the Escalade IQ.
Upvote
23 (23 / 0)

numerobis

Ars Tribunus Angusticlavius
50,232
Subscriptor
Since the North American standard has coalesced around the Tesla connector and several other manufacturers are connecting to the Supercharger network, what effect does that have on the shift to 800V?
The standard formerly known as NACS is up to 1000 V.

The vast majority of chargers that we need for a ~100% EV market remain to be built for the first time. The existence of some infrastructure to handle the first ~1% of cars isn't going to be much of an anchor. As cars that can handle 800V come out, the infrastructure for them will start being interesting to build.
 
Upvote
90 (92 / -2)

SetsChaos

Wise, Aged Ars Veteran
133
Not mentioned in the article but worth mentioning: the GM truck EVs use both, sort of. As I understand it, they operate at 400v but can charge at 800v. If hooked up to a 350kW charger, they link their batteries in series and thus can charge at 800v. I still think going full 800v is better for the reasons stated in the article but it's an interesting transitionary solution.
 
Upvote
21 (28 / -7)

t_newt

Ars Praefectus
3,235
Subscriptor++
Vehicles with 800V (or around 800V) batteries cannot be charged directly from 400V chargers. For reference, almost all the Tesla Superchargers are 400V.

So how do the vehicles handle using 400V chargers? Two ways:
1. They include an inverter (sometimes using the electric motor itself as part of the circuit doing the conversion) to up-convert the 400V to 800V.

2. They split the battery connection in half. Normally the two halves are connected in series, making an 800V battery. When charging at 400V, they connect them in parallel, making a 400V battery.

Most 800v vehicles use method #1, which is unfortunate, because the inverter usually has a low current capability, so a car that can charge at 350kW at a 800V charger can sometimes only charge at a very slow 50kW at a 400V charger (some cars have gotten better and can charge up to 150kW, but that is still half the speed that they are capable of). So, for example, Hyundai drivers often avoid the more plentiful Tesla superchargers.

Cars that use method 2 include the Cybertruck, the GM Hummer EV, and several (but not all) Porsche vehicles.
 
Upvote
109 (109 / 0)

Dan Homerick

Ars Praefectus
5,469
Subscriptor++
After reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?
I expect it had to do with supply chain considerations. A bit of googling suggests that datacenters were adopting 380V DC in the early 2010s, which suggests a mature ecosystem of ~400V DC
rated parts. The Tesla Roadster used a 375V pack, and came out in 2008.
 
Upvote
65 (66 / -1)

afidel

Ars Legatus Legionis
18,164
Subscriptor
After reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?
Cost, power electronics that can handle >500V were rare and hence expensive. That's actually why there are new 400V architectures still coming out today because while the auto industry is fairly large, it's not THAT large in the scheme of the total global economy.
 
Upvote
78 (78 / 0)

TenacityOverAptitude

Ars Centurion
201
Subscriptor++
All the hype about 800v charging is pertinent only when you're paying through the nose for fast DC charging, e.g., when on a long road trip, or if you don't even have 120v charging at home.
(Even our small 20A x 240v charger restores our ~70kWh Niro EV battery overnight.)

If your trips are not so much like those of a transcontinental trombone transporter, 400v delays you a few hours per year, in the worst case. Compare that to the yearly cumulative time to find a decently priced gas station and fill up.
 
Upvote
48 (63 / -15)

jerkyjones

Seniorius Lurkius
15
Subscriptor++
I hadn't realized the economics of the hardware were still a big enough difference between 400/800 to matter.

I suppose Ford going 400V for their UEV platform works for potential EVs hitting the market in the next 5 years or so. Seems it would make since for that platform to be more modular and support iterative improvement over time. Hopefully, the drive to software defined vehicles drives similar approaches to hardware.
 
Upvote
-8 (0 / -8)

brett_x

Ars Centurion
235
Subscriptor
This was a great article. I learned some of this from this video by Aging Wheels and Technology Connections. They used a Silverado EV to tow a vehicle 1/2 way across the country. Spoiler: Along the way, they lost the ability to do 800v charging due to a bug/defect in the system.
The video was interesting to me because I've haven't actually used a fast charger yet (and I had 45 minutes to spare one night.) So I got to see what that experience was like for them on a road trip.. in the worst possible scenario : towing.
 
Upvote
36 (36 / 0)

jerkyjones

Seniorius Lurkius
15
Subscriptor++
All the hype about 800v charging is pertinent only when you're paying through the nose for fast DC charging, e.g., when on a long road trip, or if you don't even have 120v charging at home.
(Even our small 20A x 240v charger restores our ~70kWh Niro EV battery overnight.)

If your trips are not so much like those of a transcontinental trombone transporter, 400v delays you a few hours per year, in the worst case. Compare that to the yearly cumulative time to find a decently priced gas station and fill up.
For better or worse, range anxiety, and along with it, charging speed, will probably be a big friction point for whatever adoption rates the US will ever experience. Watching/reading anything I can about EVs and what it's like to live with one day-to-day has settled me down on that point, but I don't know if there's enough folks curious enough to do that to move adoption forward meaningfully.
 
Upvote
15 (19 / -4)

numerobis

Ars Tribunus Angusticlavius
50,232
Subscriptor
I hadn't realized the economics of the hardware were still a big enough difference between 400/800 to matter.

I suppose Ford going 400V for their UEV platform works for potential EVs hitting the market in the next 5 years or so. Seems it would make since for that platform to be more modular and support iterative improvement over time. Hopefully, the drive to software defined vehicles drives similar approaches to hardware.
Higher voltage generally reduces materials cost since the insulators generally cost less than the copper.

The increase in components cost is likely mostly temporary; if/when 800V becomes the norm, the components will drop in price. And given the likely trajectory of copper prices, I expect that will happen.
 
Upvote
74 (74 / 0)
One of the significant issues that wasn't discussed in the article is that as voltage goes up, arc danger goes up as well. It's hard to make an accurate analogy but it's very similar to air pressure. Image a tank with 100 psi pressure as opposed to one with 200 psi pressure. While the total energy of the batter isn't changed, you can expect more and worse unexpected energy release events during accidents or other unexpected events due to the increased voltage.
 
Upvote
57 (59 / -2)

t_newt

Ars Praefectus
3,235
Subscriptor++
Another problem with 800V vehicles is that designing with 800V is hard. Designing for 800V involves new parts and design styles and methods that aren't yet well hardened. There are a lot of unexpected effects--unexpected because this is all new territory.

One example I've noticed as an engineer is the FET Safe Operating Area (SOA) which is a voltage/current curve--if you operate the part under this curve, the FET is 'safe' and won't 'blow'. I've noticed that several 100V FET companies have updated their SOA curves to include temperature affects, so it isn't a straight line, but gets progressively steeper as the votage goes up. I've noticed that the 1000V FET datasheets don't do this. If you design to these curves, your FETs will eventually short out (don't ask me how I know this). There could also be unexpected ways that voltage spikes show up, exceeding design limits in other ways.

As a result, for example, Hyundai/Kia 800V vehicles have had a terrible reliability problem with their ICCUs--often crapping out when you are in the middle of a long trip, leaving you stranded. Even the replacement ICCUs have reliability problems. I've heard the Cybertruck had problems with its ICCU too.
(There are rumors that Hyundai may have fixed the ICCU problem but aren't advertising it yet because they haven't built out enough new parts--just a rumor but I hope it is true).

One thing that may help is that NVidia is starting to build servers/AI farms using 800V to connect them together (because you can transmit more power with less loss over the same copper with the higher voltage--same as with EVs). How will this help? You'll end up with a lot more engineers and parts suppliers with a lot more experience designing at 800V--because these days when NVidia decides to do something, they pour billions of dollars into it.

https://developer.nvidia.com/blog/n...ll-power-the-next-generation-of-ai-factories/
 
Upvote
142 (145 / -3)

OlfactoriusRex

Smack-Fu Master, in training
50
Subscriptor
Going from a 2013 Nissan Leaf to a 2024 Kia EV6 was night and day. From an around-town-only car that got maybe 50mi range on a good, warm day to a roadtrippable EV with 800 V architecture that makes fast charging a breeze. By the time I plug in, pop in for a bathroom break, and return to the car, I'm usually at 80% no matter where I started charging. Consistently pulling in 160 kW or better means I'm usually on my way while other drivers that were there before me are still charging.

Only issues are 1) the battery per-conditoning is really finicky, making it hard to use that feature and get even faster charging the car is capable of, and 2) the specter of an ICCU failure haunts me and all other EV6/Ioniq 5/Ioniq 9/EV9 drivers. Really hope that issue gets a permanent fix soon.
 
Upvote
85 (85 / 0)
I hadn't realized the economics of the hardware were still a big enough difference between 400/800 to matter.
Look at Kia’s range of electric cars. Two - the EV6 and EV9 - are 800V. The rest are all 400V.

Then look at the price for all of those. There’s a reason why Kia has opted for a 400V platform for all except their highest end models.

And honestly, having owned an EV6 for over three years, and taken it on road trips… the difference in charging speeds, combined with the difference in charging costs at the fastest chargers, isn’t that much. It’s nice to plug in and watch 220kW pouring into the battery, but for the cost of those electrons, I find myself accepting a slower charge most of the time. (Acknowledging that this is very much a personal choice, and that I nonetheless have the ability to make that choice on a charge by charge basis.)
 
Upvote
59 (59 / 0)

Ryan B.

Ars Praefectus
4,093
Subscriptor++
More important than peak charging rate is the charging curve. Not that you can't have both, of course. But I personally don't think 800V is a make-or-break feature for an EV, especially if the manufacturer puts effort into maintaining peak charging rate for more of the curve.

My personal experience is with a 2018 Model 3 that I've owned since it was new. Its peak charge rate is 250 kW, which it can maintain for a few minutes. I bet a car with a 150 kW peak rate that held that rate all the way up to 80% could beat mine, tortoise-and-the-hare style.
 
Upvote
37 (38 / -1)

smitty825

Ars Centurion
246
Subscriptor++
Why not some intermediate jump, like 600V?

Cars that run at 600V are generally considered to be built on an 800V architecture. Because the battery voltage drops as capacity drops, a car doesn't run at a single voltage. So, in general, cars whose battery pack is in the 300V to 500V range when fully charged are considered 400V architecture, and cars in the 600V to 1000V are considered 800V architecture.

If you're into geeking out on EVs, each one has a different "pack voltage"
  • Chevy Blazer EV ~ 350V
  • Tesla Model 3 ~400V
  • Tesla Model S ~450V
  • Hyundai Ioniq 9 ~610V
  • Hyundai Ioniq 5 ~700V
  • Lucid Air ~925V

So, when you plug your car into a DC Fast Charger, it has to negotiate the voltage to charge the battery at and the amount of current to provide the car. (The voltage increases as the battery gets charged)
 
Last edited:
Upvote
69 (69 / 0)

fenris_uy

Ars Tribunus Angusticlavius
9,086
From an EV owner’s perspective, it’s also simply easier to plug in when the charging cable isn’t trying to double as a portable gym workout. Higher-voltage systems allow stations to use lighter cables, making plugging in much less like wrestling a fire hose.

Are that many charging points 800V only? Because to be able to put a charging cable with thinner wires, you need to accept that you are going to charge 400V vehicles slower than what they can accept (V halves, current can't increase because you have thinner cables, so power halves)
 
Upvote
32 (32 / 0)

raxx7

Ars Legatus Legionis
17,079
Subscriptor++
One thing that may help is that NVidia is starting to build servers/AI farms using 800V to connect them together (because you can transmit more power with less loss over the same copper with the higher voltage--same as with EVs). How will this help? You'll end up with a lot more engineers and parts suppliers with a lot more experience designing at 800V--because these days when NVidia decides to do something, they pour billions of dollars into it.

https://developer.nvidia.com/blog/n...ll-power-the-next-generation-of-ai-factories/

TL; DR People have tried ~300 V DC for data centres and it seems to have gone mostly nowhere.

Longer version:
The power architecture of a car and a data centre have very different requirements.
In a data centre, your loads (the CPUs and GPUs) actually run at ~1 volt.
So a data centre will invariably have mutilple steps of down conversions:
Medium voltage AC (10-60 kV) → common domestic industrial AC (120-400 V) → 12 V DC → 1 V DC.
In some cases, there's a intermediate step of 48 V DC which is basically the highest voltage deemed safe enough not to fall under electrical safety regulations.

People have proposed data centres based on 300-400 V DC intermediate voltages but they seem to have gone nowhere.
I doubt that 800 V DC intermediate voltage will be a different story.

Electric cars are quite different as there are no step downs for the main load (the motors). These run on AC with actual 400 or 800 V peaks. Which makes it very convenient to have 400 or 800 V batteries. Which makes it very convienient to charge those batteries directly at 400 or 800 V.
 
Upvote
36 (39 / -3)

raxx7

Ars Legatus Legionis
17,079
Subscriptor++
Are that many charging points 800V only? Because to be able to put a charging cable with thinner wires, you need to accept that you are going to charge 400V vehicles slower than what they can accept (V halves, current can't increase because you have thinner cables, so power halves)

There are no "800 V only" charging points.
But there are "800 V" (actually 920 V) charging points with thin wires which can only provide ~50% of their rated power to "400 V" vehicles.
E.g. charging points with 200 A cables.
 
Upvote
24 (24 / 0)

evan_s

Ars Tribunus Angusticlavius
7,314
Subscriptor
All the cost of changing it for only half the benefit.

Also 800v is just a general term. Batteries don't operate at a single voltage. The voltage goes down as the state of charge goes down. A "800v architecture" car might be 900v at full charge or it might be 700v at full charge. At 0 charge both cars would be lower voltage. Exactly what they would be at depends on how the manufacturer setup the BMS, the battery pack lay out and type of cells. That 700v car might be at near 600v when at low state of charge. The same is true for "400v" cars too. A Chevy Equinox is in the low 200v range when the state of charge is low because of the way they designed their packs. That does typically mean it needs a 350kW charger to hit it's 150kW charging speed because it needs lots of current at it's relatively low voltage and an actual 150kW or 175kW charger typically can't actually provide that much current. Right now the most obvious difference is that higher voltage packs allow higher max charging rates. ~250-300kW is about the best you can manage on a 400v architecture. All the 350kW+ charging is higher voltage because it just becomes so hard to push that much current.
 
Upvote
34 (34 / 0)
My experience with all these DC charging stations in my Hummer is that they have electrical service for around 4-800 amps continuous at the big ones. It may say 350kw at 800 volts. But that's usually half or all of the entire stations output sitting on your plug around 440 amps. Then Bob pulls up in his minivan and plugs in and you're down to 1-150kw if you're lucky.
 
Upvote
8 (15 / -7)

fenris_uy

Ars Tribunus Angusticlavius
9,086
One of the significant issues that wasn't discussed in the article is that as voltage goes up, arc danger goes up as well. It's hard to make an accurate analogy but it's very similar to air pressure. Image a tank with 100 psi pressure as opposed to one with 200 psi pressure. While the total energy of the batter isn't changed, you can expect more and worse unexpected energy release events during accidents or other unexpected events due to the increased voltage.

The DC lines should be unpowered until ground and both pilots are connected. I would be surprised if the communication standard of CCS and NCAS don't define that.
 
Upvote
15 (15 / 0)

raxx7

Ars Legatus Legionis
17,079
Subscriptor++
Why stop at 800V, and keep going to 1600, or 3200?

Increasing the voltage means thinner wires.
But it also means thicker insulation and more expensive or less efficient transistors.
It's a compromise and for a given level of power there's an optimum compromise.

Furthermore for EV charging you have a chicken and egg problem: you need 1600 V cars to take advantage of 1600 V charging stations and you need 1600 V charging stations to charge the 1600 V cars.

In this case the CCS industry made a big push to make 920 V charging stations the default at a time there were no "800 V" cars on the road.
 
Upvote
50 (50 / 0)

raxx7

Ars Legatus Legionis
17,079
Subscriptor++
The DC lines should be unpowered until ground and both pilots are connected. I would be surprised if the communication standard of CCS and NCAS don't define that.

The DC lines actually unpowered until the EVSE runs several isolation tests to make sure they're not leaking anywhere else.
 
Upvote
34 (34 / 0)