After reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?
The standard formerly known as NACS is up to 1000 V.Since the North American standard has coalesced around the Tesla connector and several other manufacturers are connecting to the Supercharger network, what effect does that have on the shift to 800V?
I expect it had to do with supply chain considerations. A bit of googling suggests that datacenters were adopting 380V DC in the early 2010s, which suggests a mature ecosystem of ~400V DCAfter reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?
Cost, power electronics that can handle >500V were rare and hence expensive. That's actually why there are new 400V architectures still coming out today because while the auto industry is fairly large, it's not THAT large in the scheme of the total global economy.After reading the article, I was left with the question: if there are so many advantages to 800V, why were EVs all initially designed around 400V architectures?
All the cost of changing it for only half the benefit.Why not some intermediate jump, like 600V?
Get zapped by a Tesla Coil and you're on your way, which is maybe not far off from his hopes for Wardenclyffe TowerI want infinityV. I want to just pass by the charger at like 10 mph and have a full battery by the time the car fully passes.![]()
For better or worse, range anxiety, and along with it, charging speed, will probably be a big friction point for whatever adoption rates the US will ever experience. Watching/reading anything I can about EVs and what it's like to live with one day-to-day has settled me down on that point, but I don't know if there's enough folks curious enough to do that to move adoption forward meaningfully.All the hype about 800v charging is pertinent only when you're paying through the nose for fast DC charging, e.g., when on a long road trip, or if you don't even have 120v charging at home.
(Even our small 20A x 240v charger restores our ~70kWh Niro EV battery overnight.)
If your trips are not so much like those of a transcontinental trombone transporter, 400v delays you a few hours per year, in the worst case. Compare that to the yearly cumulative time to find a decently priced gas station and fill up.
Higher voltage generally reduces materials cost since the insulators generally cost less than the copper.I hadn't realized the economics of the hardware were still a big enough difference between 400/800 to matter.
I suppose Ford going 400V for their UEV platform works for potential EVs hitting the market in the next 5 years or so. Seems it would make since for that platform to be more modular and support iterative improvement over time. Hopefully, the drive to software defined vehicles drives similar approaches to hardware.
Look at Kia’s range of electric cars. Two - the EV6 and EV9 - are 800V. The rest are all 400V.I hadn't realized the economics of the hardware were still a big enough difference between 400/800 to matter.
Why not some intermediate jump, like 600V?
From an EV owner’s perspective, it’s also simply easier to plug in when the charging cable isn’t trying to double as a portable gym workout. Higher-voltage systems allow stations to use lighter cables, making plugging in much less like wrestling a fire hose.
One thing that may help is that NVidia is starting to build servers/AI farms using 800V to connect them together (because you can transmit more power with less loss over the same copper with the higher voltage--same as with EVs). How will this help? You'll end up with a lot more engineers and parts suppliers with a lot more experience designing at 800V--because these days when NVidia decides to do something, they pour billions of dollars into it.
https://developer.nvidia.com/blog/n...ll-power-the-next-generation-of-ai-factories/
The voltage is above 9,000!!!!!!Why stop at 800V, and keep going to 1600, or 3200?
Are that many charging points 800V only? Because to be able to put a charging cable with thinner wires, you need to accept that you are going to charge 400V vehicles slower than what they can accept (V halves, current can't increase because you have thinner cables, so power halves)
All the cost of changing it for only half the benefit.
Shouldn't you be thinking of more like 11,000V? (Spinal Tap II had us laughing the whole way through.)The voltage is above 9,000!!!!!!
One of the significant issues that wasn't discussed in the article is that as voltage goes up, arc danger goes up as well. It's hard to make an accurate analogy but it's very similar to air pressure. Image a tank with 100 psi pressure as opposed to one with 200 psi pressure. While the total energy of the batter isn't changed, you can expect more and worse unexpected energy release events during accidents or other unexpected events due to the increased voltage.
Why stop at 800V, and keep going to 1600, or 3200?
The DC lines should be unpowered until ground and both pilots are connected. I would be surprised if the communication standard of CCS and NCAS don't define that.
Please don't make me agree with Jeremy Clarkson.Worth stressing that what really matters is power.