Google bumps up Q Day estimate to 2029, far sooner than previously thought

Varste

Ars Praetorian
533
Subscriptor
Awesome, can't wait for none of the services I use to update their encryption methods (especially looking at you, local credit union). I also wonder just how bad state-on-state cyberattacks will get, or if quantum computers will be useful in that area too. I will admit I don't understand them at any significant level, so that's a complete guess.
Also will this undermine cryptocurrencies at all? Can they change the way they are encrypted, or will it require forking them?
 
Upvote
78 (79 / -1)

foobacca

Smack-Fu Master, in training
87
Subscriptor++
Does this mean Google expect someone to use a quantum computer to crack keys in 2029? Or that 2029 is their deadline to finish the changeover, and a crack isn't expected until later? I've read the article and the linked post and I can read it either way, but it sounds like the latter to me, but I'm really not sure.

The Google blog post says (emphasis mine):

Google’s introducing a 2029 timeline to secure the quantum era with post-quantum cryptography (PQC) migration.

It doesn't mention an estimate for Q day.

Edit to add quote.
 
Upvote
75 (75 / 0)

equals42

Ars Scholae Palatinae
1,215
Subscriptor++
We started offering PQC a year or so back. I rarely get more than slight interest from customers even though it’s fairly trivial to implement. There’s no cost to them for it either. This is something that needs more airtime to get IT layman and security folks more interested. Announcements like this help bring the topic into the conversation. I’d think that having your data-at-rest encrypted using PQC would make your company a less enticing target for nation state attackers willing to invest in longterm plans.
 
Upvote
107 (107 / 0)
So, someone may be able to claim Satoshi's BTC stash by 2030? That's gonna be a day.
Apparently it's not just the Satoshi stash that vulnerable, but up to 25% of all BTC's ever minted. Everything associated with a private key that predates the current practice of generating a new key for each transaction.

Does this mean Google expect someone to use a quantum computer to crack keys in 2029? Or that 2029 is their deadline to finish the changeover, and a crack isn't expected until later? I've read the article and the linked post and I can read it either way, but it sounds like the latter to me, but I'm really not sure.
It's their internal goal. They don't say when they expect quantum computers to become a risk to traditional encryption.
 
Upvote
69 (69 / 0)
Any time I read a story like this now, I wonder "is this prediction real, or is it an attempt to manipulate either the stock price or the prediction markets?"
Did you read the source blog post from Google? Does that read like a market manipulation strategy to you?
 
Upvote
-17 (8 / -25)

Sarty

Ars Tribunus Angusticlavius
7,814
Any time I read a story like this now, I wonder "is this prediction real, or is it an attempt to manipulate either the stock price or the prediction markets?"
I take no position on the question as you put it--I don't feel remotely qualified--but it's remarkable to think back to how much trust we used to put in Google as an organization, circa 2005 or whatever. Back then, I would have found your question insane. They're the good guys!
 
Upvote
83 (87 / -4)
Apparently it's not just the Satoshi stash that vulnerable, but up to 25% of all BTC's ever minted. Everything associated with a private key that predates the current practice of generating a new key for each transaction.
I'd assume active accounts will migrate their holdings to safe wallets, but duly noted it's not only the 1 million of Satoshi coins that may be affected as there will be people who forget to migrate, or died without a plan to transfer those assets, or any of the many ways BTC can end up "lost", including those from the guy who wanted to dig up a trash dump in search of a hard drive and even had venture capital secured to fund the search.
 
Upvote
28 (28 / 0)

motytrah

Ars Tribunus Militum
2,942
Subscriptor++
Apparently it's not just the Satoshi stash that vulnerable, but up to 25% of all BTC's ever minted. Everything associated with a private key that predates the current practice of generating a new key for each transaction.
My money is the big names in crypto collude on the dormant coin issue in a way that enriches themselves. They are kind of damned if they do, damned if they don't.
 
Upvote
23 (24 / -1)

Dachannien

Ars Scholae Palatinae
1,132
Subscriptor
Google has a quantum computing division. Implying they're close to some kind of breakthrough could absolutely juice their stock.
Maybe, but they actually explain the point in worrying now: Store-now-decrypt-later attacks can only really be mitigated by migrating systems to PQC. The sooner you do that, the smaller your data vulnerability surface is (in a timewise sense). If you get compromised in the future and your encrypted data gets exfiltrated, you're much better off if that data was protected with PQC. Your future vulnerability without PQC is by definition shorter if you implement now rather than later.

Based on that logic, the reason to pick, say, 2029 as a good must-implement date is because of the naturally decaying value of store-now-decrypt-later data. Even if QC isn't successful until 2039, deploying by 2029 means any vulnerable data would be 10 years old (and 10 years less valuable) by the time it gets cracked. The fact that they didn't pick a date even sooner just speaks to the monumental bulk of the task at hand.
 
Upvote
117 (117 / 0)
Google has a quantum computing division. Implying they're close to some kind of breakthrough could absolutely juice their stock.
Yet again - read the bloody source. There doesn't seem any implication there about some kind of breakthrough - this is media spin about a perfectly rational precaution to a potential threat.
 
Upvote
11 (21 / -10)
Awesome, can't wait for none of the services I use to update their encryption methods (especially looking at you, local credit union).
Depends on how their regulatory agency prioritizes it and what their audits report. And what their board of directors choose to risk accept.

It’s funny because credit unions have more money than banks (two separate entities, CUs pay less taxes and more and more offer the same services as banks with less “membership” requirements) but all that goes to salaries not infrastructure improvements.

Also, you’d be disheartened how wonky the financial cores and providers are, technologically, especially the Federal Reserve that we all have to interact with. Hell, the CU might just be using some other provider’s solution and has no direct control over what that provider offers.

Source: financial IT for like 2 decades and counting…
 
Upvote
18 (18 / 0)
We started offering PQC a year or so back. I rarely get more than slight interest from customers even though it’s fairly trivial to implement. There’s no cost to them for it either. This is something that needs more airtime to get IT layman and security folks more interested. Announcements like this help bring the topic into the conversation. I’d think that having your data-at-rest encrypted using PQC would make your company a less enticing target for nation state attackers willing to invest in longterm plans.
Is it a checkbox I can enable on windows server? On the SFTP servers we run that have their own cipher suites? Is this something godaddy lets me check off for my wildcard certificate? Does Veeam offer this for their encrypted backups of VMs? Does my firewall’s VPN offer this as a protocol that can connect to my azure environment?

Ok some random vendor is offering some type of encryption. Now I gotta do third party due diligence on… who exactly? And trust this extra vendor to keep up with any needed patches if issues are found on the encryption they offer?

Internal scans already have me disabling cipher suites and whatever else every so often. Nothing will change until internal scan tools peg these as vulnerabilities and my existing vendors have a solution in place.
 
Upvote
21 (21 / 0)
As someone who continues to use a lot of older—but still technically supported—machines, I'm curious if support for these new encryption methods will require some kind of hardware acceleration block to use? Like, will performance be so terrible or not even possible on a computer from ≥10yrs ago?
 
Upvote
5 (5 / 0)

Varste

Ars Praetorian
533
Subscriptor
Depends on how their regulatory agency prioritizes it and what their audits report. And what their board of directors choose to risk accept.

It’s funny because credit unions have more money than banks (two separate entities, CUs pay less taxes and more and more offer the same services as banks with less “membership” requirements) but all that goes to salaries not infrastructure improvements.

Also, you’d be disheartened how wonky the financial cores and providers are, technologically, especially the Federal Reserve that we all have to interact with. Hell, the CU might just be using some other provider’s solution and has no direct control over what that provider offers.

Source: financial IT for like 2 decades and counting…
Oh I've heard the horrors on what underlies our financial institutions, probably from commenters on this very site! Archaic COBOLian texts, decipherable by only the grayest beards, or something like that. Is there an easy way for us laymen to even get an understanding of what service our banks might be using? Or any way to know they are keeping our info stored properly? AFAIK it's all just built on "trust us".
 
Upvote
7 (7 / 0)
As someone who continues to use a lot of older—but still technically supported—machines, I'm curious if support for these new encryption methods will require some kind of hardware acceleration block to use? Like, will performance be so terrible or not even possible on a computer from ≥10yrs ago?
There won't be much of a performance hit for PCs - the keys and signatures are all much bigger, and there is a processing overhead, but not anything that would cause even a 15 year old PC to struggle. Also, it's only the SSL handshake that's affected - certificate verification and key exchange. Data streaming won't be affected because that uses AES, which is still good against quantum (though to be on the safe side use 256 bit keys).

The real problem is with resource-restricted systems, such as smartcards, which do not (cheaply) support the RAM and I/O required to process the massive signatures efficiently. Coupled with this is the fact that these systems are often deployed in "hostile" environments, with attackers capable of using all kinds of side-channel analysis to crack keys. Due to the immaturity of the new quantum safe algorithms, it is not yet apparent how well they will stand up to side channel analysis, or how they can be hardened against such threats.

Even before Google's new estimate, it was thought that migration to quantum safe should be underway by 2030 - not least because the lifetime of some of the systems released in 2030 might go well beyond 2035, a reasonably conservative timescale for quantum computing to have a significant impact.
 
Upvote
18 (18 / 0)
Does this mean Google expect someone to use a quantum computer to crack keys in 2029? Or that 2029 is their deadline to finish the changeover, and a crack isn't expected until later? I've read the article and the linked post and I can read it either way, but it sounds like the latter to me, but I'm really not sure.

The Google blog post says (emphasis mine):



It doesn't mention an estimate for Q day.

Edit to add quote.
The US Federal government set January 2, 2030 as the date for PQ readiness for all Federal information systems and suppliers of said systems. This date was set by executive order by the previous administration, and was repeated unaltered by the current administration. While the exact determination of why that date was chosen is unknown, there are a lot of factors that may have contributed to it.

  • The complexity of updating systems to PQ algorithms is expensive and time consuming. It can't be done overnight. They've selected a date that's aggressive but achievable.
  • The US date was almost certainly provided by the NSA, an intelligence organization built to not only obtain such information, but also to build exactly this kind of equipment.
  • Other Western nations (Great Britain and others) have set PQ readiness dates between 2030 and 2035.
  • As a global leader in providing backbone computing services, Google may have received private advice from the NSA. It's possible AWS and Azure have also received similar private intel.
  • China's publicly announced spending on quantum computers in 2025 exceeded the sum of all other countries' publicly announced research by double. They also have a vast number of people with the skills and talents needed to devvelop quantum computers, and seemingly limitless resources. We (the public) have no way of knowing what China's intelligence agencies have spent in secret, but we can expect the NSA has been watching as carefully as possible.
  • We have no way of knowing when Q-day will actually occur until an academic or commercial researcher publishes an example. One analyst said (in 2024) that there is a 1 in 6 chance it already has happened in secret.
  • Microsoft's Majorana One chip (announced in February 2025) has been undergoing testing to figure out if what they've built created any majorana particles (they've been purely theoretical.) Their 2018 efforts were libeled by a disgruntled researcher and they wasted years struggling with the bad publicity, but their chip is very promising. If so, they may be able to leapfrog from 18 physical qubits on-die to possibly a thousand qubits overnight. (Unfortunately their silence on the topic for the past year has been deafening.)
  • Researchers continue optimizing Shor's algorithms, and various teams have been making non-linear advances in factoring optimization. Estimates for factoring a 1024 bit RSA key have gone from 20 million qubits (a decade ago) to 1 million qubits to a paper published last year claiming to be able to do it using only 378 qubits plus 10000 quantum gates.
Given all the unknowns (and the "known unknowns") making the change by 2029 is not unrealistic.
 
Upvote
26 (27 / -1)

robododo

Smack-Fu Master, in training
87
Subscriptor
1. The 2029 date is really just 1 year early: CNSA 2.0 says PQC is needed for software/firmware signing by 2030. A big chunk of Google's work here is moving their code signing. So the schedule here is not really agressive, if one cares to continue working with govt entities (or their subcontractors).

2. The new algorithms are chonky. Are you a CA that issues a few thousand certs a day? NBD. Go implement ML-DSA right now. Do you sign 10,000,000 certs a day? Uh, your data will increase by "a bit". ML-DSA signatures are 4627 bytes. P-256 is 64... Merkle trees can help, but they are not a universal fix.
 
Upvote
16 (16 / 0)
As someone who continues to use a lot of older—but still technically supported—machines, I'm curious if support for these new encryption methods will require some kind of hardware acceleration block to use? Like, will performance be so terrible or not even possible on a computer from ≥10yrs ago?
Many machines, especially memory or processor constrained IoT devices such as cameras, simply cannot be upgraded. The PQ keys are often too big to fit in the flash memory available.

Our internal recommendation has been to prioritize upgrades based on risk. Public facing systems and internal high-risk systems (key servers, PKI signers, etc.) need to be the first to upgrade. Warehouse cameras and other low risk internal-facing-only systems would be on the "low-to-never" upgrade path, and will eventually be cycled out by vendor contracts when purchasing replacements ("all new devices must include PQ support.")

The real trick is finding all of the darn things. Shadow IT is a real problem when it comes to security.
 
Upvote
13 (13 / 0)

nadimz

Smack-Fu Master, in training
15
Subscriptor
We started offering PQC a year or so back. I rarely get more than slight interest from customers even though it’s fairly trivial to implement. There’s no cost to them for it either. This is something that needs more airtime to get IT layman and security folks more interested. Announcements like this help bring the topic into the conversation. I’d think that having your data-at-rest encrypted using PQC would make your company a less enticing target for nation state attackers willing to invest in longterm plans.
Just as a side note because I think it's important.

Current industry standard for block encryption (AES-256) is not vulnerable to quantum attacks. So as long as AES-256 is used and the encryption key is securely generated and never leaves the machine when storing data, then you're good.

Signature algorithms and key encapsulation algorithms (RSA, DSA, ECDSA) on the other hand are vulnerable to quantum attacks. These algorithms are used for authentication and key exchange and are what things like HTTPS are built on. That's where "harvest now, decrypt later" comes in. An adversary can store tons of HTTPS data today and break (read the encapsulated session keys that protect the data sent back and forth) once a quantum computer is available and access the data.
 
Upvote
17 (17 / 0)
It’s funny because credit unions have more money than banks (two separate entities, CUs pay less taxes and more and more offer the same services as banks with less “membership” requirements) but all that goes to salaries not infrastructure improvements.
In my experience it mostly goes to better rates. The CUs I've been a member of had lower fees and better rates than banks I used, and about the same level of IT infrastructure. What they don't have are the massive national ad campaigns.
 
Upvote
4 (4 / 0)
Right. Suggesting that the entire Internet could blow up in 2029 is clearly good for Google stock.
Investors don't care about old lines of business anymore, only the newest buzzwords. If they can somehow work "AI" into this alongside "quantum" they'll really be cooking.
 
Upvote
0 (0 / 0)

alxx

Ars Praefectus
4,980
Subscriptor++
Apple added ML-KEM-768 in mac os 26 , only fun thing is it spits out warnings that most servers you connect not being PQC safe especially for ssh

https://support.apple.com/en-bw/122756
https://developer.apple.com/documentation/cryptokit/mlkem768

Other fun thing is a lot of linux distros can't /don't support PQC by default yet.
Need to use ubuntu 26.04 (when its out) to get PQC support.

Yes of course you can go and compile and modify the latest openssl versions
but why would I want to do that, when it doesn't have enterprise support
 
Upvote
3 (3 / 0)

GFKBill

Ars Tribunus Militum
2,860
Subscriptor
Dumb media spin aside, if we take the 1 million figure as given, current state of the art is at 5k, so "are we there yet".
It took about a decade for CPU's to go from 5K to 1M transistors. Quantum seems a lot harder, but roughly a decade isn't long, and as others have said that 1M number is potentially not done coming down too.
 
Upvote
0 (0 / 0)
"elliptic curves and RSA, both of which will be broken"

What does that actually mean exactly? If I can crack an n-bit RSA key today at a cost of C(n), how large an RSA key (likely much more than n bits) could I crack using a quantum computer at the same cost C(n)?
Quantum scales much better than exponentially, but not quite as good as a purely linear function of qubit or quantum gate count. So if it will take 10k qubits to crack a 1024 bit key in some amount of time, it may take 40k-80k qubits to crack a 2048 bit key in a similar amount of time. Or instead of a day it may take over a year* to crack a 2048 bit key using the same 10k qubit computer.

I can't really compare the costs of a quantum break to a break using classical computers. I do know of one recent example where a 512 bit RSA key was cracked in about a day using roughly $100 worth of AWS resources. But AFAIK nobody has publicly announced breaking a 1024 bit RSA key using a classical computer. And all I know about quantum computing costs is that IBM is renting out time today on a 100 qubit computer for $96/minute (see their Qiskit service.) And 100 qubits is not nearly enough to crack a 1024 bit RSA key using today's algorithms.

IMPORTANT DISCLAIMER: these are very rough examples to show the predicted relationships of keysizes to quantum computer sizes. I was given these examples in a briefing from a cryptography vendor. They are not necessarily representative of the actual numbers of qubits, quantum gates, or durations that will actually be required to crack a specific key.
 
Upvote
1 (1 / 0)

Barleyman

Ars Tribunus Militum
2,221
Subscriptor++
It took about a decade for CPU's to go from 5K to 1M transistors. Quantum seems a lot harder, but roughly a decade isn't long, and as others have said that 1M number is potentially not done coming down too.
Oh, it'll happen all right, just the media spin of 2029 seems daft. From what I found out, the current processors tap out after 100-odd qubits so they have to network many of those as things are now to get to thousands of useable qubits and it seems the scope for parallel connections is fairly limited at the moment.

Perhaps diverting 10% of AI money thrown around to quantum processors instead..? Nah, crazy talk.
 
Upvote
1 (1 / 0)