alt.hn

3/18/2026 at 10:10:01 AM

2025 Turing award given for quantum information science

https://awards.acm.org/about/2025-turing

by srvmshr

3/18/2026 at 9:25:49 PM

I don't want to take anything away from Bennett and Brassard, but I'd like someone to spare a word for poor Stephen Wiesner, who invented the earliest quantum information-distribution protocols as far back as the 1960s and published them before Bennett and Brassard. He also invented Oblivious Transfer (OT) which is required for multi-party computation -- although his was a quantum protocol that demonstrated some of the ideas behind QKD, not the classical protocol we call OT today [1].* Weisner was an inspiration for Bennett and Brassard, who then realized more useful systems.

While obviously this takes nothing away from BB's many later contributions (and they have extensively credited him), it's just a reminder of the randomness that goes with scientific credit. Since my PhD thesis was on OT, I like to remind people of Wiesner. He deserves a lot more credit than he gets!

* I suppose if you're a real theoretician, since OT implies MPC and MPC implies all cryptography, then perhaps Wiesner's OT implies everything that BB did subsequently. I'm not sure any of that is true (and I've since checked with an LLM and there are some no-go theorems from the 1990s that block it, so that's super interesting.)

[1] https://dl.acm.org/doi/10.1145/1008908.1008920

by matthewdgreen

3/20/2026 at 12:17:07 AM

Actually you can't compose quantum crypto protocols like you can classical ones - the composed protocol needs a new security analysis. Entanglement across protocols often kills the composition!

Interestingly (to me!) it took a while in the 90’s/early 00’s for the community to realise that there are distinct questions:

Question A: Does there exist a set of target states and measurements that implement the task

Question B: Can mistrustful parties find a communication protocol that securely (from their perspective) create/implement those states/measurments.

An example where the answer to A is “no” is fully secure oblivious transfer. There were a bunch of misguided papers trying to find communication protocols for OT, but they were doomed from the start!

An example where the answer to A is “yes" but to B is “no” is strong coin flipping. And an example where the answer to both is “yes” is weak coin flipping. (See Carlos Mochon’s magnus opus arxiv 0711.4114 for the coin flipping examples).

I first articulated the distinction between A and B quant-ph/0202143 but left the proof about OT and Question A as an exercise to the reader! Roger Colbeck in arxiv 0708.2843 provided a simple proof and elucidated the whole situation a lot I think.

by Q_is_4_Quantum

3/19/2026 at 12:47:01 AM

Don't forget about William Wootters, who also did significant work in the 1980s on quantum information. Most notably with Zurek, he proved the quantum no-cloning theorem in 1982. This result is at the same foundational level as energy conservation or constancy of light.

He was also on the Teleportation discovery in 1993.

by abdullahkhalids

3/19/2026 at 7:20:22 PM

Asher Peres told me that Bill Wootters should be given 99% of the credit for the teleportation discovery (and this is in the context that most of us around at the time presumed the majority of the credit should go to Peres and Wootters who had already been discussing publicly very similar stuff).

by Q_is_4_Quantum

3/20/2026 at 2:13:42 PM

Interesting to hear, but also not surprising. His thinking was so far ahead of time.

by abdullahkhalids

3/19/2026 at 4:44:37 AM

> and I've since checked with an LLM

but did you recheck it yourself, or are you trusting unreliable narrator?

by NooneAtAll3

3/19/2026 at 1:46:50 PM

I only checked the abstracts, and they seem consistent. Good LLMs (Claude Opus, ChatGPT Pro) still get things wrong regularly, but lately I've noticed these are mainly the deep details, not easy things like "there is a result that claims X."

by matthewdgreen

3/19/2026 at 4:40:30 AM

Wiesner was quite a character but he died a few years ago, so wouldn't have been eligible for this year's award.

by throwaway81523

3/18/2026 at 7:20:03 PM

As a young grad student, I remember going to a talk by Bennett where he explained how a Quantum Computer allows manipulation in a 2^N dimensional hilbert space, while the outputs measurements give you only N bits of information. The trick is to somehow encode the result in the final N bits.

I felt this was a much better layman explanation of what a quantum computer does than simply saying a quantum computer runs all possible paths in parallel.

by spot5010

3/18/2026 at 7:34:11 PM

> I felt this was a much better layman explanation of what a quantum computer does than simply saying a quantum computer runs all possible paths in parallel.

Relevant concerning your point:

> "The Talk"

> https://www.smbc-comics.com/comic/the-talk-3

by aleph_minus_one

3/19/2026 at 1:20:58 AM

That comic is great I understand qubits a bit better now: it has 4 degrees of freedom but can be mapped onto the 2d surface of a sphere because of normalization (circle rule) and global phase symmetry which each take away one of the four DOF

I need a longer think on the interference/computation connection though

by hammock

3/18/2026 at 8:53:27 PM

Thanks for this! I guess i need to read up on Hilbert Space.

...and Shor's Algorithm

by dogtimeimmortal

3/18/2026 at 10:12:43 PM

Don't let the terminology intimidate you. The interesting ideas in quantum computing are far more dependent upon a foundation in linear algebra rather than a foundation in mathematical analysis.

When I started out, I was under the assumption that I had to understand at least the undergraduate real analysis curriculum before I could grasp quantum algorithms. In reality, for the main QC algorithms you see discussed, you don't need to understand completeness; you can just treat a Hilbert space as a finite-dimensional vector space with a complex inner product.

For those unfamiliar with said concepts from linear algebra, there is a playlist [1] often recommended here which discusses them thoroughly.

[1] https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...

by amemi

3/19/2026 at 1:26:08 AM

Yeah all the names and terminology really do make it seem harder than it is. Took me a long time and I’m still learning. 2d Hilbert space is same as 2d Euclidean space but each dimension has 2 degrees of freedom (real + imaginary). Might even think of it as 4d space, for vector imagining purposes, but that would probably be wrong and someone would call you out

by hammock

3/19/2026 at 8:53:02 AM

> ...and Shor's Algorithm

Better start with Simon's algorithm (solving Simon's problem) [0]; it already contains a lot of ideas that you need to understand Shor's algorithm, while not having a lot of technicalities. Then progress to Shor's algorithm, and then to Kitaev's algorithm [1] (link from [2]). The latter solves the Abelian stabilizer problem - this problem contains the more abstract mathematical essence of a lot of quantum algorithms.

[0] https://en.wikipedia.org/wiki/Simon%27s_problem

[1] https://arxiv.org/abs/quant-ph/9511026

[2] https://en.wikipedia.org/wiki/Hidden_subgroup_problem#Instan...

by aleph_minus_one

3/18/2026 at 10:10:01 AM

* From the announcement [0]:

ACM has named Charles H. Bennett and Gilles Brassard as the recipients of the 2025 ACM A.M. Turing Award for their essential role in establishing the foundations of quantum information science and transforming secure communication and computing.

* An accessible news excerpt via CNN science [1]

Years before emails, internet banking, cloud servers and cryptocurrency wallets, two scientists devised a way to keep secrets perfectly safe and indecipherable to eavesdropping outsiders.

Their 1984 work depended on the hidden, counterintuitive world of quantum physics, which governs the way the world works at the smallest, subatomic scale, rather than complex but theoretically breakable mathematical codes to secure data.

The insights of Charles Bennett, an American physicist who is a fellow at IBM Research, and Gilles Brassard, a Canadian computer scientist and professor at the University of Montreal, have since transformed cryptography and computing. The pair received the A.M. Turing Award on Wednesday for their groundbreaking work on quantum key cryptography.

[0] https://www.acm.org/media-center/2026/march/turing-award-202...

[1] https://edition.cnn.com/2026/03/18/science/quantum-key-crypt...

by srvmshr

3/18/2026 at 6:11:36 PM

> Bennett and Brassard, with Ethan Bernstein and Umesh Vazirani, showed that in black-box setting, quantum computers would require big-omega(sqrt(n)) queries to search n entries, matching Grover's algorithm. For some reason, the popular press rarely covers these results that limit the power of quantum computing.

This is mentioned almost as a footnote, but to (layman) me seems much more important than QKD, especially from a comp sci perspective instead of a physics perspective.

by bawolff

3/18/2026 at 7:07:59 PM

QKD can be sold today.

The quantum computers are not quite large enough to search at an `n` such that O(n)` is not viable but `O(sqrt(n))` is, that's where there's money to be made, especially if viability is defined by small time horizons. So it's a footnote for the future.

by shemnon42

3/18/2026 at 7:16:21 PM

> QKD can be sold today.

It can, but it isn't largely because the classical solutions solve the problem better and you usually have to resort to classical solutions to solve MITM anyways afaik. However my point is less about practicality and more QKD seems more like a physics or engineering thing and not a computer science thing.

After all, this is supposed to be a computer science prize not a make money prize, so which is more sellable should be besides the point.

by bawolff

3/18/2026 at 7:50:51 PM

Worth noting that this is a bound on arbitrary search, but there exist some problems with structure (e.g. integer factorization) for which quantum algorithms are exponentially faster than known classical algorithms (a problem believed to be in NP and BQP but not P).

by shorden

3/18/2026 at 7:50:38 PM

That's 2 Turing award for the Université de Montréal in 6 years. Sadly, I never had those 2 teachers during my years there!

I did see Gilles' lunch talks though, it was really insightful!

by RRRA

3/18/2026 at 6:52:37 PM

The math might be beautiful, but I'm very skeptical - practical - quantum computers will ever deliver their promise.

by DrNosferatu

3/18/2026 at 8:43:29 PM

Forever is a long time, but I agree people that assert reality is the model are almost always incorrect eventually.

There is some interesting work being done, but it will never match the excessive hype. =3

"The Genius of Computing with Light"

https://www.youtube.com/watch?v=rbxcd9gaims

by Joel_Mckay

3/19/2026 at 7:32:25 PM

I would say optical computing will be practical well before QC is.

Time will tell.

by DrNosferatu

3/18/2026 at 10:57:07 PM

Not sure what is going on in QC world; With this ACM prize it has become even more murky.

As Sabine Hossenfelder (Theoretical Physicist) points out, companies to do with QC are seeing a surge in investments and marketing. It is as if somebody knows something that the "common public" doesn't - https://www.youtube.com/watch?v=gBTS7JZTyZY

I don't know enough about the science/technology to form an opinion but have recently started down the path of trying to understand it - https://news.ycombinator.com/item?id=46599807

by rramadass

3/19/2026 at 6:05:24 AM

> It is as if somebody knows something that the "common public" doesn't

oooorrr - and hear me out - investments are inherently hype-based and irrational and there is too much money flying around to do actual smart decisions

by NooneAtAll3

3/19/2026 at 4:14:59 PM

Nope.

Quantum Computing (QC) is unlike previous technologies which were all mostly "logical structures" (i.e. the underlying Physics/Technologies were well-known). The viability of both the core Physics itself and its realization through Technology for QC are questioned by some Physicists/Technologists themselves. But in 2024/2025 many Govts. and Companies both have started investing heavily in QC. Moreover the advanced countries have implemented export controls on QC technology prohibiting export of QC computers above 34-qubits.

And now the ACM prize for something done long ago in quantum information.

Finally note that QC algorithms can be simulated (for small size qubits) on conventional computers and the current AI technologies may also play a part here i.e. implement QC algorithms on the "Cloud supercomputer" and using AI technologies.

The logical inference is that there has been some technological (one or more) breakthrough in the realization of the QC qubits technologies, QC algorithms running efficiently on the cloud, AI usage for QC etc. Nothing else explains all of the above facts.

See also: The Case Against Quantum Computing by Mikhail Dyakonov (Professor of Physics) - https://spectrum.ieee.org/the-case-against-quantum-computing

by rramadass

3/19/2026 at 10:21:23 PM

> Quantum Computing (QC) is unlike previous technologies

aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

governments are only investing the same way as into particle accelerators - in form of research grants

simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough). Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech

by NooneAtAll3

3/20/2026 at 4:30:08 AM

> aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

You have not understood the first thing about what i had pointed out.

> right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

While there has been hype, in the last couple of years things seem to have changed and now culminated in the awarding of the ACM Turing Award prize. Do you know anything about the Physics/Mathematics behind qubits (eg. probablities/superposition/phase/noise etc.) and/or how that has been realized via technologies (eg. superconducting/photonics/trapped-ions etc.)? People are looking at "hybrid" quantum computers i.e. conventional+quantum (eg. IBM, Fujitsu), shuttling qubits on silicon (eg. Hitachi) which allows existing foundry technology to be used for QC. This is huge.

> viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

Non-sequiteur.

> governments are only investing the same way as into particle accelerators - in form of research grants

No, Govts. are actively funding startups in this area and including technology research/transfers in their Free Trade Agreements with other govts.

> simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough).

Simulation of QC is not "extremely trivial" but requires HPC technology. Datacenter/Cloud technologies are also utilized here. Generally only around 30-50 qubits have been simulated with 50+ qubits being exponentially prohibitive in terms of compute power/memory.

>Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

To use your own language; this right here shows that you are just a clueless idiot about this domain. AI is a tool applied to various domains eg. AlphaFold for protein structures in Biology which solved an almost intractable problem. People are doing the same with QC+AI. There are a bunch of papers on this; for your edification start with Quantum Computing and Artificial Intelligence: Status and Perspectives - https://arxiv.org/abs/2505.23860

> If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

This demonstrates your gullibility. Since one of the best studied usecases for QC is cryptography, if there has been a breakthrough in some lab (govt/academia/company all of whom have secrets), the powers-that-be would not want it to be widespread for security (mainly) reasons. But hints might have been given and investments encouraged. Almost all QC companies have a govt. tie-up and cryptographic technologies have already been subject to export controls from the very beginning. Another scenario is defense applications. There are plenty more but these two are the main ones.

> The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech

That is the obvious superficial take. Given what i have written above, what if semiconductor technology i.e. the "hybrid" QC+Conventional allows one to simulate 100+ qubits easily now? What if there has been some breakthrough's by using AI on QC algorithms both existing and new ones? Have any formerly intractable problems in Physics/Chemistry/Biology/Mathematics been made tractable now due to AI usage? How many of these can be implemented on a QC? Etc. Etc.

To summarize; you have to look at the whole complex picture before drawing conclusions. Merely parroting trivialities like "hype" is meaningless.

by rramadass

3/20/2026 at 8:08:37 PM

Commercialization can bring in speculators and hype. And, I'd argue that speculation is a necessary for accelerating market development. Commercialization brings with it unique forcing functions that don't exist in academic settings, and this historically leads to acceleration of functional products. The first step is building a quantum computer to learn how to build a quantum computer. That step is done, while research continues in many areas, the commercialization challenges are largely engineering in nature.

I've only seen 34 qubit simulators (eg AWS SV1). My understanding is that 34 qubit uses 512GB of RAM, and each additional qubit doubles the RAM requirement. So, 50 qubit simulated would require 16.8M GB of RAM.

100 logical qubits seems to be the minimal threshold for interesting/useful quantum computing, albeit with very limited use cases. Classical still beats most. Quantinuum will hit that number in 2027. And, IonQ (often cited as being a hype-machine) expected to have 800 logical qubits in 2027.

The industry is moving out of the NISQ Era (noisy-intermediate-scale-quantum) and into the Fault-Tolerant QC (FTQC) era. NISQ is experimental. FTQC is commercial (ie reliable, repeatable).

by qubob

3/19/2026 at 1:29:56 AM

Check out Eric weinsteins latest theory about how frontier physics has moved “dark” (with a grain of salt, some of the other things he says might tempt you to discount him completely)

by hammock

3/18/2026 at 10:19:37 AM

Well deserved and much needed recognition in quantum key cryptography, for once not a single mention of "AI" anywhere.

Congratulations to Charles Bennett and Gilles Brassard.

by rvz

3/18/2026 at 6:07:15 PM

Really curious, not a critique: apart from the idea of the possibility of intrusion detection due to the quantum nature of the communication link, what is special about the protocol that is mentioned?

by MeteorMarc

3/18/2026 at 8:13:35 PM

Coincidentally, the same people also recently received the Science Fiction Award 2025!

by kleiba

3/18/2026 at 8:25:22 PM

Source?

by aleph_minus_one

3/18/2026 at 8:33:49 PM

I suspect the parent considers quantum computing science fiction.

by layer8

3/19/2026 at 12:33:55 AM

[flagged]

by goatyishere25

3/18/2026 at 9:01:21 PM

[flagged]

by 314crypto58