A Look Back at 2023 and Quantum Predictions for 2024
The Quantum StateJanuary 02, 2024x
10
01:08:42128.29 MB

A Look Back at 2023 and Quantum Predictions for 2024

In this episode of The Quantum State, we're wrapping up the year 2023 and making bold predictions for what 2024 holds in the realm of quantum technology. Host Anastasia Marchenkova, joined by quantum experts Dr. Peter Rhode and Dr. Gavin Brennan, guides us through the latest developments and future prospects in quantum computing.

🔗 Discussion Points:

Quantum Computing in 2023: A Quiet Year with Surprises:

Reflecting on 2023: Our hosts discuss the year's developments in quantum tech, overshadowed by AI but not without its surprises. 🌐

Increasing Qubit Counts and Error Correction:

Qubit Milestones: The recent announcements of 1000 qubit systems by Atom Computing and IBM, and what this means for quantum computing. 🔑

Error Correction Evolution: Exploring advancements in logical qubits, particularly QuEra's 48 logical qubit breakthrough. 🛡️

Quantum Computing and Cryptocurrencies:

Ethereum's Quantum Resistance Roadmap: Discussing Ethereum's approach to quantum-safe cryptography and the challenges for other cryptocurrencies. 💰

The Hybrid Approach in Security: The movement towards hybrid systems in platforms like Signal, Chrome, and SSH. Is this the future for cryptocurrencies too? 🖥️

The RSA-2048 Controversy:

Claimed Cracking of RSA-2048: Addressing the sensational claim and its implications. 🔑

Corporate Engagement in Quantum Technology: Quantum in the Corporate World: Discussing how companies like Moderna, Foxconn, JPMorgan, and others are integrating quantum computing. ⚠️

 General Predictions for 2024:

Quantum Forecast: Speculating on qubit development, potential real-world problem solving, and the investment landscape in quantum technology. 🌐

Join us next time for more explorations into the quantum computing universe with top field specialists!

 🎧 Stay Updated: To delve deeper into the world of quantum technologies and The Quantum State, subscribe and hit the notification bell for insightful future episodes.

[00:00:00] Welcome to The Quantum State, a podcast exploring the latest research and innovation in quantum computing. Join us as we dive into ground-breaking breakthroughs, trends and news shaping the quantum landscape. Hello and welcome back to The Quantum State.

[00:00:30] So today let's close out 2023 and talk about predictions in quantum technologies for 2024. So to me things felt pretty quiet in quantum this year. AI, chat, GPT really took over the hype. But at the end of the year we definitely got a few surprises in quantum technologies.

[00:00:47] So today we're going to go over some of the new stories in quantum tech in 2023 and look forward to those predictions in 2024. And I'm Anastasia Marchenkova and I'm joined by my co-host Peter Rode and Gavin Brennan. Hello. So let's start on the cubic counts.

[00:01:04] We talk about this a lot recently. Adam Computing announced their 1000 qubit system and IBM did as well. Adam Computing is actually a neutral atom system and IBM is superconducting. So first let's talk about what does 1000 qubits really mean here?

[00:01:20] And what I really mean by this is with 1000 perfect qubits we likely see a lot of real applications today, but we haven't seen that yet. So what are we waiting for and what are the gaps in those systems? Well the gap is exactly what you just said.

[00:01:34] You mentioned 1000 perfect qubits and perfect qubits is a very different reality to what we live in, which is noisy qubits. When people talk about perfect qubits they often call that logical qubits. The ones that you use for computation but the ones that you build

[00:01:50] are not the ones that you actually use for computation. You have to take a multitude of those together to error protect them and use codes to prevent noise. And that overhead means that the actual usable qubits you have

[00:02:01] is a small fraction of your actual number of physical qubits. When we have 1000 physical qubits today probably not one single usable qubit in a scalable error corrected sense. So it's a different benchmark altogether.

[00:02:17] Yeah but it is more than just being like a kind of a number that sounds big. It looks good on press releases. It is a significant milestone I think just because it shows that the companies have been serious about scaling up.

[00:02:34] There's a lot of things you can do without focusing on improving your architecture with dozens or maybe a couple hundred qubits. But if you really get to where you actually have 1000 qubits and particularly important thing to question is whether they're connected.

[00:02:58] And I will believe these companies that say they are connected. So if you can do that it means that you've probably invested a lot of time and money into getting the architecture to a point where it can actually accommodate all those qubits.

[00:03:15] And so that actually required redesigning and rethinking things beyond what was done in decades previously in experiments. And for example for superconducting qubits it's often thought that you're going to have to make the computer in modules where each module will have maybe a couple thousand qubits

[00:03:41] and that would be cooled by one of these cryogenic coolers to keep the noise down. And then you'll have to connect the modules together. So this sounds to me like getting at the point where you're actually starting to fill out what would be a module

[00:03:55] and for the neutral atom qubits they can probably fit some more but they also going to have some limits. Yeah that's one of the big things I think in the superconducting side that we are seeing is this

[00:04:07] thousand qubit perhaps limit to how many could fit into one cryostat. There's a lot of hardware wiring just weight on the plates that goes into the superconducting systems. So it's exciting to see that at some point back in the day we thought we could maybe fit 10,000 qubits

[00:04:25] but then some of the other hardware issues started popping up on scaling. But yeah in the modular sense that's also something that I've seen. So IonQ is a trapped Ion company so it's not neutral atoms but even a few years ago

[00:04:39] they were already talking about the modular aspect here where they would say they would need to connect a lot of these systems together to get those logical qubits. Yeah I think maybe that should be more the selling point behind these companies making

[00:04:55] these developments rather than talking about the raw number which can be a very misleading figure and a very meaningless one if it's not properly contextualized. Really the architectural aspect of it is what should be the selling point because that's really what's indicative of

[00:05:12] what the forward potential of a platform is. Yeah and there's still a little interesting gap there so IBM has their own measurement of how good a quantum computer is it's called quantum volume on their end and they talk

[00:05:28] a lot about the connectedness, gate fidelity these sort of things but different systems are going to have different benefits and so while Adam I believe they don't have their own kind of system for saying whether they're how good their qubits are algorithmic qubits is another metric

[00:05:44] that was proposed by IonQ and you know connectedness in a trapped ion system is technically better it's all to all connectivity because you can just shuttle the ions together. I haven't looked specifically here on the atom side to see what what they claim on the connectedness

[00:06:00] but it's really interesting right now that we don't have a standard for saying what a good quantum computer is. Well the thing is that there is no single standard and I think maybe

[00:06:11] that's the problem everybody these days likes to reduce things to a single metric or a kpi but there's no single number that can meaningfully characterize all of the important aspects of a large-scale quantum architecture and the opinion that I've expressed in the past also is

[00:06:30] really the ultimate measure is what's the practical utility of it and at the moment none of them can say anything meaningful on that size on that side of things so people resort to

[00:06:43] these other measures as an interim kind of approach to arguing what the forward potential is but but ultimately not no single number is going to give a meaningful picture of what the potential is.

[00:06:55] Yeah why do you think companies do that is it more for the investors is it more for talent? Well because the scientists know this right you know all the founders they're not dumb people

[00:07:08] they know exactly what this means. That's exactly correct but at the same time people do like to have benchmarks because whether it's investors or media or whatever the case may be you need to be able to say something numerical or quantitative about your achievements so people

[00:07:25] use these different measures and inevitably everybody who comes up with a competing architecture they'll think okay what measure can we define that makes our architecture look better than everybody else's and so every possible vendor out there will invent their own unique metric

[00:07:41] and say we've achieved metric x we've achieved metric y none of them are intercompatible or say consistent things but they're all just defined to optimally sell their own platform. That's maybe another reason why I'm disinclined from believing any kind of numerical KPI at the

[00:07:59] moment because there simply is no consistent way of doing it. Yeah actually I'll make a prediction that we're going to see next year companies start to advertise the number of logical qubits but that will also be a useless number because there'll be logical qubits that'll

[00:08:19] be in very small codes that won't be big enough to actually provide enough error correction for any reasonable or important problem. That's right so it'll be big marketing strategy where everybody jumps on board and says right you all need to turn your attention to the distinction between

[00:08:36] physical qubits and logical qubits and everybody will promote that and then everybody will start talking about their logical qubits oblivious to the factor at least not being forthright about the fact that there's no single benchmark for logical qubits either because a logical qubit

[00:08:51] also has many different numerical characteristics that relate to how scalable it is and it'll be the same thing all over again. Yeah right so that was another thing I was going to ask how do we define the logical qubit because you people look at error correcting codes

[00:09:08] in the beginning they see Shor's code okay nine nine qubits for the error correction side right so then they're like okay we 10x and then I say no no that's just correcting the one but then

[00:09:19] there's the gates and you have to keep expanding right and so very quickly blows up so what do you all think they'll they'll mean when they say logical qubit and what would you

[00:09:31] say the definition should be for a logical qubit? So I mean I would say a logical qubit is something that makes use of redundancy just very much like the classical encoded bit you're using many physical qubits to encode a single logical qubit it should be something where

[00:09:57] there's a well-defined code space and in order for you to be able to detect and correct errors errors should take you outside the code space and you should be able to

[00:10:14] you know measure when that happens or do an operation which can bring you back to the code space and you know this can be done in a variety of different systems that we usually think of

[00:10:28] things in terms of of qubits each one is a two level quantum system and you have lots of them but there are other ways to have you know kind of encoded qubits one is the so-called GKP code

[00:10:44] named after a goddess been cative and prescal from their 2000 paper where they showed how you can put a qubit inside a harmonic oscillator a quantum harmonic oscillator and in principle a quantum harmonic oscillator has an infinite number of states but you can put a single qubit code in

[00:11:01] there and you can do some correction on that I mean that's the completely correct way to think about what a logical qubit it is but it still doesn't provide a consistent way of comparing different ones if someone comes with one platform and says they've implemented the

[00:11:17] short code that you just mentioned using nine qubits per logical qubit well that's a statement that they have a logical qubit someone else coming along and making a GP GKP code that's a completely different statement that has the same linguistic outcome that you've got a logical qubit

[00:11:36] but completely different interpretations and completely different and inconsistent implications for how the architecture scale I'm just generally reluctant to reduce anything to a single number or even just a small number of numbers because it all comes down to

[00:11:55] how things interconnect holistically it's not about the number associated with a particular physical or logical qubit it's about how the whole architecture works together to achieve an algorithmic outcome and what algorithmic outcome of practical utility that you're able to achieve

[00:12:14] is the only meaningful benchmark in my view so speaking of that speaking of linguistics we're getting deep into that so recently someone gave me a prediction that the think they will see fault tolerant shores next year and I said what is fault tolerant

[00:12:30] what does that mean oh that's okay I'll take that bet unless they mean like shore is in factory in the number 15 they might right so that's another thing people say like well shores exist but it's only

[00:12:48] factored 15 so far so we're still far away but yeah the fault tolerant part yeah so that okay so I mean if they're if they're being precise about that you know they're going back to the original idea for from von Neumann who was looking at

[00:13:07] how to do fault tolerant classical computing and this means that you have a a procedure which encodes your your qubits in these in these these logical code words for logical qubits and you are able to correct theirs and also be able to

[00:13:31] accommodate errors during the correction as well as during the processing and if you can get that that correction down to a level where the error rate on your logical qubits is is less than the physical error rate then it means you've improved things by putting in a code

[00:13:57] and and then if you make things big enough and get the error rates small enough relative to that threshold number then you're in the game yeah so if we saw all those things happen

[00:14:12] then yeah that would be great but we haven't seen that but taking all of that further to back to what I was saying about getting an algorithmic outcome the the computer scientists sort of interpretation of this would be first of all quantum computers give

[00:14:29] in generally speaking probabilistic outcomes and what you want is that the majority of the time the outcome to your quantum computation should be correct and if the majority of the time it's

[00:14:40] correct it means that if you repeat it a small number of times whatever the majority outcome is that gives you your vote and it converges to the to the correct amount if it's majority correct

[00:14:52] so it's not just about do you make the error rates better than they were that's a necessary condition but it's not sufficient at the end of it all what you want is that the majority of the time the algorithm as a whole probabilistically gives you the correct outcome

[00:15:08] most of the time and then you can efficiently get correct outcomes and that's really what the ultimate benchmark is what's what's the likelihood of the whole thing giving you the correct outcome

[00:15:20] and does that happen most of the time I'm going to challenge you a little bit on that because I really like the definition for like the benchmark but for example and I know this is not how it

[00:15:33] works but what if you could run it 10 times it only gives you the right answer 10 of the time and one of the time it gives you the right one so it is a minority of the time against the right

[00:15:44] answer but it does get the right answer in the end and if you can quickly test that for correctness like I'm thinking for sure's algorithm right like you could just multiply back together and be like it's correct practical purposes that could be good enough

[00:15:58] sure no what you just said is completely correct and in the context of the algorithm you mentioned it's a particular class of algorithms called NP problems meaning that you can efficiently verify the solution so sure if you've got a million outcomes to Shor's algorithm even if just one

[00:16:15] of them is correct if you can efficiently classically verify that outcome which you can for sure is algorithm by multiplying the numbers together then you're in business but not all algorithms have that property lots of algorithms don't have the property that you can

[00:16:31] efficiently class classically verify them and if you don't have that ability then it boils down to having confidence in the majority outcome awesome so keep going oh yeah it's something like you even see this in just classical computing like there are probabilistic classical computers

[00:16:55] you can run them on laptops it's like Monte Carlo sampling and yeah you're always you're not always going to get the right answer but exactly as long as it's above 50 percent then like Peter was

[00:17:06] saying you can do majority voting and the probability you will get the wrong answer after majority voting goes down exponentially fast with the number of repeats so as long as you're above 50 percent then usually computer scientists use two-thirds just to show that

[00:17:25] you're actually above 50 then yeah you're in the game okay great so now that we've talked a little about that I want to read out the QR press release because this was announced during QDB this year

[00:17:39] just a couple weeks ago and it was interesting so QR announced 48 logical qubits during QDB this month which was kind of a big thing because I don't think we've actually had a company announced logical qubits before but they said researchers successfully executed large-scale

[00:17:56] algorithms on an error corrected quantum computer with 48 logical qubits and hundreds of entangling logical operations what do you all think that means 48 logical qubits well okay so we've already entered the era of what I was alluding to earlier which is everybody jumping on the

[00:18:16] logical qubit bandwagon without defining any of the details of what that means maybe they did but in the context of this conversation we haven't so I have no idea how to interpret what it means when they say 48 logical qubits doing what are they arbitrarily interconnected is that logical

[00:18:32] qubits uh when they're acting in isolation keeping in mind that the the types of errors that things are subject to can be very diverse it can be the case that usually a single qubit operation will have a very different noise profile than entangling operations which are

[00:18:51] generally harder to implement in most architectures so logical qubit single qubit gates no problem does that mean it extends to entangling gates well they say entangling operations but how much entanglement you need entanglement over many many qubits in your system and all of

[00:19:08] these things are asymmetric they're not uniform they're all individually quantified I wouldn't know how to interpret 48 logical qubits without further qualification yeah so I mean I do think it is a milestone in the sense that they've they've taken seriously uh the challenge of actually doing

[00:19:33] logical operations so I mean you know for many years now people have different experimental groups and companies have shown how to make a logical qubit but the processing has been minimal and really

[00:19:50] doing multiple entangling gates on different logical qubits just hasn't been done so uh you know I think I think you got to give credit to QERA and actually like taking this seriously and and doing doing the demonstrations including uh some uh mid-circuit measurements that's something that you don't

[00:20:14] always hear about so much but um for for most error correction which is needed to make any quantum computation work you you use some measurements in the middle of the circuit

[00:20:31] to diagnose errors and then allow you to correct them but a lot of companies just haven't shown that they just do some operations and then measure at the very end and the reason they don't is because

[00:20:41] the measurements are harmful oftentimes to their systems they introduce a lot of noise they're not very high quality uh and so QERA did do some of that but it's also pretty clear that they would have to demonstrate a lot more in order to see these kinds of

[00:20:58] massive entanglement at the logical level that Peter was talking about just to expand on that about the mid-circuit measurements that's an important distinction for error correction to do mid-circuit measurements classically post-processors outcomes feed forward to change the rest of the circuit to

[00:21:14] correct those errors and then move on to the next stage of the computation um what lots of people have demonstrated error correcting codes at a small scale in various different platforms it doesn't necessarily mean that they do it the correct way using the circuit measurements you

[00:21:29] could do a proof of principle demonstration of error correction just by running a circuit in bulk doing all the measurements at the end and then just arguing based on that post-selected result you can make an argument about the error correction but that doesn't mean it would

[00:21:45] fit into a framework where you're trying to scale it up in a way that you have these successive layers of error correction there are people often use all sorts of cheats and shortcuts in experiments certainly in my field of optics this has long been the case

[00:22:00] there are things that are technologically difficult so we just try to work around them and post-select things so you always need to be very very careful about the nuanced details of any statements to this effect what are those nuances how does that affect

[00:22:15] ultimately the scalability of the whole system yes so we've talked a little bit now about error correction so let's talk a little bit about what's the current state of the art for error correction because i mentioned the short code before which is different than Schwarz algorithm

[00:22:33] for folks listening in it's actually error correction but it has been around for a very long time and that was kind of the beginning that people get into when they look into error correction but now there's much more efficient error correction codes yeah there's certainly art but

[00:22:50] there's no single answer to what is the state of the art in error correction because the correct choice of error correction framework is highly dependent upon what the underlying physical architecture is if you're building a photonic corner computer you would be

[00:23:08] gravitated towards choosing completely different types of error correction procedures than you would in some other kind of architecture because there are very unique properties of the underlying qubits with photonic qubits you measure things they get destroyed they also have the property

[00:23:24] that they're moving at the speed of light which means that you have all sorts of issues with having to very quickly process things before you move on to the next stage that's a set of constraints that's different to other physical architectures and would make you

[00:23:38] choose different types of error correction techniques people very commonly talk about surface codes and other kinds of geometrical codes related to that as being state-of-the-art and and that great codes doesn't mean they're suitable for everything they're not so useful in optics

[00:23:57] for example but they might be in other types of platforms so there again there's also no single answer so okay but I will say we were talking about assigning numbers to error correction and performance and so at least with quantum error correction people often use two numbers not

[00:24:19] just one but two which is one is called the code rate which is the ratio of the logical qubits to the physical qubits so having a constant code rate is seen as good

[00:24:35] maybe you know it's like I don't know maybe one over 50 so one logical qubit per 50 physical qubits but that that's considered good as opposed to a rate which would be zero which is what you

[00:24:48] get with the surface code because you only get one logical qubit but the number of physical qubits gets arbitrarily large the other number is code distance which is the minimal weight operation

[00:25:01] which performs a logical operation on your code and good codes well the holy grail would be have a distance which is linear in the number of physical qubits because this means if the

[00:25:18] distance is big it means it takes a lot of errors to corrupt your data and and remarkably there has been progress over the past decade in theory and now in experiment also on demonstrating

[00:25:34] some codes that reach those properties so that's we are kind of in a renaissance in in that respect on quantum error correction in in finding ways to do that it wasn't obvious people have known

[00:25:47] how to do this for a long time in classical codes but you get obstacles with quantum codes that people have done that with these these low density parity check codes and my my feeling is

[00:25:58] that at some point the the companies will will settle down on a code that's good enough maybe not optimal but has good enough properties and then they will you know work with that

[00:26:13] and try to optimize control and fabrication in their architectures to make it work on a large scale and and I think once we start to see companies actually settle on a quantum error correction code

[00:26:27] means that they've really they've made that decision and then I think we're going to see rapid advances from that point great so Gavin also this is one of your favorite papers of the year from Alison Bob as we keep talking about error correction and shores and new architectures

[00:26:44] it's the paper computing 256 bit elliptic curve logarithm in nine hours with 126,133 cat qubits so first why are you excited about this and what are cat qubits yeah this is an awesome paper

[00:27:00] to read of course it's a theory paper and I'm a theorist so yeah but um yeah so first of all it is exciting because it's a huge reduction on previous estimates like you know I look at estimates

[00:27:16] before on how many qubits you'd need to crack elliptic curve cartography and the answer was you know we're getting estimates in you know 10 million range that have been reduced to like a

[00:27:30] couple million maybe two and a half years ago and then and then there was this big reduction to just 126,000 and and when you look at that it doesn't actually seem so far away now it probably

[00:27:46] is still but then you you know you're starting to think oh this isn't something which is maybe going to happen in 2040 this could be something that happens in the early 2030s and and and and the

[00:28:02] fact that they you know they went through the full analysis and I think is is is quite encouraging now they make some assumptions about the error rates so they do an assumption of what's called a

[00:28:16] bias noise model where errors are more likely to happen of one type than another type but that's that's what a lot of experimentalists and companies do anyway and they also make some assumption about

[00:28:30] connectivity but you know okay it might get grow when you have to limited range of interactions of your qubits but still it's that's really quite quite a notable number yeah and what are these cat qubits well they're called cat qubits because they are manifestations of the idea of

[00:28:52] Schrodinger's cat so you know Schrodinger's cat is the idea by Schrodinger that you could have a superposition of a dead and alive cat and he wrote this this this idea sort of like you know

[00:29:06] macabre gadancon experiments as a way to question how far can we take quantum mechanics if you take it to you know large-sized systems beyond just single atoms then you can get you know these very

[00:29:21] big objects to be in a superposition state and the ultimate one is being alive and dead so cat qubits we should really call them Schrodinger kitten embryo qubits they're they're they're bigger in a sense than having a single atom in a superposition it involves

[00:29:43] a harmonic oscillator mode which has an infinite number of degrees of freedom but it's something that can be controlled easily in a laboratory and the cat qubit just has a zero which is one type of superposition and a one which is another another type of superposition

[00:30:00] and with a different sign and and then they showed how these cat qubits work really well in biased noise environments and yeah they they managed to get this result so i i think it's quite

[00:30:15] exciting yeah i spoke to the founder at q2b and he said to come work at the company you have to love cats and i told him i'm a dog person so but yeah very very exciting and i think that really

[00:30:32] underlines is the point of the whole kind of hardware that we've been talking about this is still a lot of new discoveries to be made and this one kind of came out of nowhere a huge order

[00:30:41] of magnitude reduction and it's starting to get to that really interesting point we've talked about in other episodes how hardware is increasing the number of qubits is decreasing and it's a really interesting crossing point when we get there so on the next section i want to talk

[00:30:58] a little bit about cryptocurrencies and the security side so ethereum released its quantum resistant roadmap this year so i'm personally not surprised that ethereum was first since vitalik budarin has long been writing about post quantum encryption and the quantum threat

[00:31:15] but it has felt like in the last couple years everyone was kind of waiting for someone to make a move so i'm going to read a little bit from that resistance roadmap they say the challenge

[00:31:24] facing ethereum developers is that the current proof of stake protocol relies upon a very efficient signature scheme known as vls to aggregate boats on valid blocks the signature scheme is broken by quantum computers but the quantum resistant alternatives are not as efficient

[00:31:38] also the kzg commitment schemes used in several places across ethereum to generate cryptographic secrets are known to be quantum vulnerable currently this is circumvented using trusted setups where many users generate randomness that cannot be reversed engineered by a quantum computer

[00:31:53] however the ideal solution would be simply to incorporate quantum safe cryptography instead there are two leading approaches that could become efficient replacements for the bls scheme stark based and lattice based signing these are still being researched and prototyped

[00:32:08] so what do you all think about this approach and is it doing enough yeah this one is extremely nuanced there are lots of considerations when it comes to migrating to post quantum cryptography at the most superficial level the black box definition of these cryptographic

[00:32:29] protocols is the same you have a key you have or key pair you have a message or something that you're signing what they achieve is the same but the inner workings are very relevant depending upon the application in the context of a blockchain application the efficiency of the

[00:32:48] implementation of these algorithms is hugely important if you're just signing an email it doesn't matter if you've got a tiny little bit of latency you as the user isn't going to notice it when it's decrypting behind the scenes but if you've got some blockchain

[00:33:02] protocol where behind the scenes there are a huge number of signatures going around and having to be verified and signed any tiny little efficiency difference can manifest itself quite significantly significantly in the overall throughput of a blockchain the rate at which it can

[00:33:21] validate transactions so this is where it becomes a very difficult problem so other places have made similar announcements like the signal messaging protocol that many people use for secure end-to-end encryption they've also made an announcement that they're switching to post

[00:33:37] quantum cryptography but there the consideration is different it's more like what I was saying before the end user isn't going to notice it much if there's a bit of algorithmic efficiency difference for an aetherium type of blockchain it's very hard to know exactly how these

[00:33:53] differences will manifest themselves but in a potentially hugely significant way the other issue to consider is that when we switch to post quantum cryptography it's not a matter of just switch the old cryptography off and start using the new one first of all you have to remain

[00:34:11] compatible with what you've had in the past and then transition to new cryptography in the future secondly it's not good practice just to do a hard switch the new generation of post-quantum cryptography is effectively untested for large-scale deployment it's being standardized and people

[00:34:29] have a lot of confidence in it but the ultimate test is the test of time where you wait for hackers to hit hit it with a hammer for many many years and if nobody takes all your money you have

[00:34:40] confidence that things are correct in the interim period people instead double encrypt things using the old generation techniques and the new generation ones so that our signing is at least as secure as either of them that means that there's even more overhead and so all of these trade-offs

[00:34:59] need to be considered and it's a really complex task to get right the the current standardization process for post-quantum cryptography being undertaken by NIST it's not just about defining the algorithm

[00:35:13] it's about considering all of the different resource trade-offs what kind of key links do you need for a given amount of security how does that affect the implementation of the algorithm that's runtime

[00:35:25] and resource usage getting those trade-offs into the best possible regime is a big part of that standardization process and one of the hardest things to get right which can be highly application

[00:35:38] dependent as well so I don't have a black and white answer to the question but those are some of the considerations that make it a very hard thing to get right yeah and uh I think actually

[00:35:54] it's very interesting the the fact that we're living in a time where cryptocurrencies are are basically accessible around the world and you know bitcoin's been around since 2010 and it's it's essentially been stable I mean yeah it's fluctuated wildly in value

[00:36:19] but um it hasn't really been hacked you know there have been exchanges that have been hacked but the protocol itself has been robust which means that people are starting to feel like it's you know

[00:36:36] as a as a protocol not just bitcoin but the many other cryptocurrencies as well like ethereum people see them as like you know good testbeds for new cryptography so uh I think it's very good that actually uh cryptocurrencies will be some of the first adopters because

[00:36:57] we know because there's money to be made that hackers will do everything they can to crack these post quantum cryptographic protocols they'll try like you know a side channel attacks they'll they'll try to find every vulnerability possible because they can make money

[00:37:17] but the thing is you will know when you've been attacked because you will have fewer coins in your wallet uh so I think it's it's in a sense better that you know some of the the first platforms to trial these post quantum cryptographic standards are cryptocurrencies

[00:37:36] because you know it's it's it's probably better to risk some crypto tokens than say your medical records or nuclear secrets and uh and you know this this could be a testbed where we will see

[00:37:53] live results yeah and and probably people will announce if they've been hacked um some won't of course because they don't you know want to jeopardize their other holdings but some people will and um and then we'll we'll start to see that feedback into uh the protocols themselves

[00:38:13] so how they're implemented and um and and then I think this will be a good you know testbed for having them deployed in other areas it's a really good point because um blockchains by their nature uh public all the information uh showing the whole history of the ledger

[00:38:31] and all the transaction records and all the signatures it's necessarily public um and it's got trillions of dollars potentially uh attached to it so coming back to the hammer attack test of time from hackers that that's your testbed it's it's hard to fabricate a better testbed

[00:38:51] and especially in the web 3 space the notoriety you know there's a lot of hackers that steal stuff and then just give it all back just to say that they can so it really is kind

[00:39:01] of the perfect environment so peter you mentioned a little bit about signal moving to the hybrid system chrome google chrome a couple months ago also released their hybrid system where it's like

[00:39:14] you mentioned before the strongest of the two is is or yeah the strongest of the two is going to hold up right so even if we crack one there's a stronger one and we're going to be fine

[00:39:24] is there a hybrid approach cryptocurrencies can take well I mean you can certainly I mean that the technique is applicable it's that in principle it's very simple instead of just doing a digital signature on something you do it twice uh using two different techniques but the

[00:39:42] problem there is that then you've got all the resource overhead including the latency and the execution time of both of them um and in a blockchain that's very relevant because the goal of blockchains like ethereum is to aim towards a high throughput network that can support

[00:40:02] future financial infrastructure where huge numbers of transactions taking place every second and web 3 frameworks like you mentioned that require a high speed back end that potentially undermines all of that and holds it back uh it's very difficult to know what

[00:40:19] the right thing to do there is because there there is an inherent contradiction between using double encryption or double signatures using two different algorithms to slow things down while at the same time aiming towards this goal of a high throughput network I don't know how to

[00:40:36] get that balance right but um it's certainly correct to do it in a hybrid method but exactly what that implies for ethereum or blockchains that's really hard to know yeah and the upgrades are just a total pain sure yeah that they're hugely risky things right

[00:40:56] you can easily go so wrong when you do a blockchain migration like a hard fork or something yeah and then if you do hybrid you might have to do it twice right so with all that in mind do you think we'll see any transitioning cryptocurrencies in 2024

[00:41:12] oh seems inevitable yeah I mean there are already some that are designed from the ground up like the quantum distant ledger that from inception they were using hash based cryptography so will there be more absolutely I can't predict which ones but they're all going to have to

[00:41:32] and and the other thing to keep in mind also is that uh the specifications of blockchains can vary enormously they're all based on the principle of a immutable ledger that you can backward check everything but they also need to maintain forward integrity you don't want them

[00:41:51] to be retrospectively compromised because in principle you can go back to a previous point in the blockchain and try and fork it off you can do that um and you need to remain robust

[00:42:00] against that so it's not just about switching the blockchain in from now forward all future transactions to post-quantum photography got to think well if quantum computers come along that are sure enabled and can crack past cryptography what does that imply for the retrospectively for the

[00:42:17] previous history of the blockchain is it is the way it described provide retrospective avenues for compromise that's uh another nuance that can vary enormously between blockchains as well yeah and because we're talking financial assets here there are futures markets

[00:42:35] for cryptocurrencies and uh those cryptocurrencies that do not transition will have the value of those futures go down so you know that that can happen you know next month even if we don't get quantum computers until

[00:42:56] that could be viable candidates to crack it until like the 2030s um yeah futures values can respond now i mean on that front then it is also very important then that major blockchains clearly state what the quantum resistance roadmap is like as you were saying ethereum is doing

[00:43:18] if a blockchain doesn't have that kind of forward roadmap and make it clear what its commitment to a pqc migration process is uh people could look at that and say well

[00:43:30] the forward value of this blockchain is not is going to be nothing because in the future this is going to be compromised uh because they haven't got a migration roadmap to make it post-quantum secure

[00:43:41] so if the forward value is zero then by definition under a normal discounting economic model the spot price is also zero so um it affects the value of blockchains today what their future integrity is and what confidence you have in their forward security so making these announcements

[00:44:00] about transitions is extremely important and failing to do so completely undermines the present day value of blockchains yeah before we started taping this episode we were talking about prediction markets again and it keeps coming up so i'm pretty sure we should have an

[00:44:18] episode soon where we just sit down on live streams setting up all these prediction markets for the questions that we answer that would be a lot of fun uh speaking on the prediction markets

[00:44:28] and surprising news stories so last month a person on linkedin posted that he had cracked 2048 bit rsa i remember i personally saw this news article on my google home page i was half asleep and i just

[00:44:42] kind of chuckled and rolled over to go back to sleep and then to my surprise in the next day or two the media picked it up and they were all over it and all these articles start popping up

[00:44:51] so summary of that he kind of refused to give out the details asked people to DM him for the paper i also heard rumors it was actually embedded with a virus but that was someone actually just commenting

[00:45:03] on my tiktok videos and that's so i don't know about that's true um but obviously nothing has come of it since um so yeah i definitely want to be a little more productive than walking it a

[00:45:13] little bit but here's the question if you had cracked rsa what would you actually do would you publicly announce it would you publish it would you kind of just run away and hide underground

[00:45:25] from the nsa and the cia so you stay alive uh what would what would you do well i guess that depends on your personality type if you're an honest person and an academic at heart you might

[00:45:37] announce it and publish a scientific result uh otherwise uh you'd obviously just go and use it to steal someone's wallet you'd say which crypto wallet has the biggest amount of money

[00:45:49] i'll crack that one and take all their cash um it would be surprising that if someone did neither which appears to be the case here there's neither an announcement how it was done scientifically

[00:46:01] nor any evidence that they just ran off with billions of dollars uh which means uh i'm not actually familiar with this so i'm only speculating but uh there may be some other reason behind

[00:46:11] it were they manipulating markets or attempting to and make money that way you can also completely easily make money that way too it's hard to know yeah uh so i can tell you that like

[00:46:25] if i was working with the team that i actually managed to do this uh i i would i i would tell the intelligence agencies of my country the u.s you go there first i would yeah and that's

[00:46:42] it's partially out of self-preservation um you know like as a scientist my first inclination is always to just you know let the world know so just you know take take a big composite number and

[00:46:59] write down its prime factors and say look i did this um but uh there are security implications to that suddenly the calculus changes um and uh yeah i mean yeah you're quite wide i would do that it is

[00:47:16] somewhat more of a bigger issue than uh how much cash can i make right this has pretty big implications if you can actually go and crack rsa 248 yeah and you do want to be incredibly intelligent in

[00:47:29] the way and sensitive about the way you handle that if it is actually true right i think that was my my big problem when i saw this is first that that's not how the announcement would take

[00:47:40] place and also the lack of proof right so one argument said someone posted me was like oh well he doesn't want to reveal trade secrets but i was like that's fine give him a key give him a public

[00:47:50] key and have him give you the private key you know the the proof can be there without revealing trade secrets so you can do a zero knowledge proof that they have that they have

[00:47:58] the protocol by by cracking a number that you give them yeah so we yeah we could go down to that so it was definitely um yeah i i i guess i wasn't surprised that the media picked it up but

[00:48:08] i was a little disappointed in the end um i've stopped being disappointed in things i see in the media i just by default uh assume the worst unless otherwise there's a reason to great so now i want to talk a little bit about the corporate engagement and quantum

[00:48:28] technologies again sometimes there's a little bit of mocking in the field about like oh well this company is doing this sort of research on quantum and it's done because we're still so far away

[00:48:39] but a lot of companies are starting to engage in the technology and so i wanted to go through some of these and discuss which things will work well on quantum computers and which might not

[00:48:49] so one of the interesting ones and kind of relevant to this time maderna which no one had heard of before but now everyone knows that name so they said that they're using the combination of advanced formulation discovery with generative ai to create new mRNA drugs using quantum approaches

[00:49:06] will this work on quantum computers so there are a lot of buds words in that sentence um the the combination of generative ai and quantum is one that's not completely clear to me but but certainly there's room for quantum algorithms in quantum simulation which is relevant to

[00:49:26] to these sorts of things uh when you're talking about looking at biomolecular interactions that's heavily influenced by quantum phenomena that can't be easily classically simulated so there's certainly the potential for quantum algorithms to advance things in medicine and drug design

[00:49:44] the genetics biotechnology uh the moment that people start combining it with other buzz words by saying generative ai quantum techniques biochemistry i'm thinking that there are too many buzz words here um that doesn't just happen on its own that's by design uh so when i just see

[00:50:02] quantum on its own that's fine when i see too many other buzz words i immediately start asking what's going on yeah uh right i mean so okay i guess to sort of give the most optimistic point

[00:50:22] of view um you could say that you know one of the most valuable things that quantum computers could give pharmaceutical industry um is is not necessarily discovering new drugs but being able to accelerate the process of bringing those drugs to market

[00:50:48] so there's an enormous lag time in many countries like the u.s for good reason to bring drugs to market because they have to be tested because they're going to be used on humans um and uh you know

[00:51:01] this can take over a decade and a lot of that lag time is just in testing and so if you could accelerate that process for example by finding flaws quicker um where maybe you know some

[00:51:20] catalysis reaction doesn't happen and you don't you don't get the molecule you thought you were going to get or it's not you know some proposed pharmaceutical is is not actually binding in the

[00:51:29] way it should then that could be a huge benefit um uh basically it'll knock out the ones that won't work quicker and so uh you know yes maybe maybe there is a way to do that i don't see that

[00:51:47] anytime soon the the number of qubits and the number of operations necessary to do things at the level of testing pharmaceuticals or you know the molecules that are you know on the scale for these kinds of

[00:52:08] drugs is in the you know hundreds of millions and um it's it's it's a very long distant process i think for us getting to that level um but i don't know i mean it's good that i think

[00:52:28] it's good they're looking into it but i wouldn't hold my breath for you know major demonstrations in that area soon yeah definitely so and one of the things is just because the press release

[00:52:42] has a bunch of buzzwords doesn't mean there's not good fundamental research going on right because as we've talked about in previous episodes simulation chemical reactions are a big hot topic and near-term application potentially and so there's a lot of companies in that space getting in

[00:52:58] so the next one i want to talk about and jp morgan i mean jp morgan has been in the quantum space for a very long time they're collaborating for the financial applications but also moody's

[00:53:11] analytics if you look at the cure of press announcement they are involved on the hardware side there and have also the same week also put out a press release with multiverse computing

[00:53:23] on the simulation side as well so that was interesting to me to see the financial uh the financial industry moving more and more towards also the hardware space so again will this work

[00:53:35] on quantum computers and or will it not i mean uh it's always always comes down to the distinction can it work be helpful in principle versus does it in practice that that's hard to answer but certainly there's potential um people in many different areas within quantum computing

[00:53:54] have been looking at optimization algorithms that could be applicable to portfolio optimization or optimizing um uh you know portfolios under the black shawls option pricing model that kind of thing where you get these very complex optimization protocols that are hard to do classically

[00:54:14] and people have certainly come up with arguments showing that quantum methods can yield enhancement here whether it amounts to making any dollars in practice uh that's harder to know um especially in an asymmetric market where uh it would initially at least be a very small

[00:54:35] number of players that would have access to quantum optimization while the most wouldn't it's hard to know how that would affect the dynamics of the market would it just result in those handful of uh market participants with quantum technology completely dominating the space

[00:54:49] now or would it just be a much more subtle benefit that may not actually be tantamount to a huge increase in uh in portfolio margins it's really hard to know but there's definitely

[00:55:02] potential at least in theory at an algorithmic level i mean one of the biggest problems in in economic modeling isn't so much the algorithms themselves it's you don't have perfect information about things it's the inability to to have full knowledge of the system which which

[00:55:23] often limits what you can do same in like weather modeling you've got a chaotic system and it's not that the computers can't run the equations it's that it's hugely sensitive to initial conditions and you can't know them perfectly no matter how hard you try these

[00:55:37] sorts of things are very fundamental and quantum computing doesn't offer a solution to that in any way right one of the option pricing experiments can be done just on a five qubit super connecting

[00:55:49] chip with a lot of noise it's been around for a while yeah so some some things you see too is uh companies um using quantum inspired algorithms so like for example multiverse uses tensor network calculations so these tensor networks were originally designed in their context to study

[00:56:16] many body quantum systems like how electrons hop on a on a surface and turns out you can apply some of those same methods to studying other problems in finance so even they're not using a quantum device to speed things up you're using

[00:56:37] computational tools originally developed to study quantum systems to speed up financial predictions so and so i want to close out by asking about the btq and foxconne collaboration so it's announced that's entered into research and collaboration agreement with honhain research

[00:57:01] institute so can you all elaborate a little on that uh yeah so um so yeah i should just point out that foxconne has large offices in in taiwan and btq has their main research office in taipei

[00:57:23] so um they're they're physically close in that context and um yeah they basically sign an agreement to uh you know pursue research together on post quantum cryptography and um given the fact that you have the tsmc the the largest chip manufacturing plant in the world

[00:57:43] also in taiwan is sort of a you know strong collaboration i think i i i've heard that the taiwanese government has made the pursuits of um post quantum cryptography a national priority

[00:57:57] so um yeah i think it's you know it's a good sign you're seeing these these different companies start to work together for you know like a common goal and and elaborating on taiwan in terms of fairly unique position taiwan is globally a dominating force in semiconductor

[00:58:19] manufacturing especially via tsmc that gavin mentioned um now a big aspect of implementing post quantum cryptography um it's not just the software algorithms behind it but lots of the cryptography we rely on today has hardware level optimization because these are routines that

[00:58:37] get used so frequently they just put those algorithms directly into the chipset and so there's a huge place in the market for people to be designing hardware optimization units that you might for example in an arm type of model um the company that makes lots of the instruction

[00:58:58] set architectures that we rely on today they have a model where they design how these instruction sets and hardware modules work and license them out you certainly there's a lot of space for a similar thing when it comes to hardware level optimization of post quantum cryptographic

[00:59:13] primitives to replace the the current generation of hardware acceleration units that exist in our chips um so being there in the place where the manufacturers of semi semiconductor hardware are that are designing all the hardware combined with expertise in post quantum cryptography

[00:59:33] and the ability to find that intersection between hardware and software that's a really really important role in the economy yeah and i think one of the really cool parts of that collaboration is actually the work to support the academic community in promoting the start standardization

[00:59:48] as well we've talked about before the really interesting part of this is you need cryptographers you need physicists you need a lot of different academics coming together to do all this a lot

[00:59:59] of the hacker types to kind of break through the system so it's good to support that as well yeah it's a hugely intersectional area that really as you said requires expertise from very specialist expertise from quite a few different areas it's very hard to find that

[01:00:17] definitely so now i want to close out with some rapid fire general predictions so very very quick answers how many qubits will we see next year what type of qubit you can pick your kind of qubit well if we're just talking arbitrary qubits where a company says

[01:00:39] we have this many qubits and that's the end of the statement uh then the answer is i don't care how big the number is because it probably by implication isn't actually useful if they

[01:00:50] said we've got one error protected qubit that we can do the identity operation on in other words no computation just holding it in memory but maintaining its quantum state effectively over a long time scan scale that i would find incredibly exciting if someone just had one logical qubit

[01:01:07] that they could keep as a quantum memory for a very long period of time that would excite me yeah well i feel like i should have a a shot of scotch right now but to make these predictions properly but i will predict we're gonna see

[01:01:26] three to ten thousand qubits announced that are connected qubits but maybe not maybe there might be some bottlenecks in the connectivity but it'll be something where in principle there's a pathway from any one qubit to another by gates so i'll put out the number ten

[01:01:49] thousand wow that's nice i know ibm has a roadmap to double every year so we're at least getting the two thousand from them next year because they haven't wavered from that roadmap all right next

[01:02:01] will we solve a real world problem that's a very interesting question um maybe the the initial state a question to ask is what's the first real world useful problem that that we could implement

[01:02:19] there will be a practical use and even that doesn't have a clear answer because different algorithms can be inherently suited to different types of architectures i don't have a prediction on that when we're

[01:02:30] going to have the first utility that that i think is perhaps more likely to be a about a black swan kind of event but um i have a feeling it won't be something like cracking a code it'll

[01:02:41] probably be more something in the quantum simulation space where there's a lot more maneuverability on the type of algorithms you could deploy i'm gonna say no i think you know they're already annealers

[01:02:59] like d wave has said that they've used their device for you know routing traffic but i don't really consider that i mean yes it's a real world problem but if you spent all the money

[01:03:13] for that quantum annealer on just more time on a supercomputer you could have done the same thing and probably faster so um uh yeah i don't think we're gonna see uh a speed up on a real world

[01:03:28] problem next year i also say no i think it's more three years so next one is quantum winter coming can we just define quantum winter this is where everything is broken or you know funding

[01:03:46] or whatever you think quantum winter is i think certainly uh there'll be a pivot in the market at some point because we're clearly going through a boom stage where everybody's just flooding the market with investment thinking wow this is the next hot thing can't go wrong

[01:04:03] but whenever that happens they reach a point where people realize that they've overestimated the potential of certain technologies or someone comes in and qualifies a statement you realize it's not quite as profitable as you think there'll be a huge

[01:04:17] correction at some point so using that definition an investment quantum winter there will be one but i don't see that as a bad thing i see that as creative destruction it's the market learning from what works and what doesn't uh from that learning reallocating investment into the things

[01:04:37] that are more likely to work so quantum winter yes but i don't see it as bad but next year oh possibly but certainly in the next few years i would think i mean how long can the hype last

[01:04:54] yeah i don't think it's going to happen next year uh there's not going to be all kinds of crazy stuff happening like for the election in my country um but uh i don't think we're going to get to see

[01:05:05] quantum winter next year but uh i can see it happening um yeah maybe three years from now interesting so for my take i think quantum winter kind of already passed i thought that was

[01:05:21] the last two years just because there was a lot of skepticism in the market and for me it feels like especially seeing what's come out in the last few months we're kind of on the upswing

[01:05:32] we lost a lot of talent over the last two years martin has left google a couple people left into the ai industry which i think was a big loss for the field but 2024 is going to be huge for mist

[01:05:44] releasing the standards so i think the pqc side is going to kind of work up back on the upswing now with that leading into the next few years of the simulation and the early applications

[01:05:55] interesting so um for the the winter that passed do you think it was it was mainly for hardware companies then i think so i think the there's a lot of hardware companies that have

[01:06:08] died off and kind of there's consolidation going on still going to be a little bit of consolidation but it's kind of the my view is this is the next era of quantum companies in the next few years

[01:06:19] they're actually going to be focusing on applications and they're just starting to come up yeah i wonder if there'll be a winner for software that would be interesting i think that's where i saw the consolidation where people were trying to build software for stuff that doesn't

[01:06:36] exist yet also and you know there's in the venture capital industry there's an issue where you have to exit within 10 years in some way and just looking at those timelines i'm like you're just a software only company may not get there right all right so any other predictions

[01:06:58] you want to put on the record for next year yes um perhaps back to the pqc issue i think uh in the next year we're going to start seeing uh mass announcement of pqc migration strategies and

[01:07:18] adoption of pqc so the cryptographic level um it's much more predictable i think everybody knows that pqc is necessary and it's not a bet on any particular quantum computing architecture it's a

[01:07:31] bet on quantum computing as a whole and that's a much safer bet because it's a bet on the entire market rather than a particular technology so there i'm very confident we're going to see a very rapid change uh in transitioning to post-quantum cryptography yeah

[01:07:54] i think this is where our prediction market episode will do really well sure market knows best right well does sometimes i guess but if it's if it's about money the market is the best sure by definition it is right it's self defining yeah yeah

[01:08:14] all right so there's nothing else now we have this on the record so next year we can review it and see what we got right and what we got wrong so thank you so much for joining us on this

[01:08:25] episode of the quantum state make sure if you're on youtube to comment down below if you have any other questions i want us to talk more about certain topics happy to do that

[01:08:34] if you are on spotify you can find us there or any platform or you get your podcast and thank you so much