A milestone in quantum physics â rooted in a student project What began as a student's undergraduate thesis at Caltech â later continued as a graduate student at MIT â has grown into a collaborative experiment between researchers from MIT, Caltech, Harvard, Fermilab, and Google Quantum AI. Using Googleâs Sycamore quantum processor, the team simulated traversable wormhole dynamics â a quantum system that behaves analogously to how certain wormholes are predicted to work in theoretical physics. Hereâs what they did: Implemented two coupled SYK-like quantum systems on the processor that represent black holes in a holographic model. Sent a quantum state into one system. Applied an effective ânegative energyâ pulse to make the simulated wormhole traversable. Observed the state emerge on the other side â consistent with quantum teleportation. This wasnât just classical computer modeling â it ran on real qubits, using 164 two-qubit quantum gates across nine qubits. Why it matters: The results are consistent with the ER=EPR conjecture, which suggests a deep link between quantum entanglement and spacetime geometry. In the holographic picture, patterns of entanglement can be interpreted as wormhole-like âbridges.â This experiment shows how quantum processors can begin to probe aspects of quantum gravity in a laboratory setting, complementing astrophysical observations and theoretical work. While no physical wormhole was created, this is a step toward using quantum computers to explore some of the most fundamental questions in physics. What breakthrough in science excites you most? Share your thoughts below â and letâs discuss how quantum computing is reshaping our understanding of reality. â»ï¸ Repost to help people in your network. And follow me for more posts like this. CC: thebrighterside
Quantum Computing Applications
Explore top LinkedIn content from expert professionals.
-
-
Many of you will have seen the news about HSBCâs world-first application of quantum computing in algorithmic bond trading. Today, Iâd like to highlight the technical paper that explains the research behind this milestone. In collaboration with IBM, our teams investigated how quantum feature maps can enhance statistical learning methods for predicting the likelihood that a trade is filled at a quoted price in the European corporate bond market. Using production-scale, real trading data, we ran quantum circuits on IBM quantum computers to generate transformed data representations. These were then used as inputs to established models including logistic regression, gradient boosting, random forest, and neural networks. The results: ⢠Up to 34% improvement in predictive performance over classical baselines. ⢠Demonstrated on real, production-scale trading data, not synthetic datasets. ⢠Evidence that quantum-enhanced feature representations can capture complex market patterns beyond those typically learned by classical-only methods. This marks the first known application of quantum-enhanced statistical learning in algorithmic trading. For full technical details please see our published paper: ð Technical paper: https://lnkd.in/eKBqs3Y7 ð° Press release: https://lnkd.in/euMRbbJG Congratulations to Philip Intallura Ph.D , Joshua Freeland Freeland and all HSBC colleagues involved â and huge thanks to IBM for their partnership.
-
Everybodyâs asking about the ð¸ð¶ð¹ð¹ð²ð¿ ð®ð½ð½ ð³ð¼ð¿ ð¾ðð®ð»ðððº ð°ð¼ðºð½ððð²ð¿ð. But when a team actually uses one to explore ð³ðð»ð±ð®ðºð²ð»ðð®ð¹ ð½ðµððð¶ð°ð in a way we couldn't before, the ðð¶ð¹ð²ð»ð°ð² from the broader community is deafening. Really? Iâve talked about using quantum computers for exploring physics before. I get it - ð¶ð'ð ð»ð¼ð ððµð² ð¶ðºðºð²ð±ð¶ð®ðð², ð±ð¶ðð¿ðð½ðð¶ðð² ð®ð½ð½ð¹ð¶ð°ð®ðð¶ð¼ð» ððµð®ð ð©ðð ð®ð»ð± ðºð®ð¿ð¸ð²ð ð®ð»ð®ð¹ðððð ðð®ð»ð ðð¼ ðµð²ð®ð¿ ð®ð¯ð¼ðð. ððð ð ð³ð¶ð»ð± ð¶ð ð®ð¯ðð¼ð¹ððð²ð¹ð ð®ðºð®ðð¶ð»ð´ ððµð®ð ðð²'ð¿ð² ð³ð¶ð»ð®ð¹ð¹ð ð¯ðð¶ð¹ð±ð¶ð»ð´ ðð¼ð¼ð¹ð ððµð®ð ð®ð¹ð¹ð¼ð ðð ðð¼ ðð»ð±ð²ð¿ððð®ð»ð± ð¼ðð¿ ðð¼ð¿ð¹ð± ð¼ð»ð² ð¹ð®ðð²ð¿ ð±ð²ð²ð½ð²ð¿. A new paper from Google ð¤ðð®ð»ðððº ðð & ð°ð¼ð¹ð¹ð®ð¯ð¼ð¿ð®ðð¼ð¿ð, is a perfect case in point. The team tackled a monster of a problem in condensed matter physics: ðµð¼ð ðð¼ ðð¶ðºðð¹ð®ðð² ððððð²ðºð ðð¶ððµ ð±ð¶ðð¼ð¿ð±ð²ð¿. Classically, this is a brute-force nightmare: You have to simulate thousands or even millions of different disorder configurations one by one, which can take an exponential amount of time. ðð»ððð²ð®ð± ð¼ð³ ðð¶ðºðð¹ð®ðð¶ð»ð´ ð¼ð»ð² ð°ð¼ð»ð³ð¶ð´ðð¿ð®ðð¶ð¼ð» ð®ð ð® ðð¶ðºð², ðð¼ð¼ð´ð¹ð² ððð²ð± ððµð²ð¶ð¿ ð´ð-ð¾ðð¯ð¶ð ð¾ðð®ð»ðððº ð½ð¿ð¼ð°ð²ððð¼ð¿ ðð¼ ð½ð¿ð²ð½ð®ð¿ð² ð® ððð®ðð² ððµð®ð ð¶ð ð® ððð½ð²ð¿ð½ð¼ðð¶ðð¶ð¼ð» ð¼ð³ ð®ð¹ð¹ ð½ð¼ððð¶ð¯ð¹ð² ð±ð¶ðð¼ð¿ð±ð²ð¿ ð°ð¼ð»ð³ð¶ð´ðð¿ð®ðð¶ð¼ð»ð. Then they gave it a tiny kick of energy in one spot, and watched what happened. The result? The energy stayed put. It refused to spread. This is a phenomenon called ðð¶ðð¼ð¿ð±ð²ð¿-ðð¿ð²ð² ðð¼ð°ð®ð¹ð¶ðð®ðð¶ð¼ð» (ððð). Even though the system's evolution and the initial state were perfectly uniform and disorder-free, the underlying superposition over different "backgrounds" caused the system to localize. ððâð ð® ðððð»ð»ð¶ð»ð´ ð±ð²ðºð¼ð»ððð¿ð®ðð¶ð¼ð» ð¼ð³ ð¾ðð®ð»ðððº ðºð²ð°ðµð®ð»ð¶ð°ð ð®ð ðð¼ð¿ð¸ ð¼ð» ð® ðð°ð®ð¹ð² ððµð®ðâð ð¶ð»ð°ð¿ð²ð±ð¶ð¯ð¹ð ð±ð¶ð³ð³ð¶ð°ðð¹ð ð³ð¼ð¿ ð°ð¹ð®ððð¶ð°ð®ð¹ ð°ð¼ðºð½ððð²ð¿ð ðð¼ ðµð®ð»ð±ð¹ð², ð²ðð½ð²ð°ð¶ð®ð¹ð¹ð ð¶ð» ð®ð. But this isn't just a cool physics experiment. This work carves out a concrete path to quantum advantage. The team proposed an ð®ð¹ð´ð¼ð¿ð¶ððµðº based on this technique that offers a ð½ð¼ð¹ðð»ð¼ðºð¶ð®ð¹ ðð½ð²ð²ð±ðð½ ð³ð¼ð¿ ðð®ðºð½ð¹ð¶ð»ð´ ð±ð¶ðð¼ð¿ð±ð²ð¿ð²ð± ððððð²ðºð. So yes, let's keep working toward fault-tolerant machines that can break RSA and optimize your portfolio. But let's not ignore the incredible science happening right now. ð¸ Credits: Google ð¤ðð®ð»ðððº ðð & ðð¼ð¹ð¹ð®ð¯ð¼ð¿ð®ðð¼ð¿ð (arXiv:2410.06557) Pedram Roushan
-
The high-energy physics (HEP) community is particularly poised to benefit from quantum computing due to the intrinsic quantum nature of its most complex computational challenges. These include theoretical models that are hard to tackle with classical computers and the complex data analysis required for the interpretation of experiments like those carried out at the Large Hadron Collider. In a collaborative effort led by CERN, DESY, and IBM, a roadmap has been created to outline the current state of quantum computing in the HEP community. This roadmap highlights both theoretical and experimental applications that can be pursued with near-term quantum computers. This work emphasizes the potential of quantum computing to address challenging problems in HEP and aims to encourage continued exploration and development of quantum applications in this field. I look forward to see the roadmap overviewed in this paper get closer to fruition, and to the next published paper that will come of our working groups, pushing for near-term use cases for quantum computing. Read the paper here https://lnkd.in/eCibpTg2
-
History was made this week in financial markets. HSBC, Europeâs largest bank, has proven that quantum isnât just theory...itâs a powerful competitive advantage. In partnership with IBM, HSBCâs quantum pilot delivered a 34% improvement in predicting bond trade fill rates at quoted prices. In markets where milliseconds move billions, that edge is transformative. By combining quantum and classical computing, HSBC tackled complex pricing algorithms that factor in real-time market conditions and risks. Philip Intallura, HSBCâs Group Head of Quantum Technologies, explained: âIt means we now have a tangible example of how todayâs quantum computers could solve a real-world business problem at scale.â Why it matters: ⢠Quantum computing is projected to become a $100B market within a decade (McKinsey). ⢠Finance is the proving ground where nanoseconds and probabilities drive outcomes. ⢠HSBC just demonstrated how quantum can deliver measurable results today. Quantum is still in its early stages, but breakthroughs like this set the benchmarks for what comes next. Which industry do you think will unlock the first trillion-dollar quantum advantage? #QuantumComputing #FinancialMarkets #BondTrading #FinTech #InnovationLeadership #HSBC #IBM
-
This image is from an Amazon Braket slide deck that just did the rounds of all the Deep Tech conferences I've been at recently (this one from Eric Kessler). It's more profound than it might seem. As technical leaders, we're constantly evaluating how emerging technologies will reshape our computational strategies. Quantum computing is prominent in these discussions, but clarity on its practical integration is... emerging. It's becoming clear however that the path forward isn't about quantum versus classical, but how quantum and classical work together. This will be a core theme for the year ahead. As someone now on the implementation partner side of this work, and getting the chance to work on specific implementations of quantum-classical hybrid workloads, I think of it this way: Quantum Processing Units (QPUs) are specialised engines capable of tackling calculations that are currently intractable for even the largest supercomputers. That's the "quantum 101" explanation you've heard over and over. However, missing from that usual story, is that they require significant classical infrastructure for: - Control and calibration - Data preparation and readout - Error mitigation and correction frameworks - Executing the parts of algorithms not suited for quantum speedup Therefore, the near-to-medium term future involves integrating QPUs as accelerators within a broader classical computing environment. Much like GPUs accelerate specific AI/graphics tasks alongside CPUs, QPUs are a promising resource to accelerate specific quantum-suited operations within larger applications. What does this mean for technical decision-makers? Focus on Integration: Strategic planning should center on identifying how and where quantum capabilities can be integrated into existing or future HPC workflows, not on replacing them entirely. Identify Target Problems: The key is pinpointing high-value business or research problems where the unique capabilities of quantum computation could provide a substantial advantage. Prepare for Hybrid Architectures: Consider architectures and software platforms designed explicitly to manage these complex hybrid workflows efficiently. PS: Some companies like Quantum Brilliance are focused on this space from the hardware side from the outset, working with Pawsey Supercomputing Research Centre and Oak Ridge National Laboratory. On the software side there's the likes of Q-CTRL, Classiq Technologies, Haiqu and Strangeworks all tackling the challenge of managing actual workloads (with different levels of abstraction). Speaking to these teams will give you a good feel for topic and approaches. Get to it. #QuantumComputing #HybridComputing #HPC
-
Researchers at Northwestern University (USA) have made a significant breakthrough in quantum communication by successfully teleporting a quantum state of lightâa qubit carried by a photonâthrough approximately 30 kilometers of optical fiber while simultaneously transmitting high-speed classical data traffic. Key details include: - The fiber length used was around 30.2 km. - It carried a classical signal of approximately 400 Gbps in the C-band alongside the quantum channel. - The quantum channel operated in the O-band, utilizing special filtering and narrow-temporal/spectral techniques to shield delicate photons from noise, such as spontaneous Raman scattering from the classical channel. This experiment confirms that quantum teleportation of a quantum state can coexist with classical internet traffic in the same fiber infrastructure. It's important to clarify that "teleportation" in quantum communication does not involve moving the physical photon or "beaming" objects as depicted in science fiction. Instead, it refers to the transfer of the quantum state of a qubit from one location to another using an entanglement-based protocol, coupled with classical communication. The original qubit is destroyed during this process and recreated at the destination. While quantum teleportation enables inherently secure quantum communication channelsâsince measurement disturbs quantum statesâpractical deployment still faces challenges, including node security, classical channel security, side-channels, and error rates. This marks a significant step toward quantum-secure networks, though it is not yet a complete "unhackable" solution. This experiment suggests that we may not require entirely separate fiber infrastructure dedicated solely to quantum communications; existing telecom fiber could be effectively utilized. It enhances the feasibility of developing quantum networks and, eventually, a "quantum internet" that integrates with classical infrastructure. From a security and cyber perspective, it supports the architecture of quantum-secure communications, including quantum key distribution and entanglement-based signaling. Overall, this represents a major technological milestone in photonics, quantum information science, and telecom integration.
-
ð Microsoft Azure Quantum and Quantinuum announced today a new milestone in the race to practical quantum computing. Some time ago vendors changed direction from having more qubits to having better qubits. As with many things in live, useful quantum computers are a matter of quality rather than quantity when it comes to qubits. Having high quality qubits and the ability to manipulate them with low error rates is a requirement to create any useful quantum circuit, the equivalent to programs in classical computers. In April both partners announced having set up 4 logical qubits (=high quality qubits) from the combination of 30 physical qubits on a Quantinuum H-series machine. (https://lnkd.in/ePp5MC7t) Now they claim having achieved 12 logical qubits on a 56-qubit Quantinuum H2. All 12 logical qubits were entangled in a GHZ state with a circuit error rate of 0.0011. Scaling to error rates in the order of 10^-3 is great news. Practical quantum computers able to address the complex problems waiting for them will require much more than that, but we are on the way. https://lnkd.in/e6e6v8Tk
-
Woke up today thinking about how atomic particles carry information â a shift that could redefine computing and communication. We typically think of information transfer through wires and circuits. But at the smallest scales, individual particles â photons, electrons, even atoms â are changing how things could work. 1 / Qubits in Quantum Computing In quantum systems, particles like photons and electrons store information as qubits. Unlike traditional bits, qubits use superposition and entanglement to process certain problems exponentially faster, transforming fields like cryptography and complex optimization. 2 / Photonic Communication (bullish here) Photons transmit data in fiber optics, but in quantum communication, single photons enable secure data transfer. Quantum key distribution (QKD) leverages photons to detect interception attempts, creating highly secure networks. 3 / Spintronics for Data Storage Electron spin, rather than charge, is used in spintronics, leading to faster, energy-efficient storage technologies like MRAM. This approach could revolutionize data density and durability, key for next-gen devices. 4 / Atomic Computing At the experimental edge, atoms themselves are being explored as data carriers. Single-atom transistors demonstrate the potential for ultra-compact processing power, hinting at a new frontier in computing miniaturization. Atomic-scale information transfer is reshaping techâmoving us beyond circuits to a new paradigm where particles drive performance. Thoughts?
-
Three weeks ago, our Devsinc security architect, walked into my office with a chilling demonstration. Using quantum simulation software, she showed how RSA-2048 encryption â the same standard protecting billions of transactions daily â could theoretically be cracked in just 24 hours by a sufficiently powerful quantum computer. What took her classical computer billions of years to attempt, quantum algorithms could solve before tomorrow's sunrise. That moment crystallized a truth I've been grappling with: we're not just approaching a technological evolution; we're racing toward a cryptographic apocalypse. The quantum computing market tells a story of inevitable disruption, surging from $1.44 billion in 2025 to an expected $16.22 billion by 2034 â a staggering 30.88% CAGR that signals more than market enthusiasm. Research shows a 17-34% probability that cryptographically relevant quantum computers will exist by 2034, climbing to 79% by 2044. But here's what keeps me awake at night: adversaries are already employing "harvest now, decrypt later" strategies, collecting our encrypted data today to unlock tomorrow. For my fellow CTOs and CIOs: the U.S. National Security Memorandum 10 mandates full migration to post-quantum cryptography by 2035, with some agencies required to transition by 2030. This isn't optional. Ninety-five percent of cybersecurity experts rate quantum's threat to current systems as "very high," yet only 25% of organizations are actively addressing this in their risk management strategies. To the brilliant minds entering our industry: this represents the greatest cybersecurity challenge and opportunity of our generation. While quantum computing promises revolutionary advances in drug discovery, optimization, and AI, it simultaneously threatens the cryptographic foundation of our digital world. The demand for quantum-safe solutions will create entirely new career paths and industries. What moves me most is the democratizing potential of this challenge. Whether you're building solutions in Silicon Valley or Lahore, the quantum threat affects us all equally â and so does the opportunity to solve it. Post-quantum cryptography isn't just about surviving disruption; it's about architecting the secure digital infrastructure that will power humanity's next chapter. The countdown has begun. The question isn't whether quantum will break our current security â it's whether we'll be ready when it does.