This story appears in the November issue of Utah Business. Subscribe.
Last month, Utah Business partnered with Foley & Lardner to host a roundtable featuring Utahâs quantum and cloud computing experts. This conversation was moderated by Dr. Taylor Sparks, director of ReUSE REU and professor of materials science and engineering at the University of Utah.
What are the latest developments in cloud computing?
Adam Frisbee | Adjunct Professor, OIS Operations & Info Systems | University of Utah
Iâve seen the rise of specialty clouds, especially platforms. AWS [Amazon Web Services], Azure and GCP [Google Cloud Platform] are traditionally infrastructure providers, but weâve seen the rise of platform providers like Salesforce or Workday. These are cloud platforms that serve a special purpose. And that reduces the risks and costs associated with building and maintaining a lot of the infrastructure. ⦠The edge is an exciting frontier for the cloud. Iâve read that itâs called âthe fog,â â not clouds in the sky anymore, clouds among us. The fog is where we are starting to do more computing.
Troy Rydman | Chief Information Security Officer | Amazon Web Services
At AWS, we are looking at how to help our customers get out of the data center business. Our customers donât want to be in the data center business and maintaining infrastructure. Two things weâre doing are 1) a service architecture where you move to services and adopt them without having to spin up something thatâs referred to as a bare metal system. You simply execute the service you want and utilize it. You donât have to maintain anything with it. ⦠And 2) is around very custom processor sets. We are now creating systems and processors and CPUs. There are specific use cases that significantly reduce energy costs by 80 to 90 percent and increase output.
Tony Kanell | Senior Engineering Manager | NVIDIA
Thereâs a massive retargeting of workforce skill sets where, with serverless technologies, you have to think about things in a highly parallel state that youâre not used to designing for. Engineers who are used to building things in a certain way suddenly have to relearn what âgoodâ looks like in the system and how to architect things accordingly. ⦠Weâre using data to drive these decisions as opposed to feelings. Weâre making clear trade-offs. ⦠If we are sacrificing some difficulty and deployability or development time, we see clear benefits in performance or scalability.
Jeremy Fillingim | Co-Founder & CTO | PassiveLogic
As the computing power available at the edge becomes more and more substantial, ⦠it becomes really important, I think, that all the computing becomes more the same, which puts a lot of stress and emphasis on networking technology. Itâs really interesting to watch the pendulum swing back and forth between cloud and edge. ⦠We have to be able to take advantage of all of the technologies in both places.
Dr. Barclay Burns | Assistant Dean, Applied AI | Utah Valley University, Smith College of Engineering & Technology
Iâll give you a use case. Iâm an advisor for Intermountain Health on a project around neurodiverse kids. ⦠Weâre building a network of parents who can support each other ⦠[using] the best knowledge we can get from research papers and run a model on it. That has to be private, specialized and secure. We canât ever have anything ever reach into a healthcare system thatâs not utterly secure. We have to know exactly what words have been in there to train it. Weâre able to train these models, and weâre figuring out ways in which to interpret [the research] so parents can actually engage with scientific literature and understand whatâs happening with their child. Theyâre not becoming the doctor or the therapist, but when they meet with the doctor or the therapist, theyâre more effective and more efficient. Whatâs going to be really essential to this is the edge. ⦠Weâll have a version of the AI modeling on the phone that the parent can talk to.
Peter Bookman | Founder & CEO | GUARDDOG AI
The argument has been made not to fear the current version of AI because, in a nutshell, it canât be freed. There is no free thinking. You have to compile it. Therefore, itâll always do what itâs instructed â every single time. There is no âit breaks freeâ because the hardware wonât let it, and neither will the software. It canât. But quantum can. I am thinking, subconsciously, 11 million thoughts a second. Iâm aware of 40 of them; those are all in the frontal lobe. Everything weâve done so far is frontal lobe stuff, but how do we think subconsciously? When we put that all together, we get to this: I do my best thinking when Iâm not thinking.
What is quantum computing? What can the general public use quantum computing for?
Dr. Massood Tabib-Azar | USTAR Professor, ECE | University of Utah, Department of Electrical & Computer Engineering
Classical computing is based on bits, zero and one. Zero is usually represented to some voltage level, letâs say zero voltage. Then one is represented by some other voltage â maybe three volts or four volts. What will happen if you try to superimpose them on each other? Usually, zero wins. Itâs going to short out the one. You cannot have superimposition of many bits. ⦠Thatâs the power you get in quantum computing: Qubits [quantum bits] enable you to add or perform the superimposition or come up with superimposition of zeros and ones. It enables you to add a whole bunch of quantum zeros and quantum ones. Once you have those, you have superimposition, and you can process them in parallel. It gives you an exponential speed-up.
The second thing is entanglement. Entanglement enables you to take two quantum bits and cause a correlation between them. The qubits themselves donât know what state they are, but once you perform a measurement on one of them, the other oneâs state is fixed.
Kamyar Maserrat | Senior Council | Foley & Lardner
Most of us would not need a quantum computer. Itâs a really heavy load of computation that needs a quantum computer. IBM allows 15 minutes free on their quantum computer, and I played around with it. ⦠If I had a special kind of machine learning model, it would probably train it faster or execute it faster, but nothing really for the general population.
What will it take for quantum computing to be everywhere? Will it ever be consumer-grade?
Dr. Vikram Deshpande | Associate Professor, Physics & Astronomy | University of Utah, Department of Physics & Astronomy
That moment has probably already arrived. Many may be familiar with the term quantum advantage or quantum supremacy. It basically means that if a quantum computer can do something better than the strongest, biggest classical computer in the world, you say you have quantum supremacy. ⦠In 2019, Google had 53 quantum bits and said they had achieved quantum supremacy in a particular problem. ⦠Then, a few months later, IBM showed that the strongest supercomputer in the world could actually best that. Now, a few years later, both Google and IBM are at about 500 quantum bits. At that point, it is beyond doubt that there is quantum advantage, quantum supremacy. Itâs there. Going back to the question, what is the application here? I donât think the application is people carrying around quantum computers. But for certain niche applications, there are already quantum computers available to people on the cloud.
Peter Bookman | Founder & CEO | GUARDDOG AI
Encryption is probably one of the most documented, well-understood [examples]. Everybody seems to understand that quantum computing will break current encryption. There is a known scientific, proven answer to that, except for one thing: Todayâs quantum computing, with as many qubits, is also known to be rather slow. ⦠By the time we need to solve the unencryption problem, the encryption problem will also be solved because the horsepower will be there now.
Dr. Sujatha Sampath | Senior Lead Scientist | Booz Allen Hamilton
*Any views on the topics I share are my own and do not represent my current or past employers. I donât think, and a lot of the community agrees, that we are going to have quantum desktops in our homes and offices anytime soon. Thatâs not what itâs going to be. The current state of quantum computing is an expensive process. Only where there is a lot of funding, governments or huge industry players are there even proof-of-concept devices right now. Thereâs a lot of development going on in academia and the industry, but ubiquitous compared to ⦠a phone â thatâs not going to happen soon.
Troy Rydman | Chief Information Security Officer | Amazon Web Services
From a cloud perspective, there are more people interested because the technology is at their fingertips. Previously, if you wanted to test something, ⦠youâd have to own a server. Youâd have to rack it in the data center. Youâd have to own this technology. Youâd have to pay for the infrastructure. Then, youâd have to understand how to set it up and maintain it just to experiment with it. Now itâs all virtual and for pennies on the dollar.
How have you seen cloud computing change your field?
Adam Frisbee | Adjunct Professor, OIS Operations & Info Systems | University of Utah
In my classes at the University of Utah, we can do labs that are very low-cost or sometimes no-cost for enterprise-grade architectures. Ten years ago, students would never have that. They might have access to it if the university had provided it, maybe at the high-performance computing center. But with the cloud, ⦠I have students building enterprise-ready technologies. ⦠I think itâs a really powerful thing to have the cloud. I like to say it democratizes technology.
Whit Johnson | Partner | Foley & Lardner
Itâs interesting that Troy is saying the cloud has made technology ubiquitous, and weâre saying quantum computing isnât going to be ubiquitous for a long time. If itâs on the cloud and accessible and there to understand, I think with a little bit of lowering of the decoherence, solving the energy issue, and resolving some of the material challenges of the hardware, the ChatGPT moment for quantum computing could be at any moment.
Dr. Massood Tabib-Azar | USTAR Professor, ECE | University of Utah, Department of Electrical & Computer Engineering
If you look at the way sensors are evolving, kind of parallel to quantum computing are quantum sensors that enable you to sense things that are very, very minor: small changes in my biology or blood pressure. Those are coming. These quantum sensors are going to be for biology, and armies are interested in that. To access and analyze this data, you need very powerful computers. So, quantum sensors and quantum computers will go hand-in-hand in solving really difficult problems in biology, human health and global health.
Dr. Barclay Burns | Assistant Dean, Applied AI | Utah Valley University, Smith College of Engineering & Technology
If we start to make some statements, you have a governor, you have the Governorâs Office of Economic Opportunity, you have the World Trade Center, you have legislators who will sit in a room with this group and rustle through these issues and start to make policy decisions and investment decisions. They just arenât having these opportunities. This is the kind of group that can convene this kind of conversation. ⦠Iâve just seen how it played out with the AI policy with lawmakers and executive directors in government, higher education and industry. You need all the people at the same table thinking about this and engaging in the conversation, and people actually move.
Dr. Sujatha Sampath | Senior Lead Scientist | Booz Allen Hamilton
One of the advantages of quantum power is the way the operations are performed. For a classical computer, it all works in Boolean space. In quantum space, it has the advantage of mathematically mapping or scoping parallel scenarios using vectors and matrices. That is very akin to how machine learning and deep learning works. Utah has a sizable presence in deep learning and machine learning companies. That could be one selling point to influence policy.
Dr. Aurora Clark | Professor, Chemistry | University of Utah, Department of Chemistry
I was at an RPE workshop on quantum computing earlier this spring with 15-20 hardware companies and 10 different software companies. ⦠It was evident thereâs been a huge amount of investment already in both the hardware and the software side. Everyone is trying to find the right application. So much of it is partnerships and bringing people together so that your particular platform and the way youâve constructed your hardware with this particular software will work well with this application. ⦠Itâs very crowded because thereâs been so much investment at the government level â there are tons of startups available. At some point, the money is going to be going away, which means everyoneâs going to be clawing for it. I think the partnership aspect could be a very fruitful way to approach it.
What are the implications of quantum or cloud computing being used in physical science simulations?
Jeremy Fillingim | Co-Founder & CTO | PassiveLogic
The very high-level view is that we build a model of the building and all the equipment. ⦠You donât get to train one model and then run it [multiple times]. The composability of our models became very important. We want to be able to reuse components. ⦠The computing power available can become a limiting factor. Weâve taken a different approach than traditional AI. In the building space, weâre simulating physical processes. Weâre basically running thermodynamics equations. The more things like that can be accelerated, the better it is for us.
Tony Kanell | Senior Engineering Manager | NVIDIA
Along those same lines, cloud technologies have allowed us to take all of these systems that have existed separately for so many years ⦠and simulate them together for the first time. We had a manufacturing customer who did all of these different simulations in 10 different pieces of software, and then they built the factory. Unfortunately, the software that programmed the robot arm didnât talk to the architecture software. The first time they turned the arm on, it went up and slammed into the gantry. Thatâs incredibly costly to have to go and redesign after itâs been built. When you bring all these together in the cloud and you build a digital twin that knows about everything in there, itâs physically based and you can simulate it, you can save that cost upfront.
Dr. Aurora Clark | Professor, Chemistry | University of Utah, Department of Chemistry
There are simulations I would love to do that are just not practical. If I wanted to simulate, for example, all of the chemical reactions occurring in the valley right now that are leading to the air quality, I cannot do that on a classical machine. But you could, in principle, on a quantum computer very effectively. A lot of fundamental work needs to be done toward that because that kind of information at a chemical level needs to be accounted for.
What materials/hardware limitations are currently limiting quantum computing?
Dr. Vikram Deshpande | Associate Professor, Physics & Astronomy | University of Utah, Department of Physics & Astronomy
The biggest issue holding up the scaling of qubits in quantum computers is decoherence; even temperature is an issue. ⦠Googleâs and IBMâs qubits need to be cold, but even there, they are only coherent in the sense that theyâre only quantum for a certain period of time, which is on the order of tens of microseconds. The second is the error rate. With a few hundred qubits, if you have a bunch of them giving wrong answers, that canât work. Essentially, there is this whole idea of error correction. ⦠You have a whole infrastructure that is trying to correct the errors resulting from a given qubit. All of this is preventing scale-up.