Will quantum computing ever be available off-the-shelf?
It goes without saying that quantum computing is complex. But people buy extraordinarily complex things through simple processes every day. After all, few smartphone buyers know how their devices work. Even a humble bar of soap arrives on the shelf only after the raw materials are extracted, refined, manufactured, packaged, shipped, and stocked.
The question is: will quantum computing’s complexity ever be contained to the point where end users can buy it "off-the-shelf"? The answer is: that depends on what you mean by off-the-shelf.
If you’re imagining something like a MacBook Quantum that you could find on the shelves at Best Buy, that is unlikely to ever be the case. For most users, it may not ever be worth it to have your own quantum computer on-premises. Unless you’re a large, well-funded organization uniquely positioned to benefit from exclusive access to a quantum device, such as a government entity or major financial institution, quantum computing resources will always be more practical to access via the cloud.
On the other hand, if you’re imagining a digital marketplace of quantum-powered applications, that could very well come to fruition within the next two to five years. But just because you can download some quantum software doesn’t mean it will instantly provide an advantage over classical-only computing -- or even be useful at all.
Some readers may have already seen this play out with "off-the-shelf" AI solutions. While there are many commercial AI solutions available on the market, these solutions do not provide an advantage without some level of customization. For example, consider the sameness of all the automated chat bots you encounter while browsing the web. These chat bots have become table stakes, not advantages.
Generally, the less customization required for an off-the-shelf solution to work, the less likely it is to deliver an advantage that a competitor could not just as easily install. Every organization will have a unique combination of data, IT infrastructure, teams, and problems to solve. Any useful algorithm will need to be tailored to that unique environment to make an impact. This is true for AI, and it’s even more true for quantum computing.
Quantum Applications Demand Dedicated Expertise
Now, some quantum use cases will lend themselves more easily to off-the-shelf applications than others. There are already several classical optimization solutions available off-the-shelf, such as Gurobi and CPlex, and it’s not a stretch to imagine quantum-powered versions in the future. Although optimization use cases vary widely, they can all be mapped to well-known mathematical formulations, such as a mixed-integer programming problem. However, it still takes a domain expert to understand what variables or constraints need to be prioritized. It also takes a technical expert to map enterprise problems into mathematical problems that a software solution can solve, and then tweak the software to obtain the best performance.
Any advantage from off-the-shelf quantum software will depend on having a specialized team that can adapt the software to an enterprise’s unique problems. This includes both quantum computing experts and experts who deeply understand the business problems. It may seem like you can wait until the software is fully realized to start hiring quantum talent, but unfortunately the talent pool is rapidly dwindling. In our recent survey on enterprise quantum adoption, we found that 69 percent of enterprises have started on the path to quantum adoption, and 51 percent of these organizations have already started assembling their quantum teams. If you wait too long, the brightest minds will be long gone.
You will also want to foster relationships with external consultants. The executives we surveyed agreed: 96 percent said they could not successfully adopt quantum computing without external help. Outside consultants can save you time and energy by helping to identify use cases, anticipate hurdles, and build the software infrastructure you will need to effectively leverage quantum computing.
Building the Infrastructure for Quantum Computing
Quantum computing will never exist in a vacuum, and to add value, quantum computing components need to be seamlessly integrated with the rest of the enterprise technology stack. This includes HPC clusters, ETL processes, data warehouses, S3 buckets, security policies, etc. Data will need to be processed by classical computers both before and after it runs through the quantum algorithms.
This infrastructure is important: any speedup from quantum computing can easily be offset by mundane problems like disorganized data warehousing and sub-optimal ETL processes. Expecting a quantum algorithm to deliver an advantage with a shoddy classical infrastructure around it is like expecting a flight to save you time when you don’t have a car to take you to and from the airport.
These same infrastructure issues often arise in many present-day machine learning (ML) use cases. There may be many off-the-shelf tools available, but any useful ML application will ultimately be unique to the model’s objective and the data used to train it. You need a streamlined process to prepare and clean the data, ensure the data is compliant with privacy and governance policies, track and correct drifts in the model, and of course, make sure the model does what you want it to do.
As enterprise ML users know, maintaining these applications is an ongoing process. Ideally, you’d have a development environment for prototyping, a staging environment for testing, and then a production environment to scale the model up for enterprise use, leveraging HPC and cloud resources. The complexity associated with building and deploying ML applications in production called for the creation of the field of MLOps (also referred to as AIOps) to manage this complexity.
The complexity only multiplies when you add in quantum computing, which calls for a similar "QuantumOps" process to manage the complexity and make it useful in production. Quantum hardware is evolving rapidly, and to keep up, you’ll want a way to benchmark the performance of new quantum hardware backends as they come out to make sure you have the best configuration for your problem. The last thing you want is to invest millions into developing a quantum application, only to have a new device or software component render your work obsolete. It will be critical to have an environment that gives you the flexibility to fine tune your models, try out different configurations, track and compare changes, and iterate quickly.
An Off-the-Shelf Future?
In the future, quantum computing may be as invisible as the processor running the device you’re reading this on now. Quantum applications may be as easily accessible as your internet browser app or email app.
But accessible is not the same as useful.
To gain any meaningful advantage from quantum computing, you need to lay the groundwork by building the required team and infrastructure. Although fault-tolerant quantum devices are still years away, enterprises can build their workflows well in advance and swap in these more powerful backend devices once they come online.
Ultimately, every business will have unique challenges requiring unique quantum applications. Applications between businesses may be similar, but any quantum advantage will depend on tailoring the quantum application to the needs and capabilities of the business. This stands in direct contrast with the idea of an off-the-shelf quantum application, as appealing as that may sound.
Jhonathan Romero Fontalvo is Founder & Director of Professional Services at Zapata Computing.