Highlight

Training Machine Learning Models with Adiabatic Quantum Computers

Training linear regression, support vector machine (SVM) and k-means clustering on adiabatic quantum computers.
2021: Q3

Achievement

Paper titled "QUBO Formulations for Training Machine Learning Models" published in the Nature Scientific Reports journal.

Significance and Impact

Machine learning is widely used across many scientific disciplines to train large-scale complex data. However, machine learning models take a long time to train. Furthermore, it will be infeasible to train machine learning models using current practices in the post-Moore era because the computation speed of classical computers will stagnate after the Moore's law ends. Moreover, the data used for training continues to grow in scale and complexity. To this extent, it is important to explore non-conventional computing paradigms such as quantum computing to train machine learning models efficiently. Quantum computers operate in exponentially large tensor product spaces and can outperform classical computers on several problems. We show that they can be an order of magnitude faster than classical computers for training certain machine learning models as well. This line of research showcases novel quantum machine learning techniques, which can significantly accelerate the training of machine learning models. This will eventually enable analysis of large-scale complex data and result in significant scientific breakthroughs.

Research Details

We propose adiabatic quantum computing approaches for training three machine learning models in this paper: linear regression, support vector machine (SVM) and k-means clustering. We convert the training problems of these models into quadratic unconstrained binary optimization (QUBO) problems, which can be solved efficiently on adiabatic quantum computers. We compute the computational complexity of our quantum approach and compare it to their state-of-the-art classical counterparts. We show that using our quantum approach, the SVM and k-means clustering models can be trained an order of magnitude faster than the classical approach. The computational complexities for linear regression are equivalent in both cases.

Overview

Training machine learning models on classical computers is usually a time and compute intensive process. With Moore’s law nearing its inevitable end and an ever-increasing demand for large-scale data analysis using machine learning, we must leverage non-conventional computing paradigms like quantum computing to train machine learning models efficiently. Adiabatic quantum computers can approximately solve NP-hard problems, such as the quadratic unconstrained binary optimization (QUBO), faster than classical computers. Since many machine learning problems are also NP-hard, we believe adiabatic quantum computers might be instrumental in training machine learning models efficiently in the post Moore’s law era. In order to solve problems on adiabatic quantum computers, they must be formulated as QUBO problems, which is very challenging. In this paper, we formulate the training problems of three machine learning models—linear regression, support vector machine (SVM) and balanced k-means clustering—as QUBO problems, making them conducive to be trained on adiabatic quantum computers. We also analyze the computational complexities of our formulations and compare them to corresponding state-of-the-art classical approaches. We show that the time and space complexities of our formulations are better (in case of SVM and balanced k-means clustering) or equivalent (in case of linear regression) to their classical counterparts.

Last Updated: May 11, 2021 - 4:20 pm