1. 2

Nikhil S. Mande, Changpeng Shao (Feb 27 2024).

Abstract: Quantum-inspired classical algorithms provide us with a new way to understand the computational power of quantum computers for practically-relevant problems, especially in machine learning. In the past several years, numerous efficient algorithms for various tasks have been found, while an analysis of lower bounds is still missing. Using communication complexity, in this work we propose the first method to study lower bounds for these tasks. We mainly focus on lower bounds for solving linear regressions, supervised clustering, principal component analysis, recommendation systems, and Hamiltonian simulations. More precisely, we show that for linear regressions, in the row-sparse case, the lower bound is quadratic in the Frobenius norm of the underlying matrix, which is tight. In the dense case, with an extra assumption on the accuracy we obtain that the lower bound is quartic in the Frobenius norm, which matches the upper bound. For supervised clustering, we obtain a tight lower bound that is quartic in the Frobenius norm. For the other three tasks, we obtain a lower bound that is quadratic in the Frobenius norm, and the known upper bound is quartic in the Frobenius norm. Through this research, we find that large quantum speedup can exist for sparse, high-rank, well-conditioned matrix-related problems. Finally, we extend our method to study lower bounds analysis of quantum query algorithms for matrix-related problems. Some applications are given.

Arxiv: https://arxiv.org/abs/2402.15686