Quantum Algorithms for Large Language Models on Noisy Intermediate-Scale Quantum Computers
Abstract
Timo Aukusti Laine
We present a systematic methodology for developing and validating quantum algorithms for Large Language Models (LLMs). This methodology includes partition function-based transformations to map LLM embedding similarity values to a range suit- able for quantum computation, and the design of a shallow-circuit quantum algorithm for estimating this transformed similarity measure on nearterm quantum computers. We rigorously evaluate our approach through quantum simulations and experiments on real quantum hardware, demonstrating the feasibility of using quantum computing for LLM embedding analysis. Our results highlight the potential for quantum-inspired techniques in LLM tasks and demonstrate practical strategies for achieving reliable results on noisy quantum hardware.

