Abstract: We reformulate the continuous space Schrödinger equation in terms of spin Hamiltonians. For the kinetic energy operator, the critical concept facilitating the reduction in model complexity is the idea of position encoding. A binary encoding of position produces a Heisenberg-like model and yields exponential improvement in space complexity when compared to classical computing. Encoding with a binary reflected Gray code (BRGC), and a Hamming distance 2 Gray code (H2GC) reduces the model complexity down to the XZ and transverse Ising model respectively. For A qubits BRGC yields 2^A positions and is reduced to its 2-local form with O(A) ancillary qubits. H2GC yields 2^(A/2+1) positions with O(A^2) 3-local penalty terms. We also identify the bijective mapping between diagonal unitaries and the Walsh series, producing the mapping of any real potential to a series of k-local Ising models through the fast Walsh transform. Finally, in a finite volume, we provide some numerical evidence to support the claim that the total time needed for adiabatic evolution is protected by the infrared cutoff of the system. As a result, initial state preparation from a free-field wavefunction to an interacting system is expected to exhibit polynomial time complexity with volume and constant scaling with respect to lattice discretization for all encodings. For H2GC, if the evolution starts with the transverse Hamiltonian due to hardware restrictions, then penalties are dynamically introduced such that the low lying spectrum reproduces the energy levels of the Laplacian. The adiabatic evolution of the penalty Hamiltonian is therefore sensitive to the ultraviolet scale. It is expected to exhibit polynomial time complexity with lattice discretization, or exponential time complexity with respect to the number of qubits given a fixed volume.
Speaker’s Bio: Jason is currently a Research Scientist at RIKEN and also a Visiting Scholar at UC Berkeley, and lead the QIS effort in the nuclear theory department. He received by Ph.D. at the University of Illinois at Urbana-Champaign under Aida El-Khadra, where he calculated the hadronic matrix elements relevant to neutral D-meson mixing using methods of lattice quantum chromodynamics. He then moved to Berkeley Lab for his postdoctoral research position working with Andre Walker-Loud where they calculated, among other things, the first lattice QCD determination of the nucleon axial coupling to a precision of 1%. Much of the lattice QCD efforts are computationally expensive, made possible only through computing resources at OLCF. In his current position at RIKEN, he started thinking about how one may use quantum annealers and quantum computers to tackle some near and long term challenges from the perspective of lattice QCD and many-body nuclear physics. The work he presents today is made possible through the DOE Office of Nuclear Physics Quantum Horizons grant, which allowed them the resources to bring together scientists from a broad range of expertise including nuclear physics, statistical mechanics, and industry experts in digital circuit design.
Last Updated: July 2, 2021 - 9:09 am