Simulating Sparse Hamiltonians with Star Decompositions

  • Andrew M. Childs
  • Robin Kothari
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6519)

Abstract

We present an efficient algorithm for simulating the time evolution due to a sparse Hamiltonian. In terms of the maximum degree d and dimension N of the space on which the Hamiltonian H acts for time t, this algorithm uses (d2(d + log*N) ∥ Ht ∥ )1 + o(1) queries. This improves the complexity of the sparse Hamiltonian simulation algorithm of Berry, Ahokas, Cleve, and Sanders, which scales like (d4(log*N) ∥ Ht ∥ )1 + o(1). To achieve this, we decompose a general sparse Hamiltonian into a small sum of Hamiltonians whose graphs of non-zero entries have the property that every connected component is a star, and efficiently simulate each of these pieces.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Andrew M. Childs
    • 1
    • 3
  • Robin Kothari
    • 2
    • 3
  1. 1.Department of Combinatorics & OptimizationUniversity of Waterloo 
  2. 2.David R. Cheriton School of Computer ScienceUniversity of Waterloo 
  3. 3.Institute for Quantum ComputingUniversity of Waterloo 

Personalised recommendations