The paper introduces Generative Error Mitigation (GEM), a method leveraging deep generative models to predict error-free expectation values from noisy quantum measurements. GEM offers reduced sampling overhead, eliminates the need for precise noise model characterization, handles non-stationary and correlated noise, and integrates with existing NISQ algorithms.
Key findings
GEM uses a conditional diffusion model trained on pairs of noisy and ideal quantum circuit executions.
The approach reduces sampling overhead compared to Probabilistic Error Cancellation (PEC).
GEM eliminates the need for precise noise model characterization.
The method handles non-stationary and correlated noise naturally.
GEM integrates seamlessly with existing NISQ algorithms including VQE and QAOA.
Limitations & open questions
The paper does not discuss the computational complexity of training the generative models.
The scalability of GEM for quantum devices with more than 100 qubits is not addressed.