This paper introduces PhysFormer-Theory, a framework analyzing convergence properties in physics-informed generative architectures. It unifies neural operator theory with convergence analysis, establishing bounds on approximation error and optimization dynamics, and proving linear convergence rates under certain conditions.
Key findings
PhysFormer-Theory provides a comprehensive analytical framework for characterizing convergence in physics-informed generative architectures.
Linear convergence rates are proven for physics-informed generative models, dependent on spectral properties of the physical operator and neural architecture's capacity.
The framework offers practical guidelines for architecture design, loss weighting strategies, and training procedures to ensure stable convergence.
Extensive experiments validate theoretical predictions, showing superior convergence behavior compared to standard physics-informed neural networks and unconstrained generative models.
Limitations & open questions
The analysis primarily focuses on deterministic forward problems, leaving generative settings largely unexplored.