This paper introduces a learned Markov context model for adaptive entropy coding of chain codes, utilizing a lightweight neural network for symbol probability prediction. The model aims to achieve state-of-the-art compression ratios while maintaining linear computational complexity.
Key findings
A novel neural architecture for variable-order Markov context modeling adapts to local contour complexity.
An efficient probability estimation network is designed for discrete directional symbols.
A comprehensive experimental framework evaluates compression performance across diverse datasets.
Theoretical analysis of computational complexity and compression bounds is provided.
Limitations & open questions
The paper is a research proposal and thus does not include experimental results or validation.