This paper proposes CinemaWorld, a novel framework that enables real-time, user-controllable AR effect intensity for interactive streaming applications. It combines an adaptive neural rendering pipeline, a federated learning-based personalization system, and a quality-aware streaming protocol. The framework allows users to adjust AR effects from subtle to dramatic transformations, achieving real-time performance on consumer devices while maintaining high visual quality and enabling user personalization.
Key findings
CinemaWorld enables real-time, user-controllable AR effect intensity in streaming applications.
The framework combines adaptive neural rendering, federated learning, and quality-aware streaming.
Users can adjust AR effects from subtle enhancements to dramatic transformations.
Preliminary analysis shows real-time performance on consumer devices with high visual quality and user personalization.
Limitations & open questions
The paper does not discuss potential scalability issues for large-scale deployment.
The evaluation metrics for AR-enhanced streaming quality of experience (QoE) are novel and may require further validation.