Real-world AI systems increasingly operate on continuous, high-velocity data streams where distributions evolve over time and temporal dependencies are present. Traditional "train-once, deploy-forever" paradigms break down, motivating learning systems that adapt continuously under strict computational and memory constraints.
Streaming Continual Learning (SCL) is emerging as a unifying framework bridging Streaming Machine Learning (SML) and Continual Learning (CL), combining rapid online adaptation with selective knowledge retention. Interest in SCL is rapidly growing, building on a history of community events including the ESANN 2025 special session and the AAAI SCL bridge, which attracted strong participation across CL, SML, and related areas.
The workshop solicits contributions on learning paradigms (CL, SML, online/OCL), adaptation under drift, temporal and sequential learning, evaluation and benchmarking, and emerging directions such as continuous adaptation of foundation models, reinforcement learning, and resource-constrained/edge learning.
Through oral and poster presentations, invited talks, and open discussion sessions, the workshop fosters interaction and exchange among researchers working on dynamic environments and evolving data streams. Uniting these communities provides a shared forum to identify open challenges, promising research directions, and practical solutions for Streaming Continual Learning.
— Paper submission deadline: May 31st 2026 —
The "Streaming Continual Learning Workshop (SCL)" workshop has been supported by VICI & C. S.p.A.