AI agents are increasingly capable of reasoning and performing autonomous work over long periods. However, as agents take on more complex, longer-horizon tasks, keeping them supplied with the right information becomes the core engineering challenge. The industry is moving away from pre-loading context upfront toward a model where agents dynamically navigate and retrieve the data they need, when they need it.
Redis is approaching context management using a context engine, which is an architecture built around four pillars: on-demand context retrieval, data that is always current, fast retrieval, and a memory layer that improves over time. In practice this means building materialized views of data with a semantic layer on top, rather than giving agents direct access to production databases. A memory system sits alongside this, extracting and compacting information asynchronously as the agent works.
Simba Khadder leads AI strategy at Redis, and he previously co-founded the feature store platform FeatureForm, which was acquired by Redis in 2025. In this episode, Simba joins Kevin Ball to discuss why context has become the defining challenge in agentic AI, how context engines differ from traditional RAG architectures, how materialized views underpin reliable agent data pipelines, how memory systems can improve through async extraction and compaction, and how engineering teams need to adapt their practices as AI-driven development accelerates.
Full Disclosure: This episode is sponsored by Redis.
Kevin Ball or KBall, is the vice president of engineering at Mento and an independent coach for engineers and engineering leaders. He co-founded and served as CTO for two companies, founded the San Diego JavaScript meetup, and organizes the AI inaction discussion group through Latent Space.
Please click here to see the transcript of this episode.

