Talks

AI agents are becoming first‑class API consumers, yet enterprises already operate large REST-based microservice ecosystems that should not be rebuilt for agentic workflows. To enable LLM agents without burdening hundreds of teams, we built a thin MCP (Model Context Protocol) server integrated with Envoy’s ext_proc filter. It maps agent tool calls to existing OpenAPI-defined REST endpoints, with no code changes it bridges LLMs to an existing microservices architecture. We’re open‑sourcing MCP‑on‑Envoy for others to adopt.
Jens Kat
ING
Jens Kat is a senior engineer at ING, where he leads the platform team responsible for ING’s API service mesh. His work focuses on large‑scale API architectures, control-plane and data-plane, and enabling new capabilities, such as LLM agent integrations, across thousands of microservices in highly regulated environments.