|

Kong Releases Volcano: A TypeScript, MCP-native SDK for Building Production Ready AI Agents with LLM Reasoning and Real-World actions

Kong has open-sourced Volcano, a TypeScript SDK that composes multi-step agent workflows throughout a number of LLM suppliers with native Model Context Protocol (MCP) software use. The launch coincides with broader MCP capabilities in Kong AI Gateway and Konnect, positioning Volcano because the developer SDK in an MCP-governed management aircraft.

  • Why Volcano SDK? as a result of 9 traces of code are quicker to jot down and simpler to handle than 100+.
  • Without Volcano SDK? You’d want 100+ traces dealing with software schemas, context administration, supplier switching, error dealing with, and HTTP shoppers. 
  • With Volcano SDK: 9 traces.
import { agent, llmOpenAI, llmAnthropic, mcp } from "volcano-ai";


// Setup: two LLMs, two MCP servers
const planner = llmOpenAI({ mannequin: "gpt-5-mini", apiKey: course of.env.OPENAI_API_KEY! });
const executor = llmAnthropic({ mannequin: "claude-4.5-sonnet", apiKey: course of.env.ANTHROPIC_API_KEY! });
const database = mcp("https://api.firm.com/database/mcp");
const slack = mcp("https://api.firm.com/slack/mcp");


// One workflow
await agent({ llm: planner })
 .then({
   immediate: "Analyze final week's gross sales knowledge",
   mcps: [database]  // Auto-discovers and calls the correct instruments
 })
 .then({
   llm: executor,  // Switch to Claude
   immediate: "Write an government abstract"
 })
 .then({
   immediate: "Post the abstract to #executives",
   mcps: [slack]
 })
 .run();

What Volcano offers?

Volcano exposes a compact, chainable API.then(...).run()—that passes intermediate context between steps whereas switching LLMs per step (e.g., plan with one mannequin, execute with one other). It treats MCP as a first-class interface: builders hand Volcano an inventory of MCP servers, and the SDK performs software discovery and invocation mechanically. Production options embrace computerized retries, per-step timeouts, connection pooling for MCP servers, OAuth 2.1 authentication, and OpenTelemetry traces/metrics for distributed observability. The venture is launched beneath Apache-2.0.

Here are the Key Features of the Volcano SDK:

  • Chainable API: Build multi-step workflows with a concise .then(...).run() sample; context flows between steps
  • MCP-native software use: Pass MCP servers; the SDK auto-discovers and invokes the correct instruments in every step.
  • Multi-provider LLM help: Mix fashions (e.g., planning with one, execution with one other) inside one workflow.
  • Streaming of intermediate and remaining outcomes for responsive agent interactions.
  • Retries & timeouts configurable per step for reliability beneath real-world failures.
  • Hooks (earlier than/after step) to customise habits and instrumentation.
  • Typed error dealing with to floor actionable failures throughout agent execution.
  • Parallel execution, branching, and loops to specific complicated management circulate.
  • Observability by way of OpenTelemetry for tracing and metrics throughout steps and software calls.
  • OAuth help & connection pooling for safe, environment friendly entry to MCP servers.

Where it suits in Kong’s MCP structure?

Kong’s Konnect platform provides a number of MCP governance and entry layers that complement Volcano’s SDK floor:

  • AI Gateway features MCP gateway options reminiscent of server autogeneration from Kong-managed APIs, centralized OAuth 2.1 for MCP servers, and observability over instruments, workflows, and prompts in Konnect dashboards. These present uniform coverage and analytics for MCP analytics.
  • The Konnect Developer Portal might be become an MCP server so AI coding instruments and brokers can uncover APIs, request entry, and eat endpoints programmatically—lowering handbook credential workflows and making API catalogs accessible by means of MCP.
  • Kong’s staff additionally previewed MCP Composer and MCP Runner to design, generate, and function MCP servers and integrations.

Key Takeaways

  • Volcano is an open-source TypeScript SDK that builds multi-step AI brokers with first-class MCP software use.
  • The SDK offers manufacturing options—retries, timeouts, connection pooling, OAuth, and OpenTelemetry tracing/metrics—for MCP workflows.
  • Volcano composes multi-LLM plans/executions and auto-discovers/invokes MCP servers/instruments, minimizing customized glue code.
  • Kong paired the SDK with platform controls: AI Gateway/Konnect add MCP server autogeneration, centralized OAuth 2.1, and observability.

Editorial Comments

Kong’s Volcano SDK is a realistic addition to the MCP ecosystem: a TypeScript-first agent framework that aligns developer workflow with enterprise controls (OAuth 2.1, OpenTelemetry) delivered by way of AI Gateway and Konnect. The pairing closes a typical hole in agent stacks—software discovery, auth, and observability—with out inventing new interfaces past MCP. This design prioritizes protocol-native MCP integration over bespoke glue, chopping operational drift and closing auditing gaps as inside brokers scale.


Check out the GitHub Repo and Technical details. Feel free to take a look at our GitHub Page for Tutorials, Codes and Notebooks. Also, be at liberty to comply with us on Twitter and don’t neglect to hitch our 100k+ ML SubReddit and Subscribe to our Newsletter. Wait! are you on telegram? now you can join us on telegram as well.

The publish Kong Releases Volcano: A TypeScript, MCP-native SDK for Building Production Ready AI Agents with LLM Reasoning and Real-World actions appeared first on MarkTechPost.

Similar Posts