LangChain Launches Deploy CLI for One-Command AI Agent Deployment

LangChain Launches Deploy CLI for One-Command AI Agent Deployment


Thank you for reading this post, don't forget to subscribe!


Tony Kim
Mar 16, 2026 17:42

LangChain releases new deploy CLI commands enabling developers to ship LangGraph agents to production with a single command, streamlining CI/CD integration.





LangChain has released a new command-line interface for deploying AI agents to production, cutting what was previously a multi-step infrastructure setup down to a single command.

The deploy CLI, announced March 16, 2026, ships as part of the langgraph-cli package. Running langgraph deploy builds a Docker image from your local project and automatically provisions the supporting infrastructure—Postgres for persistence, Redis for message streaming—without manual configuration.

What the CLI Actually Does

The tool targets a specific pain point: getting LangGraph agents from development into production environments. Rather than manually configuring servers, databases, and message queues, developers can now integrate deployment directly into existing CI/CD pipelines through GitHub Actions, GitLab CI, or Bitbucket Pipelines.

Beyond the core deploy command, the CLI includes several management utilities:

langgraph deploy list — view all deployments in your workspacelanggraph deploy logs — access deployment logslanggraph deploy delete — remove deployments

The infrastructure provisioned connects to LangSmith Deployment, LangChain’s hosted environment for running production agents.

Context Matters Here

This release builds on LangGraph 1.0, which shipped in October 2025 with a focus on production readiness. That version introduced durable execution—agents that can persist through failures and resume where they left off—along with comprehensive memory management and human-in-the-loop oversight capabilities.

LangGraph handles the complex, stateful workflows that simpler LLM chaining can’t manage: multi-agent coordination, self-correcting loops, and long-running processes that need to maintain context across sessions.

LangChain also dropped two new starter templates alongside the CLI: a “deep agent” template for complex workflows and a “simple agent” template for lighter use cases. Both generate via langgraph new.

Getting Started

The CLI is available now through uvx:

uvx –from langgraph-cli langgraph deploy

For teams already running LangGraph agents in development, this removes a meaningful barrier to production deployment. Whether it changes the calculus for teams evaluating the framework against alternatives like AutoGen or CrewAI depends largely on how much they value integrated tooling versus flexibility.

Documentation lives at docs.langchain.com/langsmith/cli#deploy.

Image source: Shutterstock



Source link