Skip to main content
LangChain v1 is under active development and is not recommended for production use.
  • APIs may change without notice
  • The docs are incomplete and subject to change
For the latest stable version, see the v0 LangChain docs.
LangChain is the easiest way to start building agents and applications powered by LLMs. With under 10 lines of code, you can connect to OpenAI, Anthropic, Google, and more. LangChain provides pre-built agent architectures and model integrations to help you get started quickly and seamlessly incorporate LLMs into your agents and applications. We recommend you use LangChain to build common agents that loop LLM and tool calls, like SQL agents, RAG and document analysis, and simple customer support chatbots. Use LangGraph, our low-level agent orchestration framework and runtime, when you have more advanced needs that require a combination of deterministic and agentic workflows, heavy customization, and carefully controlled latency. LangChain agents are built on top of LangGraph in order to provide durable execution, streaming, human-in-the-loop, persistence, and more. You do not need to know LangGraph for basic LangChain agent usage.

Install

npm install langchain@next

Create an agent

import { z } from "zod";
// npm install @langchain/anthropic to call the model
import { createAgent, tool } from "langchain";

const getWeather = tool(({ city }) => `It's always sunny in ${city}!`, {
    name: "get_weather",
    description: "Get the weather for a given city",
    schema: z.object({
        city: z.string(),
    }),
});

const agent = createAgent({
    model: "anthropic:claude-3-7-sonnet-latest",
    tools: [getWeather],
});

console.log(
    await agent.invoke({
        messages: [{ role: "user", content: "What's the weather in Tokyo?" }],
    })
);

Core benefits


I