サクサク読めて、アプリ限定の機能も多数!
トップへ戻る
セキュリティ
docs.langchain.com
Graphs At its core, LangGraph models agent workflows as graphs. You define the behavior of your agents using three key components: State: A shared data structure that represents the current snapshot of your application. It can be any data type, but is typically defined using a shared state schema. Nodes: Functions that encode the logic of your agents. They receive the current state as input, perfo
Overview In this tutorial we will build a retrieval agent using LangGraph. LangChain offers built-in agent implementations, implemented using LangGraph primitives. If deeper customization is required, agents can be implemented directly in LangGraph. This guide demonstrates an example implementation of a retrieval agent. Retrieval agents are useful when you want an LLM to make a decision about whet
LangChain offers an extensive ecosystem with 1000+ integrations across chat & embedding models, tools & toolkits, document loaders, vector stores, and more. A provider is a company or platform that hosts AI models and exposes them through an API (e.g., OpenAI, Anthropic, Google). Many providers have a dedicated langchain-<provider> package that implements one or more of LangChain’s standard interf
Trusted by companies shaping the future of agents— including Klarna, Uber, J.P. Morgan, and more— LangGraph is a low-level orchestration framework and runtime for building, managing, and deploying long-running, stateful agents. LangGraph is very low-level, and focused entirely on agent orchestration. Before using LangGraph, we recommend you familiarize yourself with some of the components used to
This page covers all LangChain integrations with Google Gemini, Google Cloud, and other Google products (such as Google Maps, YouTube, and more). Unified SDK & package consolidationAs of langchain-google-genai 4.0.0, this package uses the consolidated google-genai SDK and now supports both the Gemini Developer API and Vertex AI backends.The langchain-google-vertexai package remains supported for V
LangSmith is a framework-agnostic platform for building, debugging, and deploying AI agents and LLM applications. Trace requests, evaluate outputs, test prompts, and manage deployments all in one place, with your agent stack.
LLMs are powerful AI tools that can interpret and generate text like humans. They’re versatile enough to write content, translate languages, summarize, and answer questions without needing specialized training for each task. In addition to text generation, many models support: Tool calling - calling external tools (like databases queries or API calls) and use results in their responses. Structured
Overview One of the most powerful applications enabled by LLMs is sophisticated question-answering (Q&A) chatbots. These are applications that can answer questions about specific source information. These applications use a technique known as Retrieval Augmented Generation, or RAG. This tutorial will show how to build a simple Q&A application over an unstructured text data source. We will demonstr
LangChain is an open source framework with a prebuilt agent architecture and integrations for any model or tool—so you can build agents that adapt as fast as the ecosystem evolves LangChain is the easy way to start building completely custom agents and applications powered by LLMs. With under 10 lines of code, you can connect to OpenAI, Anthropic, Google, and more. LangChain provides a prebuilt ag
このページを最初にブックマークしてみませんか?
『Home - Docs by LangChain』の新着エントリーを見る
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く