Initialize a ChatModel from the model name and provider. Must have the integration package corresponding to the model provider installed.

The name of the model, e.g. "gpt-4", "claude-3-opus-20240229".

Additional configuration options.

The model provider. Supported values include:

  • openai (@langchain/openai)
  • anthropic (@langchain/anthropic)
  • azure_openai (@langchain/openai)
  • google_vertexai (@langchain/google-vertexai)
  • google_genai (@langchain/google-genai)
  • bedrock (@langchain/aws)
  • cohere (@langchain/cohere)
  • fireworks (@langchain/community/chat_models/fireworks)
  • together (@langchain/community/chat_models/togetherai)
  • mistralai (@langchain/mistralai)
  • groq (@langchain/groq)
  • ollama (@langchain/ollama)

Which model parameters are configurable:

  • undefined: No configurable fields.
  • "any": All fields are configurable. (See Security Note in description)
  • string[]: Specified fields are configurable.

Prefix for configurable fields at runtime.

Additional keyword args to pass to the ChatModel constructor.

A class which extends BaseChatModelWithBindTools that implements bindTools for proper typing.

If modelProvider cannot be inferred or isn't supported.

If the model provider integration package is not installed.

import { initChatModel } from "langchain/chat_models";

const gpt4 = await initChatModel("gpt-4", {
modelProvider: "openai",
kwargs: { temperature: 0 },
});
const claude = await initChatModel("claude-3-opus-20240229", {
modelProvider: "anthropic",
kwargs: { temperature: 0 },
});

// Create a partially configurable model with no default model:
const configurableModel = await initChatModel(undefined, {
kwargs: { temperature: 0 },
});

// Create a fully configurable model with a default model and a config prefix:
const configurableModelWithDefault = await initChatModel("gpt-4", {
modelProvider: "openai",
configurableFields: "any",
configPrefix: "foo",
kwargs: { temperature: 0 },
});
import { initChatModel } from "langchain/chat_models";

const configurableModel = await initChatModel(undefined, {
kwargs: { temperature: 0 },
});

configurableModel.invoke("what's your name", {
config: { configurable: { model: "gpt-4" } },
});
// GPT-4 response

configurableModel.invoke("what's your name", {
config: { configurable: { model: "claude-3-5-sonnet-20240620" } },
});
// claude-3.5 sonnet response
import { initChatModel } from "langchain/chat_models";

const configurableModelWithDefault = await initChatModel("gpt-4", {
modelProvider: "openai",
configurableFields: "any",
configPrefix: "foo",
kwargs: { temperature: 0 },
});

configurableModelWithDefault.invoke("what's your name");
// GPT-4 response with temperature 0

configurableModelWithDefault.invoke("what's your name", {
config: {
configurable: {
foo_model: "claude-3-5-sonnet-20240620",
foo_modelProvider: "anthropic",
foo_temperature: 0.6,
},
},
});
// Claude-3.5 sonnet response with temperature 0.6
import { initChatModel } from "langchain/chat_models";
import { z } from "zod";
import { tool } from "@langchain/core/tools";

const GetWeather = z
.object({
location: z
.string()
.describe("The city and state, e.g. San Francisco, CA"),
})
.describe("Get the current weather in a given location");

const getWeatherTool = tool(
(input) => {
// Do something with the input
},
{
schema: GetWeather,
name: "GetWeather",
description: "Get the current weather in a given location",
}
);

const GetPopulation = z
.object({
location: z
.string()
.describe("The city and state, e.g. San Francisco, CA"),
})
.describe("Get the current population in a given location");

const getPopulationTool = tool(
(input) => {
// Do something with the input
},
{
schema: GetPopulation,
name: "GetPopulation",
description: "Get the current population in a given location",
}
);

const configurableModel = await initChatModel("gpt-4", {
configurableFields: ["model", "modelProvider"],
kwargs: { temperature: 0 },
});

const configurableModelWithTools = configurableModel.bind({
tools: [GetWeather, GetPopulation],
});

configurableModelWithTools.invoke(
"Which city is hotter today and which is bigger: LA or NY?"
);
// GPT-4 response with tool calls

configurableModelWithTools.invoke(
"Which city is hotter today and which is bigger: LA or NY?",
{ config: { configurable: { model: "claude-3-5-sonnet-20240620" } } }
);
// Claude-3.5 sonnet response with tools

This function initializes a ChatModel based on the provided model name and provider. It supports various model providers and allows for runtime configuration of model parameters.

Security Note: Setting configurableFields to "any" means fields like api_key, base_url, etc. can be altered at runtime, potentially redirecting model requests to a different service/user. Make sure that if you're accepting untrusted configurations, you enumerate the configurableFields explicitly.

The function will attempt to infer the model provider from the model name if not specified. Certain model name prefixes are associated with specific providers:

  • gpt-3... or gpt-4... -> openai
  • claude... -> anthropic
  • amazon.... -> bedrock
  • gemini... -> google_vertexai
  • command... -> cohere
  • accounts/fireworks... -> fireworks

0.2.7

0.2.8

  • Parameters

    • model: string
    • Optionalfields: Partial<Record<string, any>> & {
          configPrefix?: string;
          configurableFields?: undefined;
          modelProvider?: string;
      }

    Returns Promise<BaseChatModelWithBindTools>

  • Parameters

    • model: never
    • Optionaloptions: Partial<Record<string, any>> & {
          configPrefix?: string;
          configurableFields?: undefined;
          modelProvider?: string;
      }

    Returns Promise<_ConfigurableModel>

  • Parameters

    • Optionalmodel: string
    • Optionaloptions: Partial<Record<string, any>> & {
          configPrefix?: string;
          configurableFields?: ConfigurableFields;
          modelProvider?: string;
      }

    Returns Promise<_ConfigurableModel>