OpenAI Function calling
Function calling is a useful way to get structured output from an LLM for a wide range of purposes. By providing schemas for "functions", the LLM will choose one and attempt to output a response matching that schema.
Though the name implies that the LLM is actually running code and calling a function, it is more accurate to say that the LLM is populating parameters that match the schema for the arguments a hypothetical function would take. We can use these structured responses for whatever we'd like!
Function calling serves as a building block for several other popular features in LangChain, including the OpenAI Functions agent
and structured output chain
. In addition to these more specific use cases, you can also attach function parameters
directly to the model and call it, as shown below.
Usage
There are two main ways to apply functions to your OpenAI calls.
The first and most simple is by attaching a function directly to the .invoke({})
method:
/* Define your function schema */
const extractionFunctionSchema = {...}
/* Instantiate ChatOpenAI class */
const model = new ChatOpenAI({ modelName: "gpt-4" });
/**
* Call the .invoke method on the model, directly passing
* the function arguments as call args.
*/
const result = await model.invoke([new HumanMessage("What a beautiful day!")], {
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});
console.log({ result });
The second way is by directly binding the function to your model. Binding function arguments to your model is useful when you want to call the same function twice.
Calling the .bind({})
method attaches any call arguments passed in to all future calls to the model.
/* Define your function schema */
const extractionFunctionSchema = {...}
/* Instantiate ChatOpenAI class and bind function arguments to the model */
const model = new ChatOpenAI({ modelName: "gpt-4" }).bind({
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});
/* Now we can call the model without having to pass the function arguments in again */
const result = await model.invoke([new HumanMessage("What a beautiful day!")]);
console.log({ result });
OpenAI requires parameter schemas in the format below, where parameters
must be JSON Schema.
When adding call arguments to your model, specifying the function_call
argument will force the model to return a response using the specified function.
This is useful if you have multiple schemas you'd like the model to pick from.
Example function schema:
const extractionFunctionSchema = {
name: "extractor",
description: "Extracts fields from the input.",
parameters: {
type: "object",
properties: {
tone: {
type: "string",
enum: ["positive", "negative"],
description: "The overall tone of the input",
},
word_count: {
type: "number",
description: "The number of words in the input",
},
chat_response: {
type: "string",
description: "A response to the human's input",
},
},
required: ["tone", "word_count", "chat_response"],
},
};
Now to put it all together:
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanMessage } from "langchain/schema";
const extractionFunctionSchema = {
name: "extractor",
description: "Extracts fields from the input.",
parameters: {
type: "object",
properties: {
tone: {
type: "string",
enum: ["positive", "negative"],
description: "The overall tone of the input",
},
word_count: {
type: "number",
description: "The number of words in the input",
},
chat_response: {
type: "string",
description: "A response to the human's input",
},
},
required: ["tone", "word_count", "chat_response"],
},
};
const model = new ChatOpenAI({
modelName: "gpt-4",
}).bind({
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});
const result = await model.invoke([new HumanMessage("What a beautiful day!")]);
console.log(result);
/*
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
' "tone": "positive",\n' +
' "word_count": 4,\n' +
` "chat_response": "I'm glad you're enjoying the day! What makes it so beautiful for you?"\n` +
'}'
}
}
}
*/
API Reference:
- ChatOpenAI from
langchain/chat_models/openai
- HumanMessage from
langchain/schema
Usage with Zod
An alternative way to declare function schema is to use the Zod schema library with the zod-to-json-schema utility package to translate it:
- npm
- Yarn
- pnpm
npm install zod
npm install zod-to-json-schema
yarn add zod
yarn add zod-to-json-schema
pnpm add zod
pnpm add zod-to-json-schema
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanMessage } from "langchain/schema";
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";
const extractionFunctionSchema = {
name: "extractor",
description: "Extracts fields from the input.",
parameters: zodToJsonSchema(
z.object({
tone: z
.enum(["positive", "negative"])
.describe("The overall tone of the input"),
entity: z.string().describe("The entity mentioned in the input"),
word_count: z.number().describe("The number of words in the input"),
chat_response: z.string().describe("A response to the human's input"),
final_punctuation: z
.optional(z.string())
.describe("The final punctuation mark in the input, if any."),
})
),
};
const model = new ChatOpenAI({
modelName: "gpt-4",
}).bind({
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});
const result = await model.invoke([new HumanMessage("What a beautiful day!")]);
console.log(result);
/*
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
'"tone": "positive",\n' +
'"entity": "day",\n' +
'"word_count": 4,\n' +
`"chat_response": "I'm glad you're enjoying the day!",\n` +
'"final_punctuation": "!"\n' +
'}'
}
}
}
*/
API Reference:
- ChatOpenAI from
langchain/chat_models/openai
- HumanMessage from
langchain/schema