Anthropic Functions
LangChain offers an experimental wrapper around Anthropic that gives it the same API as OpenAI Functions.
Setup
First, you'll need to install the popular fast-xml-parser
package as a peer dependency:
- npm
- Yarn
- pnpm
npm install fast-xml-parser
yarn add fast-xml-parser
pnpm add fast-xml-parser
Initialize model
You can initialize this wrapper the same way you'd initialize a standard ChatAnthropic
instance:
import { AnthropicFunctions } from "langchain/experimental/chat_models/anthropic_functions";
const model = new AnthropicFunctions({
temperature: 0.1,
anthropicApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.ANTHROPIC_API_KEY
});
Passing in functions
You can now pass in functions the same way as OpenAI:
import { AnthropicFunctions } from "langchain/experimental/chat_models/anthropic_functions";
import { HumanMessage } from "langchain/schema";
const model = new AnthropicFunctions({
temperature: 0.1,
}).bind({
functions: [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state, e.g. San Francisco, CA",
},
unit: { type: "string", enum: ["celsius", "fahrenheit"] },
},
required: ["location"],
},
},
],
// You can set the `function_call` arg to force the model to use a function
function_call: {
name: "get_current_weather",
},
});
const response = await model.invoke([
new HumanMessage({
content: "What's the weather in Boston?",
}),
]);
console.log(response);
/*
AIMessage {
content: '',
additional_kwargs: {
function_call: {
name: 'get_current_weather',
arguments: '{"location":"Boston, MA","unit":"fahrenheit"}'
}
}
}
*/
API Reference:
- AnthropicFunctions from
langchain/experimental/chat_models/anthropic_functions
- HumanMessage from
langchain/schema
Using for extraction
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";
import { AnthropicFunctions } from "langchain/experimental/chat_models/anthropic_functions";
import { PromptTemplate } from "langchain/prompts";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";
const EXTRACTION_TEMPLATE = `Extract and save the relevant entities mentioned in the following passage together with their properties.
Passage:
{input}
`;
const prompt = PromptTemplate.fromTemplate(EXTRACTION_TEMPLATE);
// Use Zod for easier schema declaration
const schema = z.object({
people: z.array(
z.object({
name: z.string().describe("The name of a person"),
height: z.number().describe("The person's height"),
hairColor: z.optional(z.string()).describe("The person's hair color"),
})
),
});
const model = new AnthropicFunctions({
temperature: 0.1,
}).bind({
functions: [
{
name: "information_extraction",
description: "Extracts the relevant information from the passage.",
parameters: {
type: "object",
properties: zodToJsonSchema(schema),
},
},
],
function_call: {
name: "information_extraction",
},
});
// Use a JsonOutputFunctionsParser to get the parsed JSON response directly.
const chain = await prompt.pipe(model).pipe(new JsonOutputFunctionsParser());
const response = await chain.invoke({
input:
"Alex is 5 feet tall. Claudia is 1 foot taller than Alex and jumps higher than him. Claudia is a brunette and Alex is blonde.",
});
console.log(response);
/*
{
people: [
{ name: 'Alex', height: 5, hairColor: 'blonde' },
{ name: 'Claudia', height: 6, hairColor: 'brunette' }
]
}
*/
API Reference:
- AnthropicFunctions from
langchain/experimental/chat_models/anthropic_functions
- PromptTemplate from
langchain/prompts
- JsonOutputFunctionsParser from
langchain/output_parsers