Prompt selectors
Prompt selectors are useful when you want to programmatically select a prompt based on the type of model you are using in a chain. This is especially relevant when swapping chat models and LLMs.
The interface for prompt selectors is quite simple:
abstract class BasePromptSelector {
abstract getPrompt(llm: BaseLanguageModel): BasePromptTemplate;
}
The getPrompt
method takes in a language model and returns an appropriate prompt template.
We currently offer a ConditionalPromptSelector
that allows you to specify a set of conditions and prompt templates. The first condition that evaluates to true will be used to select the prompt template.
const QA_PROMPT_SELECTOR = new ConditionalPromptSelector(DEFAULT_QA_PROMPT, [
[isChatModel, CHAT_PROMPT],
]);
This will return DEFAULT_QA_PROMPT
if the model is not a chat model, and CHAT_PROMPT
if it is.
The example below shows how to use a prompt selector when loading a chain:
const loadQAStuffChain = (
llm: BaseLanguageModel,
params: StuffQAChainParams = {}
) => {
const { prompt = QA_PROMPT_SELECTOR.getPrompt(llm) } = params;
const llmChain = new LLMChain({ prompt, llm });
const chain = new StuffDocumentsChain({ llmChain });
return chain;
};