Developing LLM Applications with LangChain
Jonathan Bennion
AI Engineer & LangChain Contributor
PromptTemplate
+ ChatPromptTemplate
FewShotPromptTemplate
examples = [
{
"question": "..."
"answer": "..."
},
...
]
examples = [
{
"question": "Does Henry Campbell have any pets?",
"answer": "Henry Campbell has a dog called Pluto."
},
...
]
# Convert pandas DataFrame to list of dicts
examples = df.to_dict(orient="records")
from langchain_core.prompts import FewShotPromptTemplate, PromptTemplate
example_prompt = PromptTemplate.from_template("Question: {question}\n{answer}")
prompt = example_prompt.invoke({"question": "What is the capital of Italy?"
"answer": "Rome"})
print(prompt.text)
Question: What is the capital of Italy?
Rome
prompt_template = FewShotPromptTemplate(
examples=examples,
example_prompt=example_prompt,
suffix="Question: {input}",
input_variables=["input"]
)
examples
: the list of dictsexample_prompt
: formatted templatesuffix
: suffix to add to the inputinput_variables
prompt = prompt_template.invoke({"input": "What is the name of Henry Campbell's dog?"})
print(prompt.text)
Question: Does Henry Campbell have any pets?
Henry Campbell has a dog called Pluto.
...
Question: What is the name of Henry Campbell's dog?
llm = ChatOpenAI(model="gpt-4o-mini", api_key="...")
llm_chain = prompt_template | llm response = llm_chain.invoke({"input": "What is the name of Henry Campbell's dog?"})
print(response.content)
The name of Henry Campbell's dog is Pluto.
Developing LLM Applications with LangChain