Prompt templates

Developing LLM Applications with LangChain

Jonathan Bennion

AI Engineer & LangChain Contributor

Prompt templates

  • Recipes for defining prompts for LLMs
  • Can contain: instructions, examples, and additional context

A prompt template with placeholders for input variables.

Developing LLM Applications with LangChain

Prompt templates

from langchain_core.prompts import PromptTemplate


template = "Expain this concept simply and concisely: {concept}"
prompt_template = PromptTemplate.from_template( template=template )
prompt = prompt_template.invoke({"concept": "Prompting LLMs"}) print(prompt)
text='Expain this concept simply and concisely: Prompting LLMs'
Developing LLM Applications with LangChain
llm = HuggingFacePipeline.from_model_id(
    model_id="meta-llama/Llama-3.3-70B-Instruct",
    task="text-generation"
)

llm_chain = prompt_template | llm
concept = "Prompting LLMs" print(llm_chain.invoke({"concept": concept}))
Prompting LLMs (Large Language Models) refers to the process of giving a model a
specific input or question to generate a response.
  • LangChain Expression Language (LCEL): | (pipe) operator
  • Chain: connect calls to different components
Developing LLM Applications with LangChain

Chat models

  • Chat roles: system, human, ai
from langchain_core.prompts import ChatPromptTemplate


template = ChatPromptTemplate.from_messages( [ ("system", "You are a calculator that responds with math."), ("human", "Answer this math question: What is two plus two?"), ("ai", "2+2=4"), ("human", "Answer this math question: {math}") ] )
Developing LLM Applications with LangChain

Integrating ChatPromptTemplate

llm = ChatOpenAI(model="gpt-4o-mini", api_key='<OPENAI_API_TOKEN>')


llm_chain = template | llm
math='What is five times five?'
response = llm_chain.invoke({"math": math}) print(response.content)
5x5=25
Developing LLM Applications with LangChain

Let's practice!

Developing LLM Applications with LangChain

Preparing Video For Download...