Developing LLM Applications with LangChain
Jonathan Bennion
AI Engineer & LangChain Contributor
Need to consider:
docs
[
Document(
page_content="In all marketing copy, TechStack should always be written with the T and S
capitalized. Incorrect: techstack, Techstack, etc.",
metadata={"guideline": "brand-capitalization"}
),
Document(
page_content="Our users should be referred to as techies in both internal and external
communications.",
metadata={"guideline": "referring-to-users"}
)
]
from langchain_openai import OpenAIEmbeddings from langchain_chroma import Chroma embedding_function = OpenAIEmbeddings(api_key=openai_api_key, model='text-embedding-3-small')
vectorstore = Chroma.from_documents( docs, embedding=embedding_function, persist_directory="path/to/directory" )
retriever = vectorstore.as_retriever( search_type="similarity", search_kwargs={"k": 2} )
from langchain_core.prompts import ChatPromptTemplate
message = """
Review and fix the following TechStack marketing copy with the following guidelines in consideration:
Guidelines:
{guidelines}
Copy:
{copy}
Fixed Copy:
"""
prompt_template = ChatPromptTemplate.from_messages([("human", message)])
from langchain_core.runnables import RunnablePassthrough rag_chain = ({"guidelines": retriever, "copy": RunnablePassthrough()} | prompt_template | llm)
response = rag_chain.invoke("Here at techstack, our users are the best in the world!")
print(response.content)
Here at TechStack, our techies are the best in the world!
Developing LLM Applications with LangChain