<h1 align="center">
<a href="https://prompts.chat">
This directory covers loading and uploading of prompts.
Sign in to like and favorite skills
This directory covers loading and uploading of prompts. Each sub-directory covers a different use case, and has not only relevant prompts for that use case but also a README file describing how to best use that prompt.
All prompts can be loaded from LangChain by specifying the desired path, and adding the
lc:// prefix. The path should be relative to the langchain-hub repo.
For example, to load the prompt at the path
langchain-hub/prompts/qa/stuff/basic/prompt.yaml, the path you want to specify is lc://prompts/qa/stuff/basic/prompt.yaml
Once you have that path, you can load it in the following manner:
from langchain.prompts import load_prompt prompt = load_prompt('lc://prompts/qa/stuff/basic/prompt.yaml')
To upload a prompt to the LangChainHub, you must upload 2 files:
json, yaml, and python. The suggested options are json and yaml, but we provide python as an option for more flexibility. Please see the below sections for instructions for uploading each format.The prompts on the hub are organized by use case. The use cases are reflected in the directory structure and names, and each separate directory represents a different use case. You should upload your prompt file to a folder in the appropriate use case section.
If adding a prompt to an existing use case folder, then make sure that the prompt:
A litmus test to make sure that multiple prompts belong in the same folder: the existing README file for that folder should also apply to the new prompt being added.
jsonTo get a properly formatted json file, if you have prompt in memory in Python you can run:
prompt.save("file_name.json")
Replace
"file_name" with the desired name of the file.
yamlTo get a properly formatted yaml file, if you have prompt in memory in Python you can run:
prompt.save("file_name.yaml")
Replace
"file_name" with the desired name of the file.
pythonTo get a properly formatted Python file, you should upload a Python file that exposes a
PROMPT variable.
This is the variable that will be loaded.
This variable should be an instance of a subclass of BasePromptTemplate in LangChain.