DeepInfra
DeepInfra allows us to run the latest machine learning models with ease. DeepInfra takes care of all the heavy lifting related to running, scaling and monitoring the models. Users can focus on your application and integrate the models with simple REST API calls.
DeepInfra provides examples of integration with LangChain.
This page covers how to use the DeepInfra
ecosystem within LangChain
.
It is broken into two parts: installation and setup, and then references to specific DeepInfra wrappers.
Installation and Setupβ
- Get your DeepInfra api key from this link here.
- Get an DeepInfra api key and set it as an environment variable (
DEEPINFRA_API_TOKEN
)
Available Modelsβ
DeepInfra provides a range of Open Source LLMs ready for deployment.
You can see supported models for text-generation and embeddings.
You can view a list of request and response parameters.
Chat models follow openai api
LLMβ
See a usage example.
from langchain_community.llms import DeepInfra
Embeddingsβ
See a usage example.
from langchain_community.embeddings import DeepInfraEmbeddings
Chat Modelsβ
See a usage example.
from langchain_community.chat_models import ChatDeepInfra