To integrate the DeepSeek API with Langchain, follow these steps to set up the environment, authenticate, and implement the integration in your code.
Setting Up the Environment
1. Install Required Packages: You need to install the `langchain-deepseek-official` package. Use the following command:
pip install -U langchain-deepseek-official
2. Set Environment Variables: Create a `.env` file in your project directory to securely store your DeepSeek credentials. The file should contain:
DEEPSEEK_EMAIL=your_email
DEEPSEEK_PASSWORD=your_password
DEEPSEEK_DEVICE_ID=your_device_id
DEEPSEEK_COOKIES=your_cookies
DEEPSEEK_DS_POW_RESPONSE=your_ds_pow_response
Authentication Setup
Before you can use the API, ensure that you authenticate using the credentials stored in your environment variables. This is essential for establishing a secure connection with the DeepSeek API.
Code Implementation
Below is a sample code snippet demonstrating how to create a custom chat model using the DeepSeek API within Langchain:
python
import asyncio
import os
from dotenv import load_dotenv
from langchain_core.callbacks.manager import CallbackManagerForLLMRun
from langchain_core.language_models.llms import LLM
from langchain_core.outputs import GenerationChunk
from chat_deepseek_api.model import MessageData
from chat_deepseek_api import DeepseekAPI
class ChatDeepSeekApiLLM(LLM):
def __init__(self, email: str, password: str, device_id: str, cookies: str, ds_pow_response: str):
super().__init__()
self.email = email
self.password = password
self.device_id = device_id
self.cookies = cookies
self.ds_pow_response = ds_pow_response
self.app = None
self.chat_session_id = None
self.message_id = 0
def _call(self, prompt: str) -> str:
self._verify_config()
return "".join([chunk for chunk in self._generate_message(prompt)])
async def _async_generate_message(self, prompt: str):
if not self.app:
self.app = await DeepseekAPI.create(
email=self.email,
password=self.password,
save_login=True,
device_id=self.device_id,
custom_headers={
"cookie": self.cookies,
"x-ds-pow-response": self.ds_pow_response,
},
)
if not self.chat_session_id:
self.chat_session_id = await self.app.new_chat()
async for chunk in self.app.chat(message=prompt, id=self.chat_session_id):
yield chunk.choices.delta.content
def _verify_config(self) -> None:
if not all([self.email, self.password, self.device_id, self.cookies, self.ds_pow_response]):
raise ValueError("All credentials must be provided.")
if __name__ == "__main__":
load_dotenv()
model = ChatDeepSeekApiLLM(
email=os.getenv("DEEPSEEK_EMAIL"),
password=os.getenv("DEEPSEEK_PASSWORD"),
device_id=os.getenv("DEEPSEEK_DEVICE_ID"),
cookies=os.getenv("DEEPSEEK_COOKIES"),
ds_pow_response=os.getenv("DEEPSEEK_DS_POW_RESPONSE"),
)
result = model.invoke("Who are you?")
print(result)
Key Components Explained
- Authentication: The `ChatDeepSeekApiLLM` class is initialized with user credentials to authenticate with the DeepSeek API.
- Message Handling: The `_async_generate_message` method processes prompts and retrieves responses asynchronously.
- Session Management: Each interaction maintains a unique session ID for context preservation.
By following these steps and utilizing the provided code example, you can effectively integrate the DeepSeek API with Langchain to build advanced language models and applications.
Citations:
[1] https://thinhdanggroup.github.io/blog-on-chat-deepseek-r1-api/
[2] https://apidog.com/blog/rag-deepseek-r1-ollama/
[3] https://python.langchain.com/api_reference/deepseek/chat_models/langchain_deepseek.chat_models.ChatDeepSeek.html
[4] https://www.youtube.com/watch?v=zGoxg2BMdyY
[5] https://js.langchain.com/docs/integrations/chat/deepseek/
[6] https://github.com/langchain-ai/langchain/blob/master/docs/docs/integrations/chat/deepseek.ipynb
[7] https://python.langchain.com/docs/integrations/chat/deepseek/
[8] https://www.youtube.com/watch?v=Yy2xXp0UGcM