To utilize the DeepSeek API for content generation, you can follow the sample code provided below. This example demonstrates how to make a chat completion request using the API.
Sample Code for Content Generation
Using cURL
You can use the following cURL command to send a request to the DeepSeek API:
bash
curl "https://api.deepinfra.com/v1/openai/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $DEEPINFRA_TOKEN" \
-d '{
"model": "deepseek-ai/DeepSeek-R1",
"messages": [
{
"role": "system",
"content": "Respond like a Michelin-starred chef."
},
{
"role": "user",
"content": "Can you name at least two different techniques to cook lamb?"
}
]
}'
Using Python
Hereâs how you can implement the same functionality in Python using the `requests` library:
python
import requests
url = "https://api.deepinfra.com/v1/openai/chat/completions"
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {YOUR_DEEPINFRA_TOKEN}"
}
data = {
"model": "deepseek-ai/DeepSeek-R1",
"messages": [
{"role": "system", "content": "Respond like a Michelin-starred chef."},
{"role": "user", "content": "Can you name at least two different techniques to cook lamb?"}
]
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
Explanation
- Authorization: Replace `YOUR_DEEPINFRA_TOKEN` with your actual API token.
- Model: The model specified in the request is `deepseek-ai/DeepSeek-R1`, which is designed for chat completions.
- Messages: The `messages` array contains a system message that sets the context and a user message that poses a question.
Streaming Responses
If you want to receive streaming responses, you can modify the request by adding `"stream": true` in the JSON body:
json
"stream": true,
This allows you to handle responses as they are generated, which can be useful for longer conversations.
Additional Notes
- Ensure that you have installed any necessary libraries (like `requests` for Python).
- The API has limitations on context size and response time, which may vary based on the complexity of the conversation and the model used[1][2][4].
Citations:
[1] https://deepinfra.com/deepseek-ai/DeepSeek-R1/api
[2] https://api-docs.deepseek.com
[3] https://github.com/deepseek-ai/deepseek-coder/?tab=readme-ov-file
[4] https://developers.cloudflare.com/workers-ai/tutorials/explore-code-generation-using-deepseek-coder-models/
[5] https://www.youtube.com/watch?v=_IKz7sDQeVg
[6] https://www.datacamp.com/tutorial/deepseek-v3
[7] https://www.youtube.com/watch?v=WbKa-gxVybA
[8] https://api-docs.deepseek.com/api/create-chat-completion