Home Arrow Icon Knowledge base Arrow Icon Global Arrow Icon How does session management work when integrating DeepSeek R1 with Together.ai's API


How does session management work when integrating DeepSeek R1 with Together.ai's API


Integrating DeepSeek R1 with Together.ai's API involves a structured approach to session management, which is crucial for maintaining context and continuity in interactions. However, Together.ai's API does not inherently support session management through session IDs like some other platforms. Instead, it relies on passing the entire conversation context to maintain continuity. Here's a detailed overview of how session management works in this context:

Authentication and Setup

To start using the DeepSeek R1 model with Together.ai, you need to set up your environment with the necessary credentials. This typically involves storing your API key securely, often in environment variables. For example, you would set `TOGETHER_API_KEY` in your `.env` file[4].

Creating a Session Context

Since Together.ai does not provide a traditional session ID for managing conversations, you must manually handle the conversation context. This involves storing and passing the entire conversation history with each request to maintain continuity. Here’s how you can do it:

1. Initial Request: When you first interact with the model, you send a message without any prior context.

2. Subsequent Requests: For each subsequent message, you include the entire conversation history (previous messages and responses) in the request. This allows the model to understand the context and respond accordingly.

API Usage Example

To integrate DeepSeek R1 with Together.ai's API, you can use the following approach:

python
import requests

# Set API endpoint and credentials
api_endpoint = "https://api.together.xyz/v1/chat/completions"
api_key = "YOUR_TOGETHER_API_KEY"

# Initial message
messages = [{"role": "user", "content": "Hello, how are you?"}]

# Send the initial request
response = requests.post(
    api_endpoint,
    headers={"Authorization": f"Bearer {api_key}", "Content-Type": "application/json"},
    json={"model": "deepseek-ai/DeepSeek-R1", "messages": messages}
)

# Store the response and add it to the conversation history
conversation_history = messages + [{"role": "assistant", "content": response.json()["choices"][0]["message"]["content"]}]

# For subsequent requests, include the conversation history
next_message = [{"role": "user", "content": "What's the weather like today?"}]
conversation_history += next_message

# Send the next request with the updated conversation history
response = requests.post(
    api_endpoint,
    headers={"Authorization": f"Bearer {api_key}", "Content-Type": "application/json"},
    json={"model": "deepseek-ai/DeepSeek-R1", "messages": conversation_history}
)

Using Together.ai's TypeScript Library

Alternatively, you can use Together.ai's TypeScript library for more convenient integration:

typescript
import Together from 'together-ai';

const together = new Together({ apiKey: process.env['TOGETHER_API_KEY'] });

// Initial message
const messages = [{"role": "user", "content": "Hello, how are you?"}];

// Send the initial request
const response = await together.chat.completions.create({
    messages: messages,
    model: "deepseek-ai/DeepSeek-R1"
});

// Store the response and add it to the conversation history
const conversationHistory = messages.concat([{"role": "assistant", "content": response.choices[0].message.content}]);

// For subsequent requests, include the conversation history
const nextMessage = [{"role": "user", "content": "What's the weather like today?"}];
conversationHistory.push(...nextMessage);

// Send the next request with the updated conversation history
const nextResponse = await together.chat.completions.create({
    messages: conversationHistory,
    model: "deepseek-ai/DeepSeek-R1"
});

Conclusion

While Together.ai's API does not support traditional session management through IDs, it effectively manages conversations by passing the entire context with each request. This approach ensures that the model can understand and respond based on the conversation history, providing a seamless interaction experience.

Citations:
[1] https://www.together.ai/models/deepseek-r1
[2] https://github.com/togethercomputer/together-typescript
[3] https://codingmall.com/knowledge-base/25-global/245531-how-does-session-management-work-when-integrating-deepseek-api-with-langchain
[4] https://thinktank.ottomator.ai/t/together-ai-offering-100-free-credits-and-deepseek-r1/4892
[5] https://community.openai.com/t/how-to-keep-session-with-gpt-3-5-turbo-api/81029
[6] https://www.bombaysoftwares.com/blog/deepseek-r1-comprehensive-guide
[7] https://www.together.ai/blog/deploy-deepseek-r1-at-scale-fast-secure-serverless-apis-and-large-scale-together-reasoning-clusters
[8] https://www.together.ai
[9] https://community.appsmith.com/content/guide/building-chat-app-deepseek-r1-and-togetherai-under-5-minutes