Home Arrow Icon Knowledge base Arrow Icon Global Arrow Icon Anthropic Package for Go


Anthropic Package for Go


The Anthropic package for Go is an unofficial client library designed to provide convenient access to the Anthropic API, a set of artificial intelligence language model services offered by Anthropic. This package facilitates interaction with Anthropic's language models, enabling Go developers to integrate AI-driven conversational agents and text generation capabilities into their applications seamlessly.

Overview and Purpose

The Go Anthropic SDK is inspired by the unofficial OpenAI Go SDK and offers tools to communicate with the Anthropic API endpoints. Primarily, it supports sending and receiving messages to Anthropic models for generating completions based on input prompts. This aligns with the use of AI models like Claude, developed by Anthropic, for natural language understanding and generation tasks. The package is unofficial but leverages the REST API exposed by Anthropic.

Installation

The package can be installed using the Go package manager with the command:


go get github.com/adamchol/go-anthropic-sdk

or for other variants like:


go get github.com/liushuangls/go-anthropic/v2

Another official-looking Anthropic Go SDK package can be installed using:


go get -u 'github.com/anthropics/anthropic-sdk-go@v1.9.1'

These commands fetch the respective SDK versions suitable for the Anthropic API integration with Go projects.

Requirements

The SDK typically requires recent Go versions, often Go 1.18 or higher, with some variants specifying Go 1.21 or greater. The environment must have proper API keys from Anthropic, usually set via environment variables like `ANTHROPIC_API_KEY`.

Basic Usage

To start using the package, the developer creates a client instance by providing the API key. The client is then used to send message requests to specific models. For example, basic interaction with Claude 3.5 Sonnet model involves constructing a request with user messages and invoking the API to get the AI-generated response.

Example snippet:

go
package main

import (
    "context"
    "fmt"
    anthropic "github.com/adamchol/go-anthropic-sdk"
)

func main() {
    client := anthropic.NewClient("your-token")

    resp, err := client.CreateMessage(context.Background(), anthropic.MessageRequest{
        Model: anthropic.Claude35SonnetModel,
        Messages: []anthropic.InputMessage{
            {
                Role: anthropic.MessageRoleUser,
                Content: "Hello, how are you?",
            },
        },
        MaxTokens: 4096,
    })

    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Println(resp.Content[0].Text)
}

This example creates a client, sends a message with the user role, and prints the generated text from the Anthropic AI model.

Advanced Features

The package supports several advanced interactions beyond simple text completion:

- Image and Text Input: Some versions support sending images encoded in base64 alongside text messages for multimodal input handling. For instance, sending an image file as base64 and querying if there's a living organism in the image.

- Streaming Completions: The library supports streaming the response from the Anthropic API, allowing real-time readout of the generated content as it is produced by the model.

- Message Batching: Sending batch messages to the API is supported, which can optimize interaction by delivering multiple queries or instructions in one API call.

- Tool Use Integration: Certain SDKs allow defining and invoking external tools via the conversational interface, extending the AI's capabilities to perform tasks like geolocation lookup or data retrieval.

- Prompt Caching and Token Counting: These are utilities for managing prompt efficiency, reusing prompts intelligently, and monitoring token usage to control cost and response length.

Example of Image and Text Input Handling

go
package main

import (
    "context"
    "encoding/base64"
    "fmt"
    "log"
    "os"
    "github.com/adamchol/go-anthropic-sdk"
)

func main() {
    client := anthropic.NewClient("your-token")

    imageBytes, err := os.ReadFile("ant.jpg")
    if err != nil {
        log.Fatalf("Failed to read image file: %v", err)
    }

    imgData := base64.StdEncoding.EncodeToString(imageBytes) // Encoding the image into base64

    resp, err := client.CreateMessage(context.Background(), anthropic.MessageRequest{
        Model: anthropic.Claude35SonnetModel,
        Messages: []anthropic.InputMessage{
            {
                Role: "user",
                ContentBlocks: []anthropic.ContentBlock{
                    {
                        Type: "text",
                        Text: "Is there a living organism on this image?",
                    },
                    {
                        Type: "image",
                        Source: anthropic.ImageSource{
                            Type: anthropic.ImageSourceType,    // "base64"
                            MediaType: anthropic.ImageJPEGMediaType, // "image/jpeg"
                            Data: imgData,
                        },
                    },
                },
            },
        },
        MaxTokens: 1000,
    })

    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Println(resp.Content[0].Text)
}

Usage of the Official Anthropic Go SDK

Another authoritative package known as `anthropic-sdk-go` from the official or semi-official source offers a structured approach with rich types, options, and error handling capabilities. This SDK provides a cleaner integration experience with Go idioms.

Example usage:

go
package main

import (
    "context"
    "fmt"

    "github.com/anthropics/anthropic-sdk-go"
    "github.com/anthropics/anthropic-sdk-go/option"
)

func main() {
    client := anthropic.NewClient(
        option.WithAPIKey("my-anthropic-api-key"),
    )
    message, err := client.Messages.New(context.TODO(), anthropic.MessageNewParams{
        MaxTokens: 1024,
        Messages: []anthropic.MessageParam{
            anthropic.NewUserMessage(anthropic.NewTextBlock("What is a quaternion?")),
        },
        Model: anthropic.ModelClaudeSonnet4_20250514,
    })
    if err != nil {
        panic(err.Error())
    }
    fmt.Printf("%+v\n", message.Content)
}

This structure separates message content into blocks, supports creation of various message roles, and allows system-level prompts to be added for specific context or behavioral control. The API also supports streaming responses for progressive reading and advanced conversational features like tool orchestration.

Conversations Handling

The SDK allows maintaining conversational contexts by appending prior responses to the message stack:

go
messages := []anthropic.MessageParam{
    anthropic.NewUserMessage(anthropic.NewTextBlock("What is my first name?")),
}

message, err := client.Messages.New(context.TODO(), anthropic.MessageNewParams{
    Model: anthropic.ModelClaudeSonnet4_20250514,
    Messages: messages,
    MaxTokens: 1024,
})
if err != nil {
    panic(err)
}

fmt.Printf("%+v\n", message.Content)

messages = append(messages, message.ToParam())
messages = append(messages, anthropic.NewUserMessage(
    anthropic.NewTextBlock("My full name is John Doe"),
))

message, err = client.Messages.New(context.TODO(), anthropic.MessageNewParams{
    Model: anthropic.ModelClaudeSonnet4_20250514,
    Messages: messages,
    MaxTokens: 1024,
})

fmt.Printf("%+v\n", message.Content)

This methodology preserves context, enabling the AI to give responses grounded in past exchanges.

System Prompts

The Go SDK supports system prompt blocks to instruct or guide the AI's personality or style at the start of conversations:

go
message, err := client.Messages.New(context.TODO(), anthropic.MessageNewParams{
    Model: anthropic.ModelClaudeSonnet4_20250514,
    MaxTokens: 1024,
    System: []anthropic.TextBlockParam{
        {Text: "Be very serious at all times."},
    },
    Messages: messages,
})

This feature is useful for tuning output tone or behavior, reflecting Anthropic's focus on safe and steerable AI.

Streaming Responses

For real-time applications, the package offers streaming endpoints where responses are yielded incrementally:

go
content := "What is a quaternion?"

stream := client.Messages.NewStreaming(context.TODO(), anthropic.MessageNewParams{
    Model: anthropic.ModelClaudeSonnet4_20250514,
    MaxTokens: 1024,
    Messages: []anthropic.MessageParam{
        anthropic.NewUserMessage(anthropic.NewTextBlock(content)),
    },
})

message := anthropic.Message{}
for stream.Next() {
    event := stream.Current()
    err := message.Accumulate(event)
    if err != nil {
        panic(err)
    }
    switch eventVariant := event.AsAny().(type) {
    case anthropic.ContentBlockDeltaEvent:
        switch deltaVariant := eventVariant.Delta.AsAny().(type) {
        case anthropic.TextDelta:
            print(deltaVariant.Text)
        }
    }
}

if stream.Err() != nil {
    panic(stream.Err())
}

This code allows a client to process partial content as it arrives, useful for UI updates or interactive applications.

Tool Use Integration

The SDK permits embedding tools or functions callable by the AI agent during conversations. A tool is defined with a name, description, and input schema, which the AI can invoke to perform tasks beyond text generation.

Example tool usage in conversation loop:

go
toolParams := []anthropic.ToolParam{
    {
        Name: "get_coordinates",
        Description: anthropic.String("Accepts a place as an address, then returns the latitude and longitude coordinates."),
        InputSchema: GetCoordinatesInputSchema,
    },
}
tools := make([]anthropic.ToolUnionParam, len(toolParams))
for i, toolParam := range toolParams {
    tools[i] = anthropic.ToolUnionParam{OfTool: &toolParam}
}

for {
    message, err := client.Messages.New(context.TODO(), anthropic.MessageNewParams{
        Model: anthropic.ModelClaudeSonnet4_20250514,
        MaxTokens: 1024,
        Messages: messages,
        Tools: tools,
    })
    if err != nil {
        panic(err)
    }

    for _, block := range message.Content {
        switch block := block.AsAny().(type) {
        case anthropic.TextBlock:
            println(block.Text)
        case anthropic.ToolUseBlock:
            inputJSON, _ := json.Marshal(block.Input)
            println(block.Name + ": " + string(inputJSON))
        }
    }

    // Handle tool calls and responses...

    messages = append(messages, message.ToParam())
}

This integration enables the creation of AI-assisted workflows that hybridize language processing and external functionalities.

Request Field Semantics

The request fields in the SDK use advanced Go serialization semantics introduced in Go 1.24, such as "omitzero" tags that control the inclusion of zero values in JSON requests. Required fields are always serialized, even when zero-valued, while optional fields can be omitted if they have zero values. This distinction allows precise control of API payload content for different scenarios.

Conclusion

The Anthropic package for Go is a powerful and flexible tool for integrating Anthropic's AI language models in Go applications. It supports a wide range of functionality from simple text completions to more complex workflows involving streaming, images, tool use, and conversational state management. It is suitable for developers aiming to build advanced AI-powered conversational agents, research applications, or interactive systems with Anthropic's Claude language models. The library is under active development with expanding features and serves as a key means for Go programmers to leverage Anthropic's AI technologies.

All usage patterns require secure API key management and familiarity with Go programming principles and idioms. The package documentation and example code provide a good starting point for adoption and experimentation with Anthropic AI services in Go environments.

This detailed overview captures the essence, installation, core usage, advanced features, and integration patterns of the Anthropic Go SDK ecosystem as it stands based on current public resources and official SDK references.