Swift interface for X.AI's Grok models, part of the LangTools framework. The X.AI module uses OpenAI's request and response formats, allowing for easy integration with existing OpenAI-style code.
- Model Selection:
// Available X.AI models
.grok // "grok-2-1212"
.grokVision // "grok-2-vision-1212"
- Initialization:
let xai = XAI(apiKey: "your-api-key")
- Base URL:
- Default base URL is
https://api.x.ai/v1/
- Uses X.AI's API infrastructure
- Request Limitations:
- Some OpenAI-specific parameters may not be available
- Vision features are only available with the grokVision model
Since X.AI uses OpenAI's request format, you can use the same request structure:
let request = OpenAI.ChatCompletionRequest(
model: XAI.Model.grok, // Use Grok model
messages: [
Message(role: .user, content: "Tell me about AI.")
]
)
let response = try await xai.perform(request: request)
X.AI errors have their own type:
do {
let response = try await xai.perform(request: request)
} catch let error as XAIErrorResponse {
print("X.AI API error:", error.error.message)
}
Refer to the OpenAI Module Documentation for detailed information about:
- Chat completion requests
- Streaming
- Message formats
- Response handling
- Best practices
The same patterns apply to X.AI, just use Grok models and initialization.
See also: