Skip to content

[REQUEST] Support Multiple API Keys for Inference #79

@strikeoncmputrz

Description

@strikeoncmputrz

First, thank you for the awesome tool!

Describe the solution you'd like
I would like to enable a small number of users to directly use the chat and chat/completions endpoints and provide each a different API key for access control and monitoring. "Share these keys with guests to your API." implies that this is possible but I couldn't make additional keys or find other references to multiple keys in the documentation.

Describe alternatives you've considered
I tried to add multiple API keys but tabbyAPI only appears to allow the last one in the list to authenticate. Other keys were invalid. I've also considered proxying the traffic through nginx but this would add unnecessary complexity.

Why should this feature be added?
This would enable tabbyAPI to support multiple users with per-key access. It could also pave the way for additional role-based access control.

Examples
User 1 submits inference requests. User 2 submits 10x more inference requests. The admin is able to easily identify this based on different API keys.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestlow priorityGood issue, but isn't necessary to add right away

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions