AI Credits
SetGet AI uses a credit-based system on cloud-hosted instances to manage usage fairly across all users. Each AI interaction consumes a certain number of credits depending on the complexity of the request. Self-hosted instances bypass the credit system entirely by using your own LLM API key.
How credits work
Every time you interact with SetGet AI — whether asking a question, requesting an analysis, or proposing an action — the system consumes credits from your workspace's allocation.
Credits are consumed at the workspace level, meaning all members of a workspace share the same credit pool. This encourages teams to use AI efficiently while ensuring fair access.
Credit consumption model
Credits are consumed based on the type and complexity of the interaction:
| Interaction type | Approximate credit cost | Description |
|---|---|---|
| Simple question | 1 credit | Short factual query (e.g., "How many open items?") |
| Data analysis | 2-3 credits | Queries that scan multiple items or compare data |
| Summarization | 2-4 credits | Generating summaries of cycles, projects, or progress |
| Action proposal (single item) | 1-2 credits | Creating or updating one work item |
| Action proposal (bulk) | 3-5 credits | Operations that affect multiple items |
| Complex multi-turn analysis | 3-6 credits | Follow-up questions in a long thread with heavy context |
TIP
Credit costs are approximate and depend on the amount of data retrieved and the length of the AI response. Simpler prompts with focused scope consume fewer credits.
What counts as a credit charge
- Each message you send to the AI consumes credits.
- The AI's response is part of the same credit charge — you are not charged separately for the response.
- Confirming or rejecting a proposed action does not consume additional credits. The credit was already consumed when the action was proposed.
- Viewing thread history does not consume credits. Only new messages incur a charge.
What does NOT consume credits
- Browsing the AI panel
- Opening or closing threads
- Reviewing past conversations
- Canceling a proposed action
Credit balance tracking
Workspace administrators can monitor credit usage from the workspace settings.
Viewing your balance
- Go to Settings in the sidebar.
- Navigate to AI (or Billing > AI Usage on cloud plans).
- View the current credit balance, usage history, and renewal date.
The dashboard shows:
| Metric | Description |
|---|---|
| Credits remaining | How many credits are left in the current period |
| Credits used | Total credits consumed in the current billing cycle |
| Usage by member | Breakdown of credit consumption per workspace member |
| Usage by day | Daily credit consumption chart |
| Renewal date | When the credit balance resets |
Usage alerts
Workspace admins receive notifications when:
- 75% of credits have been consumed — an early warning to manage usage.
- 90% of credits have been consumed — a more urgent alert.
- 100% of credits have been consumed — AI features are paused until renewal.
WARNING
When credits are fully consumed, AI Chat and AI-powered features become unavailable until the next billing cycle or until credits are manually replenished (if your plan supports it).
Credit limits by plan
Credit allocations vary by plan tier. All credit counts below are per workspace per month:
| Plan | Monthly credits | Overage option | Notes |
|---|---|---|---|
| Free | 50 | Not available | Basic AI access for evaluation |
| Pro | 500 | Purchase additional | Suitable for small to medium teams |
| Business | 2,000 | Purchase additional | Priority processing, higher rate limits |
| Enterprise | Custom | Custom agreement | Dedicated allocation, SLA included |
Purchasing additional credits
On Pro and Business plans, workspace admins can purchase additional credit packs:
- Go to Settings > Billing > AI Usage.
- Click Purchase Credits.
- Select a credit pack size.
- Complete the purchase.
Additional credits are available immediately and do not expire at the end of the billing cycle. They are consumed only after the monthly allocation is fully used.
Plan upgrade considerations
If you consistently run out of credits before the end of the billing cycle, consider:
- Upgrading your plan — Higher-tier plans include significantly more credits.
- Purchasing credit packs — A good option if overages are occasional.
- Optimizing usage — Review the efficiency tips below to reduce credit consumption per interaction.
- Moving to self-hosted — Self-hosted instances have no credit limits.
Self-hosted instances
If you run SetGet on your own infrastructure, the credit system does not apply. Instead, you configure your own LLM API key, and usage is billed directly by your LLM provider.
Configuring your API key
- Go to Settings > AI in your self-hosted instance.
- Enter your LLM provider API key.
- Select the model to use (if your provider offers multiple models).
- Save the configuration.
Supported providers
Self-hosted SetGet supports connecting to any OpenAI-compatible API endpoint. This includes:
| Provider | Configuration |
|---|---|
| OpenAI | Standard API key, model selection (GPT-4, etc.) |
| Azure OpenAI | Endpoint URL, API key, deployment name |
| Anthropic | API key, model selection (Claude, etc.) |
| Self-hosted LLMs | Any OpenAI-compatible endpoint (e.g., vLLM, Ollama) |
Advantages of self-hosted AI
- No credit limits — Use AI as much as you need, limited only by your provider's rate limits.
- Data sovereignty — Your workspace data is sent only to your chosen LLM provider.
- Model choice — Select the model that best fits your needs and budget.
- Cost control — Pay your LLM provider directly at their published rates.
TIP
For self-hosted instances, we recommend setting up usage monitoring with your LLM provider to track costs. SetGet logs all AI interactions in the admin panel for auditability.
Managing credit usage efficiently
Whether you are on a credit-limited cloud plan or managing LLM costs on a self-hosted instance, these practices help optimize your AI usage:
- Be specific — Focused prompts consume fewer tokens and therefore fewer credits than vague, open-ended questions.
- Use context — Navigate to the relevant project or cycle before chatting so the AI does not need to search across the entire workspace.
- Batch related questions — Ask multiple related questions in the same thread rather than starting new threads for each one.
- Use filters first — Apply view filters to narrow down data before asking the AI to analyze it.
- Prefer summaries over listings — Asking "Summarize sprint progress" is usually cheaper than "List every item in this sprint with all details."
Credit usage examples
Here are concrete examples of how credits are consumed for typical interactions:
| Prompt | Credits | Why |
|---|---|---|
| "How many open items are in the WEB project?" | 1 | Simple count query, minimal data retrieval |
| "List all high-priority bugs assigned to me" | 1-2 | Filtered query with moderate data |
| "Summarize Sprint 12 progress with completion breakdown" | 3 | Multi-item scan and summary generation |
| "Create a work item titled 'Fix header layout' in the WEB project" | 1 | Single action proposal |
| "Compare velocity across the last 4 sprints" | 4 | Multi-cycle data retrieval and analysis |
| "Update all Todo items in API project to In Progress" | 4 | Bulk action scanning multiple items |
| "Write a detailed sprint retrospective for Sprint 11" | 5 | Heavy data scan and long-form generation |
Understanding token consumption
Under the hood, credits map to the number of tokens processed by the LLM. Tokens include:
- Input tokens — Your message, the system context, and retrieved workspace data.
- Output tokens — The AI's response text and any action proposals.
Larger prompts with broader context (e.g., querying across all projects) consume more input tokens. Longer responses consume more output tokens. Both contribute to the total credit cost.
Frequently asked questions
Do credits roll over to the next month? Monthly allocation credits do not roll over. Purchased additional credit packs do persist until used.
Can I transfer credits between workspaces? No. Credits are allocated per workspace and cannot be transferred.
What happens when I run out of credits mid-conversation? The current thread remains accessible for reading, but you cannot send new messages until credits are available.
Can individual members have their own credit limits? Not currently. Credits are shared at the workspace level. Admins can monitor per-member usage but cannot set individual caps.
Related pages
- SetGet AI Overview — General AI capabilities and architecture
- AI Chat — Using the thread-based AI conversation interface
- AI Actions — Understanding action proposals and confirmations
- Admin Settings — Workspace administration and billing