Quick Start
Get started with Demeterics in 2 minutes. This guide shows you how to make your first LLM API call using Groq—with automatic usage tracking, credit billing, and BigQuery analytics.
1. Get Your API Key
- Sign in to demeterics.com
- Navigate to API Keys in the left sidebar
- Click Create API Key
- Copy your key (starts with
dmt_)
Important: Your API key will only be shown once. Store it securely!
2. Make Your First API Call
Demeterics provides OpenAI-compatible reverse proxy endpoints. Use your Demeterics API key, and we'll handle the rest—vendor authentication, usage tracking, and credit billing.
Example: Chat with Groq's Llama 3.3
curl -X POST https://api.demeterics.com/groq/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.3-70b-versatile",
"messages": [
{
"role": "user",
"content": "What are three interesting facts about quantum computing?"
}
]
}'
Response
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1234567890,
"model": "llama-3.3-70b-versatile",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Here are three interesting facts about quantum computing:\n\n1. **Superposition**: Quantum computers can process multiple possibilities simultaneously...\n2. **Quantum Entanglement**: Qubits can be linked in ways that classical bits cannot...\n3. **Exponential Speedup**: For certain problems, quantum computers can solve in seconds what would take classical computers millennia..."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 15,
"completion_tokens": 120,
"total_tokens": 135
}
}
That's it! Your interaction is now:
- ✅ Tracked in your Demeterics dashboard
- ✅ Stored in BigQuery for analytics
- ✅ Billed from your Stripe credit balance
- ✅ Available for export and reporting
3. View Your Interactions
- Go to demeterics.com/interactions
- See your API call with full request/response details
- Filter by model, date range, or user
- Export to JSON/CSV for analysis
4. Use with Your Favorite SDK
Demeterics is compatible with all OpenAI SDKs. Just change the base URL:
Python (OpenAI SDK)
from openai import OpenAI
# Initialize client with Demeterics endpoint
client = OpenAI(
base_url="https://api.demeterics.com/groq/v1",
api_key="dmt_your_api_key_here"
)
# Make a chat completion request
response = client.chat.completions.create(
model="llama-3.3-70b-versatile",
messages=[
{"role": "user", "content": "Explain machine learning in simple terms"}
]
)
print(response.choices[0].message.content)
Node.js (OpenAI SDK)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.demeterics.com/groq/v1',
apiKey: 'dmt_your_api_key_here'
});
const response = await client.chat.completions.create({
model: 'llama-3.3-70b-versatile',
messages: [
{ role: 'user', content: 'What is the best way to learn programming?' }
]
});
console.log(response.choices[0].message.content);
Go
package main
import (
"context"
"fmt"
"github.com/openai/openai-go"
"github.com/openai/openai-go/option"
)
func main() {
client := openai.NewClient(
option.WithBaseURL("https://api.demeterics.com/groq/v1"),
option.WithAPIKey("dmt_your_api_key_here"),
)
response, _ := client.Chat.Completions.New(context.Background(), openai.ChatCompletionNewParams{
Model: openai.F("llama-3.3-70b-versatile"),
Messages: openai.F([]openai.ChatCompletionMessageParamUnion{
openai.UserMessage("What is Docker?"),
}),
})
fmt.Println(response.Choices[0].Message.Content)
}
5. Try Other Providers
Demeterics supports Groq, OpenAI, Anthropic, and Gemini. Just change the base URL:
| Provider | Base URL | Example Model |
|---|---|---|
| Groq | https://api.demeterics.com/groq/v1 |
llama-3.3-70b-versatile |
| OpenAI | https://api.demeterics.com/openai/v1 |
gpt-4o, gpt-4o-mini |
| Anthropic | https://api.demeterics.com/anthropic/v1 |
anthropic/claude-sonnet-4.5 |
| Gemini | https://api.demeterics.com/gemini/v1 |
gemini-2.0-flash-exp |
Example: Switch to Claude
curl -X POST https://api.demeterics.com/anthropic/v1/messages \
-H "Authorization: Bearer dmt_your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic/claude-sonnet-4.5",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Write a haiku about coding"}
]
}'
6. Bring Your Own Key (BYOK)
Don't want to use Demeterics credits? Store your own vendor API keys:
- Go to Settings → API Keys
- Add your Groq/OpenAI/Anthropic/Gemini API key
- Make API calls as usual—Demeterics will use your key instead of billing credits
BYOK Benefits:
- ✅ No credit charges (use your vendor billing)
- ✅ Full usage tracking and analytics
- ✅ BigQuery storage for compliance
- ✅ Export and reporting tools
Next Steps
- API Reference - Full endpoint documentation
- Dashboard - View analytics and usage metrics
- Credits - Manage your Stripe credit balance
- AI Chat Widget - Embed chat on your website
Common Questions
What's the difference between my Demeterics API key and a vendor API key?
- Demeterics API key (
dmt_...): Used for all API calls. Demeterics handles vendor authentication, billing, and tracking. - Vendor API key (Groq/OpenAI/etc.): Optional. Store in Settings → API Keys for BYOK mode (zero billing).
How are credits charged?
Demeterics charges per token based on the model's pricing:
- Groq: ~$0.000001/token (input), ~$0.000002/token (output)
- OpenAI: ~$0.00003/token (GPT-4o input), ~$0.00006/token (output)
- Anthropic: ~$0.000003/token (Claude input), ~$0.000015/token (output)
See Pricing for exact rates. You can also use BYOK mode for zero billing.
Can I use streaming?
Yes! All providers support streaming. Add "stream": true to your request:
curl -X POST https://api.demeterics.com/groq/v1/chat/completions \
-H "Authorization: Bearer dmt_your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.3-70b-versatile",
"messages": [{"role": "user", "content": "Tell me a story"}],
"stream": true
}'
Where is my data stored?
- Interactions: BigQuery (your GCP project or Demeterics-managed)
- API Keys: Google Cloud Datastore (encrypted with KMS)
- Credit Balance: Stripe (PCI-compliant)
All data is encrypted at rest and in transit. See Security for details.
How do I monitor my usage?
- Dashboard: Real-time metrics at demeterics.com/dashboard
- Interactions: Detailed logs at demeterics.com/interactions
- Exports: Download JSON/CSV via
/api/v1/exports - BigQuery: Direct SQL queries for advanced analytics
Ready to build? Grab your API key and start tracking LLM interactions today! 🚀