Skip to main content
To ensure platform stability and fair usage, the Givebutter API implements rate limiting. Rate limits prevent any single integration from consuming excessive resources and ensure reliable API access for all users.

Rate Limit Tiers

Different rate limits apply based on your account tier:
TierRequests per MinuteRequests per HourRequests per Day
Standard603,00050,000
Enterprise30015,000250,000
Need higher limits? Contact [email protected] to discuss Enterprise tier options with increased rate limits.

How Rate Limiting Works

Rate limits are applied per API key and are calculated using a sliding window:
1

Request Counted

Each API request is counted against your rate limit
2

Window Check

The API checks if you’ve exceeded the limit in the current time window (per minute, hour, or day)
3

Request Processed or Rejected

  • If within limit: Request is processed normally - If exceeded: Request is rejected with a 429 status code
4

Window Slides

As time passes, older requests fall out of the window and your available quota increases

Rate Limit Headers

Every API response includes headers showing your current rate limit status:
HeaderDescriptionExample
X-RateLimit-LimitMaximum requests allowed in current window60
X-RateLimit-RemainingRequests remaining in current window45
X-RateLimit-ResetUnix timestamp when the rate limit resets1704067200
curl -i "https://api.givebutter.com/v1/campaigns" \
  -H "Authorization: Bearer YOUR_API_KEY"

# Response headers:

# X-RateLimit-Limit: 60

# X-RateLimit-Remaining: 59

# X-RateLimit-Reset: 1704067200

Rate Limit Exceeded (429 Error)

When you exceed your rate limit, you’ll receive a 429 Too Many Requests response:
{
  "error": {
    "type": "rate_limit_error",
    "message": "Rate limit exceeded. Please wait before making another request.",
    "code": 429
  }
}
The Retry-After header tells you how many seconds to wait before making another request.

Handling Rate Limits

Exponential Backoff

Implement exponential backoff when you hit rate limits:
async function fetchWithRateLimit(url, options, maxRetries = 3) {
  for (let attempt = 0; attempt <= maxRetries; attempt++) {
    const response = await fetch(url, options);

    // Success
    if (response.ok) {
      return await response.json();
    }

    // Rate limited
    if (response.status === 429) {
      if (attempt < maxRetries) {
        // Use Retry-After header if available
        const retryAfter = response.headers.get('Retry-After');
        const delay = retryAfter
          ? parseInt(retryAfter) * 1000
          : Math.pow(2, attempt) * 1000; // Exponential: 1s, 2s, 4s

        console.log(`Rate limited. Waiting ${delay / 1000}s...`);
        await new Promise(resolve => setTimeout(resolve, delay));
        continue;
      }
    }

    // Other errors
    const error = await response.json();
    throw new Error(error.error.message);

}

throw new Error('Max retries exceeded');
}

// Usage
const data = await fetchWithRateLimit(
'https://api.givebutter.com/v1/campaigns',
{ headers: { 'Authorization': 'Bearer YOUR_API_KEY' } }
);

Monitor Rate Limit Usage

Track your rate limit usage to avoid hitting limits:
class RateLimitMonitor {
  constructor() {
    this.limit = null;
    this.remaining = null;
    this.resetTime = null;
  }

  updateFromResponse(response) {
    this.limit = parseInt(response.headers.get('X-RateLimit-Limit'));
    this.remaining = parseInt(response.headers.get('X-RateLimit-Remaining'));
    this.resetTime = parseInt(response.headers.get('X-RateLimit-Reset'));
  }

  shouldWait() {
    // If less than 10% remaining, consider waiting
    return this.remaining < this.limit * 0.1;
  }

  getWaitTime() {
    const now = Math.floor(Date.now() / 1000);
    return Math.max(0, this.resetTime - now);
  }

  getStatus() {
    const percentUsed = (((this.limit - this.remaining) / this.limit) * 100).toFixed(1);
    return {
      limit: this.limit,
      remaining: this.remaining,
      percentUsed: `${percentUsed}%`,
      resetsIn: `${this.getWaitTime()}s`,
    };
  }
}

// Usage
const monitor = new RateLimitMonitor();

async function fetchCampaigns() {
  const response = await fetch('https://api.givebutter.com/v1/campaigns', {
    headers: { Authorization: 'Bearer YOUR_API_KEY' },
  });

  monitor.updateFromResponse(response);
  console.log('Rate limit status:', monitor.getStatus());

  if (monitor.shouldWait()) {
    console.warn('Approaching rate limit. Consider slowing down requests.');
  }

  return await response.json();
}

Best Practices

Cache API responses to reduce the number of requests:
const cache = new Map();
const CACHE_TTL = 5 * 60 * 1000; // 5 minutes

async function getCachedCampaigns() {
  const cacheKey = 'campaigns';
  const cached = cache.get(cacheKey);

  // Return cached data if still fresh
  if (cached && Date.now() - cached.timestamp < CACHE_TTL) {
    console.log('Returning cached data');
    return cached.data;
  }

  // Fetch fresh data
  const response = await fetch('https://api.givebutter.com/v1/campaigns', {
    headers: { 'Authorization': 'Bearer YOUR_API_KEY' }
  });
  const data = await response.json();

  // Update cache
  cache.set(cacheKey, {
    data,
    timestamp: Date.now()
  });

  return data;
}
Instead of repeatedly polling endpoints, use webhooks for real-time updates:❌ Don’t do this (wastes rate limit):
// Polling every 10 seconds = 360 requests per hour
setInterval(async () => {
  const response = await fetch('https://api.givebutter.com/v1/transactions?page=1');
  const data = await response.json();
  checkForNewTransactions(data.data);
}, 10000);
✅ Do this (efficient):
// Set up webhook endpoint to receive real-time notifications
app.post('/webhooks/givebutter', (req, res) => {
  const event = req.body;
  if (event.type === 'transaction.created') {
    handleNewTransaction(event.data);
  }
  res.json({ received: true });
});
When fetching multiple resources, batch requests efficiently:
// Add delay between requests to stay within limits
async function fetchMultipleCampaigns(campaignIds) {
  const results = [];
  const delayMs = 1000; // 1 second = 60 requests per minute max

  for (const id of campaignIds) {
    const response = await fetch(
      `https://api.givebutter.com/v1/campaigns/${id}`,
      { headers: { 'Authorization': 'Bearer YOUR_API_KEY' } }
    );
    results.push(await response.json());

    // Delay before next request (except for last one)
    if (id !== campaignIds[campaignIds.length - 1]) {
      await new Promise(resolve => setTimeout(resolve, delayMs));
    }
  }

  return results;
}
When paginating, use appropriate page sizes to minimize requests:
// ❌ Bad - many small requests
async function getAllCampaigns() {
  let page = 1;
  const campaigns = [];

  while (true) {
    const response = await fetch(
      `https://api.givebutter.com/v1/campaigns?page=${page}&per_page=10`, // Small pages
      { headers: { 'Authorization': 'Bearer YOUR_API_KEY' } }
    );
    const data = await response.json();
    campaigns.push(...data.data);

    if (!data.links.next) break;
    page++;
  }

  return campaigns;
}

// ✅ Good - fewer large requests
async function getAllCampaigns() {
  const campaigns = [];
  let nextUrl = 'https://api.givebutter.com/v1/campaigns?per_page=100'; // Max size

  while (nextUrl) {
    const response = await fetch(nextUrl, {
      headers: { 'Authorization': 'Bearer YOUR_API_KEY' }
    });
    const data = await response.json();
    campaigns.push(...data.data);
    nextUrl = data.links.next;
  }

  return campaigns;
}
Queue requests to automatically stay within rate limits:
class RateLimiter {
  constructor(requestsPerMinute = 60) {
    this.requestsPerMinute = requestsPerMinute;
    this.queue = [];
    this.processing = false;
  }

  async request(url, options) {
    return new Promise((resolve, reject) => {
      this.queue.push({ url, options, resolve, reject });
      this.processQueue();
    });
  }

  async processQueue() {
    if (this.processing || this.queue.length === 0) return;

    this.processing = true;
    const delayMs = (60 * 1000) / this.requestsPerMinute;

    while (this.queue.length > 0) {
      const { url, options, resolve, reject } = this.queue.shift();

      try {
        const response = await fetch(url, options);
        const data = await response.json();
        resolve(data);
      } catch (error) {
        reject(error);
      }

      // Wait before processing next request
      if (this.queue.length > 0) {
        await new Promise(r => setTimeout(r, delayMs));
      }
    }

    this.processing = false;
  }
}

// Usage
const limiter = new RateLimiter(60); // 60 requests per minute

async function fetchCampaign(id) {
  return limiter.request(
    `https://api.givebutter.com/v1/campaigns/${id}`,
    { headers: { 'Authorization': 'Bearer YOUR_API_KEY' } }
  );
}

// All requests automatically queued and rate-limited
const campaigns = await Promise.all([
  fetchCampaign('camp_1'),
  fetchCampaign('camp_2'),
  fetchCampaign('camp_3'),
  // ... more requests
]);

Rate Limit Tips

Spread Requests

Distribute requests evenly over time instead of bursting. This prevents hitting limits and provides more consistent performance.

Monitor Usage

Track your rate limit headers to understand usage patterns and optimize request timing before hitting limits.

Cache Aggressively

Cache responses for as long as acceptable for your use case. Even a 1-minute cache can dramatically reduce API calls.

Use Webhooks

Replace polling with webhooks for real-time updates. This is more efficient and doesn’t consume your rate limit.

Increase Your Rate Limit

Need higher rate limits for your integration?

Enterprise Tier

Upgrade to Enterprise for 5x higher rate limits (300 req/min, 15,000/hour, 250,000/day)

Custom Limits

For unique requirements, contact [email protected] to discuss custom rate limit arrangements
Enterprise tier also includes: - Priority support - Dedicated account manager - Custom integrations assistance - Early access to new API features

Next Steps