Skip to main content

Default Limits

Endpoint CategoryLimitWindow
Order Creation50 req/secRolling
Tracking Queries100 req/secRolling
Label Operations20 req/secRolling
Manifest Submission10 req/secRolling
Rate limits are applied per API key.

Rate Limit Headers

Every response includes rate limit information:
X-RateLimit-Limit: 50
X-RateLimit-Remaining: 47
X-RateLimit-Reset: 1703155200
HeaderDescription
X-RateLimit-LimitMaximum requests per second
X-RateLimit-RemainingRequests remaining
X-RateLimit-ResetUnix timestamp when limit resets

Handling Rate Limits

Check Before Hitting Limits

class RateLimitedClient {
  constructor(apiKey) {
    this.apiKey = apiKey;
    this.remaining = 50;
    this.resetAt = 0;
  }

  async request(url, options) {
    // Wait if we're out of quota
    if (this.remaining <= 0 && Date.now() < this.resetAt) {
      const delay = this.resetAt - Date.now();
      await sleep(delay);
    }

    const response = await fetch(url, {
      ...options,
      headers: {
        ...options.headers,
        'X-API-Key': this.apiKey
      }
    });

    // Update rate limit state
    this.remaining = parseInt(response.headers.get('X-RateLimit-Remaining'));
    this.resetAt = parseInt(response.headers.get('X-RateLimit-Reset')) * 1000;

    return response;
  }
}

Handle 429 Responses

async function requestWithRetry(url, options, maxRetries = 3) {
  for (let attempt = 0; attempt < maxRetries; attempt++) {
    const response = await fetch(url, options);
    
    if (response.status === 429) {
      const retryAfter = response.headers.get('Retry-After') || 
                         response.headers.get('X-RateLimit-Reset');
      const delay = retryAfter 
        ? parseInt(retryAfter) * 1000 
        : Math.pow(2, attempt) * 1000;
      
      console.log(`Rate limited. Waiting ${delay}ms...`);
      await sleep(delay);
      continue;
    }
    
    return response;
  }
  
  throw new Error('Max retries exceeded');
}

Batch Operations

Reduce request count with batch endpoints:

Instead of N tracking requests:

// ❌ Bad: N requests for N parcels
for (const trackingNumber of trackingNumbers) {
  await fetch(`/api/v1/tracking/${trackingNumber}`);
}

Use batch endpoint:

// ✅ Good: 1 request for N parcels
await fetch('/api/v1/tracking/batch', {
  method: 'POST',
  body: JSON.stringify({ trackingNumbers })
});

Higher Limits

Need higher rate limits? Contact us:
PlanOrder CreationTrackingSupport
Starter50/sec100/secEmail
Professional200/sec500/secPriority
EnterpriseCustomCustomDedicated

Request Higher Limits

Contact us to discuss your volume requirements

Best Practices

Spread requests evenly rather than bursting. Use a queue or rate limiter on your side.
Cache tracking results for 5-10 minutes. Status changes aren’t instant.
For status updates, webhooks are more efficient than polling.
Schedule bulk operations during off-peak hours (2-6 AM EST).