Optimize Lokalise API throughput for translation pipelines by implementing cursor pagination, local caching, batch key operations (500/request), request throttling under the 6 req/s rate limit, and selective language downloads.
@lokalise/node-api SDK v9+ (ESM) or REST API accessLOKALISE_API_TOKEN environment variable setCursor pagination is significantly faster than offset pagination for projects with 5K+ keys. Offset pagination degrades as page numbers increase because the server must skip rows; cursor pagination uses a pointer.
import { LokaliseApi } from '@lokalise/node-api';
const lok = new LokaliseApi({ apiKey: process.env.LOKALISE_API_TOKEN! });
// Generator that yields all keys using cursor pagination
async function* getAllKeys(projectId: string) {
let cursor: string | undefined;
do {
const result = await lok.keys().list({
project_id: projectId,
limit: 500, // Maximum allowed per request
pagination: 'cursor',
cursor,
});
for (const key of result.items) yield key;
cursor = result.hasNextCursor() ? result.nextCursor : undefined;
} while (cursor);
}
// Usage: 10,000 keys = 20 API calls (vs 100 with default limit=100)
let count = 0;
for await (const key of getAllKeys('PROJECT_ID')) {
count++;
}
console.log(`Fetched ${count} keys`);
Offset pagination comparison (avoid for large projects):
| Keys | Offset (limit=100) | Cursor (limit=500) | Time saved |
|---|---|---|---|
| 1,000 | 10 requests | 2 requests | 80% |
| 10,000 | 100 requests | 20 requests | 80% |
| 50,000 | 500 requests (~84s) | 100 requests (~17s) | 80% |
Translation file downloads are the most expensive Lokalise operation. Cache them locally and use project last_activity timestamps to invalidate.
import { LokaliseApi } from '@lokalise/node-api';
import { readFileSync, writeFileSync, existsSync, mkdirSync } from 'fs';
const lok = new LokaliseApi({ apiKey: process.env.LOKALISE_API_TOKEN! });
const CACHE_DIR = '.lokalise-cache';
interface CacheEntry {
url: string;
timestamp: string;
languages: string[];
}
function getCachePath(projectId: string, langIso: string): string {
return `${CACHE_DIR}/${projectId}/${langIso}.json`;
}
function getMetaPath(projectId: string): string {
return `${CACHE_DIR}/${projectId}/meta.json`;
}
async function downloadWithCache(projectId: string, langIso: string, format = 'json') {
mkdirSync(`${CACHE_DIR}/${projectId}`, { recursive: true });
const cachePath = getCachePath(projectId, langIso);
const metaPath = getMetaPath(projectId);
// Check if project was modified since last cache
const project = await lok.projects().get(projectId);
const lastActivity = project.statistics?.last_activity ?? project.created_at;
if (existsSync(metaPath)) {
const meta: CacheEntry = JSON.parse(readFileSync(metaPath, 'utf8'));
if (meta.timestamp === lastActivity && existsSync(cachePath)) {
console.log(`Cache hit: ${langIso} (unchanged since ${lastActivity})`);
return JSON.parse(readFileSync(cachePath, 'utf8'));
}
}
// Cache miss — download fresh
const bundle = await lok.files().download(projectId, {
format,
filter_langs: [langIso],
original_filenames: false,
});
// bundle.bundle_url contains a temporary download URL
const response = await fetch(bundle.bundle_url);
const data = await response.arrayBuffer();
writeFileSync(cachePath, Buffer.from(data));
writeFileSync(metaPath, JSON.stringify({
url: bundle.bundle_url,
timestamp: lastActivity,
languages: [langIso],
}));
console.log(`Cache miss: downloaded ${langIso} (${data.byteLength} bytes)`);
return data;
}
Lokalise supports creating, updating, and deleting up to 500 keys per request. Always batch instead of making individual requests.
// Bulk create keys — 500 per batch with rate limit awareness
async function createKeysBatched(projectId: string, keys: any[]) {
const BATCH_SIZE = 500;
const results = [];
for (let i = 0; i < keys.length; i += BATCH_SIZE) {
const batch = keys.slice(i, i + BATCH_SIZE);
const result = await lok.keys().create({
project_id: projectId,
keys: batch,
});
results.push(...result.items);
console.log(`Batch ${Math.floor(i / BATCH_SIZE) + 1}: created ${result.items.length} keys`);
await new Promise(r => setTimeout(r, 200)); // Stay under 6 req/s
}
return results;
}
// Bulk update keys — same 500-key batch limit
async function updateKeysBatched(projectId: string, updates: Array<{key_id: number; [k: string]: any}>) {
const BATCH_SIZE = 500;
for (let i = 0; i < updates.length; i += BATCH_SIZE) {
const batch = updates.slice(i, i + BATCH_SIZE);
await lok.keys().bulk_update({
project_id: projectId,
keys: batch,
});
await new Promise(r => setTimeout(r, 200));
}
}
// Bulk delete — up to 500 key IDs per request
async function deleteKeysBatched(projectId: string, keyIds: number[]) {
const BATCH_SIZE = 500;
for (let i = 0; i < keyIds.length; i += BATCH_SIZE) {
const batch = keyIds.slice(i, i + BATCH_SIZE);
await lok.keys().bulk_delete({
project_id: projectId,
keys: batch,
});
await new Promise(r => setTimeout(r, 200));
}
}
// 2,000 keys: 4 batched requests instead of 2,000 individual ones
A proper request queue prevents 429 Too Many Requests errors and makes your integration resilient under load.
import PQueue from 'p-queue';
// Lokalise rate limit: 6 requests/second
// Use 5 concurrent with 1s interval for safety margin
const queue = new PQueue({
concurrency: 5,
interval: 1000,
intervalCap: 5,
});
async function throttledRequest<T>(fn: () => Promise<T>): Promise<T> {
return queue.add(fn) as Promise<T>;
}
// All API calls go through the queue automatically
const project = await throttledRequest(() => lok.projects().get(projectId));
const keys = await throttledRequest(() => lok.keys().list({
project_id: projectId,
limit: 500,
pagination: 'cursor',
}));
// Works for parallel operations too — queue enforces the rate limit
const projectIds = ['PROJ_1', 'PROJ_2', 'PROJ_3', 'PROJ_4', 'PROJ_5'];
const allProjects = await Promise.all(
projectIds.map(id => throttledRequest(() => lok.projects().get(id)))
);
File uploads and downloads are processed asynchronously by Lokalise. Instead of polling the process status endpoint, use webhooks to get notified when processing completes.
set -euo pipefail
# Set up a webhook for file operation events
curl -s -X POST "https://api.lokalise.com/api2/projects/${PROJECT_ID}/webhooks" \
-H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"url": "https://hooks.company.com/lokalise",
"events": [
"project.imported",
"project.exported",
"project.keys_added"
]
}' | jq '{webhook_id: .webhook.webhook_id, url: .webhook.url, events: .webhook.events}'
If you must poll (no webhook endpoint available):
async function waitForProcess(projectId: string, processId: string, timeoutMs = 120_000) {
const start = Date.now();
while (Date.now() - start < timeoutMs) {
const proc = await throttledRequest(() =>
lok.queuedProcesses().get(projectId, processId)
);
if (proc.status === 'finished') return proc;
if (proc.status === 'cancelled' || proc.status === 'failed') {
throw new Error(`Process ${processId} ${proc.status}: ${proc.message}`);
}
await new Promise(r => setTimeout(r, 2000)); // Poll every 2s
}
throw new Error(`Process ${processId} timed out after ${timeoutMs}ms`);
}
Downloading all languages when you only need one wastes bandwidth and API time. Always filter by language and, when possible, by modification timestamp.
// Download only changed translations since last sync
async function downloadDelta(projectId: string, langIso: string, sinceTimestamp: string) {
// Filter keys modified after the given timestamp
const keys = await lok.keys().list({
project_id: projectId,
limit: 500,
pagination: 'cursor',
filter_translation_lang_ids: langIso,
// Unfortunately, Lokalise doesn't support filter_modified_after on keys endpoint.
// Workaround: download full file and diff locally, or use webhooks for real-time sync.
});
return keys.items;
}
// Download a single language file instead of all languages
async function downloadSingleLanguage(projectId: string, langIso: string) {
const result = await lok.files().download(projectId, {
format: 'json',
filter_langs: [langIso], // Only this language
original_filenames: false, // Flat structure
bundle_structure: '%LANG_ISO%.%FORMAT%', // e.g., fr.json
export_empty_as: 'base', // Fall back to base language for untranslated
include_tags: ['production'], // Only production-tagged keys
});
return result.bundle_url;
}
set -euo pipefail
# Benchmark API response times across endpoints
echo "=== Lokalise API Benchmarks ==="
echo -n "Projects list: "
curl -s -o /dev/null -w "%{time_total}s" \
-H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
"https://api.lokalise.com/api2/projects?limit=10"
echo ""
echo -n "Keys list (limit=500): "
curl -s -o /dev/null -w "%{time_total}s" \
-H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
"https://api.lokalise.com/api2/projects/${PROJECT_ID}/keys?limit=500"
echo ""
echo -n "File download trigger: "
curl -s -o /dev/null -w "%{time_total}s" \
-X POST -H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
-H "Content-Type: application/json" \
"https://api.lokalise.com/api2/projects/${PROJECT_ID}/files/download" \
-d '{"format":"json","filter_langs":["en"],"original_filenames":false}'
echo ""
echo -n "Rate limit headers: "
curl -s -D - -o /dev/null \
-H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
"https://api.lokalise.com/api2/projects?limit=1" \
| grep -i x-ratelimit
| Issue | Cause | Solution |
|---|---|---|
429 Too Many Requests |
Exceeded 6 req/s global rate limit | Use PQueue throttling (Step 4), retry with exponential backoff |
| Slow file downloads | Large project with 50+ languages | Filter by filter_langs to download one language at a time |
| Pagination timeout | Offset pagination on 50K+ key projects | Switch to cursor pagination (Step 1) |
| Bulk create partial failure | Network timeout on large batch | Reduce batch size from 500 to 200 and add retry logic per batch |
| Cache stale after team edits | last_activity not granular enough |
Reduce cache TTL or use webhooks to invalidate on project.translation_updated |
bundle_url expired |
Download URL only valid for ~30 minutes | Fetch the URL and download immediately; do not store URLs for later |
set -euo pipefail
# See how much rate limit headroom you have right now
curl -s -D - -o /dev/null \
-H "X-Api-Token: ${LOKALISE_API_TOKEN}" \
"https://api.lokalise.com/api2/system/languages?limit=1" 2>/dev/null \
| grep -iE 'x-ratelimit' | while read -r line; do echo " $line"; done
lokalise-debug-bundle.lokalise-upgrade-migration.lokalise-ci-integration.