Batch Screenshot Processing: Capture Hundreds of URLs
Published March 14, 2026 -- 12 min read
Need to screenshot 100 product pages? 1,000 competitor websites? 10,000 URLs from a sitemap? This guide covers strategies for capturing screenshots at scale with proper concurrency, error handling, and cost control.
Use Cases for Batch Screenshots
- Link directories: Generate thumbnail previews for thousands of listed websites
- SEO monitoring: Capture weekly snapshots of all your pages to detect visual regressions
- Competitive intelligence: Screenshot competitor pages to track changes over time
- E-commerce catalogs: Capture product page screenshots for documentation or comparison
- Compliance audits: Archive web pages for regulatory requirements
- Content archiving: Create visual backups of websites before redesigns
Strategy 1: Controlled Concurrency
The simplest approach: process URLs in batches with controlled concurrency. This prevents overwhelming the API and gives you control over the flow.
const API_BASE = 'https://screenshotapi-api-production.up.railway.app';
const API_KEY = 'YOUR_API_KEY';
/**
* Capture screenshots with controlled concurrency
* @param {string[]} urls - Array of URLs to capture
* @param {number} concurrency - Max parallel requests (default: 5)
*/
async function batchCapture(urls, concurrency = 5) {
const results = [];
const queue = [...urls];
async function worker() {
while (queue.length > 0) {
const url = queue.shift();
try {
const response = await fetch(
`${API_BASE}/v1/screenshot?url=${encodeURIComponent(url)}&format=png&device=desktop`,
{ headers: { 'Authorization': `Bearer ${API_KEY}` } }
);
if (!response.ok) {
const error = await response.json();
results.push({ url, success: false, error: error.error });
continue;
}
const buffer = Buffer.from(await response.arrayBuffer());
const filename = url.replace(/https?:\/\//, '').replace(/[^a-z0-9]/gi, '_');
fs.writeFileSync(`screenshots/${filename}.png`, buffer);
results.push({ url, success: true, size: buffer.length });
console.log(`Captured: ${url} (${buffer.length} bytes)`);
} catch (err) {
results.push({ url, success: false, error: err.message });
}
}
}
// Start workers
const workers = Array(concurrency).fill(null).map(() => worker());
await Promise.all(workers);
return results;
}
// Usage
const urls = fs.readFileSync('urls.txt', 'utf-8').split('\n').filter(Boolean);
console.log(`Processing ${urls.length} URLs with 5 concurrent workers...`);
const results = await batchCapture(urls, 5);
const success = results.filter(r => r.success).length;
console.log(`Done: ${success}/${results.length} succeeded`);Strategy 2: Async with Webhooks
For large batches (100+ URLs), use the async endpoint. Submit all jobs at once and let the API notify you when each completes.
// Submit batch of async jobs
async function submitBatch(urls) {
const jobs = [];
for (const url of urls) {
const response = await fetch(`${API_BASE}/v1/screenshot/async`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
url,
format: 'png',
device: 'desktop',
callback_url: 'https://your-server.com/webhook/screenshots',
}),
});
const job = await response.json();
jobs.push({ url, jobId: job.id });
// Throttle submissions: 10 per second
if (jobs.length % 10 === 0) {
await new Promise(r => setTimeout(r, 1000));
}
}
return jobs;
}Strategy 3: Multi-Device Batch
Capture each URL across multiple devices for responsive design testing.
const devices = ['desktop', 'iphone15', 'ipad', 'galaxy_s24'];
const urls = ['https://example.com', 'https://stripe.com'];
async function multiDeviceBatch(urls, devices) {
const tasks = [];
for (const url of urls) {
for (const device of devices) {
tasks.push({ url, device });
}
}
console.log(`Total captures: ${tasks.length} (${urls.length} URLs x ${devices.length} devices)`);
// Process with concurrency limit
const results = await batchCapture(
tasks.map(t => `${t.url}&device=${t.device}`),
3 // Lower concurrency for multi-device
);
return results;
}Error Handling and Retries
At scale, some captures will fail. Common failure modes and how to handle them:
| Error | Cause | Solution |
|---|---|---|
| 504 Timeout | Slow page | Retry once, then skip |
| 429 Rate Limited | Too many requests | Reduce concurrency, add delay |
| 403 Forbidden | Site blocks bots | Try with ad blocking, skip if persistent |
| DNS failure | Domain not found | Skip, mark as dead URL |
| SSL error | Expired certificate | Log and skip |
async function captureWithRetry(url, maxRetries = 2) {
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
const response = await fetch(
`${API_BASE}/v1/screenshot?url=${encodeURIComponent(url)}&format=png`,
{ headers: { 'Authorization': `Bearer ${API_KEY}` } }
);
if (response.status === 429) {
// Rate limited - wait and retry
const retryAfter = parseInt(response.headers.get('Retry-After')) || 30;
console.log(`Rate limited. Waiting ${retryAfter}s...`);
await new Promise(r => setTimeout(r, retryAfter * 1000));
continue;
}
if (response.status === 504 && attempt < maxRetries) {
// Timeout - retry once
console.log(`Timeout on ${url}. Retrying...`);
continue;
}
if (!response.ok) {
return { url, success: false, status: response.status };
}
const buffer = Buffer.from(await response.arrayBuffer());
return { url, success: true, size: buffer.length, data: buffer };
} catch (err) {
if (attempt < maxRetries) continue;
return { url, success: false, error: err.message };
}
}
}Cost Optimization
- Use WebP format: 30-50% smaller files than PNG, faster transfer
- Set output_width: If you only need thumbnails, resize in the API instead of downloading full-size images
- Cache results: Store screenshots and skip URLs captured within the last 24 hours
- Filter URLs first: Validate URLs before sending to the API. Remove duplicates, dead links, and redirects.
- Choose the right plan: Business plan ($99/mo) gives you 100,000 screenshots -- that is less than $0.001 per capture
Real-World Example: Sitemap Crawler
// Screenshot every page in a sitemap
import { parseStringPromise } from 'xml2js';
async function screenshotSitemap(sitemapUrl) {
// Fetch and parse sitemap
const res = await fetch(sitemapUrl);
const xml = await res.text();
const parsed = await parseStringPromise(xml);
const urls = parsed.urlset.url.map(u => u.loc[0]);
console.log(`Found ${urls.length} URLs in sitemap`);
// Capture all pages
const results = await batchCapture(urls, 5);
// Generate report
const report = {
total: results.length,
success: results.filter(r => r.success).length,
failed: results.filter(r => !r.success).length,
totalSize: results.filter(r => r.success).reduce((sum, r) => sum + r.size, 0),
};
console.log(`Report: ${report.success}/${report.total} captured`);
console.log(`Total size: ${(report.totalSize / 1024 / 1024).toFixed(1)} MB`);
return report;
}
await screenshotSitemap('https://example.com/sitemap.xml');Scale your screenshots
From 100 to 100,000 screenshots per month. Device presets, async processing, and webhooks included on all plans.