How I Built a $0/Month Screenshot + Web-Scraper Pipeline
How I Built a $0/Month Screenshot + Web-Scraper Pipeline
When you’re bootstrapping or indie hacking, every dollar counts. SaaS subscriptions and cloud bills can eat into your runway fast. That’s why I set out to build a fully serverless screenshot and web-scraper pipeline—with zero monthly cost, using only free tiers.
This article breaks down the architecture, code, and lessons learned from running a production-grade pipeline for screenshots, scraping, and reports—without paying a cent.
👉 Want to see it in action? Check out my Landing Page Analysis Tool at twocents.design
TL;DR
You can build a robust, automated screenshot and scraping pipeline for free using:
- QStash for scheduling and queueing jobs
- Inngest for orchestration and retries
- Puppeteer + Browserless for headless Chrome screenshots
- Vercel Blob for free file storage
- Neon + Prisma for Postgres metadata
All running on Vercel’s Hobby plan, with no always-on servers.
🛠️ Why Serverless? Why Free?
For indie hackers, cost and simplicity are everything:
- No idle servers: Functions only run when needed.
- No patching or maintenance: Providers handle the heavy lifting.
- Free quotas: Each service offers a generous free tier—combine them for a $0 bill.
Architecture Overview
graph TD
A[QStash request] -->|POST| B[/api/analysis/process]
B --> C{Inngest steps}
C --> D[Vercel Fn → Puppeteer <br>via Browserless]
D --> E[Vercel Blob Storage<br>desktop.png / mobile.png]
D --> F[Prisma → Neon<br>capture metadata]
1. Queue & Schedule with QStash
import { Client } from '@upstash/qstash'
const qstash = new Client({ token: process.env.QSTASH_TOKEN })
await qstash.publishJSON({
topic: 'on-demand-analytics',
body: { url: 'https://site.dev', email: 'owner@site.dev' },
})
Why QStash instead of DIY cron?
QStash gives you a managed webhook queue with built‑in retries and cron scheduling on Upstash’s free tier (500 requests per day, far more than you’ll need for your usage). No Redis instance, no CloudWatch rules—just publish JSON and let QStash worry about persistence and back‑off.
- Free quota: 500 requests/day + cron at no cost.
2. Event Orchestration with Inngest
import { inngest } from '@/lib/inngest'
import { scrapeSite } from '@/lib/scrape'
import { saveCapture } from '@/lib/db'
export default inngest.createFunction(
{ id: 'scrape' },
{ event: 'analysis.triggered' },
async ({ event, step }) => {
const { url, email } = event.data
const data = await step.run('scrape-site', () => scrapeSite(url))
const record = await step.run('store-meta', () =>
saveCapture(url, data)
)
}
)
Inngest also allows you to bypass typical serverless timeouts for longer jobs by splitting them into multiple steps, enabling advanced website scraping and user interactions that take more time to complete. For twocents.design, this enables more complex screenshot analysis without hitting function limits.
- Free: 100k invocations/month.
3. Scrape & Screenshot with Puppeteer + Browserless
import puppeteer from 'puppeteer-core'
import { put } from '@vercel/blob'
export async function scrapeSite(url) {
const browser = await puppeteer.connect({
browserWSEndpoint: `wss://chrome.browserless.io?token=${process.env.BROWSERLESS_TOKEN}`
})
const page = await browser.newPage()
await page.setViewport({ width: 1440, height: 900 })
await page.goto(url, { waitUntil: 'networkidle0', timeout: 45000 })
const desktop = await page.screenshot({ fullPage: true })
await page.setViewport({ width: 390, height: 844, isMobile: true })
await page.reload({ waitUntil: 'networkidle0' })
const mobile = await page.screenshot({ fullPage: true })
await browser.close()
// store in Vercel Blob (public URL)
const desktopUrl = (await put(`${Date.now()}-desk.png`, desktop)).url
const mobileUrl = (await put(`${Date.now()}-mob.png`, mobile)).url
return { desktopUrl, mobileUrl }
}
- Browserless free: 3 concurrent sessions / 1,000 sec daily
- Vercel Blob free: 100 GB storage, 100 GB egress
4. Metadata into Neon Postgres via Prisma
model Capture {
id Int @id @default(autoincrement())
url String
desktopShot String
mobileShot String
createdAt DateTime @default(now())
}
import { prisma } from '@/lib/prisma'
export function saveCapture(url, shots) {
return prisma.capture.create({
data: {
url,
desktopShot: shots.desktopUrl,
mobileShot: shots.mobileUrl,
},
})
}
- Neon free: 10 GB storage, generous row limits
Cost Table @ Vercel Hobby Tier
Service | Free Tier |
---|---|
Vercel Functions | 100k invocations + 100 GB-hr/month |
Vercel Blob | 1 GB storage, 10 GB egress, 10k ops |
Browserless | 1,000 units, 1 concurrency, 1-min |
QStash | 500 msgs/day, 100 RPS |
Inngest | 50k runs, concurrency=5 |
Neon | 0.5 GB, 191.9 compute-hr, 5 GB eg. |
Total monthly bill: $0.
Updated Implementation Notes
You can combine Vercel’s Hobby plan, a free Browserless token, QStash, and Neon DB to handle scraping and screenshots at zero cost. Each service stays within free tiers under typical usage. By connecting Puppeteer to Browserless, storing images on Vercel Blob, and triggering workflows through QStash and Inngest, the entire process runs serverless with minimal setup or maintenance.
Lessons Learned
- Single Browserless session, many tabs → stay under free concurrency.
- Blob Storage beats S3 for DX →
put()
returns a url instantly. - Inngest retries → isolate failure; one step re-runs without duplicating emails.
- Prisma + Neon → serverless driver means no “max connections” pain.
- Vercel Hobby is plenty → tasks barely scratch 2% of the compute quota.
Result
This pipeline now:
- Runs entirely on free‑tier, server‑less services—no idle servers, no surprise bills.
- Captures full‑page desktop and mobile screenshots, stores them on Vercel Blob, and logs metadata in Neon.
- Completes each job in under 30 seconds of function runtime.
Building a new website? Checkout SSK, your all in one starterkit for SaaS products. 🛠️✨
