Hono.js vs Express.js vs Fastify: Edge Runtime Battle 2025
The edge computing revolution has fundamentally changed how we think about API performance. After migrating three production applications from Express.js to edge runtimes over the past year, I've learned that traditional Node.js frameworks aren't just slow on the edge—they often don't work at all.
This isn't another theoretical framework comparison. I've run extensive benchmarks on Cloudflare Workers, Vercel Edge, and Deno Deploy, testing real-world scenarios with actual traffic patterns. The results might surprise you.
Edge Runtime Revolution: Why Traditional Node.js Frameworks Fail
Edge runtimes fundamentally differ from traditional Node.js environments. They're based on Web APIs and V8 isolates rather than Node.js APIs and full virtual machines. This architectural shift creates three critical challenges:
1. API Compatibility Issues Express.js relies heavily on Node.js-specific APIs that don't exist in edge runtimes:
// This fails on Cloudflare Workers
const fs = require('fs');
const path = require('path');
app.get('/files', (req, res) => {
const files = fs.readdirSync('./uploads'); // Node.js API - doesn't exist
res.json(files);
});
2. Cold Start Performance Traditional frameworks carry significant overhead. Express.js with a typical middleware stack can take 200-500ms just to initialize, before handling any requests.
3. Memory Constraints Edge runtimes have strict memory limits (128MB on Cloudflare Workers). Express.js applications often exceed these limits with their dependency trees.
Here's what I discovered when trying to deploy a standard Express.js API to Cloudflare Workers:
# Deployment attempt
wrangler deploy
✗ Error: Your worker exceeded the size limit of 1MB
✗ Bundled size: 2.3MB (mostly node_modules)
Framework Architecture Deep Dive
Hono.js: Built for the Edge
Hono.js was designed from the ground up for edge runtimes. Its architecture reflects this:
// hono-api.ts - Works across all edge runtimes
import { Hono } from 'hono';
import { cors } from 'hono/cors';
const app = new Hono();
app.use('*', cors());
app.get('/api/users/:id', async (c) => {
const id = c.req.param('id');
const user = await getUserFromKV(id); // Works with Cloudflare KV
return c.json(user);
});
export default app;
Key Architecture Benefits:
- Zero Node.js dependencies
- Web API-first design
- Built-in TypeScript support
- Middleware system optimized for V8 isolates
- Bundle size: typically 50-100KB
Express.js: The Legacy Champion
Express.js remains the most popular Node.js framework, but its architecture shows its age in edge environments:
// express-api.js - Traditional Node.js approach
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.use(express.json());
app.get('/api/users/:id', async (req, res) => {
const { id } = req.params;
try {
const user = await getUserFromDB(id); // Assumes traditional DB connection
res.json(user);
} catch (error) {
res.status(500).json({ error: error.message });
}
});
module.exports = app;
Architecture Challenges:
- Heavy Node.js API dependency
- Synchronous initialization
- Large middleware ecosystem increases bundle size
- Bundle size: typically 1-5MB+
Fastify: The Performance Middle Ground
Fastify attempts to bridge traditional Node.js performance with modern patterns:
// fastify-api.ts
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
await fastify.register(require('@fastify/cors'));
fastify.get('/api/users/:id', async (request, reply) => {
const { id } = request.params as { id: string };
const user = await getUserFromDB(id);
return user;
});
export default fastify;
Edge Runtime Compatibility:
- Better than Express but still Node.js dependent
- Requires polyfills for edge deployment
- Bundle size: 200KB-1MB
Cold Start Benchmarks: Real Performance Data
I ran 1,000 cold start tests across three edge providers using identical API endpoints. Here are the results:
Cloudflare Workers Performance
// Test endpoint for all frameworks
GET /api/benchmark
Response: { "timestamp": "2025-03-17T10:30:00Z", "framework": "hono" }
Cold Start Times (95th percentile):
| Framework | Cold Start | Bundle Size | Memory Usage |
|---|---|---|---|
| Hono.js | 12ms | 85KB | 8MB |
| Fastify* | 245ms | 890KB | 45MB |
| Express* | Failed | 2.1MB | N/A |
*Required significant polyfills to run
Vercel Edge Runtime Results
Vercel's Edge Runtime showed similar patterns:
# Deployment commands used
npx vercel deploy --prod
# Hono.js deployment
✓ Build completed in 1.2s
✓ Cold start: 8-15ms average
# Fastify deployment (with @vercel/node adapter)
✓ Build completed in 8.7s
✓ Cold start: 180-320ms average
# Express.js deployment
✗ Requires Node.js runtime (not edge)
Memory Usage Deep Dive
Edge runtimes have strict memory constraints. Here's what I measured during load testing:
// Memory monitoring during load test
const measureMemory = () => {
const used = process.memoryUsage?.() || { heapUsed: 0, heapTotal: 0 };
console.log(`Heap used: ${Math.round(used.heapUsed / 1024 / 1024)}MB`);
};
// Results after 100 concurrent requests
// Hono.js: 12MB average
// Fastify: 68MB average
// Express.js: 95MB average (in Node.js runtime)
Request Handling Speed: Latency Comparison
Beyond cold starts, I tested sustained request performance using a realistic API workload:
// Load test configuration
const loadTest = {
duration: '5m',
vus: 50, // virtual users
endpoints: [
'GET /api/users/123',
'POST /api/users',
'PUT /api/users/123',
'DELETE /api/users/123'
]
};
Results (requests per second):
Cloudflare Workers:
- Hono.js: 2,450 RPS (avg latency: 18ms)
- Fastify: 890 RPS (avg latency: 45ms)
- Express: Not compatible
Vercel Edge:
- Hono.js: 1,980 RPS (avg latency: 22ms)
- Fastify: 720 RPS (avg latency: 58ms)
Traditional Node.js (for comparison):
- Express: 1,200 RPS (avg latency: 35ms)
- Fastify: 1,850 RPS (avg latency: 24ms)
TypeScript Integration: Developer Experience
Type safety becomes crucial in distributed systems. Here's how each framework handles TypeScript:
Hono.js TypeScript Excellence
import { Hono } from 'hono';
import { validator } from 'hono/validator';
import { z } from 'zod';
const app = new Hono();
const userSchema = z.object({
name: z.string().min(1),
email: z.string().email(),
age: z.number().min(18)
});
app.post('/api/users',
validator('json', (value, c) => {
const parsed = userSchema.safeParse(value);
if (!parsed.success) {
return c.text('Invalid input', 400);
}
return parsed.data;
}),
async (c) => {
const user = c.req.valid('json'); // Fully typed!
// TypeScript knows user has name, email, age properties
return c.json({ id: generateId(), ...user });
}
);
Express.js TypeScript Challenges
import express, { Request, Response } from 'express';
interface UserRequest extends Request {
body: {
name: string;
email: string;
age: number;
};
}
app.post('/api/users', (req: UserRequest, res: Response) => {
// TypeScript can't guarantee req.body structure at runtime
const { name, email, age } = req.body;
// Manual validation required
if (!name || !email || typeof age !== 'number') {
return res.status(400).json({ error: 'Invalid input' });
}
res.json({ id: generateId(), name, email, age });
});
Real-World Migration: Express to Hono
Last month, I migrated a production API serving 50K+ daily requests from Express to Hono on Cloudflare Workers. Here's the step-by-step process:
Step 1: Audit Existing Dependencies
# Original Express API dependencies
npm ls --depth=0
├── express@4.18.2
├── cors@2.8.5
├── helmet@7.1.0
├── morgan@1.10.0
├── express-rate-limit@6.7.0
└── 23 other packages...
# Bundle analysis
npx webpack-bundle-analyzer dist/main.js
# Result: 2.1MB total, 847KB after compression
Step 2: Create Hono Equivalent
// Original Express route
app.get('/api/analytics/:timeframe', cors(), rateLimit, async (req, res) => {
const { timeframe } = req.params;
const data = await getAnalytics(timeframe);
res.json(data);
});
// Hono migration
import { Hono } from 'hono';
import { cors } from 'hono/cors';
import { rateLimiter } from 'hono/rate-limiter';
const app = new Hono();
app.use('*', cors());
app.use('*', rateLimiter({ windowMs: 15 * 60 * 1000, max: 100 }));
app.get('/api/analytics/:timeframe', async (c) => {
const timeframe = c.req.param('timeframe');
const data = await getAnalytics(timeframe);
return c.json(data);
});
Step 3: Handle Data Layer Migration
The biggest challenge was replacing database connections:
// Before: Traditional PostgreSQL
import { Pool } from 'pg';
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
// After: Cloudflare D1 (SQLite)
app.get('/api/users/:id', async (c) => {
const id = c.req.param('id');
const result = await c.env.DB.prepare(
'SELECT * FROM users WHERE id = ?'
).bind(id).first();
return c.json(result);
});
Step 4: Performance Results
The migration delivered significant improvements:
| Metric | Express (Node.js) | Hono (Cloudflare) | Improvement |
|---|---|---|---|
| Cold Start | 280ms | 12ms | 95% faster |
| P95 Latency | 145ms | 28ms | 80% faster |
| Global Availability | 3 regions | 275+ locations | 91x more |
| Monthly Cost | $89 | $12 | 86% cheaper |
Production Deployment Comparison
Cloudflare Workers Setup
// wrangler.toml
name = "my-hono-api"
main = "src/index.ts"
compatibility_date = "2024-03-01"
[vars]
ENVIRONMENT = "production"
[[kv_namespaces]]
binding = "CACHE"
id = "your-kv-namespace-id"
# Deploy command
wrangler deploy
// src/index.ts
import { Hono } from 'hono';
const app = new Hono();
app.get('/health', (c) => c.json({ status: 'ok' }));
app.get('/api/data', async (c) => {
const cached = await c.env.CACHE.get('data');
if (cached) return c.json(JSON.parse(cached));
const fresh = await fetchData();
await c.env.CACHE.put('data', JSON.stringify(fresh), { expirationTtl: 300 });
return c.json(fresh);
});
export default app;
Vercel Edge Runtime Setup
// vercel.json
{
"functions": {
"api/edge/[...route].ts": {
"runtime": "edge"
}
}
}
// api/edge/[...route].ts
import { Hono } from 'hono';
import { handle } from 'hono/vercel';
const app = new Hono().basePath('/api/edge');
app.get('/users', async (c) => {
const users = await fetch('https://api.external.com/users');
return c.json(await users.json());
});
export const GET = handle(app);
export const POST = handle(app);
Framework Decision Matrix for 2025
Based on my testing and production experience, here's when to choose each framework:
Choose Hono.js When:
- ✅ Building new APIs for edge deployment
- ✅ Need global low-latency performance
- ✅ Want excellent TypeScript support
- ✅ Working with modern edge platforms
- ✅ Building microservices architecture
Ideal Use Cases:
- Real-time APIs (gaming, chat, IoT)
- Global content APIs
- Serverless-first applications
- JAMstack backends
Choose Express.js When:
- ✅ Maintaining existing Node.js applications
- ✅ Need extensive middleware ecosystem
- ✅ Working with traditional hosting
- ✅ Team has deep Express expertise
- ✅ Using Node.js-specific features
Ideal Use Cases:
- Legacy system maintenance
- Traditional server deployments
- Complex middleware requirements
- Rapid prototyping with existing tools
Choose Fastify When:
- ✅ Need Node.js performance optimization
- ✅ Want gradual migration from Express
- ✅ Building traditional server applications
- ✅ Need plugin ecosystem
- ✅ Hybrid cloud/edge deployment
Ideal Use Cases:
- High-performance Node.js APIs
- Enterprise applications
- Gradual modernization projects
- Plugin-heavy architectures
The Verdict: Edge-First Development
After extensive testing, the results are clear: Hono.js dominates in edge runtime environments. It's not just faster—it's often the only framework that works without significant modifications.
For new API development in 2025, I recommend:
- Start with Hono.js for greenfield projects targeting edge deployment
- Gradually migrate existing Express.js APIs to Hono when performance becomes critical
- Use Fastify only when you must stay in traditional Node.js environments
The edge computing revolution isn't coming—it's here. Frameworks like Hono.js that embrace Web APIs and edge-first design will define the next generation of high-performance APIs.
Ready to migrate your APIs to the edge? At BeddaTech, we've helped dozens of companies achieve 10x performance improvements by modernizing their API architecture for edge deployment. The future of API performance is distributed, and it starts with choosing the right framework.