Skip to Content
Overhead Performance

Overhead Performance

TL;DR

  • Goal: measure Vovk.ts overhead over native Next.js route handlers (not HTTP stack).
  • Routing: O(1) across 1–10,000 controllers (20,000 endpoints). Median latency ~1.25–1.33 µs even at 10,000 controllers. Throughput ~745k–800k ops/s/core.
  • Cold start: O(n). ~5.7 ms at 1,000 controllers; ~83 ms at 10,000. About 8–10× the cost of no‑op decorators.
  • Notes: Tinybench on Apple M4 Pro. Next.js runtime cost is out of scope. Compiled from real benchmark output with AI assistance and minor edits. See vovk-perf-test repo for scripts.

Reproducing the Tests

Clone the repo:

git clone https://github.com/finom/vovk-perf-test.git cd vovk-perf-test

Install dependencies via:

npm i

Run performance tests via:

npm run perf-test

Perf suite also runs in CI: GitHub Actions .

Overview

Vovk.ts sits on top of Next.js API routes and generates handlers via decorators applied to procedures:

src/app/api/[[...vovk]]/route.ts
export const { GET, POST } = initSegment({ controllers });

We measure framework overhead in two dimensions:

  • Request Overhead: per-request routing/handler overhead.
  • Cold Start Overhead: initialization time for controllers/metadata.

Source: test scripts in the vovk-perf-test repository.

Request Overhead

Example controller (N = 1) used in the request-overhead tests:

src/modules/one/a/AController.ts
import { procedure, prefix, get, post, operation } from "vovk"; @prefix("a") export default class AController { @operation({ summary: "Get A", }) @get() static getA = procedure({ handle: () => { return { get: true }; }, }); @operation({ summary: "Create A", }) @post("{id}") static createA = procedure({ handle: (_req, params: any) => { return { post: true, id: params.id }; }, }); }

The code above is fetched from GitHub repository. 

Methodology (short)

  • Autogenerate N controllers (N ∈ {1, 10, 100, 1,000, 10,000}), each exposing:
    • GET without params.
    • POST with path param “{id}” (pattern match).
  • Minimal handler logic; measure full routing + handler path.
  • Tinybench: 100 ms min per test, nanosecond timing; report median latency/throughput.

Results

ControllersEndpointsGET Latency (med)POST Latency (med)GET Throughput (med ops/s)POST Throughput (med ops/s)
121,250 ns1,250 ns800,000800,000
10201,291 ns1,292 ns774,593773,994
1002001,250 ns1,292 ns800,000773,994
1,0002,0001,250 ns1,291 ns800,000774,593
10,00020,0001,292 ns1,333 ns773,994750,188

Key takeaways:

  • O(1) routing: flat latency from 1 to 10,000 controllers.
  • ≈1.3 µs overhead at typical scales; GET≈POST indicates efficient param extraction.

Cold Start Overhead

Example cold-start benchmark (N = 1) contrasting Vovk.ts vs. no-op decorators:

perf/generated_coldStartPerfTest.ts
bench.add("Cold start for 1 controllers", async () => { const controllers: Record<string, Function> = {}; @prefix("one/0") class One0Controller { @operation({ summary: "Create", }) @post("{id}") static create = procedure({ handle: () => null, }); } controllers["One0Controller"] = One0Controller; initSegment({ segmentName: "", emitSchema: true, controllers, }); }); bench.add("No-op decorators for 1 classes", async () => { const controllers: Record<string, Function> = {}; @noopClassDecorator() class One0Controller { @noopDecorator({ summary: "Create", }) @noopDecorator("{id}") static create = (_req: unknown, params: any) => { return null; }; } });

Methodology (short)

For N ∈ {1, 10, 100, 1,000, 10,000} measure:

  • App creation, decorator processing, metadata build, and initSegment().
  • Compare to equivalent classes using no‑op decorators to isolate framework work.

Example no‑op decorators:

function noopDecorator() { return function (..._args: any[]) {}; } function noopClassDecorator() { return function <T extends new (...a: any[]) => any>(c: T) { return c; }; }

Results

ControllersVovk.ts Init Time (med)No-op Time (med)Overhead RatioThroughput (ops/s)
15.125 μs0.500 μs10.3x195,122
1047.167 μs5.208 μs9.1x21,201
100526.416 μs53.541 μs9.8x1,900
1,0005,719.333 μs702.896 μs8.1x175
10,00082,924.833 μs10,370.729 μs8.0x12

Key takeaways:

  • O(n) init: linear growth with a near-constant per-controller cost through 10,000 controllers.
  • Absolute times are small for long-lived services; still acceptable for serverless at typical sizes.

Practical guidance

  • For high-performance workloads: split the app into multiple segments (e.g., serverless functions built with Next.js route.ts files).
  • In theory, with careful segment management and adequate hardware, a single Next.js/Vovk.ts app can host up to ~1,000,000 endpoints. Validate this in your environment; practical limits will be memory, bundle size, cold start budgets, and platform quotas.

Benchmarks: Tinybench on Node.js; hardware Apple M4 Pro. Numbers can vary by runtime, hardware, and build settings. Scripts/results: https://github.com/finom/vovk-perf-test 

Last updated on