Skip to Content
🐺 Vovk.ts is released. Read the blog post →
BlogIntroducing Vovk.ts — A Back-End Framework Native to Next.js

Introducing Vovk.ts

For almost three years, I’ve been building a project that started as an attempt to bring the structured back-end experience of NestJS to Next.js API routes. Along the way I shared a few early previews, but the framework never felt quite ready for a proper introduction. Today, that changes.

Vovk.ts is a back-end meta-framework built natively on top of Next.js App Router. It turns Route Handlers into a structured API layer — Controller, Service, Procedure — and automatically generates type-safe RPC clients, OpenAPI specifications, and AI tool definitions from your code.

Conceptually, it distills the best ideas from NestJS, tRPC, ts-rest, and other tools I’ve admired, and combines them into a single, cohesive developer experience.

Why I built this

I’ve been writing code for over 20 years, and clean, intuitive APIs still bring me genuine joy. That instinct shaped every design decision in Vovk.ts:

  • Minimal API surface — a small set of exports that do exactly what you need, so you’re never overwhelmed.
  • Contract-less proceduresvalidation and type inference are defined in-place, right next to the handler. No separate contract files or DTO classes to keep in sync.
  • RPC meets REST — the ergonomics of an RPC call with proper RESTful semantics underneath, following the Controller–Service pattern with full in-service type inference.
  • AI-native — procedures are automatically derivable as LLM tools, so your API is callable by agents and MCP servers out of the box.
  • Proven performance — O(1) routing overhead across 1 to 10,000 controllers, with median latency under 1.5 µs per request on modern hardware.

Core features

In-place procedure definition. Define validation schemas for body, query, params, and output using Zod/Arktype/Valibot right where the handler lives. Types flow to the generated client automatically — no manual syncing required.

Auto-generated, type-safe RPC clients. The CLI emits a fetch-powered TypeScript client that mirrors your controllers exactly. Jump-to-definition, JSDoc hover, and optional client-side Ajv validation all work out of the box.

AI tool derivation. Call deriveTools() with your controllers or RPC modules and get back a set of LLM-callable tools — complete with names, descriptions, and JSON Schema parameters. The same mechanism powers MCP servers with support for text, JSON, image, and audio outputs, all in a few lines of code.

Local procedure execution. Every procedure exposes a .fn() method that runs in the current context — no HTTP round-trip. This unlocks SSR, PPR, background jobs, and AI agent execution, all using the same handler logic.

Back-end segmentation. Each route.ts compiles into a separate serverless function. You can assign each segment its own runtime (Node.js or Edge) and maxDuration, and since each route.ts compiles independently, only the code it imports ends up in that function’s bundle. The codegen can produce a single composed client or independent per-segment clients, so you control exactly what code ships where.

First-class JSON Lines streaming. Stream data with async function* generators on the server and consume on the client with disposable async iterators. Ideal for LLM token streaming and progressive data loading.

Full OpenAPI support. Procedures automatically emit an OpenAPI 3.x specification with Scalar -compatible code samples.

Third-party API integration. Import external OpenAPI specs as “mixins” and get the same type-safe client interface for third-party APIs. Your own back-end and external services live side by side in a single, unified client.

Multi-language client generation (experimental). Beyond TypeScript, the codegen can produce clients in Python and Rust — complete with typed models and client-side validation.

Multitenancy support. Built-in subdomain routing lets you serve customer.example.com and admin.example.com from a single Next.js deployment.

What it looks like

A controller with a decorator maps directly to a Route Handler:

export default class UserController { @get('{id}') static async getUser(req: NextRequest, { id }: { id: string }) { // ... } }

With procedure, you add validation and type inference in-place:

export default class UserController { @get('{id}') static getUser = procedure({ params: z.object({ id: z.string().uuid(), }) }).handle(async (req, { id }) => { // ... }); }

Services infer their parameter types directly from the controller:

import type { VovkParams } from 'vovk'; import type UserController from './UserController'; export default class UserService { static async getUserById(id: VovkParams<typeof UserController.getUser>['id']) { // ... } }
import UserService from './UserService'; export default class UserController { @get('{id}') static getUser = procedure({ /*...*/ }).handle(async (req, { id }) => { return UserService.getUserById(id); }); }

The generated client mirrors the controller interface exactly:

import { UserRPC, PetstoreAPI } from 'vovk-client'; const user = await UserRPC.getUser({ params: { id: '123' } }); const pet = await PetstoreAPI.getPetById({ params: { petId: 1 } });

Derive AI tools from controllers and API modules alike:

const { tools } = deriveTools({ modules: { UserRPC, TaskController, PetstoreAPI } }); console.log(tools); // [{ name, description, parameters, execute }, ...]

Execute procedures locally for SSR — no HTTP overhead:

await UserController.getUser.fn({ params: { id: '123' } });

What makes it different

Existing tools each solve part of the problem. tRPC gives you type-safe clients but no OpenAPI. ts-rest requires a shared contract layer. NestJS runs its own HTTP server and doesn’t fit the serverless model natively. Vovk.ts sits at the intersection:

  • Your code is the spec. Define once, generate everywhere — TypeScript clients, OpenAPI docs, AI tools, Python and Rust clients.
  • Next.js native. No separate server process. Every segment is a serverless function with full App Router support, including Edge Runtime.
  • Zero glue code for AI. The same procedures that power your API are directly callable by LLM agents and MCP servers.

Get started

Check out the documentation to get up and running, or jump straight into the GitHub repository . I hope Vovk.ts makes your work a little more enjoyable — it certainly has been a labor of love to build.

Last updated on