kinetex - v0.0.2
    Preparing search index...

    kinetex - v0.0.2

    1775122989290

    The universal HTTP client for the modern JavaScript ecosystem.
    Zero runtime dependencies. Every runtime. Every style.

    NPM JSR TypeScript Coverage Downloads Docs


    kinetex is a batteries-included fetch wrapper for modern JavaScript runtimes (Node 18+, browsers, Deno, Bun, Cloudflare Workers). It sits above fetch and optionally undici, adding the things you would otherwise wire up yourself:

    • Built-in auth (Basic, Bearer, OAuth 2.0, AWS SigV4, Digest, API key)
    • Stale-while-revalidate cache with custom store support
    • Retry with exponential back-off
    • Request deduplication
    • SSE streaming with auto-reconnect
    • GraphQL helper, HAR recording, cookie jar, proxy middleware
    • A composable middleware pipeline
    • End-to-end TypeScript generics

    What it is not: a low-level HTTP client. It does not reimplement HTTP/2, connection pooling, or TLS — those come from undici (optional peer dep) or the platform's native fetch. If you need raw socket control or maximum throughput without any abstraction overhead, use undici directly.

    See benchmarks/http-comparison.mjs for an honest performance comparison.


    Kinetex is engineered for zero-cost abstraction. By leveraging a highly optimized middleware pipeline and the undici engine, it provides a "luxury" feature set (Auth, Retries, Cache, Hooks) with performance that rivals or beats native low-level clients.

    Measured on Node.js v22.22.1 (Small JSON payload, 500 measured + 50 warmup, Concurrency: 1)

    Client Req/s Mean Latency p99 Latency Heap Δ (KB/req)
    Kinetex + Undici 585 1.71 ms 2.73 ms 4.5 KB
    Axios 570 1.75 ms 3.71 ms 16.6 KB
    Node-Fetch 518 1.93 ms 3.92 ms 14.5 KB
    Native Fetch 511 1.96 ms 4.94 ms 21.6 KB
    Kinetex (Standard) 509 1.96 ms 4.86 ms 5.4 KB
    Got 508 1.97 ms 2.73 ms 18.5 KB
    Ky 485 2.06 ms 2.61 ms 23.2 KB
    Superagent 386 2.59 ms 4.42 ms 22.9 KB

    The Kinetex pipeline adds only ~0.007ms of overhead per request. You get advanced features like AWS SigV4, automatic retries, and request deduplication for essentially zero CPU tax. Once the V8 JIT compiler optimizes the pipeline, the overhead becomes practically immeasurable.

    Kinetex uses 75% less memory per request than native fetch and 70% less than Axios. In high-traffic production environments, this significantly reduces Garbage Collection (GC) pressure, leading to lower CPU spikes and a smaller infrastructure footprint.

    With a p99 latency of 2.73ms, Kinetex is more predictable under load than native fetch (4.94ms). This prevents the random "lag spikes" often seen in microservices when using standard promise-based wrappers.

    By reducing heap allocation by 15-18KB per request compared to competitors, Kinetex allows your containers to handle more concurrent traffic with less RAM, directly lowering your cloud computing costs.


    Note: While undici (direct) and node:http (raw) are faster, they lack the high-level features (middleware, automatic JSON parsing, easy auth) that Kinetex provides. Kinetex aims to be the fastest feature-complete HTTP client for the Node.js ecosystem.

    Run the benchmarks yourself:

    node --expose-gc benchmarks/http-comparison.mjs
    


    npm install kinetex
    

    Install only what you need — kinetex works without any of them.

    npm install undici   # HTTP/2 + HTTP/3 in Node.js (auto-detected)
    npm install zod # Schema validation
    npm install valibot # Alternative schema validation
    npm install socks-proxy-agent # SOCKS5 proxy support

    import kinetex from 'kinetex';

    // Simple GET
    const { data } = await kinetex.get<User>('https://api.example.com/users/1');
    console.log(data.name);

    // POST with JSON body
    const { data: created } = await kinetex.post('https://api.example.com/users', {
    json: { name: 'Alice', email: 'alice@example.com' },
    });

    // Fluent chain
    const { data } = await kinetex
    .chain('https://api.example.com/users')
    .bearer('my-token')
    .query({ page: 1, limit: 20 })
    .timeout(5000)
    .retry(3);

    kinetex works identically across all major JavaScript runtimes. Import it the same way everywhere.

    import kinetex from 'kinetex';   // ESM ✅

    const kinetex = require('kinetex'); // CJS ✅

    const { data } = await kinetex.get('https://api.example.com/users');
    import kinetex, { create } from 'npm:kinetex';
    // or via JSR:
    import kinetex from 'jsr:@kinetexjs/kinetex';

    const { data } = await kinetex.get('https://api.example.com/users');

    Note: proxyMiddleware() has no effect in Deno — configure proxies at the OS level or use the HTTPS_PROXY environment variable.

    import kinetex from 'kinetex'; // identical to Node.js
    
    import { create, auth } from 'kinetex';

    const baseApi = create({ baseURL: 'https://api.example.com', timeout: 5000 });

    export default {
    async fetch(request: Request, env: Env): Promise<Response> {
    const api = baseApi.extend(auth.bearer(env.API_TOKEN));
    const { data } = await api.get('/users');
    return Response.json(data);
    },
    };

    interface Env { API_TOKEN: string; }
    <!-- unpkg -->
    <script src="https://unpkg.com/kinetex/dist/browser/kinetex.min.js"></script>

    <!-- jsDelivr -->
    <script src="https://cdn.jsdelivr.net/npm/kinetex/dist/browser/kinetex.min.js"></script>

    <script>
    kinetex.get('https://api.example.com/data').then(({ data }) => console.log(data));
    </script>

    ESM in the browser:

    <script type="module">
    import kinetex from 'https://cdn.jsdelivr.net/npm/kinetex/dist/browser/kinetex.esm.min.js';
    const { data } = await kinetex.get('https://api.example.com/posts');
    </script>

    The default style — clean, typed, predictable.

    import kinetex from 'kinetex';

    // GET shorthand (call instance directly)
    const { data } = await kinetex<User[]>('https://api.example.com/users');

    // HTTP verb methods
    const { data } = await kinetex.get<User>('/users/1');
    const { data } = await kinetex.post('/users', { json: { name: 'Alice' } });
    const { data } = await kinetex.put('/users/1', { json: { name: 'Bob' } });
    const { data } = await kinetex.patch('/users/1', { json: { email: 'bob@example.com' } });
    const { data } = await kinetex.delete('/users/1');
    const { data } = await kinetex.head('/users');
    const { data } = await kinetex.options('/users');

    Build requests incrementally. The chain is a Promiseawait it when ready.

    const { data } = await kinetex
    .chain('https://api.example.com/users')
    .method('POST')
    .bearer('my-token')
    .header('X-Request-ID', crypto.randomUUID())
    .json({ name: 'Alice' })
    .timeout(5000)
    .retry(3);

    Chain method reference:

    Method Description
    .method(m) HTTP method
    .header(k, v) Add a single header
    .headers(obj) Merge multiple headers
    .query(params) URL search params
    .send(body) Raw body
    .json(data) JSON body — sets Content-Type: application/json
    .form(data) URL-encoded form body
    .multipart(data) multipart/form-data (FormData or plain record)
    .auth(value) Authorization header verbatim
    .bearer(token) Authorization: Bearer <token>
    .basic(user, pass) Authorization: Basic <base64>
    .timeout(ms) Request timeout
    .retry(n) Retry count
    .accept(type) Accept header
    .type(ct) Content-Type header
    .schema(v) Validate response with zod / valibot / custom
    .signal(s) AbortSignal for cancellation
    .cancel() Returns an AbortController wired to this request — call .abort() on it when you want to cancel
    .onUpload(fn) Upload progress callback
    .onDownload(fn) Download progress callback
    .as<U>() Re-type the response generic
    // Multipart upload
    const fd = new FormData();
    fd.append('file', fileBlob, 'avatar.png');
    await kinetex.chain('/upload').multipart(fd);

    // Or from a plain record (auto-converted to FormData)
    await kinetex.chain('/upload').multipart({ file: fileBlob, caption: 'photo' });

    // Cancel a request — call controller.abort() when you actually want to cancel.
    // cancel() does NOT abort immediately; the request fires lazily on first .then()
    // so you need the controller to exist before it starts.
    const chain = kinetex.chain('https://api.example.com/long-poll');
    const controller = chain.cancel(); // wires the signal — NOT aborted yet
    // ... start the request
    const promise = chain; // triggers dispatch
    // ... later:
    controller.abort(); // now it cancels

    Drop-in compatible with the request package callback style.

    kinetex.callback('https://api.example.com/data', {}, (err, res, data) => {
    if (err) return console.error(err);
    console.log(data);
    });

    Create isolated instances with their own defaults, middleware, and interceptors.

    import { create } from 'kinetex';

    const api = create({
    baseURL: 'https://api.example.com',
    timeout: 10_000,
    retry: { limit: 3, delay: (n) => 100 * 2 ** n },
    headers: { 'X-App': 'my-app/1.0' },
    });

    // Scoped sub-instance (inherits parent defaults)
    const usersApi = api.create({ baseURL: 'https://api.example.com/users' });

    // Extend with middleware — returns a new instance, does not mutate
    const authedApi = api.extend(auth.bearer('secret'));

    // Add middleware in-place — mutates this instance
    api.use(myLoggingMiddleware);

    // Cancel ALL in-flight requests on this instance (e.g. on component unmount)
    api.cancelAll();
    // The instance resets automatically — subsequent requests work normally
    // Simple: single number applies to the whole request lifecycle
    await api.get('/data', { timeout: 5000 });

    // Granular: separate timeouts for connection vs full response
    await api.get('/data', {
    timeout: {
    request: 30_000, // abort if total request takes > 30s
    response: 5_000, // abort if no first byte within 5s (TTFB guard)
    },
    });

    Every request method accepts a RequestConfig object. All fields are optional except url.

    interface RequestConfig<T = unknown> {
    url: string;
    method?: string; // default: 'GET'

    // ── Headers & Body ─────────────────────────────────────────────────────
    headers?: Record<string,string> | [string,string][] | Headers;
    body?: BodyInit; // raw body
    json?: unknown; // serialised + Content-Type: application/json
    form?: Record<string, string|number|boolean>; // application/x-www-form-urlencoded
    searchParams?: Record<string, string|number|boolean|string[]> | URLSearchParams | string;

    // ── Response ───────────────────────────────────────────────────────────
    responseType?: 'json'|'text'|'blob'|'arrayBuffer'|'formData'|'stream';
    schema?: ZodSchema | ValibotSchema | { parse(d: unknown): T }; // validates body
    throwHttpErrors?: boolean; // default: true
    decompress?: boolean; // default: true — set false to get raw compressed bytes

    // ── Network ───────────────────────────────────────────────────────────
    baseURL?: string;
    timeout?: number | {
    request?: number; // total request timeout (ms)
    response?: number; // TTFB timeout — aborts if first byte not received in time
    };
    retry?: number | RetryConfig;
    cache?: CacheConfig | false;
    dedupe?: boolean; // default: true for GET
    signal?: AbortSignal;
    credentials?: RequestCredentials;
    followRedirects?: boolean; // default: true
    maxRedirects?: number; // default: 10
    transport?: 'fetch' | 'undici'; // force transport

    // ── Callbacks ─────────────────────────────────────────────────────────
    onUploadProgress?: (event: ProgressEvent) => void; // Node + browser
    onDownloadProgress?: (event: ProgressEvent) => void;

    // ── Observability ─────────────────────────────────────────────────────
    hooks?: HookConfig;
    logger?: Logger;
    har?: boolean;
    }

    Every request resolves to a KinetexResponse<T>:

    const res = await kinetex.get<User>('/users/1');

    res.data // T — parsed body
    res.status // number — HTTP status code
    res.statusText // string
    res.headers // Headers — response headers
    res.response // Response — original fetch Response (for streaming)
    res.request // RequestConfig — originating request
    res.fromCache // boolean — true if served from cache
    res.retries // number — retry attempts made
    res.timing // { start, end, duration, ttfb, ... }
    res.harEntry // HarEntry — present when har: true

    import { auth, create } from 'kinetex';

    // Basic auth
    create().extend(auth.basic('username', 'password'));

    // Bearer token (static)
    create().extend(auth.bearer('my-token'));

    // Bearer token (async — refreshed on each request)
    create().extend(auth.bearer(async () => getAccessToken()));

    // API key in a header
    create().extend(auth.apiKey('key-value', { header: 'X-API-Key' }));

    // API key as a query param
    create().extend(auth.apiKey('key-value', { query: 'api_key' }));

    // OAuth 2.0 client credentials — auto-fetches and refreshes tokens
    create().extend(auth.oauth2({
    tokenUrl: 'https://auth.example.com/oauth/token',
    clientId: process.env.CLIENT_ID,
    clientSecret: process.env.CLIENT_SECRET,
    scope: 'read:users write:posts',
    onToken: (token) => console.log('New token, expires in:', token.expires_in),
    }));

    // AWS SigV4 — signed using Web Crypto, no extra dependency
    create().extend(auth.aws({
    accessKeyId: process.env.AWS_ACCESS_KEY_ID,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
    sessionToken: process.env.AWS_SESSION_TOKEN, // optional
    region: 'us-east-1',
    service: 's3',
    }));

    // Digest auth — RFC 2617 compliant MD5 (nginx, Apache compatible)
    create().extend(auth.digest('username', 'password'));

    kinetex integrates with any schema library that exposes .parse() or .safeParse().

    import { z } from 'zod';
    // import * as z from 'zod';
    import kinetex from 'kinetex';

    const UserSchema = z.object({
    id: z.number(),
    name: z.string(),
    email: z.string().email(),
    });

    // data is typed and validated at runtime — throws ValidationError on mismatch
    const { data } = await kinetex.get('https://api.example.com/users/1', {
    schema: UserSchema,
    });

    console.log(data.name); // string ✅
    import * as v from 'valibot';

    const UserSchema = v.object({ id: v.number(), name: v.string() });
    const { data } = await kinetex.get('/users/1', { schema: UserSchema });

    Any object with .parse() or .safeParse() works:

    const { data } = await kinetex.get('/users/1', {
    schema: {
    parse: (raw) => {
    if (!raw || typeof raw !== 'object') throw new Error('Invalid');
    return raw as User;
    },
    },
    });
    const { data } = await kinetex
    .chain('/users/1')
    .schema(UserSchema);

    Built-in in-memory cache with stale-while-revalidate support. Works with all response types.

    The default cache store is per-instance — each create() call gets its own isolated store, so cache entries from one instance never bleed into another. To share a cache across instances, pass a custom store explicitly.

    // Cache for 60s, serve stale for up to 5 more minutes while revalidating in the background
    const { data, fromCache } = await kinetex.get('/api/config', {
    cache: { ttl: 60_000, swr: 300_000 },
    });
    console.log('from cache?', fromCache);

    // Opt out for one request
    await api.get('/realtime-data', { cache: false });

    // Custom cache key
    await api.get('/users', {
    cache: {
    ttl: 30_000,
    key: (req) => `v2:${req.url}`,
    },
    });
    import { create } from 'kinetex';

    const api = create({
    cache: {
    ttl: 30_000,
    store: {
    get: (key) => redis.get(key).then(v => v ? JSON.parse(v) : undefined),
    set: (key, entry) => redis.setex(
    key,
    Math.ceil((entry.expiresAt - Date.now()) / 1000),
    JSON.stringify(entry)
    ),
    delete: (key) => redis.del(key),
    clear: () => redis.flushdb(),
    },
    },
    });

    const { data } = await kinetex.get('/api/data', {
    retry: {
    limit: 4,
    statusCodes: [429, 500, 502, 503, 504],
    methods: ['GET', 'POST'],
    delay: (attempt) => Math.min(200 * 2 ** attempt, 10_000), // exponential back-off
    onNetworkError: true,
    onRetry: (attempt, err, req) => console.warn(`Retry ${attempt} for ${req.url}:`, err),
    },
    });

    // Shorthand — retries 3 times with defaults
    const { data } = await kinetex.get('/api/data', { retry: 3 });

    Axios-compatible interceptor API. Interceptors run as part of the request pipeline.

    import { create } from 'kinetex';

    const api = create({ baseURL: 'https://api.example.com' });

    // Request interceptor — add a tracing header to every request
    const reqId = api.interceptors.request.use(async (config) => ({
    ...config,
    headers: { ...config.headers, 'X-Request-ID': crypto.randomUUID() },
    }));

    // Response interceptor — unwrap a nested payload
    const resId = api.interceptors.response.use((response) => ({
    ...response,
    data: response.data?.payload ?? response.data,
    }));

    // Remove individual interceptors
    api.interceptors.request.eject(reqId);
    api.interceptors.response.eject(resId);

    Fine-grained hooks for observability, modification, and error handling. Set per-request or on an instance.

    const { data } = await kinetex.get('/api/data', {
    hooks: {
    // Modify the request before it is sent
    beforeRequest: [
    (config) => ({ ...config, url: config.url + '?source=kinetex' }),
    ],
    // Inspect or transform the response
    afterResponse: [
    (response) => {
    console.log(`${response.status} in ${response.timing.duration}ms`);
    return response;
    },
    ],
    // React to any error (does not suppress it)
    onError: [
    (err) => metrics.increment('http.error', { status: err.response?.status }),
    ],
    },
    });

    The full power of kinetex — compose arbitrary request/response transforms into the pipeline.

    import { create, compose } from 'kinetex';
    import type { Middleware } from 'kinetex';

    // Write a middleware
    const timingMiddleware: Middleware = async (request, next) => {
    const start = Date.now();
    const response = await next(request);
    console.log(`${request.method} ${request.url}${Date.now() - start}ms`);
    return response;
    };

    const api = create({ baseURL: 'https://api.example.com' });
    api.use(timingMiddleware);

    // Compose multiple middlewares manually
    const pipeline = compose([timingMiddleware, authMiddleware], coreHandler);

    Server-Sent Events with automatic reconnection, Last-Event-ID tracking, and server-side retry: interval support.

    import { sse } from 'kinetex/plugins';

    const controller = new AbortController();
    setTimeout(() => controller.abort(), 30_000); // 30s limit

    for await (const event of sse('https://api.example.com/events', {
    signal: controller.signal,
    headers: { Authorization: 'Bearer my-token' },
    maxRetries: 10, // Infinity by default
    })) {
    console.log(`[${event.event ?? 'message'}]`, event.data);
    if (event.id) console.log('event id:', event.id);
    if (event.event === 'done') break;
    }

    SSE event fields:

    interface SSEEvent {
    data: string; // event data
    event?: string; // event type (from "event:" line)
    id?: string; // event id (from "id:" line)
    retry?: number; // retry interval in ms (from "retry:" line)
    }

    Type-safe GraphQL requests with automatic error detection.

    import { create } from 'kinetex';
    import { graphqlPlugin } from 'kinetex/plugins';

    const api = create({ baseURL: 'https://api.example.com' });

    interface UserData { user: { id: string; name: string } }
    interface UserVars { id: string }

    const config = graphqlPlugin<UserVars, UserData>('/graphql', {
    query: `
    query GetUser($id: ID!) {
    user(id: $id) { id name }
    }
    `,
    variables: { id: '42' },
    operationName: 'GetUser',
    });

    const { data } = await api.post(config.url, config);
    // Throws automatically if data.errors is present
    console.log(data.data?.user.name);

    const form = new FormData();
    form.append('title', 'My Upload');
    form.append('file', fileBlob, 'document.pdf');

    const { data } = await kinetex.post('/upload', { body: form });

    In Node.js, onUploadProgress wraps the request body in a progress-tracking ReadableStream that emits events as undici reads each chunk. In Bun, upload progress uses Bun's native fetch streaming. In the browser, kinetex automatically falls back to XMLHttpRequest to provide real upload progress events.

    import type { ProgressEvent as KinetexProgressEvent } from 'kinetex';

    await kinetex.post('/upload', {
    body: fileBuffer,
    onUploadProgress: (event: KinetexProgressEvent) => {
    console.log(`${event.percent?.toFixed(0)}% — ${(event.bytesPerSecond / 1024).toFixed(1)} KB/s`);
    },
    });
    await kinetex.get('/large-file.zip', {
    responseType: 'blob',
    onDownloadProgress: ({ loaded, total, percent }) => {
    progressBar.value = percent ?? 0;
    },
    });

    ProgressEvent fields:

    interface ProgressEvent {
    loaded: number; // bytes transferred so far
    total: number | undefined; // total bytes (undefined if unknown)
    percent: number | undefined; // 0–100 (undefined if total unknown)
    transferredBytes: number; // same as loaded
    bytesPerSecond: number; // current transfer speed
    }

    Record all HTTP traffic as a HAR 1.2 file for debugging, testing, or replaying.

    import { create } from 'kinetex';
    import { writeFileSync } from 'node:fs';

    const api = create();

    await api.get('https://api.example.com/users', { har: true });
    await api.post('https://api.example.com/posts', { json: { title: 'test' }, har: true });

    const har = api.exportHAR();
    // har.log.version === '1.2'
    // har.log.entries — array of HarEntry

    writeFileSync('trace.har', JSON.stringify(har, null, 2));
    // Open in Chrome DevTools → Network → Import HAR

    RFC 6265 compliant cookie jar — no dependencies.

    import { create } from 'kinetex';
    import { cookieJar, withCookies } from 'kinetex/plugins';

    const jar = cookieJar();
    const api = create().use(withCookies(jar));

    // Cookies from Set-Cookie headers are stored automatically
    await api.get('https://example.com/login');

    // Stored cookies are sent automatically on subsequent requests
    await api.get('https://example.com/dashboard');

    // Manual jar operations
    const all = jar.getAll(); // all stored cookies
    const header = jar.getCookieString('https://example.com/'); // "name=value; ..."
    jar.delete('session', 'example.com', '/'); // remove one cookie
    jar.clear(); // remove all

    Node.js and Bun only. Calling proxy middleware in browser, edge, or Deno environments logs a console.warn and falls through — there is no silent failure.

    import { create } from 'kinetex';
    import { proxyMiddleware, envProxy } from 'kinetex/plugins';

    // Explicit HTTP/HTTPS proxy
    const api = create().use(proxyMiddleware({
    url: 'http://proxy.corp.internal:3128',
    auth: { username: 'user', password: 'pass' },
    noProxy: ['internal.corp.com', '.local'],
    }));

    // Read from environment variables (HTTP_PROXY, HTTPS_PROXY, NO_PROXY)
    const api2 = create().use(envProxy());

    // SOCKS5 proxy — requires: npm install socks-proxy-agent
    const api3 = create().use(proxyMiddleware({
    url: 'socks5://proxy.example.com:1080',
    protocol: 'socks5',
    }));

    ProxyConfig fields:

    interface ProxyConfig {
    url?: string; // proxy URL
    protocol?: 'http' | 'https' | 'socks5';
    auth?: { username: string; password: string };
    noProxy?: string[]; // hostnames / patterns to bypass
    headers?: Record<string, string>; // extra headers sent to proxy
    }

    import { create } from 'kinetex';
    import { concurrencyLimit, rateLimit } from 'kinetex/plugins';

    const api = create({ baseURL: 'https://api.example.com' })
    .use(concurrencyLimit(5)) // max 5 in-flight requests
    .use(rateLimit({ requestsPerSecond: 10, burst: 20 })); // token-bucket rate limiter

    rateLimit options:

    Option Type Description
    requestsPerSecond number Sustained rate
    burst number Max burst size (default: requestsPerSecond)

    Protect against unexpectedly large responses. Checks Content-Length header first, then falls back to measuring the actual body.

    import { create, ResponseSizeError } from 'kinetex/plugins';
    import { responseSizeLimit } from 'kinetex/plugins';

    const api = create().use(responseSizeLimit(5 * 1024 * 1024)); // 5 MB limit

    try {
    const { data } = await api.get('/large-file');
    } catch (err) {
    if (err instanceof ResponseSizeError) {
    console.log(`Too large: ${err.actualBytes} bytes (limit: ${err.maxBytes})`);
    }
    }

    npm install undici
    

    When undici is installed, kinetex automatically uses it in Node.js for HTTP/2 and HTTP/3 support. Bun uses its own optimized native fetch. Falls back to native fetch transparently when unavailable.

    Connections to each origin are pooled via a persistent undici.Pool (10 connections per origin, keep-alive 30s). This means multiple requests to the same host reuse the same TCP/TLS/HTTP2 session — true multiplexing, not just HTTP/2 framing.

    // Force native fetch for one request (e.g. when streaming to a non-undici consumer)
    const { data } = await kinetex.get('/api', { transport: 'fetch' });

    // Force undici for all requests on this instance
    const api = create({ transport: 'undici' });

    // Opt out of automatic response decompression (get raw compressed bytes)
    const { data } = await kinetex.get('/compressed', { decompress: false });

    Note: connection pooling is per-process, not per-instance. All kinetex instances share the same undici.Pool for a given origin.


    All errors extend KinetexError. Catch the specific subclass you care about.

    import { HTTPError, TimeoutError, ValidationError, KinetexError } from 'kinetex';

    try {
    await kinetex.get('https://api.example.com/protected');
    } catch (err) {
    if (err instanceof HTTPError) {
    console.log(err.response?.status); // 401, 404, 429 ...
    console.log(err.response?.data); // parsed error body
    console.log(err.request.url); // originating URL
    } else if (err instanceof TimeoutError) {
    console.log('Timed out:', err.message);
    } else if (err instanceof ValidationError) {
    console.log('Schema mismatch:', err.validationError);
    } else if (err instanceof KinetexError) {
    console.log('kinetex error:', err.originalCause);
    }
    }
    const res = await kinetex.get('/api/data', { throwHttpErrors: false });
    if (res.status === 404) {
    console.log('not found');
    }

    Error types:

    Class When thrown
    KinetexError Base class for all errors
    HTTPError Non-2xx response (when throwHttpErrors: true)
    TimeoutError Request exceeded timeout
    ValidationError Response failed schema validation
    ResponseSizeError Response exceeded responseSizeLimit

    Identical in-flight GET requests on the same instance are collapsed into a single network call. All callers receive the same response.

    const api = create({ baseURL: 'https://api.example.com' });

    // All three fire simultaneously — only one HTTP request is made
    const [r1, r2, r3] = await Promise.all([
    api.get('/config'),
    api.get('/config'),
    api.get('/config'),
    ]);

    // Opt out per-request
    await api.get('/always-fresh', { dedupe: false });

    Deduplication state is instance-scoped — two separate create() calls never share inflight maps.

    Authorization-aware: the dedup key includes the Authorization header value, so two requests to the same URL but with different credentials (e.g. different user tokens in a multi-tenant server) are never collapsed into the same in-flight request.


    Attach a logger to an instance or individual request for structured observability.

    import { create } from 'kinetex';
    import type { Logger } from 'kinetex';

    const logger: Logger = {
    request: (config) => console.log(`→ ${config.method} ${config.url}`),
    response: (res) => console.log(`← ${res.status} ${res.timing.duration}ms`),
    error: (err) => console.error(`✗ ${err.message}`),
    };

    // On an instance
    const api = create({ logger });

    // Or per-request
    await kinetex.get('/api/data', { logger });

    git clone https://github.com/kinetexjs/kinetex.git
    cd kinetex && npm install

    npm run typecheck # TypeScript — 0 errors
    npm run lint # ESLint — 0 warnings
    npm run build # ESM + CJS + types + browser bundles

    npm test # ~149 tests, 0 failures (core suite)
    npm run test:all # ~460 tests, 0 failures (full suite)
    npm run test:coverage # coverage report (≥ 95% ESM line coverage)
    npm run bench # benchmark vs native fetch and node:http (add more: npm install --no-save axios got ky undici)

    # Run individual runtimes
    npm run test:bun
    npm run test:deno

    # Generate local API docs
    npm run docs
    open docs/index.html

    # Create a release
    node scripts/release.mjs patch # or minor / major
    git push && git push --tags

    MIT © Qasim Ali