Metrics Middleware for Next.js

TODO: collapasable, copyable, download, etc

I'm using TypeScript and Nextjs. I have router functions like POST. What is the best way I could add instrumentation code to emit metrics for things like latency, http status code, etc.To instrument your TypeScript and Next.js application to emit metrics like latency, HTTP status codes, and other relevant data, you can use middleware to intercept requests and responses. Here’s a guide on how you can achieve this:

  1. Install Dependencies: First, you'll need a metrics library. Prometheus is a popular choice, and you can use the prom-client library to integrate it with your Next.js application.

    npm install prom-client
    
  2. Create a Metrics Middleware: Create a middleware to collect metrics for each request.

    // lib/metrics.ts
    import { collectDefaultMetrics, Histogram, register } from 'prom-client';
    
    collectDefaultMetrics();
    
    const httpRequestDurationMicroseconds = new Histogram({
      name: 'http_request_duration_seconds',
      help: 'Duration of HTTP requests in seconds',
      labelNames: ['method', 'route', 'status_code'],
      buckets: [0.1, 0.2, 0.5, 1, 2, 5]
    });
    
    export { httpRequestDurationMicroseconds, register };
    
  3. Integrate Middleware in Your Next.js API Routes: Use the middleware in your API routes to measure request metrics.

    // pages/api/_middleware.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { httpRequestDurationMicroseconds } from '../../lib/metrics';
    
    const metricsMiddleware = (req: NextApiRequest, res: NextApiResponse, next: () => void) => {
      const end = httpRequestDurationMicroseconds.startTimer();
      const originalEnd = res.end;
    
      res.end = (...args: any[]) => {
        res.end = originalEnd;
        const response = res.end(...args);
        end({ method: req.method, route: req.url, status_code: res.statusCode });
        return response;
      };
    
      next();
    };
    
    export default metricsMiddleware;
    
  4. Apply Middleware to All Routes: Ensure that the middleware is applied to all your API routes.

    // pages/api/[...all].ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import metricsMiddleware from './_middleware';
    
    const handler = (req: NextApiRequest, res: NextApiResponse) => {
      metricsMiddleware(req, res, () => {
        // Your existing route logic here
        res.status(200).json({ message: 'Hello World' });
      });
    };
    
    export default handler;
    
  5. Expose Metrics Endpoint: Create an endpoint to expose the collected metrics.

    // pages/api/metrics.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { register } from '../../lib/metrics';
    
    export default async function handler(req: NextApiRequest, res: NextApiResponse) {
      res.setHeader('Content-Type', register.contentType);
      res.send(await register.metrics());
    }
    
  6. Start Collecting Metrics: Now, when you make API requests to your Next.js application, metrics will be collected and can be viewed at the /api/metrics endpoint.

Additional Tips

  • Custom Metrics: You can add more custom metrics as needed using different types of collectors provided by prom-client like Gauge, Counter, etc.
  • Prometheus Server: To visualize and analyze the metrics, set up a Prometheus server to scrape the /api/metrics endpoint.
  • Grafana: For better visualization, use Grafana to create dashboards based on the metrics collected by Prometheus.

By following these steps, you can effectively instrument your Next.js application to collect and emit metrics for monitoring and performance analysis.NextJs doesn't support middlewareYou're right, Next.js doesn’t natively support middleware in the same way Express.js does. However, you can still achieve instrumentation by wrapping your API route handlers with custom functions to measure metrics. Here’s how you can do it:

  1. Install Dependencies: Make sure you have prom-client installed.

    npm install prom-client
    
  2. Create a Metrics Wrapper: Create a wrapper function to measure metrics for each request.

    // lib/metrics.ts
    import { collectDefaultMetrics, Histogram, register } from 'prom-client';
    
    collectDefaultMetrics();
    
    const httpRequestDurationMicroseconds = new Histogram({
      name: 'http_request_duration_seconds',
      help: 'Duration of HTTP requests in seconds',
      labelNames: ['method', 'route', 'status_code'],
      buckets: [0.1, 0.2, 0.5, 1, 2, 5],
    });
    
    export { httpRequestDurationMicroseconds, register };
    
  3. Create a Wrapper Function for API Routes: Use the wrapper function to measure metrics for your API routes.

    // lib/withMetrics.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { httpRequestDurationMicroseconds } from './metrics';
    
    const withMetrics = (handler: (req: NextApiRequest, res: NextApiResponse) => void) => {
      return async (req: NextApiRequest, res: NextApiResponse) => {
        const end = httpRequestDurationMicroseconds.startTimer();
        const originalEnd = res.end;
    
        res.end = (...args: any[]) => {
          res.end = originalEnd;
          const response = res.end(...args);
          end({ method: req.method, route: req.url, status_code: res.statusCode });
          return response;
        };
    
        await handler(req, res);
      };
    };
    
    export default withMetrics;
    
  4. Wrap Your API Routes: Wrap your API route handlers with the withMetrics function.

    // pages/api/hello.ts
    import type { NextApiRequest, NextApiResponse } from 'next';
    import withMetrics from '../../lib/withMetrics';
    
    const handler = async (req: NextApiRequest, res: NextApiResponse) => {
      res.status(200).json({ message: 'Hello World' });
    };
    
    export default withMetrics(handler);
    
  5. Expose Metrics Endpoint: Create an endpoint to expose the collected metrics.

    // pages/api/metrics.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { register } from '../../lib/metrics';
    
    export default async function handler(req: NextApiRequest, res: NextApiResponse) {
      res.setHeader('Content-Type', register.contentType);
      res.send(await register.metrics());
    }
    
  6. Start Collecting Metrics: Now, when you make API requests to your Next.js application, metrics will be collected and can be viewed at the /api/metrics endpoint.

Summary

By wrapping your API route handlers with the withMetrics function, you can collect metrics for each request, including latency and HTTP status codes. This approach works within the constraints of Next.js and provides you with a way to monitor and analyze the performance of your application.If I have export async function POST(req: Request | NextRequest, res: NextApiResponse) {, how can I wrap that?To wrap your POST function in Next.js with metrics collection, you can create a higher-order function to handle the instrumentation. Here’s how you can do it:

  1. Update the Metrics Wrapper: Adjust the metrics wrapper to handle both Request and NextRequest types.

    // lib/withMetrics.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { httpRequestDurationMicroseconds } from './metrics';
    
    const withMetrics = (handler: (req: Request | NextApiRequest, res: NextApiResponse) => void) => {
      return async (req: Request | NextApiRequest, res: NextApiResponse) => {
        const end = httpRequestDurationMicroseconds.startTimer();
        const originalEnd = res.end;
    
        res.end = (...args: any[]) => {
          res.end = originalEnd;
          const response = res.end(...args);
          end({ method: req.method, route: req.url, status_code: res.statusCode });
          return response;
        };
    
        await handler(req, res);
      };
    };
    
    export default withMetrics;
    
  2. Wrap Your API Route: Use the withMetrics function to wrap your POST function.

    // pages/api/yourRoute.ts
    import type { Request } from 'express';
    import type { NextApiResponse, NextApiRequest } from 'next';
    import withMetrics from '../../lib/withMetrics';
    
    async function postHandler(req: Request | NextApiRequest, res: NextApiResponse) {
      // Your existing logic here
      res.status(200).json({ message: 'Hello World' });
    }
    
    export const POST = withMetrics(postHandler);
    
  3. Metrics Endpoint: Ensure you have a metrics endpoint to expose the collected metrics.

    // pages/api/metrics.ts
    import { NextApiRequest, NextApiResponse } from 'next';
    import { register } from '../../lib/metrics';
    
    export default async function handler(req: NextApiRequest, res: NextApiResponse) {
      res.setHeader('Content-Type', register.contentType);
      res.send(await register.metrics());
    }
    
  4. Using NextRequest: If you need to handle both NextRequest and NextApiRequest, ensure your types are correctly handled.

    // pages/api/yourRoute.ts
    import type { NextRequest } from 'next/server';
    import type { NextApiRequest, NextApiResponse } from 'next';
    import withMetrics from '../../lib/withMetrics';
    
    async function postHandler(req: Request | NextRequest | NextApiRequest, res: NextApiResponse) {
      // Your existing logic here
      res.status(200).json({ message: 'Hello World' });
    }
    
    export const POST = withMetrics(postHandler);
    

By following these steps, you can instrument your Next.js POST function to collect metrics for latency, HTTP status codes, and more, while accommodating different request types. This method provides a reusable way to add instrumentation across your Next.js API routes.This is what I've got... route.ts


const handler = async (req: Request | NextRequest, res: NextApiResponse) => {
  const meter = opentelemetry.metrics.getMeter('app-ui', '0.1');
  meter.createCounter('test_counter').add(1, { foo: 'bar'  });
  console.log('creating counter', meter);
  

  try {
    const body = await req.json();
    const { showAnomalies, markInteresting, showIntent } = body;

    setCookie("showAnomalies", showAnomalies);
    setCookie("markInteresting", markInteresting);
    setCookie("showIntent", showIntent);

    return new Response(`"status": "ok"}`, {
      status: 200,
      headers: {
        "Content-Type": "application/json",
      },
    });
  } catch (e) {
    console.log("ERROR settings POST", e);
    return new Response("Error", { status: 500 });
  }
};

export const POST = withMetrics(handler);

with_metrics.ts

import { NextApiRequest, NextApiResponse } from 'next';
// import { httpRequestDurationMicroseconds } from './metrics';

import opentelemetry from '@opentelemetry/api';
import { NextRequest } from 'next/server';

const meter = opentelemetry.metrics.getMeterProvider().getMeter('app-ui');

export const requestLatency = meter.createHistogram('zhttp_request_latency', {
  description: 'Duration of HTTP requests in milliseconds',
});

export const requestCount = meter.createCounter('zhttp_request_count', {
  description: 'Count of HTTP requests',
});

export const errorCount = meter.createCounter('zhttp_request_errors', {
  description: 'Count of HTTP request errors',
});

export const successCount = meter.createCounter('zhttp_request_successes', {
  description: 'Count of successful HTTP requests',
});

export const withMetrics = (handler: (req: Request | NextRequest, res: NextApiResponse) => Promise<Response>) => {
  return async (req: Request | NextRequest, res: NextApiResponse) => {
    const start = process.hrtime();
    // const end = httpRequestDurationMicroseconds.startTimer();
    const originalEnd = res.end;

    res.end = (...args: any[]) => {
      res.end = originalEnd;
      const response = res.end(...args);
      const duration = process.hrtime(start);
      const durationInMillis = duration[0] * 1000 + duration[1] / 1e6;
      
      const route = req.url ? req.url : 'unknown_route';
      const method = req.method;
      const statusCode = res.statusCode;
      requestLatency.record(durationInMillis, {
        route,
        method,
        status_code: statusCode,
      });
  
      // Increment request count
      requestCount.add(1, {
        route,
        method,
        status_code: statusCode,
      });
  
      // Increment error or success count
      if (statusCode >= 400) {
        errorCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      } else {
        successCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      }

      // end({ method: req.method, route: req.url, status_code: res.statusCode });
      return response;
    };

    await handler(req, res);
  };
};

export default withMetrics;

I am getting this error

 ⨯ Error: No response is returned from route handler '/Users/ozten/Projects/deductive/app-ui/src/app/api/settings/route.ts'. Ensure you return a `Response` or a `NextResponse` in all branches of your handler.
    at /Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:63974
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async eU.execute (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:53964)
    at async eU.handle (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:65062)
    at async doRender (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:1334:42)
    at async cacheEntry.responseCache.get.routeKind (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:1556:28)
    at async DevServer.renderToResponseWithComponentsImpl (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:1464:28)
    at async DevServer.renderPageComponent (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:1861:24)
    at async DevServer.renderToResponseImpl (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:1899:32)
    at async DevServer.pipeImpl (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:912:25)
    at async NextNodeServer.handleCatchallRenderRequest (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/next-server.js:269:17)
    at async DevServer.handleRequestImpl (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/base-server.js:808:17)
    at async /Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/dev/next-dev-server.js:331:20
    at async Span.traceAsyncFn (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/trace/trace.js:151:20)
    at async DevServer.handleRequest (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
    at async invokeRender (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/lib/router-server.js:136:21)
    at async handleRequest (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/lib/router-server.js:315:24)
    at async requestHandlerImpl (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/lib/router-server.js:339:13)
    at async Server.requestListener (/Users/ozten/Projects/deductive/app-ui/node_modules/next/dist/server/lib/start-server.js:140:13)
```To address the error you're encountering, ensure that the `withMetrics` wrapper correctly handles and returns the `Response` object. Here’s a revised version of the `withMetrics` function and the API route to ensure it works as expected:

### Revised `withMetrics` Wrapper

1. **Ensure `withMetrics` Returns a `Response`**: Make sure the wrapper function correctly returns the `Response` object from the handler.

```typescript
// lib/withMetrics.ts
import { NextApiResponse } from 'next';
import { NextRequest } from 'next/server';
import opentelemetry from '@opentelemetry/api';

const meter = opentelemetry.metrics.getMeterProvider().getMeter('app-ui');

export const requestLatency = meter.createHistogram('zhttp_request_latency', {
  description: 'Duration of HTTP requests in milliseconds',
});

export const requestCount = meter.createCounter('zhttp_request_count', {
  description: 'Count of HTTP requests',
});

export const errorCount = meter.createCounter('zhttp_request_errors', {
  description: 'Count of HTTP request errors',
});

export const successCount = meter.createCounter('zhttp_request_successes', {
  description: 'Count of successful HTTP requests',
});

export const withMetrics = (
  handler: (req: Request | NextRequest, res: NextApiResponse) => Promise<Response>
) => {
  return async (req: Request | NextRequest, res: NextApiResponse) => {
    const start = process.hrtime();
    const originalEnd = res.end;

    res.end = (...args: any[]) => {
      const response = originalEnd.apply(res, args);
      const duration = process.hrtime(start);
      const durationInMillis = duration[0] * 1000 + duration[1] / 1e6;

      const route = req.url || 'unknown_route';
      const method = req.method || 'UNKNOWN_METHOD';
      const statusCode = res.statusCode;

      requestLatency.record(durationInMillis, {
        route,
        method,
        status_code: statusCode,
      });

      requestCount.add(1, {
        route,
        method,
        status_code: statusCode,
      });

      if (statusCode >= 400) {
        errorCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      } else {
        successCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      }

      return response;
    };

    const response = await handler(req, res);
    return response;
  };
};

export default withMetrics;

Revised API Route

  1. Ensure Handler Returns a Response: Make sure your handler returns the Response object.
// pages/api/settings/route.ts
import type { Request } from 'express';
import type { NextApiResponse } from 'next';
import { NextRequest } from 'next/server';
import withMetrics from '../../../lib/withMetrics';

const handler = async (req: Request | NextRequest, res: NextApiResponse) => {
  const meter = opentelemetry.metrics.getMeter('app-ui', '0.1');
  meter.createCounter('test_counter').add(1, { foo: 'bar' });
  console.log('creating counter', meter);

  try {
    const body = await req.json();
    const { showAnomalies, markInteresting, showIntent } = body;

    setCookie('showAnomalies', showAnomalies);
    setCookie('markInteresting', markInteresting);
    setCookie('showIntent', showIntent);

    return new Response(
      JSON.stringify({ status: 'ok' }),
      {
        status: 200,
        headers: {
          'Content-Type': 'application/json',
        },
      }
    );
  } catch (e) {
    console.log('ERROR settings POST', e);
    return new Response('Error', { status: 500 });
  }
};

export const POST = withMetrics(handler);

Explanation

  • Returning Response: Ensure that the handler always returns a Response object, which is expected by Next.js.
  • Metrics Recording: The wrapper function withMetrics correctly records metrics and passes control back to the original handler.
  • Types: Properly handle types to ensure compatibility with both Request and NextRequest.

This should resolve the error and allow your route to correctly handle requests and responses while collecting metrics.To fix the TS errors, I change the to const response = originalEnd.apply(res, args as [any, BufferEncoding, (() => void) | undefined]);. Sound good?Yes, modifying the res.end method to properly handle the types using TypeScript’s type assertion can help resolve the TypeScript errors. Here’s the updated version of your withMetrics function with the type assertion applied:

Updated withMetrics Wrapper

// lib/withMetrics.ts
import { NextApiResponse } from 'next';
import { NextRequest } from 'next/server';
import opentelemetry from '@opentelemetry/api';

const meter = opentelemetry.metrics.getMeterProvider().getMeter('app-ui');

export const requestLatency = meter.createHistogram('zhttp_request_latency', {
  description: 'Duration of HTTP requests in milliseconds',
});

export const requestCount = meter.createCounter('zhttp_request_count', {
  description: 'Count of HTTP requests',
});

export const errorCount = meter.createCounter('zhttp_request_errors', {
  description: 'Count of HTTP request errors',
});

export const successCount = meter.createCounter('zhttp_request_successes', {
  description: 'Count of successful HTTP requests',
});

export const withMetrics = (
  handler: (req: Request | NextRequest, res: NextApiResponse) => Promise<Response>
) => {
  return async (req: Request | NextRequest, res: NextApiResponse) => {
    const start = process.hrtime();
    const originalEnd = res.end;

    res.end = (...args: [any, BufferEncoding?, (() => void)?]) => {
      const response = originalEnd.apply(res, args);
      const duration = process.hrtime(start);
      const durationInMillis = duration[0] * 1000 + duration[1] / 1e6;

      const route = req.url || 'unknown_route';
      const method = req.method || 'UNKNOWN_METHOD';
      const statusCode = res.statusCode;

      requestLatency.record(durationInMillis, {
        route,
        method,
        status_code: statusCode,
      });

      requestCount.add(1, {
        route,
        method,
        status_code: statusCode,
      });

      if (statusCode >= 400) {
        errorCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      } else {
        successCount.add(1, {
          route,
          method,
          status_code: statusCode,
        });
      }

      return response;
    };

    const response = await handler(req, res);
    return response;
  };
};

export default withMetrics;

Updated API Route

Ensure that your handler always returns the Response object:

// pages/api/settings/route.ts
import type { Request } from 'express';
import type { NextApiResponse } from 'next';
import { NextRequest } from 'next/server';
import withMetrics from '../../../lib/withMetrics';

const handler = async (req: Request | NextRequest, res: NextApiResponse) => {
  const meter = opentelemetry.metrics.getMeter('app-ui', '0.1');
  meter.createCounter('test_counter').add(1, { foo: 'bar' });
  console.log('creating counter', meter);

  try {
    const body = await req.json();
    const { showAnomalies, markInteresting, showIntent } = body;

    // Replace with your cookie setting logic, e.g., using `setCookie` function
    // setCookie('showAnomalies', showAnomalies);
    // setCookie('markInteresting', markInteresting);
    // setCookie('showIntent', showIntent);

    return new Response(
      JSON.stringify({ status: 'ok' }),
      {
        status: 200,
        headers: {
          'Content-Type': 'application/json',
        },
      }
    );
  } catch (e) {
    console.log('ERROR settings POST', e);
    return new Response('Error', { status: 500 });
  }
};

export const POST = withMetrics(handler);

Explanation

  • Type Assertion: The apply method for res.end is now type asserted as [any, BufferEncoding?, (() => void)?], ensuring that TypeScript understands the types being passed to the function.
  • Metrics Recording: The withMetrics function continues to record request metrics and passes control back to the original handler.

This should resolve TypeScript errors and ensure your route correctly handles requests and responses while collecting metrics.

Metrics Middleware for Next.js created at Tue Jan 20 1970 20:36:33 GMT+0000 (Coordinated Universal Time)

Related Chats

Metrics Middleware for Next.js 1.000

Next.js Middleware & HOC 0.812

Async Handler Type Declaration 0.564

Configure OpenTelemetry SDK. 0.539

Node.js POST Request Stream 0.522

DB Connection Middleware Setup 0.517

Stream REST Data in JavaScript 0.496

NodeJS Opentelemetry Initialization Issue 0.478

SvelteKit REST Proxy 0.450

Popular NodeJS HTTP Libraries 0.431