Skip to content
Learni
View all tutorials
Outils DevOps

How to Integrate Sentry for Advanced Monitoring in 2026

Lire en français

Introduction

Sentry is the gold standard for application observability, going far beyond basic logs by capturing errors, slowdowns, and user flows in real time. In 2026, with fullstack apps like Next.js 15+, integrating Sentry is no longer optional—it's essential for production debugging without downtime. This advanced tutorial walks you through a complete implementation: frontend, backend, distributed tracing, and custom metrics. Imagine a user clicks, an error occurs, and Sentry captures the stack trace, session context, and API performance metrics in an intuitive dashboard. For senior devs, this means slashing MTTR (Mean Time To Resolution) by 70%. We cover native TypeScript configs, auto-instrumentation, and SEO-friendly optimizations for your issues. Ready to turn alerts into actionable insights? (128 words)

Prerequisites

  • Node.js 20+ and npm/yarn/pnpm
  • Next.js 15+ with App Router (TypeScript)
  • Free Sentry.io account (or self-hosted)
  • Advanced knowledge of React Server Components and middleware
  • Vercel or Node server for deployment

Project Setup and SDK Installation

terminal
npx create-next-app@latest sentry-app --typescript --tailwind --eslint --app --src-dir --import-alias "@/*"
cd sentry-app
npm install @sentry/nextjs@latest
npm install --save-dev @sentry/webpack-plugin@latest
npm run dev

This command sets up a modern Next.js project and installs the official Sentry SDK, optimized for App Router. The Webpack plugin handles automatic sourcemap uploads during builds. Run dev to test; avoid outdated versions that break tree-shaking.

Basic Project Configuration

Create a Sentry project via the dashboard (Organization > New Project > Next.js). Grab your DSN (Data Source Name) and authToken. Add these env vars to .env.local: NEXT_PUBLIC_SENTRY_DSN=your_dsn and SENTRY_AUTH_TOKEN=your_token. This enables auto-instrumentation for client/server errors. For advanced use, set up release to correlate builds with errors.

Next.js Configuration with Sentry Plugin

next.config.js
const { withSentryConfig } = require('@sentry/nextjs');

const nextConfig = {
  experimental: {
    ppr: true,
  },
  transpilePackages: [],
};

const sentryWebpackPluginOptions = {
  silent: true,
  org: "votre-org",
  project: "votre-projet",
};

module.exports = withSentryConfig(nextConfig, sentryWebpackPluginOptions);

This next.config.js integrates the Sentry plugin for sourcemaps and auto-releases on build (npm run build). Replace org and project with your Sentry values. Pitfall: without silent: true, logs get noisy; enable PPR (Partial Prerendering) for 2026 perf without conflicts.

Client-Side Sentry Config (Browser Errors)

src/sentry.client.config.ts
import * as Sentry from '@sentry/nextjs';

Sentry.init({
  dsn: process.env.NEXT_PUBLIC_SENTRY_DSN,
  tracesSampleRate: 1.0,

  // Capture 20% des transactions pour performance
  profilesSampleRate: 0.2,

  // Ajout de contexte custom
  integrations: [
    Sentry.browserProfilingIntegration(),
    Sentry.replayIntegration({
      maskAllText: false,
      blockAllMedia: true,
    }),
  ],

  // Set tracesSampleRate to 1.0 to capture 100%
  // of transactions for tracing
  beforeSendTransaction: (event) => {
    // Modifiez tags avant envoi
    event.tags = { ...event.tags, env: 'prod' };
    return event;
  },
});

Initializes Sentry on the client with profiling and session replay. tracesSampleRate: 1.0 traces everything in dev; lower it in prod. beforeSendTransaction customizes events for filtering/tagging. Avoid maskAllText: true if you need text for debugging.

Server Instrumentation and Middleware

Important: Import these configs in layout.tsx: import './sentry.client.config'; import './sentry.server.config';. For distributed tracing, a middleware captures API spans. This propagates traces from client to server, perfect for microservices.

Server-Side Sentry Config and Tracing Middleware

src/middleware.ts
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
import * as Sentry from '@sentry/nextjs';

Sentry.init({
  dsn: process.env.NEXT_PUBLIC_SENTRY_DSN,
  tracesSampleRate: 1.0,
});

export function middleware(request: NextRequest) {
  // Capture performance API
  return Sentry.withServerSpan(
    { op: 'http.server', name: request.nextUrl.pathname },
    () => {
      const response = NextResponse.next();
      Sentry.getCurrentHub()
        ?.getScope()
        ?.setTransactionName(request.nextUrl.pathname);
      return response;
    }
  );
}

export const config = {
  matcher: '/((?!_next/static|_next/image|favicon.ico|.*pdf|.*svg).*)',
};

This middleware wraps requests in Sentry spans for end-to-end tracing. withServerSpan auto-captures timings; matcher excludes static assets. Pitfall: forget getCurrentHub and traces get lost. Ideal for spotting API bottlenecks.

Dedicated Server Config

src/sentry.server.config.ts
import * as Sentry from '@sentry/nextjs';

Sentry.init({
  dsn: process.env.NEXT_PUBLIC_SENTRY_DSN,
  tracesSampleRate: 1.0,

  // Auto-instrument Next.js
  integrations: [
    new Sentry.Integrations.ServerRuntimeDetection(),
    new Sentry.Integrations.Modules(),
    new Sentry.Integrations.Http({ tracing: true }),
    new Sentry.Integrations.ContextLines(),
    new Sentry.Integrations.LocalVariables(),
  ],

  // Patch Node globals
  integrations: [new Sentry.Integrations.Otel()],

  beforeSend(event) {
    // Filtre PII
    if (event.exception) {
      event.exception.values?.forEach((exc) => {
        exc.stacktrace?.frames?.forEach((frame) => {
          frame.filename = frame.filename?.replace(/auth_token=[^\s]+/, 'auth_token=***');
        });
      });
    }
    return event;
  },
});

Enables OpenTelemetry and HTTP tracing on the server. beforeSend scrubs sensitive data like tokens. Add LocalVariables for debug vars in prod (limited). Avoid overload: tracesSampleRate <0.5 for high-traffic.

Adding User Context and Breadcrumbs

To contextualize errors, use Sentry.setUser and addBreadcrumb in components. This ties issues to real sessions, speeding up resolution by 3x.

Usage Example with Context in a Page

src/app/page.tsx
import * as Sentry from '@sentry/nextjs';

import { headers } from 'next/headers';

export default async function Home() {
  'use server';
  const userId = headers().get('x-user-id') || 'anonymous';

  // Contexte utilisateur
  Sentry.setUser({ id: userId, ip_address: 'auto' });
  Sentry.setTag('feature', 'dashboard');

  // Breadcrumb pour actions
  Sentry.addBreadcrumb({
    category: 'auth',
    message: 'User login attempt',
    level: Sentry.Severity.Info,
  });

  // Simule erreur pour test
  if (Math.random() < 0.1) {
    throw new Error('Test Sentry capture');
  }

  return <div>Bienvenue ! Erreurs monitorées par Sentry.</div>;
}

Injects user context and breadcrumbs into an RSC. 'use server' ensures server execution. Test with random error; in prod, wrap async ops. Pitfall: forget dynamic headers() and context gets lost in SSR.

Best Practices

  • Smart sampling: tracesSampleRate: 0.1 in prod, 1.0 in staging; use beforeSend to filter noise.
  • Releases & Sourcemaps: Always tag builds (sentry-cli releases new v1.0) for original stacktraces.
  • Custom alerts: Set up Slack/Teams via Sentry UI for key metrics (LCP >2s).
  • PII Scrubbing: Enable globally + custom beforeSend for GDPR.
  • Profiling: Limit to 20% (profilesSampleRate) to avoid quota limits.

Common Errors to Avoid

  • Misplaced DSN: Use NEXT_PUBLIC_ only for client; server uses no prefix.
  • No sourcemaps: Build without plugin → unreadable minified stacktraces.
  • Incomplete tracing: Forget middleware → disconnected client/server spans.
  • Over-sampling: 100% traces in prod saturates quota (500k events/month free).
  • Ignore dev errors: Add debug: false and environment: process.env.NODE_ENV.

Next Steps

Dive deeper with OpenTelemetry + Sentry for microservices. Integrate Prometheus/Grafana for hybrid metrics. Check out our Learni observability courses: Next.js Expert and Advanced DevOps. Join the Learni Discord community for live Q&A.