Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(nextjs): Use Vercel's waitUntil to defer freezing of Vercel Lambdas #12133

Merged
merged 5 commits into from
May 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions packages/nextjs/src/common/_error.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { captureException, withScope } from '@sentry/core';
import type { NextPageContext } from 'next';
import { flushQueue } from './utils/responseEnd';
import { flushSafelyWithTimeout } from './utils/responseEnd';
import { vercelWaitUntil } from './utils/vercelWaitUntil';

type ContextOrProps = {
req?: NextPageContext['req'];
Expand Down Expand Up @@ -53,7 +54,5 @@ export async function captureUnderscoreErrorException(contextOrProps: ContextOrP
});
});

// In case this is being run as part of a serverless function (as is the case with the server half of nextjs apps
// deployed to vercel), make sure the error gets sent to Sentry before the lambda exits.
await flushQueue();
vercelWaitUntil(flushSafelyWithTimeout());
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, could we possibly still encapsulate this in some flushQueue util or so? Feels a bit "dirty" to have all these vercelXXX calls peppered throughout the code base, I think I'd prefer it if we could abstract this away a bit 🤔

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tbh it feels more dirty to me to mangle Vercel API with our own intentions. vercelWaitUntil is basically just the vendored waitUntil. Also, it's not like this is not readable or anything.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We literally call vercelWaitUntil(flushSafelyWithTimeout()); every time where we are calling this, as far as I see? 😅

I think it would be nicer to still just call flushQueue everywhere (Which can literally be: function flushQueue() { vercelWaitUntil(flushSafelyWithTimeout()); } now), call this everywhere. if we need to adjust this logic (e.g. it does more or less or whatever) we can encapsulate this there, no need to repeat this everywhere - we call this 9 times now?

But no strong feelings, if you prefer it this way all good for me :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see the upside so I'll leave it as is.

}
5 changes: 0 additions & 5 deletions packages/nextjs/src/common/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,6 @@ export type VercelCronsConfig = { path?: string; schedule?: string }[] | undefin
export type NextApiHandler = {
(req: NextApiRequest, res: NextApiResponse): void | Promise<void> | unknown | Promise<unknown>;
__sentry_route__?: string;

/**
* A property we set in our integration tests to simulate running an API route on platforms that don't support streaming.
*/
__sentry_test_doesnt_support_streaming__?: true;
};

export type WrappedNextApiHandler = {
Expand Down
9 changes: 6 additions & 3 deletions packages/nextjs/src/common/utils/edgeWrapperUtils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,9 @@ import {
import { winterCGRequestToRequestData } from '@sentry/utils';

import type { EdgeRouteHandler } from '../../edge/types';
import { flushQueue } from './responseEnd';
import { flushSafelyWithTimeout } from './responseEnd';
import { commonObjectToIsolationScope, escapeNextjsTracing } from './tracingUtils';
import { vercelWaitUntil } from './vercelWaitUntil';

/**
* Wraps a function on the edge runtime with error and performance monitoring.
Expand Down Expand Up @@ -80,9 +81,11 @@ export function withEdgeWrapping<H extends EdgeRouteHandler>(

return handlerResult;
},
).finally(() => flushQueue());
);
},
);
).finally(() => {
vercelWaitUntil(flushSafelyWithTimeout());
});
});
});
};
Expand Down

This file was deleted.

8 changes: 6 additions & 2 deletions packages/nextjs/src/common/utils/responseEnd.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,14 @@ export function finishSpan(span: Span, res: ServerResponse): void {
span.end();
}

/** Flush the event queue to ensure that events get sent to Sentry before the response is finished and the lambda ends */
export async function flushQueue(): Promise<void> {
/**
* Flushes pending Sentry events with a 2 second timeout and in a way that cannot create unhandled promise rejections.
*/
export async function flushSafelyWithTimeout(): Promise<void> {
try {
DEBUG_BUILD && logger.log('Flushing events...');
// We give things that are currently stuck in event processors a tiny bit more time to finish before flushing. 50ms was chosen very unscientifically.
await new Promise(resolve => setTimeout(resolve, 50));
await flush(2000);
DEBUG_BUILD && logger.log('Done flushing events');
} catch (e) {
Expand Down
21 changes: 21 additions & 0 deletions packages/nextjs/src/common/utils/vercelWaitUntil.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
import { GLOBAL_OBJ } from '@sentry/utils';

interface VercelRequestContextGlobal {
get?(): {
waitUntil?: (task: Promise<unknown>) => void;
};
}

/**
* Function that delays closing of a Vercel lambda until the provided promise is resolved.
*
* Vendored from https://www.npmjs.com/package/@vercel/functions
*/
export function vercelWaitUntil(task: Promise<unknown>): void {
const vercelRequestContextGlobal: VercelRequestContextGlobal | undefined =
// @ts-expect-error This is not typed
GLOBAL_OBJ[Symbol.for('@vercel/request-context')];

const ctx = vercelRequestContextGlobal?.get?.() ?? {};
ctx.waitUntil?.(task);
}
16 changes: 7 additions & 9 deletions packages/nextjs/src/common/utils/wrapperUtils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ import {
import type { Span } from '@sentry/types';
import { isString } from '@sentry/utils';

import { platformSupportsStreaming } from './platformSupportsStreaming';
import { autoEndSpanOnResponseEnd, flushQueue } from './responseEnd';
import { autoEndSpanOnResponseEnd, flushSafelyWithTimeout } from './responseEnd';
import { commonObjectToIsolationScope, escapeNextjsTracing } from './tracingUtils';
import { vercelWaitUntil } from './vercelWaitUntil';

declare module 'http' {
interface IncomingMessage {
Expand Down Expand Up @@ -124,15 +124,14 @@ export function withTracedServerSideDataFetcher<F extends (...args: any[]) => Pr
throw e;
} finally {
dataFetcherSpan.end();
if (!platformSupportsStreaming()) {
await flushQueue();
}
}
},
);
});
});
});
}).finally(() => {
vercelWaitUntil(flushSafelyWithTimeout());
});
};
}
Expand Down Expand Up @@ -198,10 +197,9 @@ export async function callDataFetcherTraced<F extends (...args: any[]) => Promis
throw e;
} finally {
dataFetcherSpan.end();
if (!platformSupportsStreaming()) {
await flushQueue();
}
}
},
);
).finally(() => {
vercelWaitUntil(flushSafelyWithTimeout());
});
}
15 changes: 3 additions & 12 deletions packages/nextjs/src/common/withServerActionInstrumentation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ import { logger } from '@sentry/utils';

import { DEBUG_BUILD } from './debug-build';
import { isNotFoundNavigationError, isRedirectNavigationError } from './nextNavigationErrorUtils';
import { platformSupportsStreaming } from './utils/platformSupportsStreaming';
import { flushQueue } from './utils/responseEnd';
import { flushSafelyWithTimeout } from './utils/responseEnd';
import { escapeNextjsTracing } from './utils/tracingUtils';
import { vercelWaitUntil } from './utils/vercelWaitUntil';

interface Options {
formData?: FormData;
Expand Down Expand Up @@ -131,16 +131,7 @@ async function withServerActionInstrumentationImplementation<A extends (...args:
},
);
} finally {
if (!platformSupportsStreaming()) {
// Lambdas require manual flushing to prevent execution freeze before the event is sent
await flushQueue();
}

if (process.env.NEXT_RUNTIME === 'edge') {
// flushQueue should not throw
// eslint-disable-next-line @typescript-eslint/no-floating-promises
flushQueue();
}
vercelWaitUntil(flushSafelyWithTimeout());
}
},
);
Expand Down
24 changes: 5 additions & 19 deletions packages/nextjs/src/common/wrapApiHandlerWithSentry.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,9 @@ import { consoleSandbox, isString, logger, objectify } from '@sentry/utils';

import { SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN } from '@sentry/core';
import type { AugmentedNextApiRequest, AugmentedNextApiResponse, NextApiHandler } from './types';
import { platformSupportsStreaming } from './utils/platformSupportsStreaming';
import { flushQueue } from './utils/responseEnd';
import { flushSafelyWithTimeout } from './utils/responseEnd';
import { escapeNextjsTracing } from './utils/tracingUtils';
import { vercelWaitUntil } from './utils/vercelWaitUntil';

/**
* Wrap the given API route handler for tracing and error capturing. Thin wrapper around `withSentry`, which only
Expand Down Expand Up @@ -83,15 +83,8 @@ export function wrapApiHandlerWithSentry(apiHandler: NextApiHandler, parameteriz
apply(target, thisArg, argArray) {
setHttpStatus(span, res.statusCode);
span.end();
if (platformSupportsStreaming() && !wrappingTarget.__sentry_test_doesnt_support_streaming__) {
target.apply(thisArg, argArray);
} else {
// flushQueue will not reject
// eslint-disable-next-line @typescript-eslint/no-floating-promises
flushQueue().then(() => {
target.apply(thisArg, argArray);
});
}
vercelWaitUntil(flushSafelyWithTimeout());
target.apply(thisArg, argArray);
},
});

Expand Down Expand Up @@ -138,14 +131,7 @@ export function wrapApiHandlerWithSentry(apiHandler: NextApiHandler, parameteriz
setHttpStatus(span, res.statusCode);
span.end();

// Make sure we have a chance to finish the transaction and flush events to Sentry before the handler errors
// out. (Apps which are deployed on Vercel run their API routes in lambdas, and those lambdas will shut down the
// moment they detect an error, so it's important to get this done before rethrowing the error. Apps not
// deployed serverlessly will run into this cleanup code again in `res.end(), but the transaction will already
// be finished and the queue will already be empty, so effectively it'll just no-op.)
if (platformSupportsStreaming() && !wrappingTarget.__sentry_test_doesnt_support_streaming__) {
await flushQueue();
}
vercelWaitUntil(flushSafelyWithTimeout());

// We rethrow here so that nextjs can do with the error whatever it would normally do. (Sometimes "whatever it
// would normally do" is to allow the error to bubble up to the global handlers - another reason we need to mark
Expand Down
10 changes: 3 additions & 7 deletions packages/nextjs/src/common/wrapRouteHandlerWithSentry.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,13 @@ import {
import { propagationContextFromHeaders, winterCGHeadersToDict } from '@sentry/utils';
import { isNotFoundNavigationError, isRedirectNavigationError } from './nextNavigationErrorUtils';
import type { RouteHandlerContext } from './types';
import { platformSupportsStreaming } from './utils/platformSupportsStreaming';
import { flushQueue } from './utils/responseEnd';
import { flushSafelyWithTimeout } from './utils/responseEnd';
import {
commonObjectToIsolationScope,
commonObjectToPropagationContext,
escapeNextjsTracing,
} from './utils/tracingUtils';
import { vercelWaitUntil } from './utils/vercelWaitUntil';

/**
* Wraps a Next.js route handler with performance and error instrumentation.
Expand Down Expand Up @@ -97,11 +97,7 @@ export function wrapRouteHandlerWithSentry<F extends (...args: any[]) => any>(
},
);
} finally {
if (!platformSupportsStreaming() || process.env.NEXT_RUNTIME === 'edge') {
// 1. Edge transport requires manual flushing
// 2. Lambdas require manual flushing to prevent execution freeze before the event is sent
await flushQueue();
}
vercelWaitUntil(flushSafelyWithTimeout());
}
});
});
Expand Down
8 changes: 3 additions & 5 deletions packages/nextjs/src/common/wrapServerComponentWithSentry.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,13 @@ import { propagationContextFromHeaders, uuid4, winterCGHeadersToDict } from '@se
import { SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN } from '@sentry/core';
import { isNotFoundNavigationError, isRedirectNavigationError } from '../common/nextNavigationErrorUtils';
import type { ServerComponentContext } from '../common/types';
import { flushQueue } from './utils/responseEnd';
import { flushSafelyWithTimeout } from './utils/responseEnd';
import {
commonObjectToIsolationScope,
commonObjectToPropagationContext,
escapeNextjsTracing,
} from './utils/tracingUtils';
import { vercelWaitUntil } from './utils/vercelWaitUntil';

/**
* Wraps an `app` directory server component with Sentry error instrumentation.
Expand Down Expand Up @@ -93,10 +94,7 @@ export function wrapServerComponentWithSentry<F extends (...args: any[]) => any>
},
() => {
span.end();

// flushQueue should not throw
// eslint-disable-next-line @typescript-eslint/no-floating-promises
flushQueue();
vercelWaitUntil(flushSafelyWithTimeout());
},
);
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,4 @@ const handler = async (_req: NextApiRequest, res: NextApiResponse): Promise<void
res.end();
};

handler.__sentry_test_doesnt_support_streaming__ = true;

export default handler;
Loading