Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove anti-pattern dispatcher hooks #2723

Merged
merged 1 commit into from
Feb 10, 2024
Merged

remove anti-pattern dispatcher hooks #2723

merged 1 commit into from
Feb 10, 2024

Conversation

ronag
Copy link
Member

@ronag ronag commented Feb 9, 2024

onBodySent on onRequestSent are footguns that easily cause bugs
when implementing logic will send multiple requests, e.g. redirect
and retry.

To achieve similar functionality wrap body into a stream and listen
to 'data' and 'end' events.

Refs: #2722

@ronag ronag added the semver-major Features or fixes that will be included in the next semver major release label Feb 9, 2024
@metcoder95
Copy link
Member

The rationale seems fair to me. I had a little bit of an issue understanding its specific use case. However, I think we should provide the same lifecycle event in some other way almost native to undici.

Wrapping the body into a stream and always listening for its end event can be a way to go to trigger the event, we just need to be clear on its intentions.

For the PR, it seems lint is not happy and we need to work on some tests 😅
Shall we point to next?

@ronag
Copy link
Member Author

ronag commented Feb 9, 2024

Shall we point to next?

Sorry, what do you mean?

@ronag ronag force-pushed the dispatcher-rm branch 3 times, most recently from 3aa30ca to 03f680f Compare February 9, 2024 09:39
@metcoder95
Copy link
Member

Sorry, what do you mean?

Point the PR to next branch

@ronag ronag force-pushed the dispatcher-rm branch 3 times, most recently from 1c71f36 to 761bd7e Compare February 9, 2024 09:48
@ronag ronag changed the base branch from main to next February 9, 2024 11:34
onBodySent on onRequestSent are footguns that easily cause bugs
when implementing logic will send multiple requests, e.g. redirect
and retry.

To achieve similar functionality wrap body into a stream and listen
to 'data' and 'end' events.

Refs: #2722
Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To achieve similar functionality wrap body into a stream and listen
to 'data' and 'end' events.

This is not the same. The reason those were added was essentially to avoid wrapping the body with a stream, because it adds overhead.

@ronag
Copy link
Member Author

ronag commented Feb 9, 2024

To achieve similar functionality wrap body into a stream and listen
to 'data' and 'end' events.

This is not the same. The reason those were added was essentially to avoid wrapping the body with a stream, because it adds overhead.

It is the same. But slower. However, they cause more harm than good as it is currently. What happens e.g. during a retry or redirect? When the same body is sent twice, possibly interleaved? The API is badly broken.

@mcollina
Copy link
Member

mcollina commented Feb 9, 2024

Can you add a atest showing how to achieve the same?

@ronag
Copy link
Member Author

ronag commented Feb 9, 2024

A test? Testing what? I can add an example in the docs if that's what you are looking for.

@mcollina
Copy link
Member

mcollina commented Feb 9, 2024

That would be ok.

@ronag
Copy link
Member Author

ronag commented Feb 10, 2024

No idea how/where to put this. But something like this would be the same thing:

import { Readable } from 'node:stream' 
import undici from 'undici'

function request(url, opts, onBodySent, onRequestSent) {
  let body = !opts?.body || typeof opts.body?.pipe === 'function' 
    ? opts.body 
    : Readable.from([opts.body])
  body
    .on('data', onBodySent)
    .on('end', onRequestSent) 
    .pause()
  return undici.request(url, { ...opts, body, })
}

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lgtm

@ronag ronag merged commit f648b9c into next Feb 10, 2024
17 checks passed
ronag added a commit that referenced this pull request Feb 16, 2024
ronag added a commit that referenced this pull request Feb 16, 2024
* remove anti-pattern dispatcher hooks (#2723)

* chore: upgrade llhttp to 9.2.0 (#2705)

* upgrade llhttp

* fix tests

* set version of llhttp 9.2.0

---------

Co-authored-by: Aras Abbasi <aras.abbasi@googlemail.com>
ronag added a commit that referenced this pull request Feb 16, 2024
@Uzlopak Uzlopak deleted the dispatcher-rm branch February 21, 2024 12:41
ronag added a commit that referenced this pull request Feb 26, 2024
KhafraDev pushed a commit to KhafraDev/undici that referenced this pull request Jun 24, 2024
@tjhiggins
Copy link

@ronag what would be the recommended way to get the request body in an interceptor going forward?

Since fetch's requestOptions.body is an AsyncGenerator and difficult to parse:

requestBody = (async function * () {

I'm currently using onBodySent for my LoggingHandler:

export class LoggingHandler extends DecoratorHandler implements Dispatcher.DispatchHandlers {
  private requestBody = '';
  private responseBody = '';
  private responseHeaders: Buffer[] = [];
  private statusCode?: number;
  private statusText?: string;

  constructor(
    protected options: { log: Log },
    protected requestOptions: Dispatcher.DispatchOptions,
    protected handler: Dispatcher.DispatchHandlers,
  ) {
    super(handler);
  }

  onConnect(abort: (err?: Error) => void): void {
    this.requestBody = '';
    this.responseBody = '';
    this.responseHeaders = [];
    this.statusCode = undefined;
    this.statusText = undefined;
    return this.handler.onConnect!(abort);
  }

  onHeaders(statusCode: number, headers: Buffer[], resume: () => void, statusText: string): boolean {
    this.statusCode = statusCode;
    this.statusText = statusText;
    this.responseHeaders = headers;
    return this.handler.onHeaders!(statusCode, headers, resume, statusText);
  }

  onBodySent(chunk: any, totalBytesSent: number) {
    try {
      this.requestBody += Buffer.from(chunk).toString();
    } catch (err) {
      console.debug('Failed to parse request body:', err);
    }
    if (this.handler.onBodySent) {
      this.handler.onBodySent(chunk, totalBytesSent);
    }
  }

  onData(chunk: Buffer): boolean {
    this.responseBody += chunk.toString();
    return this.handler.onData!(chunk);
  }

  onComplete(trailers: string[] | null) {
    const responseHeaders = util.parseHeaders(this.responseHeaders);

    const logData = {
      url: this.requestOptions.path,
      method: this.requestOptions.method || "GET",
      headers: this.requestOptions.headers,
      responseHeaders,
      body: this.requestBody,
      status: this.statusCode,
      statusText: this.statusText,
      data: this.responseBody,
    };

    let severity = 'INFO';
    if (this.statusCode === 404) {
      severity = 'WARNING';
    } else if (!this.statusCode || this.statusCode > 399) {
      severity = 'ERROR';
    }

    const metadata = {
      resource: { type: 'global' },
      // See: https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#logseverity
      severity,
    };

    const entry = this.options.log.entry(metadata, logData);
    this.options.log.write(entry).catch((reason) => console.warn('Failed to write log entry:', reason));

    return this.handler.onComplete!(trailers);
  }

  onError(err: Error) {
    return this.handler.onError!(err);
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
semver-major Features or fixes that will be included in the next semver major release
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants