Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming uploads cause truncated responses in node 16 #4262

Closed
achingbrain opened this issue Jun 18, 2021 · 4 comments
Closed

Streaming uploads cause truncated responses in node 16 #4262

achingbrain opened this issue Jun 18, 2021 · 4 comments
Labels
bug Bug or defect
Milestone

Comments

@achingbrain
Copy link

achingbrain commented Jun 18, 2021

Support plan

  • is this issue currently blocking your project? (yes/no): yes
  • is this issue affecting a production system? (yes/no): yes

Context

  • node version: 16
  • module version with issue: 20.1.4
  • last module version without issue: Unsure
  • environment (e.g. node, browser, native): node
  • used with (e.g. hapi application, another framework, standalone, ...): hapi application
  • any other relevant information:

What are you trying to achieve or the steps to reproduce?

Take a look at the following code sample. It starts a server, makes a request to the server, sends some data, and finally prints out the response.

It has two methods of reading the data from the client, one treating request.payload as an async iterator, one treating it as a readable stream.

const Hapi = require('@hapi/hapi')
const { PassThrough } = require('stream')
const http = require('http')

function receiveDataAsStream (request, stream) {
  request.payload.on('data', () => {
    console.info('receive data')
  })
  request.payload.on('end', () => {
    console.info('end data')

    // send result to client
    stream.end('hello world')
  })
}

async function receiveDataAsAsyncIterator (request, stream) {
  for await (const buf of request.payload) {
    console.info('receive data')
  }

  // send result to client
  stream.end('hello world')
}

async function main () {
  const server = Hapi.server({
    port: 0,
    host: 'localhost'
  })

  server.route({
    method: 'POST',
    path: '/',
    options: {
      payload: {
        output: 'stream'
      }
    },
    handler: (request, h) => {
      const stream = new PassThrough()

      // simulate async work
      process.nextTick(() => {
        // works on node 14 and node 16
        // receiveDataAsStream(request, stream)

        // works on node 14, breaks on node 16
        receiveDataAsAsyncIterator(request, stream).catch(err => console.error(err))
      })

      return h.response(stream)
    }
  })

  await server.start()

  // make a request
  const req = http.request({
    hostname: 'localhost',
    port: server.info.port,
    method: 'POST'
  }, (res) => {
    res.on('data', (chunk) => {
      console.info(chunk.toString())
    })
    res.on('end', () => {
      console.info('response ended')
      server.stop().catch(err => console.error(err))
    })
  })

  console.info('send data')
  req.end('{}')
}

main().catch(err => {
  console.error(err)
  process.exit(1)
})

On node 16 the 'hello world' output is not received by the client, on node 14 it is.

I think this is something to do with how the request payload is consumed as when you switch the receiveDataAsAsyncIterator function out for receiveDataAsStream it works as expected on node 14 and node 16.

What was the result you got?

Node 14:

$ node .
send data
receive data
hello world
response ended

Node 16:

$ node .
send data
receive data
response ended

The 'hello world' string is missing from the node 16 output.

What result did you expect?

Node 14:

$ node .
send data
receive data
hello world
response ended

Node 16:

$ node .
send data
receive data
hello world
response ended

The 'hello world' string should be included in the node 16 output.

@achingbrain achingbrain added the support Questions, discussions, and general support label Jun 18, 2021
@kanongil kanongil added bug Bug or defect and removed support Questions, discussions, and general support labels Jun 18, 2021
@kanongil
Copy link
Contributor

kanongil commented Jun 18, 2021

Thanks for the report! I have investigated the issue and can confirm that there is a bug.

This issue is due to the same change in node behavior that prompted #4225 and #4234. While that PR fixes the issue for the common cases, it does not fix it for stream responses.

What happens is that hapi will react to the incoming message being closed, and assume that this also applies to the outgoing response. This used to be true in node, but they changed it with v16 so it now closes right after the payload has been delivered.

I don't have time to implement a fix, but can inform that hello world is delivered if I comment out this line in internals.pipe():

request.raw.req.on('close', close);

@devinivy
Copy link
Member

I started digging into this and a question arose pretty quickly: do we believe that this test does not apply to node v16?

it('stops processing the stream when the request closes', async () => {

@kanongil
Copy link
Contributor

@devinivy Hmm, yeah – Given the new semantics, it does not apply as it it currently written.

I have tried to revise the test in a v16 compatible way in #4263.

@devinivy
Copy link
Member

Resolved by #4264, published in v20.1.5 👍 Thanks for the report!

@devinivy devinivy added this to the 20.1.5 milestone Jun 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bug or defect
Projects
None yet
Development

No branches or pull requests

3 participants