Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HPE_INVALID_HEADER_TOKEN on http requests #27711

Closed
faberyx opened this issue May 15, 2019 · 75 comments
Closed

HPE_INVALID_HEADER_TOKEN on http requests #27711

faberyx opened this issue May 15, 2019 · 75 comments
Labels
http Issues or PRs related to the http subsystem.

Comments

@faberyx
Copy link

faberyx commented May 15, 2019

Upgrading to 12.2.0 broke several http calls with a parse error HPE_INVALID_HEADER_TOKEN, all requests were working fine with version 11.10.0 I had before the upgrade.

I tried http-parser-js library to patch http but I still get the same issue
process.binding('http_parser').HTTPParser = require('http-parser-js').HTTPParser;

@mscdex
Copy link
Contributor

mscdex commented May 15, 2019

Do you have an example request that reproduces this?

@faberyx
Copy link
Author

faberyx commented May 15, 2019

now it started working, probably an issue with the network, thanks anyway!

@faberyx faberyx closed this as completed May 15, 2019
@jd4ever
Copy link

jd4ever commented May 20, 2019

I have the same problem. I'm trying to place a get request to https://www.bitstamp.net/api/ticker_hour/ with either axios or request and it gives the errors 'HPE_INVALID_HEADER_TOKEN', reason: 'Invalid header value char'

axios({
	method:'get',
	url:'https://www.bitstamp.net/api/ticker_hour/',
	responseType:'json',
}).then(response => { 
	logger.info(response)
})
.catch(error => {
	logger.error('AXIOS ERROR! ', error)
});

The error occurs only on Node 12 (v12.2.0). Downgrading Node to v11.15.0 and v10.15.3 allows the code to work properly. Upgrading to v12.2.0 gives the errors again.

Running curl -I 'https://www.bitstamp.net/api/ticker_hour/' gives the following

HTTP/1.1 200 OK
Access-Control-Allow-Headers: x-requested-with, Content-Type, origin, accept, cache-control
Access-Control-Allow-Methods: POST, GET
Access-Control-Allow-Origin: *
Cache-Control: max-age=0
Content-Language: en
Content-Security-Policy-Report-Only: default-src 'self' 'unsafe-inline' 'report-sample'; connect-src 'self' wss://ws.pusherapp.com wss://ws.bitstamp.net *.pusher.com *.trackjs.com *.google-analytics.com stats.g.doubleclick.net ; font-src 'self' data: fonts.gstatic.com www.google.com ; frame-ancestors 'self'; frame-src 'self' pixel-a.basis.net 8720977.fls.doubleclick.net *.ledgerwallet.com *.google.com www.googletagmanager.com pixel.sitescout.com ctpe.net ; img-src * data:; report-uri /api/report-csp/; script-src 'self' 'unsafe-inline' js-agent.newrelic.com *.google-analytics.com *.pusher.com d3dy5gmtp8yhk7.cloudfront.net www.googleadservices.com www.googletagmanager.com www.gstatic.com www.recaptcha.net code.highcharts.com/stock/highstock.js bam.nr-data.net ; style-src 'self' 'unsafe-inline' fonts.googleapis.com
Content-Type: application/json
Date: Mon, 20 May 2019 14:43:35 GMT
Expires: Mon, 20 May 2019 14:43:35 GMT
Last-Modified: Mon, 20 May 2019 14:43:35 GMT
Server: Apache
Strict-Transport-Security: max-age=63072000; includeSubDomains
Vary: Authorization,Accept-Language
X-Frame-Options: SAMEORIGIN
Connection: keep-alive
Set-Cookie: visid_incap_99025=28UPHgTcT9qg3fHswfuOIxa94lwAAAAAQUIPAAAAAADTKGHFgMP/lqNDNSEn5+xH; expires=Tue, 19 May 2020 11:56:21 GMT; path=/; Domain=.bitstamp.net
Set-Cookie: nlbi_99025=Z1FDFXOvq0sr44HdSF244gAAAADIWTNaa1w4Wl1Teiv195T7; path=/; Domain=.bitstamp.net
Set-Cookie: incap_ses_149_99025=VIfCRMT0pkpYn47mv2ARAha94lwAAAAAq5D2+1275GJwPNJugT3sRA==; path=/; Domain=.bitstamp.net
Set-Cookie: ___utmvmyFumyLc=yoPzQoEBNWo; path=/; Max-Age=900
Set-Cookie: ___utmvayFumyLc=nLzMMyg; path=/; Max-Age=900
Set-Cookie: ___utmvbyFumyLc=tZO
    XeHOlala: ltY; path=/; Max-Age=900
X-Iinfo: 10-550845-550848 NNNN CT(0 0 0) RT(1558363414843 40) q(0 0 0 -1) r(1 1) U6
X-CDN: Incapsula

Perhaps the headers above are malformed, but given that Node 10 and 11 can correctly handle the headers and only Node 12 has problems leads me to believe that this ticket should be reopened.

@mscdex
Copy link
Contributor

mscdex commented May 20, 2019

@jd4ever That response is malformed. According to the http spec, spaces are not permitted in header field names. So this is a case of the old http parser being less strict than the new http parser that's on by default in node v12.

You can still use the old http parser for now in node v12 by passing --http-parser=legacy to your node command line.

@jd4ever
Copy link

jd4ever commented May 24, 2019

@mscdex thanks for looking into this! Using --http-parser=legacy worked! Is the culprit XeHOlala: that is causing trouble?

It does look like the new parser of Node 12 is more strict. However, since the error comes from a remote server that is entirely beyond our control, it seems that a more strict parser is a disadvantage, rather than an advantage.

Do you know whether the legacy option will ever be removed in the future? I worry that it will cause a lot of problems that can't be fixed from our end.

@damien-git
Copy link

I am also seeing that error after moving a site behind a WAF using Incapsula. So I suspect the issue reported by jd4ever was not (or not only) the spaces in the XeHOlala header.
I see this on 2 sites, when doing a simple HEAD request. Here are the headers I get with curl -I:

HTTP/2 200 
content-type: text/html;charset=UTF-8
content-language: en-US
server: 
set-cookie: MXP_TRACKINGID=996E8B7B%2D05EA%2D10E7%2D9BDC30E99E2BDB37; Expires=Wed, 02-Jun-2049 17:58:57 GMT; Path=/; Secure; HttpOnly
set-cookie: mobileFormat=false; Expires=Wed, 02-Jun-2049 17:58:57 GMT; Path=/; Secure; HttpOnly
set-cookie: CFID=38155015;expires=Thu, 10-Jun-2049 17:58:57 GMT;path=/;secure;HttpOnly;
set-cookie: CFTOKEN=d6bec94893b176f3-996E8B62-E7BA-23FB-D8D6BCE93620BBAB;expires=Thu, 10-Jun-2049 17:58:57 GMT;path=/;secure;HttpOnly;
strict-transport-security: max-age=1200
generator: NatSci
date: Mon, 10 Jun 2019 17:58:58 GMT
set-cookie: visid_incap_2082027=D53nzqFsQau7uFL/731Q9Gma/lwAAAAAQUIPAAAAAABwQQIDDgbEKHFczttJG1tg; expires=Tue, 09 Jun 2020 06:36:42 GMT; path=/; Domain=.msu.edu
set-cookie: incap_ses_104_2082027=BvvaKTGlymrI17Zf9aRxAWma/lwAAAAAUfh8ji07KSAVjCTuxS+ajQ==; path=/; Domain=.msu.edu
x-iinfo: 7-34621156-34621157 NNNN CT(0 0 0) RT(1560189544440 0) q(0 0 0 -1) r(8 8) U6
x-cdn: Incapsula
HTTP/2 200 
cache-control: private,no-cache, no-store
pragma: no-cache
content-length: 22654
content-type: text/html; charset=utf-8
expires: -1
set-cookie: ASP.NET_SessionId=yqkwjubyu3x0qtkatx3yqxoq; path=/; HttpOnly
x-aspnet-version: 4.0.30319
x-frame-options: DENY
x-xss-protection: 1; mode=block
x-content-type-options: nosniff
date: Mon, 10 Jun 2019 17:36:50 GMT
set-cookie: BIGipServer~USYS~cstat.msu.edu_80_304734_pool=1844709667.20480.0000; path=/; Httponly; Secure
strict-transport-security: max-age=15552000
set-cookie: visid_incap_2067681=p6Fh6nR4TfajPBCtyxIvGDKV/lwAAAAAQUIPAAAAAABQsuDZ/v9AJT/b0+ArHwkF; expires=Tue, 09 Jun 2020 09:07:37 GMT; path=/; Domain=.msu.edu
set-cookie: incap_ses_529_2067681=1z+vCvtzXBoZJeEIi2NXBzKV/lwAAAAAQCOf0wRICmscmaEMl2NDag==; path=/; Domain=.msu.edu
x-iinfo: 2-37631979-37631980 NNNN CT(0 0 0) RT(1560188210238 0) q(0 0 0 -1) r(0 0) U6
x-cdn: Incapsula

I have not had luck with http-parser=legacy, but maybe I'm doing it wrong (I'm not sure where to put that in nodemon server.js --exec babel-node).
It would be nice to have an option for lenient header parsing.

@mscdex
Copy link
Contributor

mscdex commented Jun 10, 2019

@damien-git The --http-parser command line argument controls the parser for HTTP/1.x, not HTTP/2.

@damien-git
Copy link

Actually I was using node-fetch, which is still not supporting HTTP 2, so it was using HTTP 1.1, and these sites have a bad header with HTTP 1.1 similar to the one described by jd4ever (visible with curl -I --http1.1).
I had a hard time setting that node option, but when I used it in the NODE_OPTIONS environment variable it started working.

@jd4ever
Copy link

jd4ever commented Aug 6, 2019

I have not had luck with http-parser=legacy, but maybe I'm doing it wrong (I'm not sure where to put that in nodemon server.js --exec babel-node).

Try the following, for nodemon and pm2, assuming your code resides in .lib/index.js
nodemon --http-parser=legacy lib/index.js

pm2 start lib/index.js -e pm2.log -o pm2.log --node-args="--http-parser=legacy"

@WilliamConnatser
Copy link
Contributor

WilliamConnatser commented Sep 4, 2019

I was getting the same error. Ironically, also while getting an endpoint from Bitstamp via Axios (not sure if related).

I used --http-parser=legacy in my nodemon command and it seems to have done the trick.

I don't understand what's causing the error.. after inspecting the headers from the request which is erroring out- nothing looks out of place. No spaces in the header or anything, as mentioned up-thread.

@theninj4
Copy link

I've just stumbled over this issue too. It looks like it's limited to the Incapsula CDN. They support both HTTP/1.1 and HTTP/2, their responses via HTTP/2 are fine but their cookie headers over HTTP/1.1 contain garbage. @WilliamConnatser you can force HTTP/1.1 via the --http1.1 curl flag as per one of the comments above.

It looks like their HTTP/1.1 responses are attaching or manipulating cookies leaving some rogue formatting:

< Set-Cookie: ___utmvaaBuOpzY=iEy�uObw; path=/; Max-Age=900
< Set-Cookie: ___utmvbaBuOpzY=xZy
<     XKyOwalH: htn; path=/; Max-Age=900

It looks like the \n XKyOwalH: has been fiddled in by something in an attempt to fix the headers, which I'm guessing should have looked something like this:

< Set-Cookie: ___utmvaaBuOpzY=iEy�uObw; path=/; Max-Age=900
< Set-Cookie: ___utmvbaBuOpzY=xZy�htn; path=/; Max-Age=900

This is clearly an issue with Incapsula, however as someone above pointed out it's incredibly inconvenient that we're unable to parse the response. Is there any scope for dropping the invalid header and parsing the rest? How many scenarios are there whereby discarding the entire response is the desirable behaviour? What do other browsers / parsers do in these scenarios?

@fakirpic
Copy link

fakirpic commented Oct 2, 2019

i am living same problem with incapsula server, and i have no idea how to pass "--http-parser=legacy" arg to my nwjs app

@mazamachi
Copy link

mazamachi commented Oct 24, 2019

2019/10/25 Update
Sorry, I misunderstood the problem. llparse can parse obs-fold correctly.
The problem I faced was the fact that a cookie contains invalid character (\001) and it is correct to throw HPE_INVALID_HEADER_TOKEN.
The below text is not valid 🙇.


I agree with @theninj4 . I think it would be better to add an option to determine whether parse obs-fold headers or not.
This is because a client must correctly parse obs-fold headers when http-parser or llhttp are used as a user-agent, like scraper.

As in RFC 7230, a user agent must replace the heading spaces with a space and concat lines.

A user agent that receives an obs-fold in a response message that is
not within a message/http container MUST replace each received
obs-fold with one or more SP octets prior to interpreting the field
value.
https://tools.ietf.org/html/rfc7230#section-3.2.4

And the RFC determines the server MUST either reject the message by sending a 400 (Bad Request) or parse correctly in the same way as user-agent.

A server that receives an obs-fold in a request message that is not
within a message/http container MUST either reject the message by
sending a 400 (Bad Request), preferably with a representation
explaining that obsolete line folding is unacceptable, or replace
each received obs-fold with one or more SP octets prior to
interpreting the field value or forwarding the message downstream.
https://tools.ietf.org/html/rfc7230#section-3.2.4

Should I create another issue to llhtttp?

@voxsoftware
Copy link

voxsoftware commented Nov 6, 2019

Any environment variable to do this instead of node parameter?
Also I don't understand why generates an exception instead of ignoring the bad part? Why the legacy works and new no? And in any case, why not set just a parameter for ignore this type of errors

@ghost
Copy link

ghost commented Nov 8, 2019

I don't understand the reasoning behind that decision. It makes Node v13 totally useless when trying to fetch data from a server (that is beyond your control), since you also removed --http-parser=legacy

Why not just try to ignore or circumvent ill-formed header-lines by giving the developer an option to handle those headers on their own.

It's like if a browser would try to parse an ill-formed HTML page and tells the user: "This page is not according to the W3-specification, bad luck!".

@sharedprophet
Copy link

sharedprophet commented Nov 12, 2019

At my company we just encountered this and opened a support ticket with Incapsula (we use their CDN). Here is their response:

Incapsula uses malformed cookies in order to inspect the way clients handle these cookies, as part of our classification process - as browsers are able to handle this cookies, and most Bots can't.

As stated, we can implement a back-end configuration to stop using this cookies, please let us know if you wish us to set this configuration.

The white space is part of the classification cookie.

So while they provide an option to disable this, they are not going to change it altogether, and if you are not the one in control of the endpoint you are trying to request behind their CDN, you can't upgrade to node 13+ unless something changes...

@voxsoftware
Copy link

Also the problem is not only for InCapsula, I have an app that needs comunicate to government webservices (is beyond of our control), and have the same problem with invalid HEADER. Really this restriction make node.js unusable, should be an option for ignore this error, or something similar

@jehy
Copy link

jehy commented Nov 14, 2019

Just migrated to node12, and it's really inconvenient to add --tls-min-v1.0 --http-parser=legacy to every node launch script. And if node13 does not have those flags, it will be impossible to migrate - you know, I can't just change another company's server settings and code, and I have to work with them.

@Legogris
Copy link

I'd also vouch for this being addressed. The way things stand, many production systems will be stuck on node v12 (once they figure out the --http-parser=legacy flag). As I see it, please consider one of these options (in order of how I personally would prefer it):

  1. Try to support malformed headers (like in legacy)
  2. Simply ignore malformed headers
  3. Allow users to extend handling to resolve it themselves in some way
  4. Bring back --http-parser=legacy

@sam-github
Copy link
Contributor

Re-opening, it seems to have gotten closed prematurely. A work-around is mentioned, but that workaround is no longer available since the http_parser was removed in 13.x

There is a lack of detail in this report about exactly what is malformed, I don't see any trace of the actual HTTP headers returned by the server. There also isn't a standalone reproduction (that would allow us to do the trace).

@tniessen tniessen added the http Issues or PRs related to the http subsystem. label Nov 16, 2019
@theninj4
Copy link

Here's an example for how to see a malformed cookie header in the wild:

$ curl --http1.1 -vvv https://www.dezeen.com/robots.txt
...
< Set-Cookie: ___utmvmYEuOpzY=KiRnYRnqdnR; path=/; Max-Age=900
< Set-Cookie: ___utmvaYEuOpzY=Nit�SgCC; path=/; Max-Age=900
< Set-Cookie: ___utmvbYEuOpzY=oZn
<     XRtOLalF: Ftd; path=/; Max-Age=900
< Strict-Transport-Security: max-age=31536000

Running this on the Node.js repl shows the issue we're facing:

require('https').get('https://www.dezeen.com/robots.txt', (res) => {
  console.log('yay')
}).on('error', (e) => {
  console.error(e)
})
> Error: Parse Error: Invalid header value char
    at TLSSocket.socketOnData (_http_client.js:454:22)
    at TLSSocket.emit (events.js:210:5)
    at TLSSocket.EventEmitter.emit (domain.js:540:23)
    at addChunk (_stream_readable.js:326:12)
    at readableAddChunk (_stream_readable.js:301:11)
    at TLSSocket.Readable.push (_stream_readable.js:235:10)
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:182:23) {
  bytesParsed: 945,
  code: 'HPE_INVALID_HEADER_TOKEN',
  reason: 'Invalid header value char',
  rawPacket: <Buffer 48 54 54 50 2f 31 2e 31 20 32 30 30 20 4f 4b 0d 0a 53 65 72 76 65 72 3a 20 6e 67 69 6e 78 0d 0a 44 61 74 65 3a 20 53 61 74 2c 20 31 36 20 4e 6f 76 20 ... 1220 more bytes>
}

@lukiano
Copy link

lukiano commented Jul 6, 2020

I'm getting this error trying to parse a filename in a Content-Disposition header. I found out that it fails with the token ÿ (\u00FF) as values below it are correctly parsed, and for values above it the library that I'm using escapes them, setting the result in filename* (because they are outside of ISO-8859-1).

@voxsoftware
Copy link

Any way to set insecureHTTPParser option as global?

@kailashyogeshwar85
Copy link
Contributor

Can any one provide the steps to reproduce it. I have tried with request module on node v8.14.0 and v10.14.2 but it doesn't give the error

@lukiano
Copy link

lukiano commented Aug 25, 2020

const http = require('http');
http.createServer((req, res) => {
  res.setHeader('xx', 'ÿ');
  res.end();
}).listen(3001, () => {
  http.get('http://localhost:3001', (res) => {
    console.log(res.headers.xx);
  });
});

on v12.18.3 is failing me with

events.js:292
      throw er; // Unhandled 'error' event
      ^

Error: Parse Error: Invalid header value char
    at Socket.socketOnData (_http_client.js:469:22)
    at Socket.emit (events.js:315:20)
    at addChunk (_stream_readable.js:295:12)
    at readableAddChunk (_stream_readable.js:271:9)
    at Socket.Readable.push (_stream_readable.js:212:10)
    at TCP.onStreamRead (internal/stream_base_commons.js:186:23)
Emitted 'error' event on ClientRequest instance at:
    at Socket.socketOnData (_http_client.js:476:9)
    at Socket.emit (events.js:315:20)
    [... lines matching original stack trace ...]
    at TCP.onStreamRead (internal/stream_base_commons.js:186:23) {
  bytesParsed: 21,
  code: 'HPE_INVALID_HEADER_TOKEN',
  reason: 'Invalid header value char',
  rawPacket: Buffer(101) [Uint8Array] [
     72,  84,  84,  80,  47,  49,  46,  49,  32,  50,  48,  48,
     32,  79,  75,  13,  10, 120, 120,  58,  32, 255,  13,  10,
     68,  97, 116, 101,  58,  32,  84, 117, 101,  44,  32,  50,
     53,  32,  65, 117, 103,  32,  50,  48,  50,  48,  32,  48,
     50,  58,  51,  53,  58,  51,  51,  32,  71,  77,  84,  13,
     10,  67, 111, 110, 110, 101,  99, 116, 105, 111, 110,  58,
     32,  99, 108, 111, 115, 101,  13,  10,  67, 111, 110, 116,
    101, 110, 116,  45,  76, 101, 110, 103, 116, 104,  58,  32,
     48,  13,  10,  13,
    ... 1 more item
  ]
}

With --insecure-http-parser it returns

(node:27324) Warning: Using insecure HTTP parsing
ÿ

With --http-parser=legacy it returns

ÿ

However, by adding res.flushHeaders(); before res.end(); it returns

ÿ

with all --insecure-http-parser, --http-parser=legacy, and nothing set.

@lukiano
Copy link

lukiano commented Aug 25, 2020

I should add that the raw packet above is

HTTP/1.1 200 OK
xx: ÿ
Date: Tue, 25 Aug 2020 02:35:33 GMT
Connection: close
Content-Length: 0

BaochengSu added a commit to BaochengSu/node that referenced this issue Oct 21, 2020
Ported from
OpenSUSE:nodejs8-8.17.0-lp152.147.1:CVE-2019-15605.patch

Original commit message:

commit e2c8f89
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Thu Jan 16 11:55:52 2020 -0800

    test: using TE to smuggle reqs is not possible

    See: https://hackerone.com/reports/735748

    PR-URL: https://github.com/nodejs-private/node-private/pull/192
    Reviewed-By: Beth Griggs <Bethany.Griggs@uk.ibm.com>

commit 49f4220
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Tue Feb 4 10:36:57 2020 -0800

    deps: upgrade http-parser to v2.9.3

    PR-URL: https://github.com/nodejs-private/http-parser-private/pull/4
    Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
    Reviewed-By: James M Snell <jasnell@gmail.com>
    Reviewed-By: Sam Roberts <vieuxtech@gmail.com>

commit d616722
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Tue Jan 7 14:24:54 2020 -0800

    test: check that --insecure-http-parser works

    Test that using --insecure-http-parser will disable validation of
    invalid characters in HTTP headers.

    See:
    - nodejs#30567

    Backport-PR-URL: nodejs#30471
    PR-URL: nodejs#31253
    Reviewed-By: Richard Lau <riclau@uk.ibm.com>
    Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de>

commit a9849c0
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Wed Nov 20 11:48:58 2019 -0800

    http: opt-in insecure HTTP header parsing

    Allow insecure HTTP header parsing. Make clear it is insecure.

    See:
    - nodejs#30553
    - nodejs#27711 (comment)
    - nodejs#30515

    Backport-PR-URL: nodejs#30471
    PR-URL: nodejs#30567
    Reviewed-By: Fedor Indutny <fedor.indutny@gmail.com>
    Reviewed-By: Anna Henningsen <anna@addaleax.net>
    Reviewed-By: Denys Otrishko <shishugi@gmail.com>
    Reviewed-By: James M Snell <jasnell@gmail.com>

commit a28e5cc
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Wed Nov 13 10:05:38 2019 -0800

    deps: upgrade http-parser to v2.9.1

    PR-URL: nodejs#30471
    Reviewed-By: James M Snell <jasnell@gmail.com>
    Reviewed-By: Jiawen Geng <technicalcute@gmail.com>
    Reviewed-By: Richard Lau <riclau@uk.ibm.com>
    Reviewed-By: Beth Griggs <Bethany.Griggs@uk.ibm.com>

Signed-off-by: Su Baocheng <baocheng.su@siemens.com>
@balram-mitta
Copy link

balram-mitta commented Dec 21, 2020

I am getting this error from axios rest api call in nodejs application. I have tried all the mentioned work arounds but didn't work though. Can someone help me out here. And the other thing is, I keep getting this intermittently blocking me for hours.

Parse Error: Invalid header value char
code: HPE_INVALID_HEADER_TOKEN

@ghost
Copy link

ghost commented Dec 26, 2020

I have the same issue with incapsula CDN despite nothing visibly wrong!

So they removed --http-parser=legacy in node 13?? So that's why it doesn't work. So I have to use node 12, or, might as well use node 10 so I don't need the flag.

@7c
Copy link

7c commented Jan 30, 2021

Node: 10.21.0
MAC OS 10.15.7 (19H2)

require('https').get('https://rdap.nic.brussels/domain/actiris.brussels', (res) => {
            console.log(res.data)
          }).on('error', (e) => {
            console.error(e)
          })
{ Error: Parse Error
    at TLSSocket.socketOnData (_http_client.js:451:20)
    at TLSSocket.emit (events.js:198:13)
    at addChunk (_stream_readable.js:288:12)
    at readableAddChunk (_stream_readable.js:269:11)
    at TLSSocket.Readable.push (_stream_readable.js:224:10)
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:94:17) bytesParsed: 940, code: 'HPE_INVALID_HEADER_TOKEN' }

same applies to axios. Same problem applies with node v14.15.4 on same MACOS.

raw response is:
image

i believe the '42' after the headers might be a compliance issue.

Another raw response:
image

@ghost
Copy link

ghost commented Jan 30, 2021

I think it's the spaces before XGVORalE and X-CDN

@indutny
Copy link
Member

indutny commented Jan 30, 2021

@7c this seems to be working just fine with v15.7.0 . Have you checked it?

@7c
Copy link

7c commented Jan 30, 2021

@7c this seems to be working just fine with v15.7.0 . Have you checked it?

no but really, i do not want to switch to higher version if it is not a necessity. This might be an issue with the server and backend these servers are using, they might not comply with standards. nodejs probably does.

@mscdex
Copy link
Contributor

mscdex commented Jan 31, 2021

@7c Yes, the first raw response is definitely not valid because of the '42'. Are you saying the second raw response results in a parse error as well?

@furkanmustafa
Copy link

Hitting similar issues with nodejs & stupid Imperva WAF now.

@mscdex 42 there is the chunk length indicator for http chunked response I think. There should have been an empty line before that though, might be a copy&paste issue.

BaochengSu added a commit to BaochengSu/node that referenced this issue Jul 14, 2022
Ported from
OpenSUSE:nodejs8-8.17.0-lp152.147.1:CVE-2019-15605.patch

Original commit message:

commit e2c8f89
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Thu Jan 16 11:55:52 2020 -0800

    test: using TE to smuggle reqs is not possible

    See: https://hackerone.com/reports/735748

    PR-URL: https://github.com/nodejs-private/node-private/pull/192
    Reviewed-By: Beth Griggs <Bethany.Griggs@uk.ibm.com>

commit 49f4220
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Tue Feb 4 10:36:57 2020 -0800

    deps: upgrade http-parser to v2.9.3

    PR-URL: https://github.com/nodejs-private/http-parser-private/pull/4
    Reviewed-By: Matteo Collina <matteo.collina@gmail.com>
    Reviewed-By: James M Snell <jasnell@gmail.com>
    Reviewed-By: Sam Roberts <vieuxtech@gmail.com>

commit d616722
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Tue Jan 7 14:24:54 2020 -0800

    test: check that --insecure-http-parser works

    Test that using --insecure-http-parser will disable validation of
    invalid characters in HTTP headers.

    See:
    - nodejs#30567

    Backport-PR-URL: nodejs#30471
    PR-URL: nodejs#31253
    Reviewed-By: Richard Lau <riclau@uk.ibm.com>
    Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de>

commit a9849c0
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Wed Nov 20 11:48:58 2019 -0800

    http: opt-in insecure HTTP header parsing

    Allow insecure HTTP header parsing. Make clear it is insecure.

    See:
    - nodejs#30553
    - nodejs#27711 (comment)
    - nodejs#30515

    Backport-PR-URL: nodejs#30471
    PR-URL: nodejs#30567
    Reviewed-By: Fedor Indutny <fedor.indutny@gmail.com>
    Reviewed-By: Anna Henningsen <anna@addaleax.net>
    Reviewed-By: Denys Otrishko <shishugi@gmail.com>
    Reviewed-By: James M Snell <jasnell@gmail.com>

commit a28e5cc
Author: Sam Roberts <vieuxtech@gmail.com>
Date:   Wed Nov 13 10:05:38 2019 -0800

    deps: upgrade http-parser to v2.9.1

    PR-URL: nodejs#30471
    Reviewed-By: James M Snell <jasnell@gmail.com>
    Reviewed-By: Jiawen Geng <technicalcute@gmail.com>
    Reviewed-By: Richard Lau <riclau@uk.ibm.com>
    Reviewed-By: Beth Griggs <Bethany.Griggs@uk.ibm.com>

Signed-off-by: Su Baocheng <baocheng.su@siemens.com>
@jaiganesh1991
Copy link

For anyone coming here for spring boot app and Istio service mesh, check your headers where there was a trailing space to the header like "application-name ". Rename the header to "application-name" and it will work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
http Issues or PRs related to the http subsystem.
Projects
None yet
Development

No branches or pull requests