We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reported at: https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=61620
Detailed Report: https://oss-fuzz.com/testcase?key=4886037758083072
Project: caddy Fuzzing Engine: libFuzzer Fuzz Target: fuzz-tokenize Job Type: libfuzzer_asan_caddy Platform Id: linux
Crash Type: Slice bounds out of range Crash Address: Crash State: caddyfile.(*lexer).next caddyfile.Tokenize
Sanitizer: address (ASAN)
Regressed: https://oss-fuzz.com/revisions?job=libfuzzer_asan_caddy&range=202308190620:202308200626
Reproducer Testcase: https://oss-fuzz.com/download?testcase_id=4886037758083072
Stack trace:
goroutine 17 [running, locked to thread]: -- | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile.(*lexer).next(0x10c00007ad40) | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile/lexer.go:158 +0x1cf5 | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile.Tokenize({0x6020000000b0, 0x3, 0x3}, {0xeffca7, 0x9}) | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile/lexer.go:63 +0x1b1 | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile.FuzzTokenize({0x6020000000b0?, 0x0?, 0x1?}) | github.com/caddyserver/caddy/v2/caddyconfig/caddyfile/lexer_fuzz.go:20 +0x4b
clusterfuzz-testcase-minimized-fuzz-tokenize-4886037758083072.txt
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
Reported at: https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=61620
Detailed Report: https://oss-fuzz.com/testcase?key=4886037758083072
Project: caddy
Fuzzing Engine: libFuzzer
Fuzz Target: fuzz-tokenize
Job Type: libfuzzer_asan_caddy
Platform Id: linux
Crash Type: Slice bounds out of range
Crash Address:
Crash State:
caddyfile.(*lexer).next
caddyfile.Tokenize
Sanitizer: address (ASAN)
Regressed: https://oss-fuzz.com/revisions?job=libfuzzer_asan_caddy&range=202308190620:202308200626
Reproducer Testcase: https://oss-fuzz.com/download?testcase_id=4886037758083072
Stack trace:
clusterfuzz-testcase-minimized-fuzz-tokenize-4886037758083072.txt
The text was updated successfully, but these errors were encountered: