Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rekor-cli loginfo returns firstSize error #1234

Closed
zosocanuck opened this issue Dec 9, 2022 · 3 comments · Fixed by #1290
Closed

rekor-cli loginfo returns firstSize error #1234

zosocanuck opened this issue Dec 9, 2022 · 3 comments · Fixed by #1290
Labels
bug Something isn't working

Comments

@zosocanuck
Copy link

zosocanuck commented Dec 9, 2022

Hi, I’m running a local rekor instance (latest main branch or v1.0.1) for testing purposes, performed a cosign (v1.13.1) sign operation (pointing to local rekor instance) and the log entry exists. However getting the following error when getting loginfo:

rekor-cli --rekor_server http://localhost:3000/ loginfo
[GET /api/v1/log/proof][422] getLogProof default &{Code:609 Message:firstSize in query should be greater than or equal to 1}

Log output from rekor-server:

2022-12-09T10:18:36.576-0800    ERROR   restapi/configure_rekor_server.go:276   validation failure list:
firstSize in query should be greater than or equal to 1 {"requestID": "07WKSMAC150894.local/eJxvA7R1Lo-000011"}
github.com/sigstore/rekor/pkg/generated/restapi.logAndServeError
        /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:276
github.com/go-openapi/runtime/middleware.(*Context).Respond
        /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/context.go:527
github.com/sigstore/rekor/pkg/generated/restapi/operations/tlog.(*GetLogProof).ServeHTTP
        /Users/ivan.wallis/rekor/pkg/generated/restapi/operations/tlog/get_log_proof.go:67
@zosocanuck zosocanuck added the bug Something isn't working label Dec 9, 2022
@bobcallaway
Copy link
Member

Can you confirm that this only happens when you first run rekor-cli loginfo when the log is empty, and then again after it has at least one record?

@zosocanuck
Copy link
Author

@bobcallaway rekor-cli loginfo looks to report correctly after starting up a fresh instance:

No previous log state stored, unable to prove consistency
Verification Successful!
Active Tree Size:       0
Total Tree Size:        0
Root Hash:              e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
Timestamp:              2022-12-12T02:50:33Z
TreeID:                 2757109831971889720

I then upload a new artifact:

rekor-cli upload --artifact ../../tests/test_file.txt --public-key ../../tests/test_public_key.key --signature ../../tests/test_file.sig --rekor_server http://localhost:3000
Created entry at index 0, available at: http://localhost:3000/api/v1/log/entries/264338c8cd6be238d2f305428d7c222d7b77f56453dd4b6e6851752ecacc78e5992779c8f9b61dd9
./rekor-cli --rekor_server http://localhost:3000 loginfo
[GET /api/v1/log/proof][422] getLogProof default  &{Code:609 Message:firstSize in query should be greater than or equal to 1}

rekor-server:

2022-12-11T18:53:17.117-0800 ERROR restapi/configure_rekor_server.go:276 validation failure list: firstSize in query should be greater than or equal to 1 {"requestID": "07WKSMAC150894.local/ese9zrPgdG-000012"} github.com/sigstore/rekor/pkg/generated/restapi.logAndServeError /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:276 github.com/go-openapi/runtime/middleware.(*Context).Respond /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/context.go:527 github.com/sigstore/rekor/pkg/generated/restapi/operations/tlog.(*GetLogProof).ServeHTTP /Users/ivan.wallis/rekor/pkg/generated/restapi/operations/tlog/get_log_proof.go:67 github.com/go-chi/chi/middleware.NoCache.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-chi/chi@v4.1.2+incompatible/middleware/nocache.go:54 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/sigstore/rekor/pkg/generated/restapi.recordMetricsForAPI.func1.1 /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:222 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-openapi/runtime/middleware.NewOperationExecutor.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/operation.go:28 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-openapi/runtime/middleware.NewRouter.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/router.go:78 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-openapi/runtime/middleware.Redoc.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/redoc.go:72 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-openapi/runtime/middleware.Spec.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-openapi/runtime@v0.24.1/middleware/spec.go:46 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-chi/chi/middleware.RequestLogger.func1.1 /Users/ivan.wallis/go/pkg/mod/github.com/go-chi/chi@v4.1.2+incompatible/middleware/logger.go:46 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-chi/chi/middleware.Recoverer.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-chi/chi@v4.1.2+incompatible/middleware/recoverer.go:37 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-chi/chi/middleware.Heartbeat.func1.1 /Users/ivan.wallis/go/pkg/mod/github.com/go-chi/chi@v4.1.2+incompatible/middleware/heartbeat.go:21 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/sigstore/rekor/pkg/generated/restapi.serveStaticContent.func1 /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:296 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/rs/cors.(*Cors).Handler.func1 /Users/ivan.wallis/go/pkg/mod/github.com/rs/cors@v1.8.2/cors.go:231 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/sigstore/rekor/pkg/generated/restapi.wrapMetrics.func1 /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:266 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/sigstore/rekor/pkg/generated/restapi.setupGlobalMiddleware.func1 /Users/ivan.wallis/rekor/pkg/generated/restapi/configure_rekor_server.go:207 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 github.com/go-chi/chi/middleware.RequestID.func1 /Users/ivan.wallis/go/pkg/mod/github.com/go-chi/chi@v4.1.2+incompatible/middleware/request_id.go:76 net/http.HandlerFunc.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2109 net/http.serverHandler.ServeHTTP /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:2947 net/http.(*conn).serve /usr/local/Cellar/go/1.19.3/libexec/src/net/http/server.go:1991 2022-12-11T18:53:17.117-0800 DEBUG restapi/configure_rekor_server.go:280 map[Body:0xc002e965c0 Cancel:<nil> Close:false ContentLength:0 Form:map[] GetBody:<nil> Header:map[Accept:[application/json] Accept-Encoding:[gzip] User-Agent:[rekor-cli/devel (darwin; amd64)]] Host:localhost:3000 Method:GET MultipartForm:<nil> PostForm:map[] Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 RemoteAddr:127.0.0.1:53886 RequestURI:/api/v1/log/proof?firstSize=0&lastSize=1&treeID=2757109831971889720 Response:<nil> TLS:<nil> Trailer:map[] TransferEncoding:[] URL:/api/v1/log/proof?firstSize=0&lastSize=1&treeID=2757109831971889720] {"requestID": "07WKSMAC150894.local/ese9zrPgdG-000012"}

@bobcallaway
Copy link
Member

Right. If you start with an empty log, and do the insert, and then call loginfo - it should work correctly (I was able to reproduce this locally).

I see the bug, but wanted to make sure I was reproducing the same scenario before proposing a fix.

Thanks for confirming!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants