You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Peer-to-peer Signet repos are sufficient for distributing attestations when they state positive facts about a resource ("this resource is safe to use!"), but when negative signals are necessary ("this resource has a huge security hole!"), there exists an attack. A MITM attacker could send you an old version of a repository from before a negative attestation was made.
While Signet is starting out, most useful attestations will be positive, and in some use cases there may be other factors that prevent such an attack. However, it is valuable to provide a solution to this problem, because it is only a matter of time before attestations need to be amended/revoked.
To solve this problem, we need a way to determine that the information in a repository is fresh. The repo.json in a repository is wrapped in an attestation, which already contains a timestamp embedded within the gpg signature. For individual repositories uploaded to webservers, we shouldn't expect the signature to be very fresh; it's probably only getting updated intermittently when new attestations are created. We need a signature and timestamp that updates much more frequently, on the order of hours or days.
The best solution I can think of is to introduce servers to the Signet ecosystem. Sig servers should allow attestations to be published to them, and return a fresh attestation (using a server keypair) of their contents when queried. This solves a problem similar to OCSP stapling and TUF's timestamp role. In addition, servers could implement search queries over their set of attestations. This will be necessary for scaling Signet as the global set of signatures grows large. Each signature is currently ~.5k, so 1M signatures = 500mb.
In general, servers should only need to be trustworthy about returning all of the information sent to them -- that is, not dropping anything -- verifying the actual attestations happens via the web of trust. Perhaps the end goal is a set of volunteer servers which propagate their sets of attestations via gossip. Organizations could also stand up their own servers for the purpose of offering their own attestations reliably.
Again, not all use-cases will require servers or the assurances they provide, but they are a necessary component for the network to support negative attestations. We'll need to give thought to managing this added complexity without overly complicating the UX of sig.
sig-server
Accept and store publishes from sig clients
Query endpoint for looking up attestations by keyid and identifier
Timestamp data by returning an attestation in response to fetches / queries
IP ratelimit on publishes
Max size limit on attestations
Feed of recent changes for replication/gossip?
sig
Implement sig publish to send local attestations to server
sig verify --remote command to consult servers
Add freshness requirement policy settings for repos
Implement sensible defaults for repo freshness
Offer to add a server if one doesn't exist
Warn users when they don't have a server repo set up
Abort sig verify if no repos are fresher than 1 day? (configurable?)
The text was updated successfully, but these errors were encountered:
Peer-to-peer Signet repos are sufficient for distributing attestations when they state positive facts about a resource ("this resource is safe to use!"), but when negative signals are necessary ("this resource has a huge security hole!"), there exists an attack. A MITM attacker could send you an old version of a repository from before a negative attestation was made.
While Signet is starting out, most useful attestations will be positive, and in some use cases there may be other factors that prevent such an attack. However, it is valuable to provide a solution to this problem, because it is only a matter of time before attestations need to be amended/revoked.
To solve this problem, we need a way to determine that the information in a repository is fresh. The
repo.json
in a repository is wrapped in an attestation, which already contains a timestamp embedded within the gpg signature. For individual repositories uploaded to webservers, we shouldn't expect the signature to be very fresh; it's probably only getting updated intermittently when new attestations are created. We need a signature and timestamp that updates much more frequently, on the order of hours or days.The best solution I can think of is to introduce servers to the Signet ecosystem. Sig servers should allow attestations to be published to them, and return a fresh attestation (using a server keypair) of their contents when queried. This solves a problem similar to OCSP stapling and TUF's timestamp role. In addition, servers could implement search queries over their set of attestations. This will be necessary for scaling Signet as the global set of signatures grows large. Each signature is currently ~.5k, so 1M signatures = 500mb.
In general, servers should only need to be trustworthy about returning all of the information sent to them -- that is, not dropping anything -- verifying the actual attestations happens via the web of trust. Perhaps the end goal is a set of volunteer servers which propagate their sets of attestations via gossip. Organizations could also stand up their own servers for the purpose of offering their own attestations reliably.
Again, not all use-cases will require servers or the assurances they provide, but they are a necessary component for the network to support negative attestations. We'll need to give thought to managing this added complexity without overly complicating the UX of
sig
.sig-server
sig
clientskeyid
andidentifier
sig
sig publish
to send local attestations to serversig verify --remote
command to consult serverssig verify
if no repos are fresher than 1 day? (configurable?)The text was updated successfully, but these errors were encountered: