Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deal Observer: initial implementation (deal activation events) #172

Closed
Tracked by #144
bajtos opened this issue Oct 7, 2024 · 2 comments
Closed
Tracked by #144

Deal Observer: initial implementation (deal activation events) #172

bajtos opened this issue Oct 7, 2024 · 2 comments
Assignees

Comments

@bajtos
Copy link
Member

bajtos commented Oct 7, 2024

The initial implementation of the design created in #171

This task includes implementing non-functional requirements, like observability, Sentry error reporting, automated tests & CI checks, automated deployment to Fly.io, and so on.

Remaining tasks:

@bajtos bajtos changed the title Deal Observer: deal activation events Deal Observer: initial implementation (deal activation events) Oct 7, 2024
@bajtos bajtos mentioned this issue Oct 7, 2024
31 tasks
This was referenced Jan 6, 2025
@bajtos
Copy link
Member Author

bajtos commented Jan 6, 2025

For Spark v1.5's scope, we need to find the simplest solution to get started. We don't have enough time to fully research & design-spec a solution that can handle all events like deal expiration and slashing.

I propose the following design to get us started:

  • Observe claim events only.
  • Calculate the deal expiration time (expires_at) as term-start (epoch) + term-min (duration in epochs). This will often be inaccurate, and we will stop testing deals that are still active, but it makes it easy for us to start somewhere.
  • Getting the rest of the deal metadata should be trivial.
Spark column Event metadata
client_id 'f0' + metadata.client
miner_id 'f0' + metadata.provider
piece_cid metadata['piece-cid']
piece_size metadata['piece-size']
expires_at see above
payload_cid NULL (will be filled later)

Notes:

  • We can store the miner & client ID in the numeric form and let the component backfilling Spark v1 eligible deals handle the conversion to the f0 address.
  • We can also store the raw term fields (term-start, term-min and term-max) and let the component backfilling Spark v1 eligible deals to calculate the expires_at value for each deal.
  • We will almost certainly need to look up eligible deals by sector ID to handle sector slashing & expiration and also when troubleshooting. Therefore, I strongly suggest including sector in the initial database schema.

References

Example transaction activating DDO (non-f05) deals:
https://beryx.io/fil/mainnet/txs/bafy2bzacebtvjy6fkzso6zlwj7g3rpz5qd2hfncbidw2wth7jjfynxdtlcrb2?tab=events

Example implementation of a Node.js service monitoring builtin-actor events:
https://github.com/rvagg/spacemon/

Example claim event metadata:

{
  "height": 4131302,
  "tipset_cid": "bafy2bzaceant5slrmah4q7w4tul6rpz7go7db4hksq4wtz5loq7xxzsbtixs6",
  "id": "e7388ac6-44a8-53d8-a2c7-2808b50f458a",
  "tx_cid": "bafy2bzacebtvjy6fkzso6zlwj7g3rpz5qd2hfncbidw2wth7jjfynxdtlcrb2",
  "log_index": 177,
  "emitter": "f06",
  "type": "native",
  "selector_id": "claim",
  "selector_sig": "",
  "reverted": false,
  "metadata": {
    "0": {
      "flags": 3,
      "key": "$type",
      "value": "claim"
    },
    "1": {
      "flags": 3,
      "key": "id",
      "value": 69956823
    },
    "2": {
      "flags": 3,
      "key": "client",
      "value": 3138382
    },
    "3": {
      "flags": 3,
      "key": "provider",
      "value": 3072985
    },
    "4": {
      "flags": 3,
      "key": "piece-cid",
      "value": {
        "/": "baga6ea4seaqeaboqmwmwm7i6nr5alwtdi6xy43p6iay2xs7qee5ghyakpwelofy"
      }
    },
    "5": {
      "flags": 1,
      "key": "piece-size",
      "value": "34359738368"
    },
    "6": {
      "flags": 1,
      "key": "term-min",
      "value": 1051200
    },
    "7": {
      "flags": 1,
      "key": "term-max",
      "value": 5256000
    },
    "8": {
      "flags": 1,
      "key": "term-start",
      "value": 4131302
    },
    "9": {
      "flags": 3,
      "key": "sector",
      "value": 37710
    }
  },
  "canonical": true,
  "search_id": "NDEzMTMwMi9iYWZ5MmJ6YWNlYW50NXNscm1haDRxN3c0dHVsNnJwejdnbzdkYjRoa3NxNHd0ejVsb3E3eHh6c2J0aXhzNi9iYWZ5MmJ6YWNlYnR2ank2Zmt6c282emx3ajdnM3JwejVxZDJoZm5jYmlkdzJ3dGg3ampmeW54ZHRsY3JiMi9uYXRpdmUvMTc3L2U3Mzg4YWM2LTQ0YTgtNTNkOC1hMmM3LTI4MDhiNTBmNDU4YQ=="
}

@bajtos
Copy link
Member Author

bajtos commented Jan 31, 2025

Closing as done. I also opened a follow-up issue to fix what we discovered in real data today: CheckerNetwork/spark-deal-observer#67

@bajtos bajtos closed this as completed Jan 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants