-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dump zone functionality #41
base: production-namebase
Are you sure you want to change the base?
Conversation
Co-authored-by: James Stevens <github@jrcs.net> Co-authored-by: Mark Tyneway <mark.tyneway@gmail.com>
The
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice thought to stream it to s3, some questions but nothing blocking.
@@ -21,6 +21,8 @@ const Claim = require('../primitives/claim'); | |||
const Address = require('../primitives/address'); | |||
const Network = require('../protocol/network'); | |||
const pkg = require('../pkg'); | |||
const AWS = require('aws-sdk'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are some other options besides requiring this package for the whole full node? Not blocking but not ideal since it's a big dependency and would probably never make it into upstream.
Could this be an optional peer dependency or something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure how peer dependencies work, exactly. I'd considered adding it as a separate plugin.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe aws sdk v2 lets you import only the services you are planning to use as well, that would trim down the loaded lib size abit.
var S3 = require('aws-sdk/clients/s3');
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure how peer dependencies work, exactly. I'd considered adding it as a separate plugin.
Peer dependency is the wrong word since it means something really specific in node.js/npm. I trust aws-sdk but it feels bad adding a 50mb dependency for this one feature. Do you have a clear idea of how this dependency could live in a plugin so hsd
stays clean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A plugin would just need access to the Chain
, which is itself a plugin that it could take a dependency on, I believe. So a dump-zone-to-s3 plugin would pretty much be this code with some boilerplate to create the plugin itself and then we'd need some other way of triggering it than the HTTP endpoint on the Node
plugin interface.
Since it assumes AWS anyway, we could use a queue. If that's too much we could just put up another HTTP interface on a third port. If we wanted to move more of our custom functionality into plugins, it'd be easy to extend that HTTP interface to cover all of them with a namebase meta-plugin that aggregated the calls
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bump @turbomaze @rozaydin
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replying in slack
Key: this.options.s3DumpConfig.key, | ||
Body: dumpzone.readableStream(this.chain) | ||
}, (err, data) => { | ||
// TODO - capture status, do a rename? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not blocking/not requesting this but might be nice for the key to have the timestamp and then to rename one "current" etc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 I'll verify how rename works in S3
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's actually no way to rename something in S3, would have to do a copy then delete. Alternative could be to push files with the timestamp and have some other process to reap ones older than a certain age, but it would require the consumer to list the objects in the bucket and chose the lastest one
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one other alternative might be enabling versioning on s3 bucket by default you will get the timestamp and maangement of old versions provided by aws
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Versioning sounds good 👍
Have refactored zippy's branch to use streams (because it will make uploading to S3 easier) and filtered out TXT records.
In progress: