Node app for receiving a set of keys and URLs, store the URLs on S3 and return the set of keys with S3 (or CouldFront) URLs.
API will return 200 OK, but errors may occur during requests. The reason for this is that we'll start sending data to client right away, to keep connection open and stop Heroku from killing us. We will know at a later point in time if some URLs fails or not and the status is serialized in the JSON response. It will either be "ok", "error", or "timeout".
API request sanity validations will return 422 as they happen at the very beginning of each request.
A request to the API should behave in a transactional manner, meaning that either all given URLs are successfully uploaded, or non will be stored on S3. We will try and clean any uploaded files to S3 if other files fail.
In production all requests must be sent over https due to credentials being passed around. Please
see ENV variables REQUIRE_SSL
which should be true in production, and BEHIND_PROXY
if you for instance
are deploing on Heroku. You should also set BASIC_AUTH_USER
and BASIC_AUTH_PASSWORD
to restrict
access to your API.
- Give key-value pairs of URLs to download, store on S3 and return URLs for.
- Available options
awsAccessKeyId
AWS access keyawsSecretAccessKey
AWS access secrets3Bucket
AWS bucket you want files uploaded tos3Region
AWS region you want files uploaded tocloudfrontHost
AWS cloud front, if any.
- Available HTTP headers
Tag-Logs-With
A string you want this request to be tagged with. For instanceiweb prod asset-123
will log as[iweb] [prod] [asset-123]
{
"urls": {
"thumb": "http://www.filepicker.com/api/XXX/convert/thumb",
"monitor": "http://www.filepicker.com/api/XXX/convert/monitor"
},
"options": {
"awsAccessKeyId": "xxx",
"awsSecretAccessKey": "xxx",
"s3Bucket": "xxx",
"s3Region": "xxx",
"cloudfrontHost": "xxx" # Optional
}
}
- Status is
ok
- All URLs are swapped out for stored URLs.
{
"status": "ok",
"urls": {
"thumb": "http://s3.com/sha1-of-thumb-url",
"monitor": "http://s3.com/sha1-of-monitor-url"
}
}
- Status is
error
- Keys with
null
was ok, but is cleaned from S3 due to other version failed. - Keys with an object includes information about the response.
{
"status": "error",
"urls": {
"thumb": null,
"monitor": {
"downloadResponse": {
"status": 502,
"body": "Bad Gateway"
}
}
}
}
- Status is
error
- Keys with
null
was ok, but is cleaned from S3 due to other version failed. - Keys with an object includes information about the s3 error.
{
"status": "error",
"urls": {
"thumb": null,
"monitor": {
"s3": "Some message or object(!) from s3 when we tried to upload this file"
}
}
}
- Status is
timeout
due to max keep alive time exceeded. Seelib/middleware/keep_alive.coffee
andENV
variablesKEEP_ALIVE_WAIT_SECONDS
andKEEP_ALIVE_MAX_ITERATIONS
. - Any uploads to S3 we have done will be cleaned.
{
"status": "timeout"
}
- The
/delete
action is more of a convenience action. I guess you applicaiton language have an AWS SDK available and you could potentially use that directly. If you feel like it's just as easy to make a DELETE call to the S3 Storage API feel free to do so. - Give array of URLs to delete.
- Available options
awsAccessKeyId
AWS access keyawsSecretAccessKey
AWS access secrets3Bucket
AWS bucket you want files uploaded tos3Region
AWS region you want files uploaded to
- Available HTTP headers
Tag-Logs-With
A string you want this request to be tagged with. For instanceiweb prod asset-123
will log as[iweb] [prod] [asset-123]
{
"urls": [
"http://file.in.your.s3.bucket.com/object1",
"http://file.in.your.s3.bucket.com/object2"
],
"options": {
"awsAccessKeyId": "xxx",
"awsSecretAccessKey": "xxx",
"s3Bucket": "xxx",
"s3Region": "xxx",
}
}
- Status is
ok
{
"status": "ok"
}
- Status is
error
{
"status": "error",
"description": "Some explantion of the error."
}
Errors isn't likely to happen, as we do not actually check if bucket has given URLs / objects. We just make a deleteObjects call to S3 and expect S3 to remove given object keys.
npm install
nodemon --exec coffee bin/www
Tests are written using Mocha and Chai expect syntax style. We use Sinon for test utilities and SuperTest for integration tests.
Run npm test
when you want to run all tests. Run npm run test-unit
to only run the unit tests,
and npm run test-integration
to only run the integration tests.
You can also run mocha path/to/test
if you want to run a specific test.
In our tests some ENV variables are important. They all start with TEST_*
and you find examples in .envrc.example
. You need to create and configure your own bucket
for integration testing.
Is may deployed on Heroku. Do the normal git push heroku master
, or deploy to other servers
you feel comfortable with.