Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the package compatible Node.js and others runtimes #30

Closed
wants to merge 13 commits into from
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
npm
70 changes: 70 additions & 0 deletions build_npm.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import { build, emptyDir } from "https://deno.land/x/dnt@0.40.0/mod.ts";

const version = Deno.args[0];

if (!version) {
throw new Error("Please specify a version.");
}

await emptyDir("./npm");

await build({
entryPoints: ["./mod.ts"], // Replace with your actual entry point
outDir: "./npm",
testPattern: "**/*(*.test|integration).{ts,tsx,js,mjs,jsx}",
shims: {
deno: {
test: "dev",
},
},
compilerOptions: {
lib: ["ESNext", "DOM"],
},
mappings: {
"node:stream/web": {
name: "node:stream/web",
},
},
package: {
name: "s3-lite-client",
version: version,
description: "This is a lightweight S3 client for Node.js and Deno.",
license: "MIT",
repository: {
type: "git",
url: "git+https://github.com/bradenmacdonald/deno-s3-lite-client.git",
},
bugs: {
url: "https://github.com/bradenmacdonald/deno-s3-lite-client/issues",
},
engines: {
"node": ">=20",
},
author: {
"name": "Braden MacDonald",
"url": "https://github.com/bradenmacdonald",
},
contributors: [
"Martin Donadieu <martindonadieu@gmail.com> (https://martin.solos.ventures/)",
],
devDependencies: {
"@types/node": "^20.11.1",
},
keywords: [
"api",
"lite",
"amazon",
"minio",
"cloud",
"s3",
"storage",
],
},
postBuild() {
// Copy additional files to the npm directory if needed
Deno.copyFileSync("LICENSE", "npm/LICENSE");
Deno.copyFileSync("README.md", "npm/README.md");
},
});

console.log("Build complete. Run `cd npm && npm publish && cd ..`.");
14 changes: 12 additions & 2 deletions client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -648,10 +648,20 @@ export class Client {
if (typeof streamOrData === "string") {
// Convert to binary using UTF-8
const binaryData = new TextEncoder().encode(streamOrData);
stream = ReadableStream.from([binaryData]);
stream = new ReadableStream({
start(controller) {
controller.enqueue(binaryData);
controller.close();
},
});
size = binaryData.length;
} else if (streamOrData instanceof Uint8Array) {
stream = ReadableStream.from([streamOrData]);
stream = new ReadableStream({
start(controller) {
controller.enqueue(streamOrData);
controller.close();
},
});
size = streamOrData.byteLength;
} else if (streamOrData instanceof ReadableStream) {
stream = streamOrData;
Expand Down
13 changes: 11 additions & 2 deletions deno.jsonc
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
{
"fmt": {
"lineWidth": 120
"lineWidth": 120,
"exclude": ["npm/"]
},
"lock": false
"lint": {
"exclude": ["npm/"]
},
"lock": false,
"tasks": {
"test-integration": "deno test --allow-net integration.ts",
"npm-build": "deno run -A --no-check build_npm.ts",
"npm-publish": "cd npm && npm publish && cd .."
}
}
13 changes: 8 additions & 5 deletions integration.ts
Original file line number Diff line number Diff line change
Expand Up @@ -91,11 +91,14 @@ Deno.test({
name: "putObject() can stream a large file upload",
fn: async () => {
// First generate a 32MiB file in memory, 1 MiB at a time, as a stream
const dataStream = ReadableStream.from(async function* () {
for (let i = 0; i < 32; i++) {
yield new Uint8Array(1024 * 1024).fill(i % 256); // Yield 1MB of data
}
}());
const dataStream = new ReadableStream({
start(controller) {
for (let i = 0; i < 32; i++) {
controller.enqueue(new Uint8Array(1024 * 1024).fill(i % 256)); // Yield 1MB of data
}
controller.close();
},
});

// Upload the 32MB stream data as 7 5MB parts. The client doesn't know in advance how big the stream is.
const key = "test-32m.dat";
Expand Down
2 changes: 2 additions & 0 deletions mod.ts
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
import "./node_shims.ts";

export { Client as S3Client } from "./client.ts";
export * as S3Errors from "./errors.ts";
25 changes: 25 additions & 0 deletions node_shims.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
if (!("ReadableStream" in globalThis) || !("TransformStream" in globalThis) || !("WritableStream" in globalThis)) {
(async () => {
const { ReadableStream, TransformStream, WritableStream } = await import("node:stream/web");
Object.defineProperties(globalThis, {
"ReadableStream": {
value: ReadableStream,
writable: true,
enumerable: false,
configurable: true,
},
"TransformStream": {
value: TransformStream,
writable: true,
enumerable: false,
configurable: true,
},
"WritableStream": {
value: WritableStream,
writable: true,
enumerable: false,
configurable: true,
},
});
})();
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this ^ merging into globals will be done automatically by dnt if you specify it in the DNT config shims.custom:

  shims: {
    deno: {
      test: "dev",
    },
    custom: [{
      package: {
        name: "node:stream/web",
      },
      globalNames: [
        "ReadableStream",
        "WritableStream",
        "TransformStream",
      ],
    }],
  },

See denoland/dnt#362 and this integration test which checks that this approach works.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did it here, and it does work on Node.js env but break in Workers env.
That why I implemented as oak project does (made by a Deno maintainer)

11 changes: 5 additions & 6 deletions signing.ts
Original file line number Diff line number Diff line change
Expand Up @@ -153,13 +153,12 @@ function getHeadersToSign(headers: Headers): string[] {
"content-type",
"user-agent",
];
const headersToSign = [];
for (const key of headers.keys()) {
if (ignoredHeaders.includes(key.toLowerCase())) {
continue; // Ignore this header
const headersToSign: string[] = [];
headers.forEach((_value, key) => {
if (!ignoredHeaders.includes(key.toLowerCase())) {
headersToSign.push(key);
}
headersToSign.push(key);
}
});
headersToSign.sort();
return headersToSign;
}
Expand Down
2 changes: 1 addition & 1 deletion transform-chunk-sizes.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { TransformChunkSizes } from "./transform-chunk-sizes.ts";
*/
class NumberSource extends ReadableStream<Uint8Array> {
constructor(delayMs: number, chunksCount: number, bytesPerChunk = 1) {
let intervalTimer: number;
let intervalTimer: ReturnType<typeof setTimeout>;
let i = 0;
super({
start(controller) {
Expand Down