Skip to content

Commit

Permalink
chore: update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
pin705 committed Jul 17, 2024
1 parent d8b0ef2 commit e587226
Show file tree
Hide file tree
Showing 6 changed files with 390 additions and 475 deletions.
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@

<!-- /automd -->

This is my package description.
A simple tool to scrape Cloudflare clearance cookies (cf_clearance) from websites with Cloudflare challenges.

## Usage

Expand Down Expand Up @@ -36,24 +36,24 @@ bun install @pinjs/cf-scraper-bypass

Import:

<!-- automd:jsimport cjs cdn name="pkg" -->
<!-- automd:jsimport cjs cdn name="@pinjs/cf-scraper-bypass" -->

**ESM** (Node.js, Bun)

```js
import {} from "pkg";
import {} from "@pinjs/cf-scraper-bypass";
```

**CommonJS** (Legacy Node.js)

```js
const {} = require("pkg");
const {} = require("@pinjs/cf-scraper-bypass");
```

**CDN** (Deno, Bun and Browsers)

```js
import {} from "https://esm.sh/pkg";
import {} from "https://esm.sh/@pinjs/cf-scraper-bypass";
```

<!-- /automd -->
Expand All @@ -76,11 +76,11 @@ import {} from "https://esm.sh/pkg";

<!-- automd:contributors license=MIT -->

Published under the [MIT](https://github.com/unjs/@pinjs/cf-scraper-bypass/blob/main/LICENSE) license.
Made by [community](https://github.com/unjs/@pinjs/cf-scraper-bypass/graphs/contributors) 💛
Published under the [MIT](https://github.com/pin705/cf-scraper-bypass/blob/main/LICENSE) license.
Made by [community](https://github.com/pin705/cf-scraper-bypass/graphs/contributors) 💛
<br><br>
<a href="https://github.com/unjs/@pinjs/cf-scraper-bypass/graphs/contributors">
<img src="https://contrib.rocks/image?repo=unjs/@pinjs/cf-scraper-bypass" />
<a href="https://github.com/pin705/cf-scraper-bypass/graphs/contributors">
<img src="https://contrib.rocks/image?repo=pin705/cf-scraper-bypass" />
</a>

<!-- /automd -->
Expand Down
10 changes: 6 additions & 4 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -41,13 +41,15 @@
"eslint": "^9.1.1",
"eslint-config-unjs": "^0.3.0-rc.7",
"jiti": "^1.21.0",
"puppeteer": "^22.13.0",
"puppeteer-extra": "^3.3.6",
"puppeteer-extra-plugin-stealth": "^2.11.2",
"tough-cookie": "^4.1.4",
"typescript": "^5.4.5",
"unbuild": "^2.0.0",
"vitest": "^1.5.3"
},
"dependencies": {
"puppeteer": "^22.13.0",
"puppeteer-extra": "^3.3.6",
"puppeteer-extra-plugin-stealth": "^2.11.2",
"tough-cookie": "^4.1.4"
},
"packageManager": "pnpm@9.0.6"
}
21 changes: 15 additions & 6 deletions playground/index.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,19 @@
import Scraper from "../src";

const scraper = new Scraper(false, false, "/usr/bin/chromium-browser", false);
runScraper()
const scraper = new Scraper({
headless: false,
skip_chromium_download: false,
chromium_path: "/usr/bin/chromium-browser",
wait_for_network_idle: false,
PUP_TIMEOUT: 16_000,
});

runScraper();
function runScraper() {
scraper.proxy("https://google.com", {
query: { foo: "bar" },
headers: { "User-Agent": "Mozilla/5.0" },
}).then((res) => console.log(res));
scraper
.proxy("https://google.com", {
query: { foo: "bar" },
headers: { "User-Agent": "Mozilla/5.0" },
})
.then((res) => console.log(res));
}
Loading

0 comments on commit e587226

Please sign in to comment.