Skip to content

Commit

Permalink
chore(scrape): fix visited_links storing
Browse files Browse the repository at this point in the history
  • Loading branch information
j-mendez committed Aug 14, 2024
1 parent 4788c96 commit 89d3ddd
Show file tree
Hide file tree
Showing 7 changed files with 25 additions and 21 deletions.
6 changes: 3 additions & 3 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion spider/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "spider"
version = "2.0.2"
version = "2.0.3"
authors = [
"j-mendez <jeff@a11ywatch.com>"
]
Expand Down
24 changes: 12 additions & 12 deletions spider/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ This is a basic async example crawling a web page, add spider to your `Cargo.tom

```toml
[dependencies]
spider = "2.0.2"
spider = "2.0.3"
```

And then the code:
Expand Down Expand Up @@ -93,7 +93,7 @@ We have the following optional feature flags.

```toml
[dependencies]
spider = { version = "2.0.2", features = ["regex", "ua_generator"] }
spider = { version = "2.0.3", features = ["regex", "ua_generator"] }
```

1. `ua_generator`: Enables auto generating a random real User-Agent.
Expand Down Expand Up @@ -139,7 +139,7 @@ Move processing to a worker, drastically increases performance even if worker is

```toml
[dependencies]
spider = { version = "2.0.2", features = ["decentralized"] }
spider = { version = "2.0.3", features = ["decentralized"] }
```

```sh
Expand Down Expand Up @@ -170,7 +170,7 @@ Use the subscribe method to get a broadcast channel.

```toml
[dependencies]
spider = { version = "2.0.2", features = ["sync"] }
spider = { version = "2.0.3", features = ["sync"] }
```

```rust,no_run
Expand Down Expand Up @@ -201,7 +201,7 @@ Allow regex for blacklisting routes

```toml
[dependencies]
spider = { version = "2.0.2", features = ["regex"] }
spider = { version = "2.0.3", features = ["regex"] }
```

```rust,no_run
Expand All @@ -228,7 +228,7 @@ If you are performing large workloads you may need to control the crawler by ena

```toml
[dependencies]
spider = { version = "2.0.2", features = ["control"] }
spider = { version = "2.0.3", features = ["control"] }
```

```rust
Expand Down Expand Up @@ -298,7 +298,7 @@ Use cron jobs to run crawls continuously at anytime.

```toml
[dependencies]
spider = { version = "2.0.2", features = ["sync", "cron"] }
spider = { version = "2.0.3", features = ["sync", "cron"] }
```

```rust,no_run
Expand Down Expand Up @@ -337,7 +337,7 @@ the feature flag [`chrome_intercept`] to possibly speed up request using Network

```toml
[dependencies]
spider = { version = "2.0.2", features = ["chrome", "chrome_intercept"] }
spider = { version = "2.0.3", features = ["chrome", "chrome_intercept"] }
```

You can use `website.crawl_concurrent_raw` to perform a crawl without chromium when needed. Use the feature flag `chrome_headed` to enable headful browser usage if needed to debug.
Expand Down Expand Up @@ -367,7 +367,7 @@ Enabling HTTP cache can be done with the feature flag [`cache`] or [`cache_mem`]

```toml
[dependencies]
spider = { version = "2.0.2", features = ["cache"] }
spider = { version = "2.0.3", features = ["cache"] }
```

You need to set `website.cache` to true to enable as well.
Expand Down Expand Up @@ -398,7 +398,7 @@ Intelligently run crawls using HTTP and JavaScript Rendering when needed. The be

```toml
[dependencies]
spider = { version = "2.0.2", features = ["smart"] }
spider = { version = "2.0.3", features = ["smart"] }
```

```rust,no_run
Expand All @@ -424,7 +424,7 @@ Use OpenAI to generate dynamic scripts to drive the browser done with the featur

```toml
[dependencies]
spider = { version = "2.0.2", features = ["openai"] }
spider = { version = "2.0.3", features = ["openai"] }
```

```rust
Expand All @@ -450,7 +450,7 @@ Set a depth limit to prevent forwarding.

```toml
[dependencies]
spider = { version = "2.0.2" }
spider = { version = "2.0.3" }
```

```rust,no_run
Expand Down
4 changes: 4 additions & 0 deletions spider/src/website.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2109,9 +2109,12 @@ impl Website {

match self.pages.as_mut() {
Some(p) => {

while let Ok(res) = rx2.recv().await {
self.links_visited.insert(res.get_url().into());
p.push(res);
}

}
_ => (),
};
Expand All @@ -2133,6 +2136,7 @@ impl Website {
match self.pages.as_mut() {
Some(p) => {
while let Ok(res) = rx2.recv().await {
self.links_visited.insert(res.get_url().into());
p.push(res);
}
}
Expand Down
4 changes: 2 additions & 2 deletions spider_cli/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "spider_cli"
version = "2.0.2"
version = "2.0.3"
authors = [
"j-mendez <jeff@a11ywatch.com>"
]
Expand Down Expand Up @@ -28,7 +28,7 @@ quote = "1"
failure_derive = "0.1.8"

[dependencies.spider]
version = "2.0.2"
version = "2.0.3"
path = "../spider"

[[bin]]
Expand Down
2 changes: 1 addition & 1 deletion spider_utils/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ edition = "2018"
indexmap = { version = "1", optional = true }

[dependencies.spider]
version = "2.0.2"
version = "2.0.3"
path = "../spider"

[features]
Expand Down
4 changes: 2 additions & 2 deletions spider_worker/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "spider_worker"
version = "2.0.2"
version = "2.0.3"
authors = [
"j-mendez <jeff@a11ywatch.com>"
]
Expand All @@ -24,7 +24,7 @@ lazy_static = "1.4.0"
env_logger = "0.11.3"

[dependencies.spider]
version = "2.0.2"
version = "2.0.3"
path = "../spider"
features = ["serde", "flexbuffers"]

Expand Down

0 comments on commit 89d3ddd

Please sign in to comment.