Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactoring of robotparser-rs #20

Merged
merged 3 commits into from
Jan 31, 2020
Merged

Refactoring of robotparser-rs #20

merged 3 commits into from
Jan 31, 2020

Conversation

svmk
Copy link
Contributor

@svmk svmk commented Oct 2, 2019

Separated models.
Writed documentation & unit-tests.

There's a lot of changes. First of all, the models are divided.
Now there is no such thing that one model is engaged in and downloading the robots.txt file, and checking the validity of the rule.
These changes, unfortunately, lead to incompatibility inversely.
I was also going to update the url dependency to 2.0, but this is not yet possible because of the reqwest dependency.
Either wait for the reqwest update or make a separate crate robotparser-rs-reqwest.

Code-review is required.

@messense messense requested review from messense and spk October 11, 2019 01:50
Copy link
Collaborator

@spk spk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the updates!

I was also going to update the url dependency to 2.0, but this is not yet possible because of the reqwest dependency.

Since this is a complete rewrite maybe we can follow last request version v0.10.0-alpha.1 and also update url crate to the v2 and see if everything goes well?

Just thinking while seeing ParseResult maybe we could add some parse error tests wdyt?

Looking forward to see this merged <3

@@ -8,7 +8,8 @@ license = "MIT"
name = "robotparser"
readme = "README.md"
repository = "https://github.com/messense/robotparser-rs"
version = "0.10.2"
version = "0.11.0"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Usually I let the maintainer decide the next version release

src/lib.rs Outdated
None
}
}
// TODO: 3. Unit tests.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you plan to fix those or it is leftovers ?

type Result = RobotsTxtResponse;
fn fetch_robots_txt(&self, origin: Origin) -> Self::Result {
let url = format!("{}/robots.txt", origin.unicode_serialization());
let url = Url::parse(&url).expect("Unable to parse robots.txt url");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know this is not currently tested but maybe we can add a test for that wdyt ?

@svmk
Copy link
Contributor Author

svmk commented Jan 29, 2020

Updates pushed.

@messense messense merged commit 2d19755 into messense:master Jan 31, 2020
This was referenced Jan 31, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants