-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactoring of robotparser-rs #20
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the updates!
I was also going to update the url dependency to 2.0, but this is not yet possible because of the reqwest dependency.
Since this is a complete rewrite maybe we can follow last request version v0.10.0-alpha.1 and also update url crate to the v2 and see if everything goes well?
Just thinking while seeing ParseResult
maybe we could add some parse error tests wdyt?
Looking forward to see this merged <3
@@ -8,7 +8,8 @@ license = "MIT" | |||
name = "robotparser" | |||
readme = "README.md" | |||
repository = "https://github.com/messense/robotparser-rs" | |||
version = "0.10.2" | |||
version = "0.11.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually I let the maintainer decide the next version release
src/lib.rs
Outdated
None | ||
} | ||
} | ||
// TODO: 3. Unit tests. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you plan to fix those or it is leftovers ?
type Result = RobotsTxtResponse; | ||
fn fetch_robots_txt(&self, origin: Origin) -> Self::Result { | ||
let url = format!("{}/robots.txt", origin.unicode_serialization()); | ||
let url = Url::parse(&url).expect("Unable to parse robots.txt url"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this is not currently tested but maybe we can add a test for that wdyt ?
Updates pushed. |
Separated models.
Writed documentation & unit-tests.
There's a lot of changes. First of all, the models are divided.
Now there is no such thing that one model is engaged in and downloading the robots.txt file, and checking the validity of the rule.
These changes, unfortunately, lead to incompatibility inversely.
I was also going to update the
url
dependency to 2.0, but this is not yet possible because of the reqwest dependency.Either wait for the reqwest update or make a separate crate
robotparser-rs-reqwest
.Code-review is required.