Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow the user to specify a maximum drive speed when ripping #225

Closed
wants to merge 3 commits into from

Conversation

calumchisholm
Copy link
Contributor

Implements #204.

Changed:

  • Adds the command-line parameter --speed (or -s).
  • Adds max_read_speed as a per-drive setting in whipper.conf. If set, the value is reported when running whipper drive list.
  • Modified the calculation used to determine the speed of a rip. Previously, this was simply the duration of the track (in seconds) divided by the time taken to read it from disc (also in seconds). The speed is now calculated as multiples of cd-rom drive speeds [1]. This produces marginally different values from before, but is perhaps now more closely aligned with cd-paranoia's --force-read-speed option. I'm not precious about this change though, so feel free to revert.

Unchanged:

  • This setting is ignored when running whipper offset find - only when ripping / verifying. Is there any benefit to be had in throttling at this stage of the workflow?

Notes:

  • This speed value is passed through directly to cd-paranoia's --force-read-speed option. cd-paranoia only appears to allow integer values, but I haven't enforced this in my code.
  • This argument is only a request that the read speed is throttled - it will be ignored by cd-paranoia for some drive models.
  • I'm still new to python, so code review comments are welcomed. Most of this is copy/pasted from the existing code for the drive offset config, so hopefully it's not too broken.

[1] A single-speed cd-rom drive reads data at a rate of 153600 bytes per second, so speed can be calculated as: .wav file size / (read time in seconds * 153600)

:param model:
:param release:
:param offset:
:param vendor: Drive vendor
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it'd be worth simply deleting the :param vendor: etc lines wholesale. The meaning of the parameters is already quite clear from context.

@@ -115,6 +115,22 @@ def getReadOffset(self, vendor, model, release):
raise KeyError("Could not find read_offset for %s/%s/%s" % (
vendor, model, release))

def getMaxReadSpeed(self, vendor, model, release):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you don't need to define a custom method here to get the parameter out of the config file: the generic read-cli-opts-from-the-config code will automatically search for speed in the config file and use that as a default for the -s arg if it's found.

@RecursiveForest
Copy link
Contributor

Overall: NAK. I don't think adding another parameter to whipper to configure an underlying program is worth it. I think the correct solution is to add a configuration option for each executable whipper calls to allow setting options directly. See #244

@calumchisholm
Copy link
Contributor Author

In general I agree that adding explicit support for each and every available config option isn't a road we want to go down.

I think an important distinction here is that this is a per-drive setting. A user may quite reasonably want to limit their read speed when using their (difficult/expensive to replace) slot-loading drive, but happily thrash a cheap OEM USB device. This unfortunately means that workarounds like adding a flag into the user's local copy of the code aren't really a viable solution.

Is support for device-specific settings going to be within the scope of #244? I can imagine it may not be something that's straightforward to implement cleanly.

@RecursiveForest
Copy link
Contributor

I hadn't thought of it, but per-drive options are totally in the scope of #244. Should be very easy to implement.

@JoeLametta
Copy link
Collaborator

Closed because of #244.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants