-
-
Notifications
You must be signed in to change notification settings - Fork 514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Speed-up for __contains__ in linear codes #18607
Comments
Branch: u/dlucas/speedup_in_contains |
Commit: |
Author: David Lucas |
New commits:
|
comment:3
An obvious alternative is to use the existing implementation but cache the space Now, your tests are not very convincing: you are testing only one set of parameters over one field. And furthermore, you are testing only for the speed of I quickly wrote a more comprehensive set of tests. Here are some timings (
As can be seen, "sug" does much better than "new" when we are testing codewords, except that the first call is sometimes exorbitantly expensive (for high-rate codes). On non-codewords "new" seems to beat "sug" more or less always, but less so on low-rate codes. |
comment:4
All in all, I think I vote for "new", i.e. the current ticket's solution. The use case for codes would be to often check codeword membership that fails. Also, I'm quite concerned about the exorbitant price for the first call that "sug" has. |
comment:5
It think it's better to write |
comment:6
So if the author has not changed his mind as a consequence of my comments, I give this the green light. All tests pass and documentation builds. (I fixed the bug in the test code) |
This comment has been minimized.
This comment has been minimized.
comment:7
Reviewer name missing |
Reviewer: Johan Sebastian Rosenkilde Nielsen |
Changed branch from u/dlucas/speedup_in_contains to |
CC: @johanrosenkilde
Component: coding theory
Author: David Lucas
Branch/Commit:
6319903
Reviewer: Johan Sebastian Rosenkilde Nielsen
Issue created by migration from https://trac.sagemath.org/ticket/18607
The text was updated successfully, but these errors were encountered: