-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Improve complexity of CompactAssignments::unique_targets #8314
Conversation
Original implementation was O(n**2). Current impl is O(n log n). Avoided the original proposed mitigation because it does not retain the de-duplicating property present in the original implementation. This implementation does a little more work, but retains that property.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
The deduplication is absolutely important, and I believe we should have tests for it. If not, this is the time to add one. I see that while using |
If we had access to a deterministic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was wrong, the imports need to be sorted.
Ensures that the macro still works if someone uses it in a context in which sp_std is not imported or is renamed.
let mut all_targets: Vec<Self::Target> = Vec::with_capacity(self.average_edge_count()); | ||
use _npos::sp_std::collections::btree_set::BTreeSet; | ||
|
||
let mut all_targets: BTreeSet<Self::Target> = BTreeSet::new(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't remember, we do have proper test for this unique_targets
, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, AFAICT this crate's macros don't generate any test functions, and we don't test them with an example case. Would you like me to add that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm let me see. There must be some tests in the root level crate, otherwise I've been a very bad coder.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are some instances of uniqueue_targets
there, I presume that back then I found them enough, but would be good if you ensure they are safe and sound again.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh cool! I'll look over them there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO unique_targets_len_edge_count_works
sufficiently exercises unique_targets
: in two different instances, it provides non-unique inputs, and validate that unique_targets
unifies them.
bot merge |
Trying merge. |
What was the audit conclusion here? |
AFAIU you wrote the best formulation of the policy: "generally it should be audited again, except such cases where the change is trivial as well." The heart of this PR is +5/-6 which just swaps a vector for a BTreeSet; I think that counts as trivial. |
…8314) * Improve complexity of CompactAssignments::unique_targets Original implementation was O(n**2). Current impl is O(n log n). Avoided the original proposed mitigation because it does not retain the de-duplicating property present in the original implementation. This implementation does a little more work, but retains that property. * Explicitly choose sp_std Vec and BTreeSet Ensures that the macro still works if someone uses it in a context in which sp_std is not imported or is renamed. * explicitly use sp_std vectors throughout compact macro
Original implementation was O(n**2). Current impl is O(n log n).
Avoided the original proposed mitigation because it does not retain
the de-duplicating property present in the original implementation.
This implementation does a little more work, but retains that property.
Mitigates https://github.com/paritytech/srlabs_findings/issues/58.