Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove duplicate blocks from changes_list before import #2777

Merged
merged 2 commits into from
Nov 25, 2019

Conversation

pasqu4le
Copy link
Contributor

Motivation

Solves error arising (mostly started by block_uncle fetcher) when the same block is present twice in the same changes_list

Checklist for your PR

@pasqu4le pasqu4le self-assigned this Oct 15, 2019
@pasqu4le pasqu4le requested review from vbaranov and ayrat555 October 15, 2019 10:04
@pasqu4le pasqu4le added the ready for review This PR is ready for reviews. label Oct 15, 2019
@ayrat555
Copy link
Contributor

are you sure this is only happening with uncles?

@coveralls
Copy link

Pull Request Test Coverage Report for Build e9e300a7-1bec-4456-b257-49e8d1b7bfa7

  • 3 of 3 (100.0%) changed or added relevant lines in 1 file are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage increased (+0.2%) to 77.755%

Totals Coverage Status
Change from base Build ed1f6586-be28-4ce7-aad1-e283ea55d6a6: 0.2%
Covered Lines: 5341
Relevant Lines: 6869

💛 - Coveralls

@pasqu4le
Copy link
Contributor Author

@ayrat555 it looks like it, uncles is the only one where this is happening (from the logs I have). The fix modifies the runner anyway so if this is happening with other block-inserting fetchers as well if will be de-duplicated anyway.

I am looking into why this fetcher is the only one to be affected, as like the other should have the filtering before fetching and should not end up with duplicates during importing

ordered_changes_list = Enum.sort_by(changes_list, & &1.hash)
ordered_changes_list =
changes_list
|> Enum.sort_by(& &1.hash)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Enum.uniq_by?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would work for the duplicates, but they also need to be sorted

@vbaranov vbaranov merged commit 824cc33 into master Nov 25, 2019
@vbaranov vbaranov deleted the pp-remove-duplicate-blocks-before-insertion branch November 25, 2019 13:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready for review This PR is ready for reviews.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants