Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compact/planner: fix issue 6775 #7334

Merged
merged 2 commits into from
May 17, 2024
Merged

compact/planner: fix issue 6775 #7334

merged 2 commits into from
May 17, 2024

Conversation

GiedriusS
Copy link
Member

@GiedriusS GiedriusS commented May 3, 2024

It doesn't make sense to vertically compact downsampled blocks so mark them with the no compact marker if downsampled blocks were detected in the plan. Seems like the Planner is the best place for this logic - I just repeated the previous pattern with the large index file filter.

Closes #6775 .

It doesn't make sense to vertically compact downsampled blocks so mark
them with the no compact marker if downsampled blocks were detected in
the plan. Seems like the Planner is the best place for this logic - I
just repeated the previous pattern with the large index file filter.

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>
@GiedriusS GiedriusS marked this pull request as ready for review May 3, 2024 13:07
Copy link
Contributor

@yeya24 yeya24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of creating a new planner and excluding downsampled blocks from planning time, why we cannot filter out downsampled blocks from metasByMinTime when we pass blocks to the block?

I understand that compaction should only happen for raw blocks. Is there a usecase to compact downsampled blocks or compact downsampled blocks with raw blocks? If we do this, downsampled blocks will turn into raw blocks again, and we need to downsampled it again? I feel this is not ideal.

@GiedriusS
Copy link
Member Author

Instead of creating a new planner and excluding downsampled blocks from planning time, why we cannot filter out downsampled blocks from metasByMinTime when we pass blocks to the block?

I understand that compaction should only happen for raw blocks. Is there a usecase to compact downsampled blocks or compact downsampled blocks with raw blocks? If we do this, downsampled blocks will turn into raw blocks again, and we need to downsampled it again? I feel this is not ideal.

I am inclined to agree but I am afraid that there will be some weird use-case out there that depends on compacting downsampled blocks. Maybe we could go with the current change as-is for the next release and then in $nextrelease+1 I will do as you've suggested?

@GiedriusS
Copy link
Member Author

I deployed this in prod and it works. 😄

@nicolastakashi
Copy link
Contributor

I also deployed this in prod and it's working like a charm 🥳

yeya24
yeya24 previously approved these changes May 9, 2024
Copy link
Contributor

@yeya24 yeya24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am ok to have the no compact marker first.

@yeya24
Copy link
Contributor

yeya24 commented May 9, 2024

I am inclined to agree but I am afraid that there will be some weird use-case out there that depends on compacting downsampled blocks

Let me think about this more. Downsampled blocks can be compacted with downsampled blocks with the same resolution only since they are in the same compaction group. It sounds possible to happen in the current planner creating a plan compacting downsampled blocks together. But I can only see blocks getting compacted into a raw block and the raw block getting downsampled again.

When two downsampled blocks overlap with each other, then it is probably valid to do a vertical compaction and merge them into 1 block. Is it possible to create a downsampled block directly rather than compacting a raw block first then get it downsampled again?

@GiedriusS
Copy link
Member Author

I am inclined to agree but I am afraid that there will be some weird use-case out there that depends on compacting downsampled blocks

Let me think about this more. Downsampled blocks can be compacted with downsampled blocks with the same resolution only since they are in the same compaction group. It sounds possible to happen in the current planner creating a plan compacting downsampled blocks together. But I can only see blocks getting compacted into a raw block and the raw block getting downsampled again.

When two downsampled blocks overlap with each other, then it is probably valid to do a vertical compaction and merge them into 1 block. Is it possible to create a downsampled block directly rather than compacting a raw block first then get it downsampled again?

Unless two downsampled and vertically overlapping blocks are exactly the same, it is impossible to do vertical compaction on them. For example, it would be impossible to calculate AggrCount from two non-identical values.

I agree with you and I'm not 100% certain how this could happen but Filip had an idea that it is maybe related to the "index size too big" filter which could cause blocks to have "gaps" in them due to some blocks being filtered from compaction. Then, the downsampling part probably picks up those blocks that had been left out from compaction. And we end up with this situation.

@yeya24
Copy link
Contributor

yeya24 commented May 13, 2024

Then, the downsampling part probably picks up those blocks that had been left out from compaction. And we end up with this situation.

We can merge this pr first and think about not compacting downsampled blocks at all in next release.

@yeya24
Copy link
Contributor

yeya24 commented May 13, 2024

@GiedriusS Let's add changelog?

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>
@GiedriusS
Copy link
Member Author

@yeya24 added

@GiedriusS GiedriusS requested a review from yeya24 May 17, 2024 07:02
@lu-xiansheng
Copy link

Can we filter out the blocks that are already fully compressed and then downsample them. This way downsampled blocks will not be compressed.There shouldn't be any other problems with this, right?

hczhu-db pushed a commit to databricks/thanos that referenced this pull request Aug 22, 2024
* compact/planner: fix issue 6775

It doesn't make sense to vertically compact downsampled blocks so mark
them with the no compact marker if downsampled blocks were detected in
the plan. Seems like the Planner is the best place for this logic - I
just repeated the previous pattern with the large index file filter.

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

* CHANGELOG: add item

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

---------

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>
hczhu-db pushed a commit to databricks/thanos that referenced this pull request Aug 22, 2024
* compact/planner: fix issue 6775

It doesn't make sense to vertically compact downsampled blocks so mark
them with the no compact marker if downsampled blocks were detected in
the plan. Seems like the Planner is the best place for this logic - I
just repeated the previous pattern with the large index file filter.

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

* CHANGELOG: add item

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

---------

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>
jnyi pushed a commit to jnyi/thanos that referenced this pull request Oct 17, 2024
* compact/planner: fix issue 6775

It doesn't make sense to vertically compact downsampled blocks so mark
them with the no compact marker if downsampled blocks were detected in
the plan. Seems like the Planner is the best place for this logic - I
just repeated the previous pattern with the large index file filter.

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

* CHANGELOG: add item

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>

---------

Signed-off-by: Giedrius Statkevičius <giedrius.statkevicius@vinted.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Compactor] panic: unexpected seriesToChunkEncoder lack of iterations
4 participants