Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avoid tiflash crash when query is killed (#3434) #3448

Merged

Conversation

ti-chi-bot
Copy link
Member

This is an automated cherry-pick of #3434

What problem does this PR solve?

Issue Number: close #3401

Problem Summary:

What is changed and how it works?

Proposal: xxx

What's Changed:

Root cause:

  1. NPE by memory tracker
2021.11.12 12:36:14.389127 [ 112803 ] <Error> BaseDaemon: ########################################                                                                                                 
2021.11.12 12:36:14.393874 [ 112803 ] <Error> BaseDaemon: (from thread 112614) Received signal Segmentation fault (11).
2021.11.12 12:36:14.393924 [ 112803 ] <Error> BaseDaemon: Address: NULL pointer.
2021.11.12 12:36:14.393959 [ 112803 ] <Error> BaseDaemon: Access: read.
2021.11.12 12:36:14.393980 [ 112803 ] <Error> BaseDaemon: Unknown si_code.
2021.11.12 12:36:14.444986 [ 112803 ] <Error> BaseDaemon: 0. /data1/xufei/grafana_test_51x/tiflash1/tiflash(MemoryTracker::free(long)+0x22) [0x3701782]
2021.11.12 12:36:14.445027 [ 112803 ] <Error> BaseDaemon: 1. /data1/xufei/grafana_test_51x/tiflash1/tiflash(MemoryTracker::alloc(long)+0x776) [0x3701fe6]
2021.11.12 12:36:14.445059 [ 112803 ] <Error> BaseDaemon: 2. /data1/xufei/grafana_test_51x/tiflash1/tiflash(Allocator<true>::realloc(void*, unsigned long, unsigned long, unsigned long)+0x100) [0x
2021.11.12 12:36:14.445085 [ 112803 ] <Error> BaseDaemon: 3. /data1/xufei/grafana_test_51x/tiflash1/tiflash() [0x7659dd2]
2021.11.12 12:36:14.445105 [ 112803 ] <Error> BaseDaemon: 4. /data1/xufei/grafana_test_51x/tiflash1/tiflash() [0x767a81a]
2021.11.12 12:36:14.445142 [ 112803 ] <Error> BaseDaemon: 5. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::Join::insertFromBlockInternal(DB::Block*, unsigned long)+0x14ad) [0x7686d4d]
2021.11.12 12:36:14.445165 [ 112803 ] <Error> BaseDaemon: 6. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::Join::insertFromBlock(DB::Block const&, unsigned long)+0x4a1) [0x76885f1]
2021.11.12 12:36:14.445185 [ 112803 ] <Error> BaseDaemon: 7. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::HashJoinBuildBlockInputStream::readImpl()+0x3c) [0x798be8c]
2021.11.12 12:36:14.445227 [ 112803 ] <Error> BaseDaemon: 8. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<fals
2021.11.12 12:36:14.445248 [ 112803 ] <Error> BaseDaemon: 9. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::IProfilingBlockInputStream::read()+0x17) [0x6902ff7]
2021.11.12 12:36:14.445274 [ 112803 ] <Error> BaseDaemon: 10. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::ParallelInputsProcessor<DB::UnionBlockInputStream<(DB::StreamUnionMode)0>::Handler
2021.11.12 12:36:14.445300 [ 112803 ] <Error> BaseDaemon: 11. /data1/xufei/grafana_test_51x/tiflash1/tiflash(DB::ParallelInputsProcessor<DB::UnionBlockInputStream<(DB::StreamUnionMode)0>::Handler
2021.11.12 12:36:14.445321 [ 112803 ] <Error> BaseDaemon: 12. /data1/xufei/grafana_test_51x/tiflash1/tiflash() [0x873e10f]
2021.11.12 12:36:14.445342 [ 112803 ] <Error> BaseDaemon: 13. /usr/lib64/libpthread.so.0(+0x6e63) [0x7f6344890e63]

The memory tracker is destructed before it is used
2. NPE by join

2021.11.12 23:06:59.506368 [ 843819 ] <Error> BaseDaemon: ########################################
2021.11.12 23:06:59.516163 [ 843819 ] <Error> BaseDaemon: (from thread 843816) Received signal Segmentation fault (11).
2021.11.12 23:06:59.516198 [ 843819 ] <Error> BaseDaemon: Address: NULL pointer.
2021.11.12 23:06:59.516217 [ 843819 ] <Error> BaseDaemon: Access: read.
2021.11.12 23:06:59.516234 [ 843819 ] <Error> BaseDaemon: Address not mapped to object. 
2021.11.12 23:06:59.623789 [ 843819 ] <Error> BaseDaemon: 0. /data1/xufei/grafana_test_51x/tiflash0/tiflash() [0x76cdb60]
2021.11.12 23:06:59.623876 [ 843819 ] <Error> BaseDaemon: 1. /data1/xufei/grafana_test_51x/tiflash0/tiflash(void DB::Join::joinBlockImpl<(DB::ASTTableJoin::Kind)0, (DB::ASTTableJoin::Strictness)2, DB::Join::MapsTemplate<DB::Join::WithUsedFlag<false, DB::Join::RowRefList> > >(DB::Block&, DB::Join::MapsTemplate<DB::Join::WithUsedFlag<false, DB::Join::RowRefList> > const&) const+0xed4) [0x76d0ae4]
2021.11.12 23:06:59.623908 [ 843819 ] <Error> BaseDaemon: 2. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::Join::joinBlock(DB::Block&) const+0x252) [0x766b2a2]
2021.11.12 23:06:59.623939 [ 843819 ] <Error> BaseDaemon: 3. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionAction::execute(DB::Block&) const+0x9f) [0x761357f]
2021.11.12 23:06:59.623964 [ 843819 ] <Error> BaseDaemon: 4. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionActions::execute(DB::Block&) const+0x6a) [0x7619aba]
2021.11.12 23:06:59.623988 [ 843819 ] <Error> BaseDaemon: 5. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionBlockInputStream::readImpl()+0x34) [0x74da8b4]
2021.11.12 23:06:59.624018 [ 843819 ] <Error> BaseDaemon: 6. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<false>, 15ul, 16ul>*&, bool)+0x3bd) [0x6902dfd]
2021.11.12 23:06:59.624042 [ 843819 ] <Error> BaseDaemon: 7. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read()+0x17) [0x6902fe7]                               
2021.11.12 23:06:59.624067 [ 843819 ] <Error> BaseDaemon: 8. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionBlockInputStream::readImpl()+0x1b) [0x74da89b]
2021.11.12 23:06:59.624101 [ 843819 ] <Error> BaseDaemon: 9. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<false>, 15ul, 16ul>*&, bool)+0x3bd) [0x6902dfd]
2021.11.12 23:06:59.624126 [ 843819 ] <Error> BaseDaemon: 10. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read()+0x17) [0x6902fe7]
2021.11.12 23:06:59.624150 [ 843819 ] <Error> BaseDaemon: 11. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionBlockInputStream::readImpl()+0x1b) [0x74da89b]
2021.11.12 23:06:59.624178 [ 843819 ] <Error> BaseDaemon: 12. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<false>, 15ul, 16ul>*&, bool)+0x3bd) [0x6902dfd]
2021.11.12 23:06:59.624217 [ 843819 ] <Error> BaseDaemon: 13. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read()+0x17) [0x6902fe7]
2021.11.12 23:06:59.624242 [ 843819 ] <Error> BaseDaemon: 14. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionBlockInputStream::readImpl()+0x1b) [0x74da89b]
2021.11.12 23:06:59.624270 [ 843819 ] <Error> BaseDaemon: 15. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<false>, 15ul, 16ul>*&, bool)+0x3bd) [0x6902dfd]
2021.11.12 23:06:59.624294 [ 843819 ] <Error> BaseDaemon: 16. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read()+0x17) [0x6902fe7]
2021.11.12 23:06:59.624317 [ 843819 ] <Error> BaseDaemon: 17. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::ExpressionBlockInputStream::readImpl()+0x1b) [0x74da89b]
2021.11.12 23:06:59.624344 [ 843819 ] <Error> BaseDaemon: 18. /data1/xufei/grafana_test_51x/tiflash0/tiflash(DB::IProfilingBlockInputStream::read(DB::PODArray<unsigned char, 4096ul, Allocator<false>, 15ul, 16ul>*&, bool)+0x3bd) [0x6902dfd]

The exception happens during hash table build is lost, so hash table probe begins when the hash table is still in build stage.

Related changes

  • PR to update pingcap/docs/pingcap/docs-cn:
  • Need to cherry-pick to the release branch:

Check List

Tests

  • Manual test (add detailed scripts or steps below)

Side effects

Release note

Fix tiflash randomly crash when a mpp query is killed.

@ti-chi-bot
Copy link
Member Author

ti-chi-bot commented Nov 16, 2021

[REVIEW NOTIFICATION]

This pull request has been approved by:

  • SchrodingerZhu

To complete the pull request process, please ask the reviewers in the list to review by filling /cc @reviewer in the comment.
After your PR has acquired the required number of LGTMs, you can assign this pull request to the committer in the list by filling /assign @committer in the comment to help you merge this pull request.

The full list of commands accepted by this bot can be found here.

Reviewer can indicate their review by submitting an approval review.
Reviewer can cancel approval by submitting a request changes review.

@ti-chi-bot ti-chi-bot added do-not-merge/cherry-pick-not-approved release-note Denotes a PR that will be considered when it comes time to generate release notes. labels Nov 16, 2021
@ti-chi-bot ti-chi-bot added size/S Denotes a PR that changes 10-29 lines, ignoring generated files. type/cherry-pick-for-release-5.0 labels Nov 16, 2021
@ti-chi-bot ti-chi-bot added size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. and removed size/S Denotes a PR that changes 10-29 lines, ignoring generated files. labels Nov 16, 2021
@windtalker windtalker force-pushed the cherry-pick-3434-to-release-5.0 branch from 1f6f77b to 3ebbe29 Compare November 16, 2021 08:52
@ti-chi-bot ti-chi-bot added size/S Denotes a PR that changes 10-29 lines, ignoring generated files. and removed size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. labels Nov 16, 2021
@ti-chi-bot ti-chi-bot added the status/LGT1 Indicates that a PR has LGTM 1. label Nov 17, 2021
@JaySon-Huang JaySon-Huang added this to the v5.0.6 milestone Dec 1, 2021
Copy link
Contributor

@windtalker windtalker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhouqiang-cl zhouqiang-cl added the cherry-pick-approved Cherry pick PR approved by release team. label Dec 18, 2021
@zhouqiang-cl
Copy link

/merge

@ti-chi-bot
Copy link
Member Author

@zhouqiang-cl: It seems you want to merge this PR, I will help you trigger all the tests:

/run-all-tests

You only need to trigger /merge once, and if the CI test fails, you just re-trigger the test that failed and the bot will merge the PR for you after the CI passes.

If you have any questions about the PR merge process, please refer to pr process.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the ti-community-infra/tichi repository.

@ti-chi-bot
Copy link
Member Author

This pull request has been accepted and is ready to merge.

Commit hash: 3ebbe29

@ti-chi-bot ti-chi-bot added the status/can-merge Indicates a PR has been approved by a committer. label Dec 18, 2021
@ti-chi-bot
Copy link
Member Author

@ti-chi-bot: Your PR was out of date, I have automatically updated it for you.

At the same time I will also trigger all tests for you:

/run-all-tests

If the CI test fails, you just re-trigger the test that failed and the bot will merge the PR for you after the CI passes.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the ti-community-infra/tichi repository.

@ti-chi-bot ti-chi-bot merged commit 21b08f0 into pingcap:release-5.0 Dec 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cherry-pick-approved Cherry pick PR approved by release team. release-note Denotes a PR that will be considered when it comes time to generate release notes. size/S Denotes a PR that changes 10-29 lines, ignoring generated files. status/can-merge Indicates a PR has been approved by a committer. status/LGT1 Indicates that a PR has LGTM 1. type/cherry-pick-for-release-5.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants