-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
get error Information schema is changed. #10282
Comments
Thanks @z2665.
|
@winkyao Thanks for your answer
it mesns like this?
I have tested the above statement and concurrently executed 24 coroutines to execute truncate table, but the result is that their total execution time is similar. it will spend 1m40s with use mysqldump imports data. How do I make it faster? |
No, could you run ddl in one goroutine? @z2665 |
I try run ddl in one goroutine. There has two results
|
@z2665 Could you use master tidb to test your workload? #10170 fixed one of issue of creating table and create database slowly. But one thing maybe you should know is that DDL jobs in TiDB are handled serially now. If you run it parallelly, it will cause many transaction conflicts, caused by TiDB trying to enqueue the job to the jobs queue. So if you want to finish all your ddl jobs fastly, you'd better run them one by one, including |
We are working on concurrency running ddl, but at this time, the optimized way is run ddl one by one. @z2665 |
@winkyao Thanks for your help. |
I have a tidb v2.1.8 cluster of three eight-core virtual machines.
I use the tidb as backup db.
So I create about 21 dbs. Before I import data. I will clean tables use
I started 30 coroutines to exec the statement.
some time I will get some errror about
so I have some quesetions:
The text was updated successfully, but these errors were encountered: