-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimization of importer execution logic and output information. #285
Labels
type/feature req
Type: feature request
Comments
@GangLiCN Thanks for your advice.
2&3. I think maintainers need to plan the size of the disk and expand it when necessary. In addition, the importer cannot know the disk usage, and there is no suitable algorithm to estimate how much disk is needed.
@MuYiYong How do you think? |
QingZ11
changed the title
Importer执行逻辑和输出信息优化
Optimization of importer execution logic and output information.
Sep 18, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is your feature request related to a problem? Please describe.
Describe the solution you'd like
现在控制台输出的只能看到: 当前已经导入了多少条记录和网络延时,
用户可能更希望看到的是 每秒导入了多少条记录 类似于tpmc这种
性能指标。
改进建议如下:
或者增加进度说明。例如:
一共多少个csv文件,当前处理的是哪个csv文件,
本csv文件一共需要导入多少条记录
现在导入了多少条记录
预计需要花费的时间。
导入测试数据集前对磁盘容量进行检测,如果剩余磁盘空间 小于 预估的容量,
则报错提示无法导入。并输出具体的错误信息。
预估容量的计算要考虑到底层存储的问题,例如底层存储使用了RocksDB的话,
会有写放大的问题出现,这样可能会占据更多的磁盘空间,因此在预估磁盘容量
时尽量按照上限计算;
断点续传: 例如有20个csv文件,已经完成了10个,在导入第11个文件的时候
因为磁盘空间不足导致导入中断,下次再运行导入程序能不能从第11个文件开始,
不用再重复导入已经完成的文件。
Describe alternatives you've considered
Additional context
The text was updated successfully, but these errors were encountered: