We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transforms: # 数据变换与数据增强 - type: RandomPaddingCrop # 对图像和标注图进行随机裁剪 crop_size: [1024, 512]
下面我的图片是2560x1280的,想要训练出的模型是1X3X320X640的 。 crop_size: [640, 320] ,设置成这样对吗?input_shape: [1, 3, 320, 640]设置成这样对吗?
train: base_default: # 训练轮次的类型: 0: Epochs; 1: Iters epochs_iters_type: 0 # 训练轮次 epochs_iters: 3600 batch_size: 32 learning_rate: 0.01 device: GPU gpu_num: 1 # 需要根据环境中的设备数配置 # 记录配置 base_current: epochs_iters_type: 0 epochs_iters: 3600 batch_size: 32 learning_rate: 0.01 device: GPU gpu_num: 1 # 需要根据环境中的设备数配置 # 高级配置 advance_default: # 自定义权重路径, 默认值为空 resume_weight_path: pretrain_weight_path: # 可以填入:O1, O2, null(不开启) amp: # 动转静训练: true/false dy2st: false # 配置文件详情 uapi_config_detail: '{}' # 记录配置 advance_current: resume_weight_path: pretrain_weight_path: output/iter_1200/model.pdparams amp: dy2st: false uapi_config_detail: '{}' # 模型输出目录,默认值 output_dir: ./output output_dir_user: ./output crop_size: [640, 320] evaluate: # 模型目录,将目录内所有.pdparams文件的全路径显示在"选择模型"中 model_dir: ./output model_path: best_model/best.pdparams # 批量测试输出路径: output_dir: ./output/eval # 测试结果输出的json output_json: ./tmp/evaluate.json # 可以填入:O1, O2, None(否) device: GPU gpu_num: 1 # 需要根据环境中的设备数配置 amp: input_shape: [1, 3, 320, 640]
The text was updated successfully, but these errors were encountered:
表示[宽,高]
Sorry, something went wrong.
以上回答已经充分解答了问题,如果有新的问题欢迎随时提交issue,或者在此条issue下继续回复~ 我们开启了飞桨套件的ISSUE攻关活动,欢迎感兴趣的开发者参加:PaddlePaddle/PaddleOCR#10223
Asthestarsfalll
No branches or pull requests
下面我的图片是2560x1280的,想要训练出的模型是1X3X320X640的 。 crop_size: [640, 320] ,设置成这样对吗?input_shape: [1, 3, 320, 640]设置成这样对吗?
The text was updated successfully, but these errors were encountered: