Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generate protostr files with current config files #363

Merged
merged 11 commits into from
Nov 7, 2016
46 changes: 23 additions & 23 deletions python/paddle/trainer_config_helpers/tests/configs/check.md5
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
86c0815275a9d5eb902e23c6a592f58a img_layers.protostr
a5d9259ff1fd7ca23d0ef090052cb1f2 last_first_seq.protostr
9c038249ec8ff719753a746cdb04c026 layer_activations.protostr
5913f87b39cee3b2701fa158270aca26 projections.protostr
7334ba0a4544f0623231330fc51d390d shared_fc.protostr
8b8b6bb128a7dfcc937be86145f53e2f shared_lstm.protostr
6b39e34beea8dfb782bee9bd3dea9eb5 simple_rnn_layers.protostr
4e78f0ded79f6fefb58ca0c104b57c79 test_bi_grumemory.protostr
0fc1409600f1a3301da994ab9d28b0bf test_cost_layers.protostr
6cd5f28a3416344f20120698470e0a4c test_cost_layers_with_weight.protostr
144bc6d3a509de74115fa623741797ed test_expand_layer.protostr
2378518bdb71e8c6e888b1842923df58 test_fc.protostr
8bb44e1e5072d0c261572307e7672bda test_grumemory_layer.protostr
1f3510672dce7a9ed25317fc58579ac7 test_hsigmoid.protostr
d350bd91a0dc13e854b1364c3d9339c6 test_lstmemory_layer.protostr
5433ed33d4e7414eaf658f2a55946186 test_maxout.protostr
251a948ba41c1071afcd3d9cf9c233f7 test_ntm_layers.protostr
e6ff04e70aea27c7b06d808cc49c9497 test_print_layer.protostr
2a75dd33b640c49a8821c2da6e574577 test_rnn_group.protostr
67d6fde3afb54f389d0ce4ff14726fe1 test_sequence_pooling.protostr
f586a548ef4350ba1ed47a81859a64cb unused_layers.protostr
8122477f4f65244580cec09edc590041 util_layers.protostr
dcd76bebb5f9c755f481c26192917818 math_ops.protostr
86c0815275a9d5eb902e23c6a592f58a ./protostr/img_layers.protostr.unitest
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

我理解的是这里不要放类似86c0815275a9d5eb902e23c6a592f58的一串数字了。应该直接比较./protostr/img_layers.protostr和./protostr/img_layers.protostr.unitest的md5值。因为这串数字也是之前我们对./protostr/img_layers.protostr用md5sum -c命令计算获得的,所以可以修改下trainer_config_helpers/tests/configs/run_tests.sh脚本,直接比较两个protostr的md5值。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@luotao1 了解,后续会修改为直接比对生成文件新文件和版本库中的protostr文件的的MD5值。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1
@luotao1 图片中是修改之后的代码,但是感觉没有原来的简洁。
烦请预先check下,如果没问题我在提交pull request
感谢

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个脚本的意思是:check.md5中可以只放img_layers.protostr(也不需要./protostr目录),然后脚本中自动寻找img_layers.protostr.unitest么?这样我觉得挺好的,check.md5会比较简洁,脚本看着也很容易懂。脚本的第14行是判断语句,那如果new_protostr不存在,是否要报错或者怎么样?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

如果按图片中进行修改的话,就不需要check.md5文件。建立protostr目录是为了存放xxx.protostr和xxx.protostr.unitest文件(当前xxx.protostr文件个数23个),目的是为了隔离生成的文件和configs中的文件分开,便于管理。
第5行指定对比文件目录
第6行查找版本库中生成好的xxx.protostr文件,由于可能存在多次运行run_tests.sh产生xxx.protostr.unitest文件,所以过滤了该文件,只将xxx.protostr文件作为对比文件。
10到23行为遍历对比文件和生成后的xxx.protostr.unitest文件的MD5值。相同为OK,反之为FAILED
脚本中第14行判断如果new_protostr不存在是会走eles分支,会输出新生成的文件可能不存在的警告,图片中没有贴出来。已经添加到我的代码里了。
我一会提一个pull request

a5d9259ff1fd7ca23d0ef090052cb1f2 ./protostr/last_first_seq.protostr.unitest
9c038249ec8ff719753a746cdb04c026 ./protostr/layer_activations.protostr.unitest
5913f87b39cee3b2701fa158270aca26 ./protostr/projections.protostr.unitest
7334ba0a4544f0623231330fc51d390d ./protostr/shared_fc.protostr.unitest
8b8b6bb128a7dfcc937be86145f53e2f ./protostr/shared_lstm.protostr.unitest
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

另外,既然用diff了。这个md5文件就没有啥用了,删了吧。

6b39e34beea8dfb782bee9bd3dea9eb5 ./protostr/simple_rnn_layers.protostr.unitest
4e78f0ded79f6fefb58ca0c104b57c79 ./protostr/test_bi_grumemory.protostr.unitest
0fc1409600f1a3301da994ab9d28b0bf ./protostr/test_cost_layers.protostr.unitest
6cd5f28a3416344f20120698470e0a4c ./protostr/test_cost_layers_with_weight.protostr.unitest
144bc6d3a509de74115fa623741797ed ./protostr/test_expand_layer.protostr.unitest
2378518bdb71e8c6e888b1842923df58 ./protostr/test_fc.protostr.unitest
8bb44e1e5072d0c261572307e7672bda ./protostr/test_grumemory_layer.protostr.unitest
1f3510672dce7a9ed25317fc58579ac7 ./protostr/test_hsigmoid.protostr.unitest
d350bd91a0dc13e854b1364c3d9339c6 ./protostr/test_lstmemory_layer.protostr.unitest
5433ed33d4e7414eaf658f2a55946186 ./protostr/test_maxout.protostr.unitest
251a948ba41c1071afcd3d9cf9c233f7 ./protostr/test_ntm_layers.protostr.unitest
e6ff04e70aea27c7b06d808cc49c9497 ./protostr/test_print_layer.protostr.unitest
2a75dd33b640c49a8821c2da6e574577 ./protostr/test_rnn_group.protostr.unitest
67d6fde3afb54f389d0ce4ff14726fe1 ./protostr/test_sequence_pooling.protostr.unitest
f586a548ef4350ba1ed47a81859a64cb ./protostr/unused_layers.protostr.unitest
8122477f4f65244580cec09edc590041 ./protostr/util_layers.protostr.unitest
dcd76bebb5f9c755f481c26192917818 ./protostr/math_ops.protostr.unitest
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ set -e
cd `dirname $0`
export PYTHONPATH=$PWD/../../../../

protostr=$PWD/protostr

configs=(test_fc layer_activations projections test_print_layer
test_sequence_pooling test_lstmemory_layer test_grumemory_layer
last_first_seq test_expand_layer test_ntm_layers test_hsigmoid
Expand All @@ -15,5 +17,5 @@ test_maxout test_bi_grumemory math_ops)
for conf in ${configs[*]}
do
echo "Generating " $conf
python -m paddle.utils.dump_config $conf.py > $conf.protostr
python -m paddle.utils.dump_config $conf.py > $protostr/$conf.protostr.unitest
done
Original file line number Diff line number Diff line change
@@ -0,0 +1,176 @@
type: "nn"
layers {
name: "image"
type: "data"
size: 65536
active_type: ""
}
layers {
name: "__conv_0__"
type: "exconv"
size: 3297856
active_type: ""
inputs {
input_layer_name: "image"
input_parameter_name: "___conv_0__.w0"
conv_conf {
filter_size: 32
channels: 1
stride: 1
padding: 1
groups: 1
filter_channels: 1
output_x: 227
img_size: 256
caffe_mode: true
filter_size_y: 32
padding_y: 1
stride_y: 1
}
}
bias_parameter_name: "___conv_0__.wbias"
num_filters: 64
shared_biases: true
}
layers {
name: "__batch_norm_0__"
type: "batch_norm"
size: 3297856
active_type: "relu"
inputs {
input_layer_name: "__conv_0__"
input_parameter_name: "___batch_norm_0__.w0"
image_conf {
channels: 64
img_size: 227
}
}
inputs {
input_layer_name: "__conv_0__"
input_parameter_name: "___batch_norm_0__.w1"
}
inputs {
input_layer_name: "__conv_0__"
input_parameter_name: "___batch_norm_0__.w2"
}
bias_parameter_name: "___batch_norm_0__.wbias"
moving_average_fraction: 0.9
}
layers {
name: "__crmnorm_0__"
type: "norm"
size: 3297856
active_type: ""
inputs {
input_layer_name: "__batch_norm_0__"
norm_conf {
norm_type: "cmrnorm-projection"
channels: 64
size: 32
scale: 0.0004
pow: 0.75
output_x: 227
img_size: 227
blocked: false
}
}
}
layers {
name: "__pool_0__"
type: "pool"
size: 2458624
active_type: ""
inputs {
input_layer_name: "__conv_0__"
pool_conf {
pool_type: "max-projection"
channels: 64
size_x: 32
stride: 1
output_x: 196
img_size: 227
padding: 0
size_y: 32
stride_y: 1
output_y: 196
img_size_y: 227
padding_y: 0
}
}
}
parameters {
name: "___conv_0__.w0"
size: 65536
initial_mean: 0.0
initial_std: 0.0441941738242
initial_strategy: 0
initial_smart: false
}
parameters {
name: "___conv_0__.wbias"
size: 64
initial_mean: 0.0
initial_std: 0.0
dims: 64
dims: 1
initial_strategy: 0
initial_smart: false
}
parameters {
name: "___batch_norm_0__.w0"
size: 64
initial_mean: 1.0
initial_std: 0.0
initial_strategy: 0
initial_smart: false
}
parameters {
name: "___batch_norm_0__.w1"
size: 64
initial_mean: 0.0
initial_std: 0.0
dims: 1
dims: 64
initial_strategy: 0
initial_smart: false
is_static: true
is_shared: true
}
parameters {
name: "___batch_norm_0__.w2"
size: 64
initial_mean: 0.0
initial_std: 0.0
dims: 1
dims: 64
initial_strategy: 0
initial_smart: false
is_static: true
is_shared: true
}
parameters {
name: "___batch_norm_0__.wbias"
size: 64
initial_mean: 0.0
initial_std: 0.0
dims: 1
dims: 64
initial_strategy: 0
initial_smart: false
}
input_layer_names: "image"
output_layer_names: "__pool_0__"
output_layer_names: "__crmnorm_0__"
sub_models {
name: "root"
layer_names: "image"
layer_names: "__conv_0__"
layer_names: "__batch_norm_0__"
layer_names: "__crmnorm_0__"
layer_names: "__pool_0__"
input_layer_names: "image"
output_layer_names: "__pool_0__"
output_layer_names: "__crmnorm_0__"
is_recurrent_layer_group: false
}

Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
type: "nn"
layers {
name: "data"
type: "data"
size: 30
active_type: ""
}
layers {
name: "__first_seq_0__"
type: "seqlastins"
size: 30
active_type: "linear"
inputs {
input_layer_name: "data"
}
select_first: true
trans_type: "seq"
}
layers {
name: "__first_seq_1__"
type: "seqlastins"
size: 30
active_type: "linear"
inputs {
input_layer_name: "data"
}
select_first: true
trans_type: "non-seq"
}
layers {
name: "__last_seq_0__"
type: "seqlastins"
size: 30
active_type: "linear"
inputs {
input_layer_name: "data"
}
trans_type: "seq"
}
layers {
name: "__last_seq_1__"
type: "seqlastins"
size: 30
active_type: "linear"
inputs {
input_layer_name: "data"
}
trans_type: "non-seq"
}
input_layer_names: "data"
output_layer_names: "__first_seq_0__"
output_layer_names: "__first_seq_1__"
output_layer_names: "__last_seq_0__"
output_layer_names: "__last_seq_1__"
sub_models {
name: "root"
layer_names: "data"
layer_names: "__first_seq_0__"
layer_names: "__first_seq_1__"
layer_names: "__last_seq_0__"
layer_names: "__last_seq_1__"
input_layer_names: "data"
output_layer_names: "__first_seq_0__"
output_layer_names: "__first_seq_1__"
output_layer_names: "__last_seq_0__"
output_layer_names: "__last_seq_1__"
is_recurrent_layer_group: false
}

Loading