Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lrn implementation is different from definition from onnx website. #127

Closed
scxiao opened this issue Nov 19, 2019 · 5 comments
Closed

lrn implementation is different from definition from onnx website. #127

scxiao opened this issue Nov 19, 2019 · 5 comments
Assignees

Comments

@scxiao
Copy link

scxiao commented Nov 19, 2019

in the file test/lrn_test.cpp, the code is as follows:

          auto radius = (lrn.GetN() - 1) / 2;
            par_ford(n_batch, height, width)([&](int b, int h, int w) {
                double scale = 0;
                ford(channels)([&](int c) {
                    auto start = (c - radius) < 0 ? 0 : (c - radius);
                    auto end   = (c + radius) > channels ? channels : (c + radius);

                    for(auto k = start; k < end; k++)
                    {
                        scale += std::pow(input(b, k, h, w), 2);
                    }

But the formula for this operator is:
max(0, c - floor((size - 1) / 2)) <= i <= min(C - 1, c + ceil((size - 1) / 2))}

which means the end should be as:

radius_upper = lrn.GetN() / 2 + 1;
auto end   = (c + radius_upper) > channels ? channels : (c + radius_upper);
@scxiao
Copy link
Author

scxiao commented Dec 3, 2019

the following test case can trigger this bug:

data = 1.24726, 0.282953, 0.692061, 1.58436, 1.32044, 0.691029, 0.671802, 0.107024, 0.072982, 0.290336, 0.101854, 1.02826, 0.538117, 0.086037, 0.313857, 0.235939, 1.0564, 0.145062, 0.221358, 0.603927, 1.80503, 0.462163, 0.611235, 0.524762, 1.60482, 1.02829, 0.032356, 0.132356, 1.08368, 1.0389, 0.109562, 0.267397, 0.62775, 1.40053, 0.246009, 0.110555, 0.869892, 0.850173, 0.273867, 0.213759, 0.741422, 1.00399, 1.20645, 1.3793, 0.360373, 0.309728, 0.88583, 0.549744, 1.14111, 0.150417, 0.633155, 0.273577, 0.36119, 0.83478, 0.0199271, 0.302868, 0.047034, 0.213876, 0.448483, 0.256486, 0.475469, 0.25423, 0.574961, 1.22058, 0.125457, 0.284095, 1.15261, 0.327219, 0.641497, 0.0367307, 0.515534, 0.0912023, 0.80561, 0.873093, 0.284497, 0.484195, 0.615719, 0.409293, 0.220292, 0.039714, 0.823812, 0.63697, 0.329268, 1.08617, 1.05227, 1.34043, 0.833491, 0.127223, 0.757038, 0.745595, 0.285085, 0.864326, 0.147272, 0.690563, 0.251985, 0.499613, 0.00742448, 1.2627, 0.0897355, 0.284246, 1.3315, 0.952969, 0.898339, 0.685455, 0.829504, 1.37423, 0.292468, 0.528524, 1.35959, 1.04681, 1.32041, 0.640647, 0.608971, 1.35051, 0.189503, 0.567415, 0.669782, 0.109609, 0.43421, 0.652088, 0.26617, 0.777378, 0.210875, 0.937869, 0.491128, 0.105808, 0.307684, 1.30754, 0.475375, 0.288118, 0.544403, 0.381297, 0.476812, 0.0225071, 0.449607, 0.478305, 0.407684, 0.147287, 0.280018, 0.772877, 1.05443, 0.310691, 0.117851, 0.449021, 1.68498, 0.667828, 0.645453, 0.789824, 0.930454, 0.326383, 0.0482537, 1.21144, 0.526582, 0.584367, 0.0696162, 0.469141, 0.796606, 0.763601, 0.811298, 0.30958, 0.352143, 1.36427, 0.671331, 0.0619073, 0.86644, 0.597009, 0.707244, 1.09227, 0.840019, 0.22409, 0.651126, 0.225369, 0.605844, 0.460292, 0.731283, 0.481955, 0.568062, 0.487567, 0.322103, 0.0123595, 0.250298, 0.972186, 0.455091, 1.57202, 0.442101, 1.13276, 0.780851, 0.0368853, 0.522943, 1.09099, 0.914149, 0.18883, 0.027776, 0.825858, 0.37001, 0.121298, 0.545731, 0.58229, 1.5295, 0.944986, 0.261048, 0.169265, 0.777554, 0.463334, 0.452638, 1.14327, 0.0172006, 0.521856, 0.197919, 0.0694008, 0.643551, 0.224294, 0.555972, 0.329803, 0.667783, 0.289938, 0.0120352, 0.2681, 1.5974, 0.0298801, 0.675913, 0.244638, 0.327801, 0.340427, 1.08939, 0.0447329, 0.110667, 0.164173, 0.422361, 0.168235, 1.00689, 0.348828, 0.383856, 0.294191, 0.817526, 0.552377, 1.05672, 1.46358, 0.301408, 0.478635, 0.450732, 0.280913, 0.0939605, 0.210552, 0.218505, 1.18504, 0.814802, 0.763388, 0.575127, 1.03684, 0.368446, 0.40714, 0.100376, 0.225783, 0.488953, 0.491258, 0.513061, 0.978147, 1.11922, 0.4316, 0.840625, 0.358365, 0.421645, 0.0371704, 1.36903, 0.133486, 0.370444, 0.0625183, 0.219825, 0.0688723, 0.282162, 1.96027, 1.38294, 0.275836, 0.461318, 0.276436, 0.349109, 0.0820978, 1.4358, 1.45969, 0.0781631, 0.721331, 0.489319, 1.08629, 0.202474, 0.430505, 0.739075, 0.856365, 0.487767, 0.920459, 0.444122, 0.340135, 1.62897, 0.749529, 0.0961309, 0.803877, 0.0690924, 0.412181, 0.282448, 0.261661, 0.923824, 1.17242, 0.0835475, 0.480918, 0.47116, 0.325774, 0.943397, 0.952227, 0.490537, 0.112822, 0.0945401, 0.762012, 0.796747, 0.516654, 0.27213, 0.0667149, 0.0298195, 0.20284, 0.0435746, 0.075876, 0.508821, 0.574865, 0.194111, 0.629959, 0.818352, 0.220815, 0.111478, 1.59556, 0.498286, 0.666967, 0.528333, 0.840619, 0.546748, 0.837101, 1.88008, 0.428726, 1.24151, 0.318846, 0.48366, 1.17342, 0.755535, 0.320591, 0.486369, 0.85846, 0.311772, 0.198239, 0.25787, 0.110805, 0.409068, 0.247237, 0.540325, 1.01664, 0.964758, 0.487506, 0.461231, 0.368531, 1.30314, 0.337974, 0.339159, 0.438605, 0.493877, 0.00266628, 0.658905, 0.240389, 0.0110884, 0.113791, 0.134812, 0.279198, 0.189311, 0.797605, 0.198302, 0.702231, 0.595116, 0.176393, 0.0349982, 0.349192, 0.454879, 1.11054, 0.146303, 0.62234, 1.20068, 0.273845, 1.59479, 0.723009, 0.0273156, 1.17138, 0.696843, 1.04071, 1.16535, 0.116121, 0.40113, 0.157455, 0.249913, 1.14293, 0.20635, 0.538443, 0.606633, 0.806804, 1.03699, 0.602838, 0.423308, 0.789025, 0.542017, 0.251935, 1.25046, 0.25135, 0.575925, 0.0416626, 0.13085, 0.571079, 1.02277, 0.565875, 0.218569, 0.165076, 1.22516, 0.484008, 0.262206, 0.100453, 1.07471, 1.21586, 0.657243, 0.411686, 1.48094, 0.0874806, 0.0919984, 0.0664333, 0.666803, 1.93698, 0.402561, 0.190841, 0.330085, 1.00186, 0.614393, 0.195778, 0.686656, 0.2226, 0.580927, 0.00374245, 0.566078, 0.0553353, 0.279439, 0.819801, 0.0607616, 0.137368, 0.619268, 0.0813866, 0.323437, 0.682022, 0.553392, 0.078053, 0.745709, 0.579971, 0.327436, 0.197345, 0.23964, 1.42898, 0.331519, 1.55649, 0.140926, 0.0357811, 0.365896, 0.692092, 0.310551, 0.128224, 0.355543, 1.70566, 0.679152, 0.560801, 1.61811, 0.177802, 1.42572, 0.381446, 0.194924, 0.501835, 1.22951, 0.703118, 0.932733, 0.623954, 0.798019, 0.350724, 0.545424, 0.727908, 0.642532, 0.300037, 0.60994, 1.87758, 1.07003, 0.39112, 0.0323176, 0.155922, 0.728191, 0.247443, 0.777995, 0.917734, 1.90629, 0.0522691, 0.465662, 0.363616, 0.719846, 0.0550488, 0.270628, 0.0242126, 0.775193, 0.1656, 0.245681, 0.410982, 1.15439, 1.10851, 0.833771, 0.920198, 0.633033, 0.972215, 0.941987, 1.39194, 0.466727, 0.124323, 0.352617, 0.741014, 0.201008, 1.23219, 0.157405, 0.645623, 1.18862, 0.628575, 0.171202, 0.62841, 0.662367, 0.998599, 1.67539, 0.610964, 1.58345, 0.283863, 0.866057, 0.0458577, 0.904818, 0.413927, 0.185005, 0.128863, 0.143453, 0.0776976, 0.150953, 0.854565, 0.171129, 1.07353, 0.27197, 0.313837, 0.762333, 1.80933, 0.835322, 0.446817, 0.115913, 0.0681077, 0.66638, 0.1892, 0.479417, 0.917555, 1.67155, 0.0143778, 0.953088, 0.538491, 1.42206, 0.0315337, 0.137934, 1.25968, 0.515428, 0.138986, 0.250844, 0.436201, 0.00610049, 0.372645, 0.320869, 1.29372, 0.0261641, 0.542963, 0.417103, 0.257276, 0.569656, 0.790726, 0.092669, 0.801159, 1.38, 0.466567, 0.805908, 0.555026, 0.391949, 0.332751, 0.153402, 0.314937, 0.277442, 2.1536, 0.384176, 0.310448, 0.155238, 0.766493, 0.248735, 0.268156, 0.332343, 0.153252, 0.657703, 0.12628, 1.09627, 0.295085, 0.667746, 0.168364, 0.994112, 0.417205, 0.0781272, 1.17421, 0.081421, 0.268096, 1.23196, 0.921505, 0.42788, 0.633244, 0.093259, 0.286208, 0.158281, 0.233078, 0.909285, 1.06557, 0.478325, 0.270119, 0.158575, 0.213719, 0.265268

data shape = (5, 5, 5, 5), alpha=0.0001,beta=0.75,bias=1,size=3

@ce1adon
Copy link
Contributor

ce1adon commented Dec 5, 2019

I see your concern.
The lower and upper radius should be distinguished since they are different when size is even.
Moreover, the element corresponding to "end" should be included in the calculation. Current for-loop excluded it.

@ce1adon
Copy link
Contributor

ce1adon commented Dec 7, 2019

It looks like the current ctest config for lrn is not valid.
Current ctest chooses beta=0, which turned the test into a simple memory copy case without testing any features of lrn.
When beta is set to any non-zero value, the ctest failed.
Working on both using the correct verification method and modify kernels.

@ce1adon
Copy link
Contributor

ce1adon commented Dec 13, 2019

@scxiao
Hey Shucai, Please review the solution here:
https://github.com/AMDComputeLibraries/MLOpen/pull/2305

If you still encounter issues with the test, please let us know.

@daniellowell
Copy link
Contributor

Fixed in 2.2.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants