Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

completed l2pool2d #68

Merged
merged 8 commits into from
Apr 16, 2024
Merged

Conversation

mei1127
Copy link
Contributor

@mei1127 mei1127 commented Jan 10, 2024

@BruceDai @huningxin @fdwr PTAL , thanks!

src/reduce.js Outdated
@@ -143,6 +143,20 @@ export function reduceL1(input, options = {}) {
return reduceSum(abs(input), options);
}

/* The l2 reducer */
export function l2Reducer(previousValue, currentValue, currentIndex, array) {
if (currentIndex== 1) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nit] if (currentIndex == 1) {

src/reduce.js Outdated
/* The l2 reducer */
export function l2Reducer(previousValue, currentValue, currentIndex, array) {
if (currentIndex== 1) {
const sumOfSquares = previousValue*previousValue + currentValue * currentValue;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nit] ... = previousValue * previousValue ...

src/reduce.js Outdated
/* The l2 reducer */
export function l2Reducer(previousValue, currentValue, currentIndex, array) {
if (currentIndex== 1) {
const sumOfSquares = previousValue*previousValue + currentValue * currentValue;
Copy link

@fdwr fdwr Jan 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, if this is the first index, wouldn't the previousValue be ignorable?
const sumOfSquares = currentValue * currentValue;

Also, does this work correctly if there is only one element being reduced? Otherwise the final sqrt isn't applied. Let's be sure to include a test case with exactly one element @BruceDai. I would have expected:

if (currentIndex == 0)
    previousValue = 0;

const sumOfSquares = previousValue + currentValue * currentValue;
return (currentIndex === array.length - 1) ? sumOfSquares : Math.sqrt(sumOfSquares);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this function, currentIndex starts from 1 by default, and the number with index=0 at the beginning defaults to previousValue. So I consider this situation: if (currentIndex== 1) {
const sumOfSquares = previousValue * previousValue + currentValue * currentValue;}

E.g.
case:
const x = new Tensor([1, 1, 4, 4], [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]) ;
const windowDimensions = [3, 3];
log:
currentIndex: 1 previousValue: 1 currentValue: 2
currentIndex: 2 previousValue: 5 currentValue: 3
currentIndex: 3 previousValue: 14 currentValue: 5
currentIndex: 4 previousValue: 39 currentValue: 6
currentIndex: 5 previousValue: 75 currentValue: 7
currentIndex: 6 previousValue: 124 currentValue: 9
currentIndex: 7 previousValue: 205 currentValue: 10
currentIndex: 8 previousValue: 305 currentValue: 11

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

currentIndex starts from 1 by default

Huh, why doesn't the first reduced element start at 0? Don't JS arrays start at 0?

Btw, just to catch any interesting boundary cases, I'd include one test where the values don't start at 1 (e.g. [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16] -> [16, 15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1]).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it('l2Pool2d nhwc', function() {
const x = new Tensor([1, 4, 4, 1], [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]);
const windowDimensions = [3, 3];
const layout = 'nhwc';
Copy link

@fdwr fdwr Jan 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(update) I found a use case for normalizing across batches and spatial dimensions - this may inform future decisions: https://ieeexplore.ieee.org/document/8648206

🤔 I've often wondered why pooling in ML doesn't have variations that can pool across batches or channel dimensions too, not only the "spatial" dimensions. e.g. One could generalize pooling consistently like reduction (which can apply from all dimensions to none), and pass windowDimensions of rank 4 with [3, 1, 3, 3] to pool along batches too, or just [1, 3, 1, 1] for only channels. Also it would eliminate the then redundant layout parameter. Now, DML can achieve the latter via explicit strides (to map the first dimension to the last dimensions), but there's no way to access it via WebNN, and I've never heard of a need for it anyway. 🤷 I suppose if a researcher really wanted it, they could transpose the batch dimension to the X dimension, pool, then transpose back; and if there has been such a use case, that's probably exactly what they did.

*No action expected - just musing after noticing this.

14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25,
]);
const windowDimensions = [5, 5];
const autoPad = 'same-upper';
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, we're still using autoPad for pooling.

@huningxin, you opened webmachinelearning/webnn#326 where we talked about removing autoPad from conv2d, and @Honry removed it from the WebNN EP in microsoft/onnxruntime#18688, but for consistency, wouldn't we want to be explicit with pooling too?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but for consistency, wouldn't we want to be explicit with pooling too?

Yes, we should be consistent and only do explicit padding for pooling.

32.4037034920393,
];
utils.checkValue(y, expected);
});
Copy link

@fdwr fdwr Jan 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we add a single element test case here (e.g. const x = new Tensor([1, 1, 1, 1], or wait for the WPT?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay,I have added a single element test case.

Copy link

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a question, but assuming it's correct, and assuming we add a single element test case here or in the WPT, then LGTM.

Copy link

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Mei.

@BruceDai BruceDai mentioned this pull request Apr 9, 2024
@huningxin huningxin merged commit a9f2ded into webmachinelearning:main Apr 16, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants