Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

convolve expects both data and weights to have the exact same types #17

Open
rkshthrmsh opened this issue Aug 6, 2023 · 1 comment
Open

Comments

@rkshthrmsh
Copy link

I am trying to convolve over the 2D arrays in an image cube and noticed the following mismatched types error.

pub fn denoise(hsi: &mut Array3<f32>) {
    let corner_kernel = arr2(&[[0.25f32, 0., 0.25], [0., 0., 0.], [0.25, 0., 0.25]]);
    hsi.outer_iter_mut().into_par_iter().for_each(|bnd| {
        let corner_average =
            ndarray_ndimage::convolve(&bnd, &corner_kernel.view(), BorderMode::Mirror, ORIGIN);
    });
// --snip--
}

Error:

error[E0308]: mismatched types
   --> src/lib.rs:93:45
    |
93  |             ndarray_ndimage::convolve(&bnd, &corner_kernel.view(), BorderMode::Mirror, ORIGIN);
    |             -------------------------       ^^^^^^^^^^^^^^^^^^^^^ types differ in mutability
    |             |
    |             arguments to this function are incorrect
    |
    = note: expected reference `&ArrayBase<ViewRepr<&mut f32>, Dim<[usize; 2]>>`
               found reference `&ArrayBase<ViewRepr<&f32>, Dim<[usize; 2]>>`

I belive this is because of convolve's signature. Since both data and weights have the same generic parameters, the compiler expects them to both have the exact same types, which, I suspect, is probably not necessary?

pub fn convolve<S, A, D>(
    data: &ArrayBase<S, D>,
    weights: &ArrayBase<S, D>,
    mode: BorderMode<A>,
    origin: isize
) -> Array<A, D>
@nilgoyette
Copy link
Contributor

You're right. It should not be necessary for both arrays to have the exact same type. I could accept mut/not mut, owned/view without any problem. I'm allocating and copying a new weights anyway because it needs to be reversed (which is totally useless for the kernel you use... I guess that could be optimized!)

It's probably done that way because I simply copied data definition without thinking about that case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants