Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Best practices for "carrying through" type-specific operations to wrapped types #6

Open
jthielen opened this issue Sep 24, 2021 · 10 comments

Comments

@jthielen
Copy link

In many use cases, we can have "multiply nested" duck arrays rather than just two interacting duck array types. This creates a couple issues:

Note: this issue is mostly relevant to array types higher on DAG, as they are the ones handling the respective interactions.

Current State

Many of the libraries higher on the DAG (e.g., xarray and pint) have some form of attribute fallback to their wrapped data. However, more complex methods/operations often require direct handling (such as rechunking an underlying dask array contained in an xarray data structure) that each library has to implement itself. This approach quickly becomes impractical as many array types become involved (e.g., how can xarray do all of pint, dask, and CuPy operations on a xarray(pint(dask(cupy))) nested array?), especially if there is not a standard way of carrying through these operations "down the stack."

Specific Goals

  • Be able to seamlessly (or as seamlessly as possible) perform wrapped-type-specific operations on wrapper arrays
    • e.g., dask-specific things on a xarray(pint(dask(numpy))) nested array, or sparse-specific things on a pint(dask(sparse)) nested array

Suggested Paths Forward

Discussion

  • Is there guidance (or is there the prospect of some kind of standardization/"best practice" document) on how to perform operations specific to a wrapped array type through a wrapper array type in a way that nests well?
@SimonHeybrock
Copy link

SimonHeybrock commented Sep 21, 2022

Context

I was working on a prototype for duck arrays supporting vector-valued data on bin edges for pydata/xarray#5775. This exercise escalated into a potential solution for this (#6) as well as the "layer removal" in #5.

A solution that most likely does not scale is adding a __getattr__ that forwards to wrapped duck arrays. As the wrapped array may have many properties this would likely result in a big mess. Furthermore, it requires users to know the order of layers of the duck-array wrapping. Finally, this drops properties of the current or higher layers.

Introducing __array_property__

__array_property__ is a working title, we may also want to support setting attributes and call it, e.g., __array_getattr__ and __array_setattr__.

Example:

class BinEdgeArray(numpy.lib.mixins.NDArrayOperatorsMixin):
    def __array_ufunc__(...):  # BinEdgeArray is a duck-array
        ...
    def __array_property__(self, name, wrap):
        if name == 'left':
            return wrap(self.left)
        if name == 'right':
            return wrap(self.right)
        if name == 'center':
            return wrap_result(wrap)(self.center)
        if hasattr(self._values, '__array_property__'):
            return self._values.__array_property__(
                name, wrap=lambda x: wrap(self.__class__(x)))
        raise AttributeError(f"{self.__class__} object has no attribute '{name}'")

Every duck array may/should define __array_property__ to expose a (small) subset of its properties. In our top-level duck-array, we will likely want to define __getattr__:

class DataArray:
    def __getattr__(self, name: str):
        # Top-level, wrap is no-op
        return self.__array_property__(name, wrap=lambda x: x)

I can then do this:

vectors = sx.VectorArray(np.arange(15).reshape(3, 5), ['vx', 'vy', 'vz'])
edges = sx.BinEdgeArray(vectors)
data = Quantity(edges, 'meter/second')
da = xr.DataArray(dims=('x', ), data=data, coords={'x': np.arange(4)})

da.units  # Unit('m/s')
da.left.units # Unit('m/s')
da.fields['vx'].units # Unit('m/s')
da.fields['vx'].left.units # Unit('m/s')
da.left.fields['vx'].units #Unit('m/s')
da.left  # DataArray[Quantity[VectorArray[np.ndarray]]]
da.left.fields['vx']  # DataArray[Quantity[np.ndarray]]
da.fields['vx'].left  # DataArray[Quantity[np.ndarray]], order does not matter
da.fields['vx'].left.units
da.fields['vx'].left.coords['x']
da.fields['vx'].magnitude.left
da.center().units  # DataArray[Quantity[VectorArray[np.ndarray]]], works with methods
da.center().units  # Unit('m/s')
da.center().fields['vy'].units
da.magnitude.center()  # DataArray[np.ndarray]

There is a working prototype in https://github.com/scipp/scippx, in particular https://github.com/scipp/scippx/blob/main/tests/array_property_test.py.

Notes

  • The duck-array layers are "searched" from top down, i.e., if multiple layers define the same property the highest layer takes priority.
  • One could to this via an accessor, but chaining is cumbersome: da.props.left.props.fields['vx'].
  • Layer removal is essentially handled by the layer to be removed. For example, pint.Quantity.__array_property__ provides magnitude, which returns a non-quantity, i.e., the Quantity layer is removed by da.magnitude.
  • Layer decides whether property return value needs wrapping or not. An example for the latter is the units property.

In almost all the examples I have considered so far this mechanism allows for encapsulation of behavior within a specific duck-array. That is, other layer do not need to know about specific other duck arrays or how to handle them. One special case may be a mask-array: If we have DataArray[Quantity[MaskedArray]] and access da.mask we want to remove the Quantity layer as well, as the unit applies only to the data. But maybe this indicates that Quantity[MaskedArray] is conceptually wrong, and it should be MaskedArray[Quantity] (and the effort to move units into the NumPy dtype system may confirm that suspicion)?

What other examples are there?

@jthielen
Copy link
Author

jthielen commented Sep 21, 2022

@SimonHeybrock Thanks for sharing your idea on this! In case it is helpful for anyone else, I needed to mock this up in order to work my head around it...some very rough code beneath the details tag:

import numpy as np

class HighLevelWrappingArray:
    def __init__(self, array, attrs=None):
        """An array with attrs dict and automatic exposure of these attrs through getattr.
        
        A minimially viable mock of xarray DataArray for testing __array_property__.
        """
        self._array = array
        self._attrs = attrs if attrs is not None else {}

    def __repr__(self):
        return f"{self.__class__.__name__}(\n{self._array}\n{self._attrs}\n)"

    def __array_property__(self, name, wrap):
        if hasattr(self._array, '__array_property__'):
            return self._array.__array_property__(
                name, wrap=lambda x: wrap(self.__class__(x, self._attrs))
            )
        raise AttributeError(f"{self.__class__} has no array attribute '{name}'")
            
    def __getattr__(self, name):
        if name == "data":
            return self._array
        elif name in self._attrs:
            return self._attrs[name]
        else:
            return self.__array_property__(name, wrap=lambda x: x)
        
class MidLevelWrappingArray:
    def __init__(self, array, units=None):
        """An array with units attribute.
        
        A minimially viable mock of pint Quantity for testing __array_property__.
        """
        self._magnitude = array
        self._units = units if units is not None else "dimensionless"

    def __repr__(self):
        return f"{self.__class__.__name__}(\n{self.magnitude}\n{self.units}\n)"
    
    def __array_property__(self, name, wrap):
        if name == "magnitude":
            return wrap(self._magnitude)
        if name == "units":
            return self._units
        if hasattr(self._magnitude, '__array_property__'):
            return self._magnitude.__array_property__(
                name, wrap=lambda x: wrap(self.__class__(x, self._units))
            )
        raise AttributeError(f"{self.__class__} has no array attribute '{name}'")
            
    def __getattr__(self, name):
        if name == "magnitude":
            return self._magnitude
        if name == "units":
            return self._units
        return self.__array_property__(name, wrap=lambda x: x)   
    
class LowLevelWrappingArray:
    def __init__(self, array, named_segments=None):
        """An array with special properties related to labeled segments.
        
        A minimally viable mock of Dask Array for testing __array_property__.
        """
        self._array = array
        self._segments = named_segments if named_segments is not None else {}

    def __repr__(self):
        return f"{self.__class__.__name__}(\n{self._array}\n{self._segments}\n)"
    
    def as_full_array(self):
        return self._array
    
    def get_segment(self, segment_name):
        return self._array[self._segments[segment_name]]
    
    def __array_property__(self, name, wrap):
        if name == "get_segment":
            return lambda x: wrap(self.get_segment(x))
        if name == "as_full_array":
            return lambda: wrap(self.as_full_array())
        if hasattr(self._array, '__array_property__'):
            return self._array.__array_property__(
                name, wrap=lambda x: wrap(self.__class__(x))
            )
        raise AttributeError(f"{self.__class__} has no array attribute '{name}'")    
            
    def __getattr__(self, name):
        return self.__array_property__(name, wrap=lambda x: x)
        

class NDArrayWithArrayProperty(np.lib.mixins.NDArrayOperatorsMixin):
    __array_priority__ = 20

    def __init__(self, array):
        self._array = array

    def __repr__(self):
        return f"{self.__class__.__name__}(\n{self._array}\n)"

    def __array__(self):
        return np.asarray(self._array)

    def __array_ufunc__(self, ufunc, method, *inputs, **kwargs):
        if method == '__call__':
            inputs = [arg._array if isinstance(arg, self.__class__) else arg
                      for arg in inputs]
            return self.__class__(ufunc(*inputs, **kwargs), **self.attrs)
        else:
            return NotImplemented
    
    def __getitem__(self, key):
        return self._array[key]

    def __array_property__(self, name, wrap):
        if hasattr(self._array, name):
            return getattr(self._array, name)
        raise AttributeError(f"{self._array.__class__} has no array attribute '{name}'")

There is a fair bit of complexity here, so my below points below may be the result of misinterpreting this concept. If so, please let me know where I misunderstood something!

As written/proposed, I'd be concerned about its fragility (i.e., needing every array layer to implement the __getattr__ fall back and __array_property__ layer descent/inheritance properly without recursion, or else things break) and high degree of boilerplate code, but if many of the re-usable components were encapsulated into a helper package (as mixins, decorators, or functions), then this is much less problematic. The other concern I had (which I'm not sure if we could avoid with this kind of protocol) is that this leads to a lot of extra work for duck array libraries (including NumPy itself), as essentially their entire API would need to be handled property-by-property, or at least annotated into groups of

  • non-callable properties that return an instance of self and are wrappable
  • (maybe) non-callable properties that return an instance of self and are not intended to be wrapped
  • non-callable properties that return what they wrap (which is wrappable)
  • non-callable properties that are not duck arrays
  • methods, likewise, for each of the above
    and then handled groupwise.

That all being said, my hunch is that with type annotations, there could be enough information for standardized tooling to parse all this and set up the __array_property__ internals "auto-magically." So, with that in mind, would proposing an alternative that maintains a very similar internal wrapping/deferral pattern to the suggestion of __array_property__, but operates through an external toolkit (with decorators/mixins/functions) make sense? Perhaps something that looks like

from wrapping_arrays import RegisterWrappingArrayType, get_wrapped_property

@RegisterWrappingArrayType
class HighLevelWrappingArray:
    ...

    def __getattr__(self, name: str) -> Any:
        # any built-in __getattr__
        return get_wrapped_property(self._rewrap_contents, self._wrapped_array, name)

A nice (potential) benefit of not enforcing a protocol all the way to the bottom of the type casting hierarchy in order to make this work out is that only arrays that wrap other arrays would need to implement this (e.g., xarray, pint, dask; not cupy, numpy, etc.). Instead, at the lowest level (e.g., when a wrapped type shows up that is not registered as a wrapping type), the property behavior can simply be determined by the Array API.

Also, given that a "Duck Array DAG Library" was already proposed in #3, these property utilities would have a natural home there.

@SimonHeybrock
Copy link

SimonHeybrock commented Sep 22, 2022

As written/proposed, I'd be concerned about its fragility (i.e., needing every array layer to implement the getattr fall back and array_property layer descent/inheritance properly without recursion, or else things break) and high degree of boilerplate code, but if many of the re-usable components were encapsulated into a helper package (as mixins, decorators, or functions), then this is much less problematic.

I absolutely share your concern about complexity, but doesn't this decoupled approach actually reduce the overall complexity? Individual packages need to know less about each other, and the need for helper packages like pint-xarray may be reduced?

More specifically, implementing __getattr__ on all levels is not necessary, in principle it is sufficient to have it at the top level (but for UX having it in every layer may be an advantage). Yes, I have also considered mixins. Another option may be to provide decorators to duck-array implementers can decorate/mark individual properties or methods in their array class as "array-properties" to be accessible on all levels.

The other concern I had (which I'm not sure if we could avoid with this kind of protocol) is that this leads to a lot of extra work for duck array libraries (including NumPy itself), as essentially their entire API would need to be handled property-by-property, or at least annotated into groups of [...]

Not sure I understand this one. Several comments:

  • Unless NumPy would wrap other array implementations (can numpy.ma do that?) there is no need for change in NumPy.
  • I am not suggesting to provide access to the entire API via __array_property__. If that is what we want then we can just implement __getattr__ to forward to the wrapped array.
  • To avoid a complete mess of properties on higher levels (with multiple duck-array layers) we have to severely limit the number of properties/methods that are exposed by each layer. If, e.g., pint.Quantity would list most of its interface in __array_property__ we'd quickly be in trouble. Instead, what I am proposing is to provide a minimal but sufficient set of methods/properties. This might be something the community agrees on.

For example, for Pint this might simply be units and magnitude, because this is conceptually what a quantity is and thus a DataArray should have those properties --- from a point of view of a user who does not know/care about layers of duck arrays and may instead think of everything in a single class.

So, with that in mind, would proposing an alternative that maintains a very similar internal wrapping/deferral pattern to the suggestion of array_property, but operates through an external toolkit (with decorators/mixins/functions) make sense? Perhaps something that looks like [...]

I am not sure I understand your suggestion. It sounds like just forwarding to any attribute found on lower levels (plus the rewrap)? How does this address the issue of drowning in a sea of attributes?

A nice (potential) benefit of not enforcing a protocol all the way to the bottom of the type casting hierarchy in order to make this work out is that only arrays that wrap other arrays would need to implement this (e.g., xarray, pint, dask; not cupy, numpy, etc.).

Neither does my suggestion. There are no changes/patches to NumPy in my prototype.

@jthielen
Copy link
Author

jthielen commented Sep 22, 2022

Thanks for the thorough reply! Unfortunately, though, I think we both have not quite been understanding each others' points. My apologies if any of that was due to a lack of clarity and/or diligence on my part; I'll try doing better below. Hopefully the back-and-forth on these points of discussion will help us work towards a better and more mutually understood solution!

As written/proposed, I'd be concerned about its fragility (i.e., needing every array layer to implement the getattr fall back and array_property layer descent/inheritance properly without recursion, or else things break) and high degree of boilerplate code, but if many of the re-usable components were encapsulated into a helper package (as mixins, decorators, or functions), then this is much less problematic.

I absolutely share your concern about complexity, but doesn't this decoupled approach actually reduce the overall complexity?

My latter idea about how to implement this "carrying" through of properties should be decoupled (from the viewpoint of direct array library to array library interactions) in the same fashion as the former protocol. It is of course still coupled to a common library, but that should both be lightweight from an implementation perspective and most likely already existing in some extent due to #3. Also, I was originally referring to complexity of code that needs to exist within any given array wrapping library, or equivalently, what we would need to instruct libraries to implement to support this. Overall, yes, having a separate package encapsulate these details would be more complex, but that complexity would exist within a single shared library, which I would believe would lead to less points of failure.

Individual packages need to know less about each other, and the need for helper packages like pint-xarray may be reduced?

Yes, either approach suggested so far (and indeed, any full solution to this GitHub issue) should reduce the need for such helper packages, as well as support things that take way too much manual unwrapping and rewrapping at the moment.

More specifically, implementing __getattr__ on all levels is not necessary, in principle it is sufficient to have it at the top level (but for UX having it in every layer may be an advantage).

Unfortunately, there is no such thing as the top level in this context; any array type that wraps other arrays could end up as outermost layer if the higher-level types are not in use. So, for example, we need to fully support situations of pint Quantities wrapping, say, Sparse arrays just the same as xarray DataArrays wrapping pint Quantities wrapping anything else. So, if attribute access is the way forward for this, then we'd need to work through __getattr__ on all wrapping array types.

Yes, I have also considered mixins. Another option may be to provide decorators to duck-array implementers can decorate/mark individual properties or methods in their array class as "array-properties" to be accessible on all levels.

Yes indeed! In a partial API case, that kind of decorator was exactly what I was suggesting. Though, depending on how much information could be parsed from the property signature / type annotation, there may need to be several options of such decorators, since some properties will need to be rewrapped, some exposed as-is, and others handled as callables (with or without rewrapping the output, rather than the callable itself).

The other concern I had (which I'm not sure if we could avoid with this kind of protocol) is that this leads to a lot of extra work for duck array libraries (including NumPy itself), as essentially their entire API would need to be handled property-by-property, or at least annotated into groups of [...]

Not sure I understand this one. Several comments:

  • Unless NumPy would wrap other array implementations (can numpy.ma do that?) there is no need for change in NumPy.

As far as I understood the __array_property__ hand-off to lower types, the lowest type in the stack of wrapped arrays (e.g., NumPy, CuPy) still needs to have __array_property__ implemented in order for their properties to be handled through the protocol. For example, how are properties that could exist on the lowest-level array like shape, T, or mean() handled through __array_property__?

  • I am not suggesting to provide access to the entire API via __array_property__. If that is what we want then we can just implement __getattr__ to forward to the wrapped array.
  • To avoid a complete mess of properties on higher levels (with multiple duck-array layers) we have to severely limit the number of properties/methods that are exposed by each layer. If, e.g., pint.Quantity would list most of its interface in __array_property__ we'd quickly be in trouble. Instead, what I am proposing is to provide a minimal but sufficient set of methods/properties. This might be something the community agrees on.
    For example, for Pint this might simply be units and magnitude, because this is conceptually what a quantity is and thus a DataArray should have those properties --- from a point of view of a user who does not know/care about layers of duck arrays and may instead think of everything in a single class.

I think this was the key misunderstanding I had about your initial idea and likely the fulcrum of our diverging perspectives on this topic. I also apologize for my exaggerated and poor phrasing of "essentially their entire API"...in retrospect I should have said something more like "a significant portion of their API", as many portions of the API are of course either not relevant to be user-exposed in this way (e.g., format conversions, alternate constructors) or consist of operations handled through other means (e.g., math operations).

I would still disagree though with severely limiting the properties exposed through this mechanism. When using multiply nested arrays, my use cases may demand any number of different aspects of the APIs of each array layer. So, if only a small portion of the API were to be carried through to higher level types, then I am no better off than I am at present (needing to manually unwrap and rewrap). To borrow your example, say pint.Quantity only exposed magnitude and units. How would a user of an xarray-wrapped pint.Quantity then use pint methods/properties like .to(), .to_base_units(), .m_as(), and .dimensionality? I don't think it would be realistic to satisfy the user base in this wide community by picking and choosing what is or is not sufficient for them.

So, given that the breath of user expectations likely demands that substantial portions of every layer's API is exposed, but (as you said) also having the need to avoid a mess of properties on higher levels, I think our discussion back-and-forth has revealed the need for some kind of namespace to be a core consideration. How to name these "wrapped array property namespaces" and how they relate to/differ from something like xarray's accessors (or even if a separate, rather than attached method/property chaining, user interface needs to be considered) would definitely need to be further thought out.

So, with that in mind, would proposing an alternative that maintains a very similar internal wrapping/deferral pattern to the suggestion of array_property, but operates through an external toolkit (with decorators/mixins/functions) make sense? Perhaps something that looks like [...]

I am not sure I understand your suggestion. It sounds like just forwarding to any attribute found on lower levels (plus the rewrap)? How does this address the issue of drowning in a sea of attributes?

This would indeed forward to any registered attribute found on lower levels plus conditionally (based on what kind of attribute) rewrapping. Given that such collections of registered attributes would likely still need to be large (see prior point), this indeed does not address the issue of drowning in a sea of attributes, so we'd need something like namespaces to avoid that.

A nice (potential) benefit of not enforcing a protocol all the way to the bottom of the type casting hierarchy in order to make this work out is that only arrays that wrap other arrays would need to implement this (e.g., xarray, pint, dask; not cupy, numpy, etc.).

Neither does my suggestion. There are no changes/patches to NumPy in my prototype.

I think I'm missing something here...as alluded to previously, how are potentially low-level attributes (like .T, .shape, .max()) that exist on the innermost array layer handled through the protocol?

@SimonHeybrock
Copy link

As far as I understood the array_property hand-off to lower types, the lowest type in the stack of wrapped arrays (e.g., NumPy, CuPy) still needs to have array_property implemented in order for their properties to be handled through the protocol. For example, how are properties that could exist on the lowest-level array like shape, T, or mean() handled through array_property?

Maybe this is the key misunderstanding in our discussion: I had considered these properties as something that most array implementations provide anyway? If we assume that array implementers stick close to the Python Array API there is no need to handle any of those properties with the proposed mechanism. __array_property__ would only be used for additional properties (such as units or dims or masks).

@jthielen
Copy link
Author

jthielen commented Sep 22, 2022

As far as I understood the array_property hand-off to lower types, the lowest type in the stack of wrapped arrays (e.g., NumPy, CuPy) still needs to have array_property implemented in order for their properties to be handled through the protocol. For example, how are properties that could exist on the lowest-level array like shape, T, or mean() handled through array_property?

Maybe this is the key misunderstanding in our discussion: I had considered these properties as something that most array implementations provide anyway? If we assume that array implementers stick close to the Python Array API there is no need to handle any of those properties with the proposed mechanism. __array_property__ would only be used for additional properties (such as units or dims or masks).

Yes, that is another key misunderstanding, thank you for clarifying! Coming from a pint-informed mindset where Array API-type things are implemented through this kind of attribute deferral, it had not occurred to me that Array API details should be handled separately within in array wrapping types rather than deferring. However, declaring that "wrapping array implementors should ensure they implement the Array API apart from this attribute deferral protocol" seems like a reasonable and clean-cut stance to take!

@SimonHeybrock
Copy link

To borrow your example, say pint.Quantity only exposed magnitude and units. How would a user of an xarray-wrapped pint.Quantity then use pint methods/properties like .to(), .to_base_units(), .m_as(), and .dimensionality? I don't think it would be realistic to satisfy the user base in this wide community by picking and choosing what is or is not sufficient for them.

Indeed, I am still struggling with that one. One option I am considering is providing an accessor, but not with Xarray's mechanism (which does not work with wrapping), but via __array_property__:

da.pint.to('mm')  # `pint` property found via `__array_property__`

The helper object returned by the pint property could then perform required re-wrapping.

So, given that the breath of user expectations likely demands that substantial portions of every layer's API is exposed, but (as you said) also having the need to avoid a mess of properties on higher levels, I think our discussion back-and-forth has revealed the need for some kind of namespace to be a core consideration. How to name these "wrapped array property namespaces" and how they relate to/differ from something like xarray's accessors (or even if a separate, rather than attached method/property chaining, user interface needs to be considered) would definitely need to be further thought out.

Your suggestion sounds similar to the pint property I mention above. I think we need to keep usability in mind as well: A user may not know or care that the are using pint for physical units. All they may want is have a units property on the data array. Of course this has limitations, i.e., we will still need things that are explicitly namepaced, but I feel there is need for standardizing basic concepts (similar to how the Python Array API standard does) that all implementing libraries might have (but to be honest I am not sure there are many good examples beyond a units property).

@SimonHeybrock
Copy link

I slightly extended my prototype to support dask:

array = dask.array.arange(15).reshape(3, 5)
vectors = sx.VectorArray(array, ['vx', 'vy', 'vz'])
edges = sx.BinEdgeArray(vectors)
data = Quantity(edges, 'meter/second')
masked = sx.MultiMaskArray(data,
                           masks={'mask1': np.array([False, False, True, False])})
da = xr.DataArray(dims=('x', ), data=masked, coords={'x': np.arange(4)})
result = da.xcompute()  # named xcompute here to avoid calling xarray.compute

The thing to note here is that xr.DataArray (or any of the other layers) is completely unaware of dask! That is, one could have a simple DataArray implementation that does not implement the dask collections interface and this would still work.

@jthielen
Copy link
Author

@SimonHeybrock Thanks for the follow-ups! In the coming days, I'll be writing up a thorough post for the Scientific Python discourse on the topics in this repo, and I plan on using what you have here for __array_property__ as a first draft proposal for addressing type-specific operations between wrapped array layers (leaving my helper package ideas as more of utilities to make implementing the protocol easier and/or with cleaner code). If you come up with any new insights on the protocol (particularly in regards to namespaces/type-grouping), please do mention them here! Otherwise, I'll contextualize those as an area in need of further work in that discourse write-up.

@SimonHeybrock
Copy link

Thanks. I'll keep looking into this as well. I think am finding more problems, such as NEP-35 (which says it fixes shortcomings of NEP-18) still having shortcomings (or maybe just its implementation?): The like argument is not forwarded to the __array_function__ of the type of the like argument. This means that np.empty and friends cannot recreate nested duck-array hierarchies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants