Open
Description
Is your feature request related to a problem? Please describe.
I tried defining a Compose
within a Compose
with a different map_items
argument, but the one from the parent Compose
doesn't get overridden.
Only once I defined the parent Compose
to be a torchvision
one instead of monai
, it worked, as the torchvision
one doesn't have a map_items
argument to pass down.
Describe the solution you'd like
A nested Compose
should override the parent Compose
's arguments.
Describe alternatives you've considered
Additional context
Monai Bundle Config style example of what I'm doing. In this example, I need to wrap the final list into a dict to indicate that it's an input only, and not an input and target.
system#datasets#train#transform:
# Parent torchvision Compose so that the monai Compose children can have different map_items.
# A parent monai Compose would force all children to have its map_items.
_target_: torchvision.transforms.Compose
transforms:
- _target_: monai.transforms.Compose
# `map_items=True` - When input to a Transform is a list, apply the transform to each element of the list.
map_items: True
transforms:
- _target_: monai.transforms.LoadImage
image_only: True
- _target_: monai.transforms.EnsureChannelFirst
- _target_: monai.transforms.Orientation
axcodes: SPL
lazy: True
- _target_: monai.transforms.Spacing
pixdim: "%CONSTANTS#SPACING"
mode: bilinear
lazy: True
- _target_: monai.transforms.CropForeground
lazy: True
- _target_: monai.transforms.SpatialPad
spatial_size: "%CONSTANTS#PATCH_SIZE_VOXELS"
value: -1024
lazy: True
- _target_: monai.transforms.ScaleIntensityRange
a_min: -1024
a_max: 2048
b_min: 0
b_max: 1
clip: True
# Turn into a dict of views
- _target_: monai.transforms.Lambda
func: '$lambda x: {"view_0": x}'
# From this point it's not a Tensor anymore, but a list of NUM_CONTRASTIVE_CROPS Tensors
- _target_: monai.transforms.RandSpatialCropSamplesd
keys: view_0
roi_size: "%CONSTANTS#PATCH_SIZE_VOXELS"
num_samples: "%CONSTANTS#NUM_CONTRASTIVE_CROPS"
lazy: True
# Duplicate the view
- _target_: monai.transforms.CopyItemsd
keys: view_0
names: view_1
- _target_: project.transforms.ssl.DictifyTransform
keys: view_0
transform:
_target_: project.transforms.ssl.RandomResizedCrop
size: "%CONSTANTS#PATCH_SIZE_VOXELS"
scale: [0.33, 1.0]
- _target_: project.transforms.ssl.DictifyTransform
keys: view_1
transform:
_target_: project.transforms.ssl.RandomResizedCrop
size: "%CONSTANTS#PATCH_SIZE_VOXELS"
scale: [0.33, 1.0]
- _target_: monai.transforms.RandFlipd
keys: view_0
spatial_axis: [0]
prob: 0.5
lazy: True
- _target_: monai.transforms.RandFlipd
keys: view_1
spatial_axis: [0]
prob: 0.5
lazy: True
- _target_: monai.transforms.RandHistogramShiftd
keys: view_0
prob: 0.5
num_control_points: 5 # Makes distortion stronger
- _target_: monai.transforms.RandHistogramShiftd
keys: view_1
prob: 0.5
num_control_points: 5 # Makes distortion stronger
- _target_: monai.transforms.RandGaussianSmoothd
keys: view_0
prob: 0.5
- _target_: monai.transforms.RandGaussianSmoothd
keys: view_1
prob: 0.5
- _target_: monai.transforms.ToTensord
keys: ["view_0", "view_1"]
track_meta: False
- _target_: monai.transforms.Lambda
func: '$lambda x: (x["view_0"], x["view_1"])'
# Disable map_items so that the whole list of NUM_CONTRASTIVE_CROPS Tensors is wrapped into input instead of its elements
- _target_: monai.transforms.Compose
map_items: False
transforms:
- _target_: monai.transforms.Lambda
func: '$lambda x: {"input": x}'