-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Metrics] Add Depth Loss #5392
Comments
@SkafteNicki I am kinda at loss here. The depth loss as defined in the paper, is given as follows, L(y, yˆ) = λLdepth(y, yˆ) + Lgrad(y, yˆ) + LSSIM(y, yˆ) where Ldepth is the L1 pointwise loss between the prediction and the ground truth. My question is, is depth-loss a bit too specific to be included out of the box? |
Hey @pranjaldatta, I would say it might be too specific. If @SkafteNicki feels the same way, let's close this issue. Otherwise, would you mind making a contribution ? Best, |
Sure, that works fine! Waiting for @SkafteNicki's POV |
@pranjaldatta so I agree that metric would be too specific to implement. However, I talked with @justusschock at some point that our metrics should support composition such that the can be added together, multiplied by constants ect. In this way the metric in question here could constructed by
however, we have hesitated to implement this because we needed good usecases, but the above constitutes a very good one IMO. |
@SkafteNicki That's great to hear, if it's okay I would like to work on a PR regarding the same. |
@pranjaldatta feel free to send a PR :] |
Hi @pranjaldatta, class CompositionalMetric(Metric):
def __init__(
self,
metric_A: Optional[Metric] = None,
metric_B: Optional[Metric] = None,
metric_C: Optional[Metric] = None,
metric_D: Optional[Metric] = None,
alpha: float = 1.0,
beta: float = 1.0
):
""" Implements composition of metrics of the following form:
alpha * A * B + beta * C / D
where A, B, C, D are metrics and alpha, beta are scalars.
Addition:
CompositionalMetric(A, None, B, None)
Multiplication by metrics
CompositionalMetric(A, B, None, None)
Multiplication by scalar
CompositionalMetric(A, None, None, None, alpha)
Division of metrics
CompositionalMetric(None, None, A, B)
"""
super().__init__()
if metric_A is not None:
self.metric_A = metric_A
if metric_B is not None:
self.metric_B = metric_B
if metric_C is not None:
self.metric_C = metric_C
if metric_D is not None:
self.metric_D = metric_D
if metric_A is None and metric_B is None and metric_C is None and metric_D is None:
raise ValueError('Atleast one metric should be provided')
self.alpha = alpha
self.beta = beta
def update(self, *args, **kwargs):
if self.metric_A is not None:
self.metric_A.update(*args, **kwargs)
if self.metric_B is not None:
self.metric_B.update(*args, **kwargs)
if self.metric_C is not None:
self.metric_C.update(*args, **kwargs)
if self.metric_D is not None:
self.metric_D.update(*args, **kwargs)
def compute(self):
if self.metric_A is not None:
val_a = self.metric_A.compute()
else:
val_a = 1.0
if self.metric_B is not None:
val_b = self.metric_B.compute()
else:
val_b = 1.0
# If neither metric_A or metric_B is defined set to 0
if self.metric_A is None and self.metric_B is None:
val_a = val_b = 0.0
if self.metric_C is not None:
val_c = self.metric_C.compute()
else:
val_c = 1.0
if self.metricD is not None:
val_d = self.metric_d.compute()
if val_d == 0.0:
raise ValueError('Division by 0')
else:
val_d = 1.0
# If neither metric_C or metric_D is defined make sure the term is 0
if self.metric_A is None and self.metric_B is None:
val_c = 0.0
val_d = 1.0
return self.alpha * val_a * val_b + self.beta * val_c / val_d
def reset(self):
if self.metric_A is not None:
self.metric_A.reset()
if self.metric_B is not None:
self.metric_B.reset()
if self.metric_C is not None:
self.metric_C.reset()
if self.metric_D is not None:
self.metric_D.reset() And then implement the different def __add__(self, other): # addition from left
if isinstance(other, Metric):
return CompositionalMetric(metric_A = self,
metric_C = other)
elif isinstance(other, Number):
return CompositionalMetric(metric_A = self,
beta = other)
else:
raise ValueError('invalid addtion')
def __radd__(self, other): # addition from right
if isinstance(other, Metric):
return CompositionalMetric(metric_A = other,
metric_C = self)
elif isinstance(other, Number):
return CompositionalMetric(metric_A = self,
beta = other)
else:
raise ValueError('invalid addtion')
def __mul__(self, other): # multiplication from left
if isinstance(other, Metric):
return CompositionalMetric(metric_A = self,
metric_B = other)
else:
return CompositionalMetric(metric_A = self,
alpha = other)
def __rmul__(self, other): # multiplication from right
if isinstance(other, Metric):
return CompositionalMetric(metric_A = other,
metric_B = self)
else:
return CompositionalMetric(metric_A = self,
alpha = other) |
Hey @SkafteNicki, that's a Great Idea! I do have a couple of question regarding the same,
|
@SkafteNicki I would only add a compositional metric, that gets two metrics and an operation. For more advanced things, you will then compose multiple compositional metrics. Interface would then look like that: class CompositionalMetric(Metric):
def __init__(self, operator, metric_a: Union[Metric, int, float, torch.Tensor], metric_b: Union[Metric, int, float, torch.Tensor])
super().__init__()
self.op = operator
if isinstance(metric_a, torch.Tensor):
self.register_buffer('metric_a', metric_a)
else:
self.metric_a = metric_a
if isinstance(metric_b, torch.Tensor):
self.register_buffer('metric_b', metric_b)
else:
self.metric_b = metric_b
def update(self, *args, **kwargs):
# maybe add some parsing for kwargs as in metric collection?
if isinstance(self.metric_a, Metric):
self.metric_a.update(*args, **kwargs)
if isinstance(self.metric_b, Metric):
self.metric_b.update(*args, **kwargs)
def compute(self, *args, **kwargs):
# also some parsing for kwargs?
if isinstance(self.metric_a, Metric):
val_a = self.metric_a.compute(*args, **kwargs)
else:
val_a = self.metric_a
if isinstance(self.metric_b, Metric):
val_b = self.metric_b.update(*args, **kwargs)
else:
val_b = self.metric_b
return self.op(val_a, val_b)
def reset(self):
if isinstance(self.metric_a, Metric):
self.metric_a.reset()
if isinstance(self.metric_b, Metric):
self.metric_b.reset() and the implementation of `add would look like this: def __add__(self, other: Union[Metric, int, float, torch.Tensor]):
return CompositionalMetric(torch.add, self, other) That way you could also add some more complex operators if necessary and it supports all tyes, the torch operators support out of the box. |
@justusschock your proposed solution is soo much better than mine. I propose that you create the PR for this feature :] |
Gotcha! |
Closing this as PR #5464 now allows for composition of such complex metrics. |
🚀 Feature
Add depth-loss to PT-lighting since image-gradient is being added. As discussed here.
Motivation
Depth-Loss is one of the metrics that's often used in depth-estimation-oriented papers (DenseDepth etc). The addition of depth-loss to PT-lightning out-of-box would greatly help developers and researchers
Pitch
Make depth-loss available out-of-box in PT-Lightning
Additional context
This is a continuation of PR #5056
The text was updated successfully, but these errors were encountered: