Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Metrics] Add Depth Loss #5392

Closed
pranjaldatta opened this issue Jan 7, 2021 · 12 comments · Fixed by #5464
Closed

[Metrics] Add Depth Loss #5392

pranjaldatta opened this issue Jan 7, 2021 · 12 comments · Fixed by #5464
Assignees
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Milestone

Comments

@pranjaldatta
Copy link
Contributor

pranjaldatta commented Jan 7, 2021

🚀 Feature

Add depth-loss to PT-lighting since image-gradient is being added. As discussed here.

Motivation

Depth-Loss is one of the metrics that's often used in depth-estimation-oriented papers (DenseDepth etc). The addition of depth-loss to PT-lightning out-of-box would greatly help developers and researchers

Pitch

Make depth-loss available out-of-box in PT-Lightning

Additional context

This is a continuation of PR #5056

@pranjaldatta pranjaldatta added feature Is an improvement or enhancement help wanted Open to be worked on labels Jan 7, 2021
@SkafteNicki SkafteNicki added this to the 1.2 milestone Jan 7, 2021
@pranjaldatta
Copy link
Contributor Author

@SkafteNicki I am kinda at loss here. The depth loss as defined in the paper, is given as follows,

L(y, yˆ) = λLdepth(y, yˆ) + Lgrad(y, yˆ) + LSSIM(y, yˆ)

where Ldepth is the L1 pointwise loss between the prediction and the ground truth.
Ldepth is the L1 pointwise loss between the image gradients of the prediction and the ground truth
LSSIM is the SSIM index between the prediction and the ground truth

My question is, is depth-loss a bit too specific to be included out of the box?

@tchaton
Copy link
Contributor

tchaton commented Jan 7, 2021

Hey @pranjaldatta,

I would say it might be too specific. If @SkafteNicki feels the same way, let's close this issue. Otherwise, would you mind making a contribution ?

Best,
T.C

@pranjaldatta
Copy link
Contributor Author

Sure, that works fine! Waiting for @SkafteNicki's POV

@SkafteNicki
Copy link
Member

@pranjaldatta so I agree that metric would be too specific to implement. However, I talked with @justusschock at some point that our metrics should support composition such that the can be added together, multiplied by constants ect. In this way the metric in question here could constructed by

new_metric = lambda * pl.metrics.MeanAbsoluteError() + pl.metrics.GradientLoss() + pl.metrics.SSIM()

however, we have hesitated to implement this because we needed good usecases, but the above constitutes a very good one IMO.
Therefore, lets keep the issue open for now, so we dont forget about this feature.

@pranjaldatta
Copy link
Contributor Author

@SkafteNicki That's great to hear, if it's okay I would like to work on a PR regarding the same.

@SkafteNicki
Copy link
Member

@pranjaldatta feel free to send a PR :]

@SkafteNicki
Copy link
Member

Hi @pranjaldatta,
Just played a bit around with it. I would propose we add something like the following class:

class CompositionalMetric(Metric):
    def __init__(
            self, 
            metric_A: Optional[Metric] = None, 
            metric_B: Optional[Metric] = None, 
            metric_C: Optional[Metric] = None, 
            metric_D: Optional[Metric] = None,
            alpha: float = 1.0, 
            beta: float = 1.0
    ):
        """ Implements composition of metrics of the following form:
                alpha * A * B  + beta * C / D
            where A, B, C, D are metrics and alpha, beta are scalars.
            
            Addition:
                CompositionalMetric(A, None, B, None)
            Multiplication by metrics
                CompositionalMetric(A, B, None, None)
            Multiplication by scalar
                CompositionalMetric(A, None, None, None, alpha)
            Division of metrics
                CompositionalMetric(None, None, A, B)
        """
        super().__init__()
        if metric_A is not None:
            self.metric_A = metric_A
        if metric_B is not None:
            self.metric_B = metric_B
        if metric_C is not None:
            self.metric_C = metric_C
        if metric_D is not None:
            self.metric_D = metric_D
        if metric_A is None and metric_B is None and metric_C is None and metric_D is None:
            raise ValueError('Atleast one metric should be provided')
            
        self.alpha = alpha
        self.beta = beta
        
    def update(self, *args, **kwargs):
        if self.metric_A is not None:
            self.metric_A.update(*args, **kwargs)
        if self.metric_B is not None:
            self.metric_B.update(*args, **kwargs)
        if self.metric_C is not None:
            self.metric_C.update(*args, **kwargs)
        if self.metric_D is not None:
            self.metric_D.update(*args, **kwargs)
        
    def compute(self):
        if self.metric_A is not None:
            val_a = self.metric_A.compute()
        else:
            val_a = 1.0
        if self.metric_B is not None:
            val_b = self.metric_B.compute()
        else:
            val_b = 1.0
        # If neither metric_A or metric_B is defined set to 0
        if self.metric_A is None and self.metric_B is None:
            val_a = val_b = 0.0
        if self.metric_C is not None:
            val_c = self.metric_C.compute()
        else:
            val_c = 1.0
        if self.metricD is not None:
            val_d = self.metric_d.compute()
            if val_d == 0.0:
                raise ValueError('Division by 0')
        else:
            val_d = 1.0
        # If neither metric_C or metric_D is defined make sure the term is 0
        if self.metric_A is None and self.metric_B is None:
            val_c = 0.0
            val_d = 1.0
            
        return self.alpha * val_a * val_b + self.beta * val_c / val_d
    
    def reset(self):
        if self.metric_A is not None:
            self.metric_A.reset()
        if self.metric_B is not None:
            self.metric_B.reset()
        if self.metric_C is not None:
            self.metric_C.reset()
        if self.metric_D is not None:
            self.metric_D.reset()

And then implement the different __add__, __mul__ ect methods in the base metric class as:

    def __add__(self, other): # addition from left
        if isinstance(other, Metric):
            return CompositionalMetric(metric_A = self, 
                                       metric_C = other)
        elif isinstance(other, Number):
            return CompositionalMetric(metric_A = self,
                                       beta = other)
        else:
            raise ValueError('invalid addtion')
    
    def __radd__(self, other): # addition from right
        if isinstance(other, Metric):
            return CompositionalMetric(metric_A = other,
                                       metric_C = self)
        elif isinstance(other, Number):
            return CompositionalMetric(metric_A = self,
                                       beta = other)
        else:
            raise ValueError('invalid addtion')
    
    def __mul__(self, other): # multiplication from left
        if isinstance(other, Metric):
            return CompositionalMetric(metric_A = self,
                                       metric_B = other)
        else:
            return CompositionalMetric(metric_A = self,
                                       alpha = other)
    
    def __rmul__(self, other): # multiplication from right
        if isinstance(other, Metric):
            return CompositionalMetric(metric_A = other,
                                       metric_B = self)
        else:
            return CompositionalMetric(metric_A = self, 
                                       alpha = other)

@pranjaldatta
Copy link
Contributor Author

Hey @SkafteNicki, that's a Great Idea! I do have a couple of question regarding the same,

  1. Just to be clear, do you propose this as a general feature that lets users build compositional metrics, or do propose this only for this specific loss function?

  2. Regarding the skeleton metric equation alpha * A * B + beta * C / D, wouldn't it be better if the skeleton equation is something along the lines of alpha * MetricA + beta * MetricB +gamma * Metric C + ..... I say this because isn't it more likely that compositional metrics would most commonly be composed via weighted addition (like L(y, yˆ) = λLdepth(y, yˆ) + Lgrad(y, yˆ) + LSSIM(y, yˆ) ) instead of multiplication or division of resultant metric values. I may be wrong though.

@justusschock
Copy link
Member

@SkafteNicki I would only add a compositional metric, that gets two metrics and an operation. For more advanced things, you will then compose multiple compositional metrics. Interface would then look like that:

class CompositionalMetric(Metric):
    def __init__(self, operator, metric_a: Union[Metric, int, float, torch.Tensor], metric_b: Union[Metric, int, float, torch.Tensor])
        super().__init__()

        self.op = operator

        if isinstance(metric_a, torch.Tensor):
            self.register_buffer('metric_a', metric_a)
        else:
            self.metric_a = metric_a

        if isinstance(metric_b, torch.Tensor):
            self.register_buffer('metric_b', metric_b)
        else:
            self.metric_b = metric_b

    def update(self, *args, **kwargs):

        # maybe add some parsing for kwargs as in metric collection?
        if isinstance(self.metric_a, Metric):
            self.metric_a.update(*args, **kwargs)

        if isinstance(self.metric_b, Metric):
            self.metric_b.update(*args, **kwargs)

    def compute(self, *args, **kwargs):
    
        # also some parsing for kwargs?
        if isinstance(self.metric_a, Metric):
            val_a = self.metric_a.compute(*args, **kwargs)
        else:
            val_a = self.metric_a

        if isinstance(self.metric_b, Metric):
            val_b = self.metric_b.update(*args, **kwargs)
        else:
            val_b = self.metric_b

        return self.op(val_a, val_b)


    def reset(self):
        if isinstance(self.metric_a, Metric):
            self.metric_a.reset()

        if isinstance(self.metric_b, Metric):
            self.metric_b.reset()

and the implementation of `add would look like this:

def __add__(self, other: Union[Metric, int, float, torch.Tensor]):
    return CompositionalMetric(torch.add, self, other)

That way you could also add some more complex operators if necessary and it supports all tyes, the torch operators support out of the box.

@SkafteNicki
Copy link
Member

Hey @SkafteNicki, that's a Great Idea! I do have a couple of question regarding the same,

1. Just to be clear, do you propose this as a general feature that lets users build compositional metrics, or do propose this only for this specific loss function?

2. Regarding the skeleton metric equation `alpha * A * B  + beta * C / D`, wouldn't it be better if the skeleton equation is something along the lines of  `alpha * MetricA + beta * MetricB +gamma * Metric C + ....`. I say this because isn't it more likely that compositional metrics would most commonly be composed via weighted addition (like `L(y, yˆ) = λLdepth(y, yˆ) + Lgrad(y, yˆ) + LSSIM(y, yˆ) `) instead of multiplication or division of resultant metric values. I may be wrong though.
  1. General feature
  2. The idea would be that the user would not have to use CompositionalMetric themself since we would implement __add__, __mul__ ect in the base class that would allow users to simply add together metrics: metric = metric_1 + metric_2

@justusschock your proposed solution is soo much better than mine. I propose that you create the PR for this feature :]

@pranjaldatta
Copy link
Contributor Author

  • General feature
  • The idea would be that the user would not have to use CompositionalMetric themself since we would implement __add__, __mul__ ect in the base class that would allow users to simply add together metrics: metric = metric_1 + metric_2

Gotcha!

@justusschock justusschock mentioned this issue Jan 11, 2021
12 tasks
@SkafteNicki SkafteNicki linked a pull request Jan 12, 2021 that will close this issue
12 tasks
@SkafteNicki
Copy link
Member

Closing this as PR #5464 now allows for composition of such complex metrics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants