Skip to content

Commit

Permalink
squeeze last dim of ratios
Browse files Browse the repository at this point in the history
  • Loading branch information
bkmi committed Apr 9, 2024
1 parent 3b8afae commit 0e62d37
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion sbi/neural_nets/ratio_estimators.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ def unnormalized_log_ratio(self, theta: Tensor, x: Tensor, **kwargs) -> Tensor:
Returns:
Sample-wise unnormalized log ratios.
Just like log_prob, the last dimension should be squeezed.
"""

raise NotImplementedError
Expand Down Expand Up @@ -104,4 +105,4 @@ def unnormalized_log_ratio(self, theta: Tensor, x: Tensor) -> Tensor:
z = self.combine_embedded_theta_and_x(
self.embedding_net_theta(theta), self.embedding_net_x(x)
)
return self.net(z)
return self.net(z).squeeze(-1)

0 comments on commit 0e62d37

Please sign in to comment.