Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🧨 💎 Add literal interaction and outline for tests #245

Merged
merged 39 commits into from
Apr 25, 2021

Conversation

cthoyt
Copy link
Member

@cthoyt cthoyt commented Jan 22, 2021

Closes #244. This PR is part of the break-up effort on #107.

This PR implements literal interaction modules, which wrap another interaction module as well as a "combination" function, which takes the representations for the entities and representations for literals and creates a combined representation (implementation up to user - the simplest is to concatenate them).

There's a bit of goofiness to the typing since it's hard to express types dynamically - this makes it difficult to think about any other kind of interaction function besides simple ones that give single embeddings for head, relation, and tail.

Before Merge

  • Make tests work!
  • Do we need regularizers for LiteralE in general, or for DistMultLiteral or ComplExLiteral?
  • Should we reorganize the Combination class to take the unconcatenated representations?

input_dropout: float = 0.0,
):
linear = nn.Linear(embedding_dim + num_of_literals, embedding_dim)
dropout = nn.Dropout(input_dropout)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mberr note the dropout comes after the linear later for distmult but before for complex. was this correct?

mberr and others added 3 commits March 16, 2021 15:06
…tion-modules

# Conflicts:
#	src/pykeen/models/multimodal/complex_literal.py
#	src/pykeen/models/multimodal/distmult_literal.py
embedding_dim=embedding_dim,
initializer=nn.init.xavier_normal_,
dtype=torch.complex64,
# TODO: verify
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this needs to be written up a bit differently - no regularization was reported for this model, right? so it's time to remove it

@mberr
Copy link
Member

mberr commented Mar 16, 2021

Maybe, we should use a mixin here?

Maybe something like this

class EntityCombinationMixin(Interaction):
    def __init__(self, combination: Callable[..., torch.Tensor], *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.combination = combination

    def forward(self, h, r, t):
        h = self.combination(h)
        t = self.combination(t)
        return super().forward(h=h, r=r, t=t)

class LinearProjectionCombination(EntityCombinationMixin):
    def __init__(self, embedding_dim, num_of_literals,, *args, **kwargs):
        self.projection = nn.Linear(embedding_dim + num_of_literals, embedding_dim)
        super().__init__(combination=lambda x: self.projection(torch.cat(x, dim=-1)), *args, **kwargs)


class DistMultLiteral(EntityCombinationMixin, DistMultInteraction):
    pass

@cthoyt
Copy link
Member Author

cthoyt commented Mar 16, 2021

Maybe, we should use a mixin here?

Maybe something like this

class EntityCombinationMixin(Interaction):
    def __init__(self, combination: Callable[..., torch.Tensor], *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.combination = combination

    def forward(self, h, r, t):
        h = self.combination(h)
        t = self.combination(t)
        return super().forward(h=h, r=r, t=t)

class LinearProjectionCombination(EntityCombinationMixin):
    def __init__(self, embedding_dim, num_of_literals,, *args, **kwargs):
        self.projection = nn.Linear(embedding_dim + num_of_literals, embedding_dim)
        super().__init__(combination=lambda x: self.projection(torch.cat(x, dim=-1)), *args, **kwargs)


class DistMultLiteral(EntityCombinationMixin, DistMultInteraction):
    pass

cool idea, but let's make sure the typing is done right. The h = cat(h) line is super non-intuitive if you don't realize that before h is a pair of embeddings

@mberr
Copy link
Member

mberr commented Mar 16, 2021

cool idea, but let's make sure the typing is done right. The h = cat(h) line is super non-intuitive if you don't realize that before h is a pair of embeddings

I am not sure whether the cat should be part of the "interface", or whether we should keep it like above, where linear(cat(...)) is one of the possible combination choices.

@cthoyt
Copy link
Member Author

cthoyt commented Mar 16, 2021

@mberr you're thinking convolution too, right? I'm not the only one thinking that would be cool (but probably ultimately not helpful 😆 )

@cthoyt cthoyt marked this pull request as ready for review March 23, 2021 17:58
@cthoyt cthoyt added the enhancement New feature or request label Apr 12, 2021
@cthoyt cthoyt requested a review from mberr April 23, 2021 16:49
super().__init__()
self.base = interaction_resolver.make(base, base_kwargs)
self.combination = combination
self.entity_shape = tuple(self.base.entity_shape) + ("e",)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to make sure that "e" is unused?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, there needs to be an extra dimension floating around here that gets added in by the literal model

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added more docs for this in c7b9bc1

scores_f = self.cls.func(**kwargs).view(-1)
else:
kwargs = dict(h=h, r=r, t=t)
scores_f = self.instance(h=h, r=r, t=t)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need for view(-1) here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it works without it... sooooooo... you tell me

Copy link
Member

@mberr mberr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

besides, looks good to me! Great to have them integrated into testing.

@mberr mberr merged commit 07e3c29 into master Apr 25, 2021
@mberr mberr deleted the add-literal-interaction-modules branch April 25, 2021 12:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

🧨 💎 Add literal interaction modules
2 participants