You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It'd be helpful if the encoder and decoder tree models took into account the programming language's grammar. This is more important in the decoder as it would force it to only construct grammatically correct programs. It could still be interesting for the encoder as it may help it find better learned representations. To do this each non-terminal in the grammar for the decoder would need its own tree lstm cell and the cell would need a softmax to create its children based off its possible rules. The dimensionality of the softmax would be determined by the number of ways that non terminal could be expanded. For the encoder tree lstm, it is still necessary to have one for each non-terminal, but no softmaxes should be necessary.
The text was updated successfully, but these errors were encountered:
It'd be helpful if the encoder and decoder tree models took into account the programming language's grammar. This is more important in the decoder as it would force it to only construct grammatically correct programs. It could still be interesting for the encoder as it may help it find better learned representations. To do this each non-terminal in the grammar for the decoder would need its own tree lstm cell and the cell would need a softmax to create its children based off its possible rules. The dimensionality of the softmax would be determined by the number of ways that non terminal could be expanded. For the encoder tree lstm, it is still necessary to have one for each non-terminal, but no softmaxes should be necessary.
The text was updated successfully, but these errors were encountered: