Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--marginal output does not respect --invert_hash #3496

Closed
jackgerrits opened this issue Dec 1, 2021 · 2 comments · Fixed by #3554
Closed

--marginal output does not respect --invert_hash #3496

jackgerrits opened this issue Dec 1, 2021 · 2 comments · Fixed by #3554
Assignees
Labels
Bug Bug in learning semantics, critical by default Good First Issue
Milestone

Comments

@jackgerrits
Copy link
Member

jackgerrits commented Dec 1, 2021

../build/vowpalwabbit/vw --marginal m --noconstant --initial_numerator 0.5 --initial_denominator 1.0 --decay 0.001 --readable_model readable_model.txt

Data:

0.5 |m constant id1
1.0 |m constant id2
0.25 |m constant id3
0.4 |m constant id1

Observed invert hash model:

Version 8.11.0
Id 
Min label:0
Max label:1
bits:18
lda:0
0 ngram:
0 skip:
options: --marginal m
Checksum: 1964076403
marginals size = 3
262109:0.75:2
134578:1.5:2
251020:1.4:3
:0
m^constant:6788:0.877014

Expected invert hash model:

Version 8.11.0
Id 
Min label:0
Max label:1
bits:18
lda:0
0 ngram:
0 skip:
options: --marginal m
Checksum: 1964076403
marginals size = 3
id3:0.75:2
id2:1.5:2
id1:1.4:3
:0
m^constant:6788:0.877014

The marginals section contains hashes instead of the corresponding id feature name.

@jackgerrits jackgerrits added Bug Bug in learning semantics, critical by default Good First Issue labels Dec 1, 2021
@varunbankar
Copy link

Hello! Can I work on this issue?

@jackgerrits
Copy link
Member Author

Sure thing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Bug in learning semantics, critical by default Good First Issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants