Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is no code files of this repo? #4

Open
yangxudong opened this issue Feb 28, 2019 · 17 comments
Open

There is no code files of this repo? #4

yangxudong opened this issue Feb 28, 2019 · 17 comments

Comments

@yangxudong
Copy link

why this repo is empty.

@tiandunx
Copy link

I've already implemented SV-X-Softmax and it works fine!

@LaviLiu
Copy link

LaviLiu commented Mar 11, 2019

@tiandunx Would you like to share your code to github?

@tiandunx
Copy link

@Laviyy Sorry to reply late. Of course I'm willing to share my implementation.

@ysc703
Copy link

ysc703 commented Mar 12, 2019

@tiandunx We are waiting for you! Thanks^_^

@twmht
Copy link

twmht commented Mar 15, 2019

@tiandunx

any update?

@tiandunx
Copy link

Hi all, I'm so sorry for keeping you waiting. I shall release the code after ICCV. Sorry again. Thanks for keeping an eye upon this paper.

@luameows
Copy link

@tiandunx Would u mind tell me ur LFW accuracy? I implemented it on mnist, and the classification accuracy was not very good. As for on face recognition, the code is still running. So I cannot get the result now.

@tiandunx
Copy link

@luameows Sure, on LFW, I achieved 99.866% using MS only.

@ysc703
Copy link

ysc703 commented Mar 15, 2019

Hi all, I'm so sorry for keeping you waiting. I shall release the code after ICCV. Sorry again. Thanks for keeping an eye upon this paper.

@tiandunx After Nov 3, 2019? 😲😲😲

@tiandunx
Copy link

At the end of this month.

@wjgaas
Copy link

wjgaas commented Apr 1, 2019

@tiandunx which backbone net did you use to achieve 99.866% on lfw? res-50? or attention-56?

@twmht
Copy link

twmht commented Apr 2, 2019

@tiandunx

What is the value of t you used? And what is the margin loss function you used?

@tiandunx
Copy link

tiandunx commented Apr 4, 2019

t = 1.2, margin loss is additive margin softmax@twmht

@DevilCat
Copy link

DevilCat commented Apr 9, 2019

At the end of this month.

Is the code able to be shared? I'll be very appreciated.

@tiandunx
Copy link

tiandunx commented Apr 9, 2019

The code is available now.

@twmht
Copy link

twmht commented Apr 10, 2019

@tiandunx

thanks for sharing.

So you use all the default values when set is_am=True ?

def __init__(self, feat_dim, num_class, is_am, margin=0.45, mask=1.12, scale=32):

@wjgaas
Copy link

wjgaas commented Apr 11, 2019

@tiandunx Hi, which backbone net did you use to achieve 99.866% on lfw? res-50? or attention-56?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants