You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tested your model in real life against real and spoofed photos and videos of me and some of my friends.
The amount of false negatives for real(predicting spoof where it should predict real) was quite astonishing.
This might be because CelebA-Spoof dataset is biased towards white people in general or the model is a bit weak to focus on actual cues. A pattern based model might fair better against people of other ethnicity
The text was updated successfully, but these errors were encountered:
imr555
changed the title
Model Classifies all brown or dark people as spoof.
Model Classifies all brown or dark people as spoof
Oct 18, 2021
I tested your model in real life against real and spoofed photos and videos of me and some of my friends.
The amount of false negatives for real(predicting spoof where it should predict real) was quite astonishing.
This might be because CelebA-Spoof dataset is biased towards white people in general or the model is a bit weak to focus on actual cues. A pattern based model might fair better against people of other ethnicity
The text was updated successfully, but these errors were encountered: