Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This is a lucky demo When I change the data generation process, the prediction of the variance is wrong #12

Open
conan-ux opened this issue Nov 26, 2020 · 2 comments

Comments

@conan-ux
Copy link

conan-ux commented Nov 26, 2020

I think this is a lucky demo. When I chenge the data generation code, the optimization will be guided wrongly and the variance prediction is wrong.

So I think this uncertainty method only works at a situation that: the value of the diff is close to precsion and log variance. If their value are not at the same scale, the method will broke.

`

def gen_data(N):

X = np.random.randn(N, Q)

w1 = 2.*1e2

b1 = 8.*1e2

sigma1 = 10  # ground truth

Y1 = X.dot(w1) + b1 + sigma1 * np.random.randn(N, D1)

w2 = 3*1e2

b2 = 3*1e2

sigma2 = 1*1e2 # ground truth

Y2 = X.dot(w2) + b2 + sigma2 * np.random.randn(N, D2)

return X, Y1, Y2

`

@ichxw
Copy link

ichxw commented Jan 14, 2021

@conan-ux I got a similar issue. Have you tried tuning the learning rate?

@RuixuanDai
Copy link

Try to increase epochs or tune the learning rate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants