Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialization of global weights is suboptimal #169

Closed
luisfabib opened this issue Apr 30, 2021 · 0 comments · Fixed by #174
Closed

Initialization of global weights is suboptimal #169

luisfabib opened this issue Apr 30, 2021 · 0 comments · Fixed by #174
Labels
bug Something isn't working enhancement New feature or request
Milestone

Comments

@luisfabib
Copy link
Collaborator

When doing global fits, all datasets are weighted equally (weights=[1,1,...]) and scaled all to the range [0,1]. The scaling avoids signals with larger scale V0 to have a larger influence during the fitting due to their contribution to the LSQ residual having larger values.

However, all DeerLab fit functions do not take into account the variance of the different signals. Signals with larger noise will lead to a larger contribution to the LSQ residual than the lower-noise signals. This is the inverse behaviour that would be expected, since the low-quality datasets will dominate the fit instead of the higher-quality fits.

Instead of being initialized equally, the default behaviour of global fitting should be to weight each signals inversely proportional to their estimated noise. This would solve the issue.

Here is an example illustrating the problem and the (potential) solution:

import numpy as np 
import matplotlib.pyplot as plt
import deerlab as dl 

# Simulation
t = np.linspace(0,5,300)
r = np.linspace(2,4.5,600)
P = dl.dd_gauss(r,[3,0.2])
K = dl.dipolarkernel(t,r)

sigma1 = 0.01
V1 = K@P + dl.whitegaussnoise(t,sigma1,seed=1)
sigma2 = 0.5
V2 = K@P + dl.whitegaussnoise(t,sigma2,seed=1)

# Default behavior

fit = dl.fitparamodel([V1,V2],lambda p: [K@dl.dd_gauss(r,p)]*2,[3.5,0.5],[2,0.1],[20,5])

plt.figure(figsize=[6,6])
plt.subplot(221)
plt.fill(r,P,alpha=0.3)
plt.plot(r,dl.dd_gauss(r,fit.param),'k')
plt.title('Default')
plt.xlabel('r [nm]')
plt.subplot(223)
plt.plot(t,V1,'.',t,fit.model[0],'k',t,V2+2,t,fit.model[1]+2,'k')
plt.xlabel('t [us]')

# Expected behavior

fit = dl.fitparamodel([V1,V2],lambda p: [K@dl.dd_gauss(r,p)]*2,[3.5,0.5],[2,0.1],[20,5],weights=[1/sigma1,1/sigma2])

plt.subplot(222)
plt.fill(r,P,alpha=0.3)
plt.plot(r,dl.dd_gauss(r,fit.param),'k')
plt.title('Expected')
plt.xlabel('r [nm]')

plt.subplot(224)
plt.plot(t,V1,'.',t,fit.model[0],'k',t,V2+2,t,fit.model[1]+2,'k')
plt.xlabel('t [us]')

plt.tight_layout()
plt.show()

issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant