You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using chirped pulses it is often helpful to increase the bound for the refocusing time when importing an experiment model into dl.dipolarmodel. Currently this is set +-50ns however when using chirped pulses that are over 100ns long this can be an issue.
In my personal fork I have modified the dl.dipolarmodel function to take an additional parameter where the +- uncertainty can be adjusted. I think this should become standard.
The text was updated successfully, but these errors were encountered:
I agree, this should be an exposed options to the user.
A more intuitive way (for users) to implement this could be to add an optional keyword argument to the experiment model functions (e.g. ex_4pdeer) which allow them to specify the longest pulse length used in the experiment. From that value the program could automatically adjust the boundaries on the refocusing times, e.g. reftime +- 3*max(pulselength). So for example for standard 16ns rectangular pulses one would have reftime +- 48ns (similar to what is now hard-coded) and for shaped pulses of 100ns one would have reftime +- 300ns.
When using chirped pulses it is often helpful to increase the bound for the refocusing time when importing an experiment model into dl.dipolarmodel. Currently this is set +-50ns however when using chirped pulses that are over 100ns long this can be an issue.
In my personal fork I have modified the dl.dipolarmodel function to take an additional parameter where the +- uncertainty can be adjusted. I think this should become standard.
The text was updated successfully, but these errors were encountered: