-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarification of tdwidth
in find_doppler, get rid of FITS terminology E.g. NAXIS1
#98
Comments
Does this comment suggest something to you?
Does the shoulder concept come into play when "averaging across channels for large drift rates" (issue #76)? Agreed about the use of NAXISn variables. |
I've looked at find_doppler.py, data_handler.py, and file_writer.py quite a bit lately. It looks like those FITS header fields are due to being porting from some other project some point back in time. DATAH5 object instantiation, in effect, creates a second header and several data object fields. Several of these fields either (a) duplicate the filterbank header or (b) do nothing. Of course, some of them are important in search_coarse_channel processing.
tsteps vs tsteps_valid is interesting:
stdout:
|
tdwidth
in find_doppler, get rid of NAXIS1tdwidth
in find_doppler, get rid of FITS terminology E.g. NAXIS1
Having looked in this code recently, this will be a somewhat tedious and methodical process. It seems like the original author wanted to handle FITS input files as well as .h5 and .fil files. |
I believe that the original code author was attacking issue #76 in a different way. If this is true, then the No excuses for the use of FITS metadata labels! Headers upon headers, ugh. Yesterday, I made another pass at cleanup (issue #274). I am certain that more is needed. If you believe that a
|
tdwidth
is a value used in the code that is unclear. NAXIS1 is not required as a parameter in general, should use a descriptive keyword instead of a FITS keyword.In data_handler, there is this code:
Unclear why not just using FFT length.
The text was updated successfully, but these errors were encountered: