Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement color dependent galaxy shapes #17

Open
dkirkby opened this issue Aug 9, 2016 · 30 comments
Open

Implement color dependent galaxy shapes #17

dkirkby opened this issue Aug 9, 2016 · 30 comments

Comments

@dkirkby
Copy link
Member

dkirkby commented Aug 9, 2016

The catalog bulge/disk/AGN normalizations are specified at a one wavelength but multiply different SEDs, so this issue is to combine the normalization parameters (fluxnorm_bulge/disk/agn) with the appropriate SEDs (sedname_bulge/disk/agn) to calculate correct bulge/disk/agn proportions in each band.

As a cross check, try to reproduce the extinction corrected AB apparent magnitudes in the input catalog, which will require using the av_b/d, rv_b/d and ext_model_b/d catalog params (@danielsf why no corresponding AGN params?).

@jmeyers314 I remember you saying you had already done a similar check. How good was the agreement? Did you check the catalog magnitudes, colors or both?

The galaxy input catalog schema is documented here (@danielsf is this still current?)

@danielsf
Copy link

danielsf commented Aug 9, 2016

Yes. That schema is still current.

As to why there are no dust parameters for the AGN: I'm not sure. These tables were constructed before my time. I will ask around. It is possible that we decided it did not matter since literally every AGN in our catalog has the same SED (or, put another way, we only have one AGN SED in our SED library), so whatever we do isn't going to be very realistic anyway. I will get back to you.

@danielsf
Copy link

danielsf commented Aug 9, 2016

Apparently, we matched our AGN distribution to the distribution in this reference

Y. Q. Xue, B. Luo, W. N. Brandt, F. E. Bauer, B. D. Lehmer, P. S. Broos, D. P. Schneider, D. M.
Alexander, M. Brusa, A. Comastri, A. C. Fabian, R. Gilli, G. Hasinger, A. E. Hornschemeier,
A. Koekemoer, T. Liu, V. Mainieri, M. Paolillo, D. A. Rafferty, P. Rosati, O. Shemmer, J. D.
Silverman, I. Smail, P. Tozzi, and C. Vignali. The Chandra Deep Field-South Survey: 4 Ms
Source Catalogs. Astrophysical Journal Supplemental, 195:10, July 2011. doi: 10.1088/0067-
0049/195/1/10.

The AGN SED is taken from vandenBerg (let me know if that doesn't make any sense to you and I can try to find a more detailed citation). The normalization of the SED (which does vary from AGN to AGN) was done to try to empirically match observed magnitudes (so there ought to be some dust implicitly folded in there). The variability parameters were also tuned to the observed distribution.

Does this help?

@dkirkby
Copy link
Member Author

dkirkby commented Aug 9, 2016

Thanks @danielsf ! How literally should I take the schema descriptions, e.g. for fluxnorm_bulge:

Multiplicative scaling factor to apply to the bulge SED

Do I really just multiply the raw SED (from here?) by this scaling factor? I guess the result is then in erg/cm2/s/A, based on the comment line in the individual SED files?

@danielsf
Copy link

danielsf commented Aug 9, 2016

Yes, you can just multiply by the fluxnorm factor appropriate to the component.

Note: the LSST bandpass throughputs have evolved since the database was created. In order to get near exact agreement with the catalog, you will have to use the throughputs as they exist in commit 3ed95bd of this github repository

https://github.com/lsst/throughputs

@danielsf
Copy link

danielsf commented Aug 9, 2016

Also note that the [u,g,r,i,z,y]_ab magnitudes in the database neglect the AGN contribution (this is just based on a similar validation I did last week)

@jmeyers314
Copy link
Member

I looked at reproducing the CatSim magnitudes from the SEDs/extinction/normalizations a few years ago. The agreement at the time is in the following figure:

compare_galaxy_mags

Pretty good, but not exact. At the time I chalked this up to evolution of the bandpasses since the catalog was created.

However, I seem to remember an earlier offline conversation with @dkirkby that the SEDs library may have also evolved (at least for stars) since I last queried CatSim.

@danielsf
Copy link

danielsf commented Aug 9, 2016

Stars have evolved (mostly we changed the wavelength grid at which we keep the SEDs). Galaxies should not have been touched the last two or three years.

@danielsf
Copy link

danielsf commented Aug 9, 2016

@dkirkby Am I correct that you ultimately want a catalog of galaxies with individual bulge, disk, and agn magnitudes in each of the LSST bands?

@dkirkby
Copy link
Member Author

dkirkby commented Aug 9, 2016

That's the near-term goal, yes, to improve on my current assumption that the bulge/disk/agn proportions are the same in all bands. However, I ultimately want to be able to calculate magnitudes of all components in arbitrary bands, e.g., WFIRST. I already have the SED to magnitude part under control (using specsim) so I just need to be able to calculate SEDs for each component.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 9, 2016

On the topic of extinction, the schema says for, e.g. ext_model_b:

Extinction model identifier ('ccm')

I guess ccm refers to Cardelli, Clayton & Mathis (ApJ, 345, 245, 1989), but this appears to be Milky Way (not host galaxy) extinction, so shouldn't it depend on the (ra,dec) of an observation, which isn't fixed for a catalog object because of the tiling?

@danielsf
Copy link

danielsf commented Aug 9, 2016

Okay. We already have many tools in CatSim for querying our database, loading the SEDs, integrating them over bandpasses (Bryce Kalmbach recently added WFIRST, Hipparcos, and PanStarrs filters to our throughputs repository). If you'd rather not deal with the trouble of querying our database and parsing its contents "by hand," let me know, and I can show you how to do all of this in the CatSim framework.

@danielsf
Copy link

danielsf commented Aug 9, 2016

The rv_d, rv_b etc. parameters are for defining a dust extinction model (that is CCM-like) inside the host galaxy. If you want to further apply Milky Way extinction on top of that, you need to calculated the relevant dust parameters based on the galaxy's RA, Dec

@dkirkby
Copy link
Member Author

dkirkby commented Aug 9, 2016

I am using a preprocessed 1 sq.deg. of CatSim to insulate users entirely from the database layer, but
I am willing to try out the CatSim framework if you think it is ready for wider distribution. What dependencies would that pull in? Where should I start in the docs? Will you be in Tucson next week?

@danielsf
Copy link

I will be in Tucson.

The tools I describe exist entirely in the code that you get when you install the LSST Simulations stack.

https://confluence.lsstcorp.org/display/SIM/Catalogs+and+MAF

The best tutorial on how CatSim works is here

https://github.com/uwssg/LSST-Tutorials/blob/master/CatSim/CatSimTutorial_SimulationsAHM_1503.ipynb

I will draw up an iPython notebook explaining how to do what you want and send it to you tomorrow.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 10, 2016

That's 28 packages, including boost, just to read the SED text files in sims_sed_library ! I'm interested to learn how to do this, but not sure I want to pass this on to users.

I think my next step should be to replace the current dbquery script with equivalent code using the LSST Sims Stack, since users do not normally run this.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 10, 2016

@danielsf What is the minimal set of LSST Sims packages required to query the UW CatSim db? Looking the names and descriptions given here, I would guess:

base, utils, pymssql, sims_catalogs_generation, sims_catUtils

Do you expect that some small set of packages like this will be sufficient, or am I going to end up needing everything under lsst-apps and lsst-sims?

@danielsf
Copy link

Unfortunately, installing sims_catUtils means installing basically the whole of lsst_sims. sims_catUtils includes functions that map (RA, Dec) to pixel positions on the LSST camera, which implies dependence on afw (DM's "applications framework"). And, yes, sims_catUtils is where the interface to the CatSim database is defined.

You do not need lsst_apps to run any of the simulations code.

@danielsf
Copy link

danielsf commented Aug 10, 2016

Have you tried the conda installation of lsst_sims? It takes about 30 minutes to install on account of the SED library and dust maps, but it is very straightforward and not nearly as prone to failure as eups distrib install was (and even eups distrib install is much more stable than it used to be)

@dkirkby
Copy link
Member Author

dkirkby commented Aug 11, 2016

For the record, the installation consumed 7G and took about an hour (on my laptop).

@dkirkby
Copy link
Member Author

dkirkby commented Aug 11, 2016

Should I be concerned that the lsst-sims installation replaced a bunch of standard packages with "nomkl" versions? Will that interfere with other packages that depend on these standard packages?

The following packages will be UPDATED:

    ...
    numexpr:                       2.4.4-np110py27_0                                --> 2.6.0-np110py27_nomkl_0  [nomkl]
    numpy:                         1.10.4-py27_2                                    --> 1.10.4-py27_nomkl_2      [nomkl]
    scikit-learn:                  0.17.1-np110py27_1                               --> 0.17.1-np110py27_nomkl_2 [nomkl]
    scipy:                         0.16.0-np110py27_1                               --> 0.17.1-np110py27_nomkl_1 [nomkl]

Is MKL actually incompatible with the LSST stack? Otherwise, it sounds like a good thing.

@danielsf
Copy link

As I understand it: the way the MKL packages were implemented, they contain methods that have the same names as methods provided by fftw (which the stack also uses), so there was a lot of incorrect linking going on. Mario Juric has figured out a way to hack fftw so that this is no longer a problem and we should be able to build against MKL in future versions of the sims stack.

@cwwalter
Copy link
Member

Apparently this is now in the development versions but the latest conda releases don't have the fix yet so right now when you do the lsst-sims update it downgrades them.

BTW, David there is a separate conda channel you can use that has a rolling sims update release schedule. The official DM one is frozen.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 11, 2016

I believe the sims use a different channel than DM (with more frequent updates?):

conda config --add channels http://conda.lsst.codes/sims

None of my standard packages were actually downgraded when I installed lsst-sims.

@cwwalter
Copy link
Member

Yes, that is the channel.

Oh, do you mean you redid it with the sims channel and the packages weren't downgraded to nomkl?

That's good. That mean Josh H. remade the packages.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 11, 2016

They were replaced with nomkl versions but not downgraded in the version-number sense, e.g.

numpy:                         1.10.4-py27_2                                    --> 1.10.4-py27_nomkl_2      [nomkl]

@cwwalter
Copy link
Member

Ah. OK. That is the same. That should change the next time new binaries are released.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 11, 2016

@danielsf A suggestion for these instructions: add options -fN to the ssh command line in step (1) so it puts the tunnel process in the background instead of starting an interactive remote shell session.

Also, step (2) seems to be redundant with the latest lsst-sim since $SIMS_CATUTILS_DIR/config/db.py already has the correct config.

@dkirkby
Copy link
Member Author

dkirkby commented Aug 12, 2016

For the record, I just got burned by this DM stack nomkl business when I tried to update a non-conda package (lmfit) using pip, which triggered an update of numpy and left me with a broken numpy. I was eventually able to recover from this (but unable to upgrade lmfit) with:

conda install -f nomkl numpy=1.10.4

@joezuntz
Copy link

Side question about the stack: has anyone seen the error:
OpenBLAS: pthread_creat error in blas_thread_init function. Error code:11
before when trying to run things?

@danielsf
Copy link

@joezuntz I have seen the problem before (second hand; haven't incurred it myself). In the past, it has been resolved be reducing the number of cores and/or thread you are trying to use. Perhaps try setting OMP_NUM_THREADS to something 25 or lower.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants