You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed today that the result of the calculation "contamination_fraction" is inconsistent between being done with different load-modes.
Currently, "contamination_fraction" looks into the loaded particles, find the minimum mass of those and calculate the fraction of particles more massive than this minimum mass.
In a load-mode in which the entire snapshot is loaded, the minimum mass is always the overall minimum mass in the simulation (zoom resolution) so the calculation returns 1.0 for unzoomed halo, 0.X for contaminated halos and 0.0 for uncontaminated zoomed halos.
In a load-mode in which only halo data is loaded (e.g. server), the minimum mass is calculated locally, not necessarily knowing that there exists a zoom region elsewhere in the simulation. For example, a pure unzoomed halo will have minimum_mass 64 and zero particles above this value, while a pure zoomed halo will have minimum_mass 8.0 and zero particles above. The calculation then returns 0.0 for unzoomed halos, 0.X for contaminated halos and 0.0 for uncontaminated zoomed halos.
I am not sure which one should be the correct behaviour (number 1 for me) but the fact that it is inconsistent between load-mode is highly confusing.
Hope this is clear
Martin
The text was updated successfully, but these errors were encountered:
Yes the problem is clear but I am not immediately sure how to solve it actually. I agree the result should be 1.0 for fully contaminated halos, regardless of load mode. Any bright ideas?
Not really tbh, I couldn't see an obvious solution either. Maybe by storing the mass resolution of the simulation higher up in the handler, similarly as it is done for "approx_resolution_kpc"?
Yes, I guess that would work. If you implement it, would be good to have it transparent in the sense that the code will "do its best" if the appropriate value is not stored at the simulation level.
Hi all,
I noticed today that the result of the calculation "contamination_fraction" is inconsistent between being done with different load-modes.
Currently, "contamination_fraction" looks into the loaded particles, find the minimum mass of those and calculate the fraction of particles more massive than this minimum mass.
In a load-mode in which the entire snapshot is loaded, the minimum mass is always the overall minimum mass in the simulation (zoom resolution) so the calculation returns 1.0 for unzoomed halo, 0.X for contaminated halos and 0.0 for uncontaminated zoomed halos.
In a load-mode in which only halo data is loaded (e.g. server), the minimum mass is calculated locally, not necessarily knowing that there exists a zoom region elsewhere in the simulation. For example, a pure unzoomed halo will have minimum_mass 64 and zero particles above this value, while a pure zoomed halo will have minimum_mass 8.0 and zero particles above. The calculation then returns 0.0 for unzoomed halos, 0.X for contaminated halos and 0.0 for uncontaminated zoomed halos.
I am not sure which one should be the correct behaviour (number 1 for me) but the fact that it is inconsistent between load-mode is highly confusing.
Hope this is clear
Martin
The text was updated successfully, but these errors were encountered: