You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SDL_Surface, compared to Bitmap, has the slight difference that it keeps
one logical palette per instance (instead of just always interpreting the
pixel values in terms of the physical palette).
The thing is, when blitting from one surface to another, the pixel values
are mapped from one palette to the other. So we need to keep adjusting the
source surface's palette when blitting during a fade, otherwise all the
pixels get mapped to dark color entries, and it doesn't "fix itself" when
the fade is over.
If we would keep the logical palette of the display surface stable and
played only with the physical palette for the fading effect, this would be
much simpler, where source surfaces (of which there seems to be only one at
a time in most screens? yet to be verified) would need to be adjusted only
when we changed the logical palette. In fact, it seems like we usually do
it the other way around (set the display palette from the background image,
for example) in the first place.
Palettes in general are really starting to annoy me, though. :-)
No description provided.
The text was updated successfully, but these errors were encountered: