This repository has been archived by the owner on Feb 11, 2021. It is now read-only.
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
in SDL 1.3 (revision 5508 from SVN), the method used to calculate the bits per pixel from a ?int format? differ between ?SDL_ListModes? (which always uses the ?SDL_BITSPERPIXEL? macro) and ?SDL_PixelFormatEnumTo- Masks? (which uses either ?SDL_BITSPERPIXEL? or ?SDL_BYTESPERPIXEL * 8?, depending on the value of ?SDL_BYTESPERPIXEL?). Because the values are later compared in ?SDL_ListModes? this may lead to some valid video modes not being returned. In my case the only mode returned by ?SDL_GetNumDisplayModes? was dismissed and NULL was returned. (This led to the calling application sticking its head in the sand.) The attached patch copies the method used within ?SDL_PixelFormatEnumTo- Masks? to ?SDL_ListModes?. This solved the problem for me though I don't fully understand the method used by ?SDL_PixelFormatEnumToMasks?.
- Loading branch information