Fixed the logical size for rendering to texture, thanks to Mason Wheeler.
Mason Wheeler
The SDL_RenderGetLogicalSize function should always return the amount of pixels that are currently available for rendering to. But after updating to the latest SDL2, I started getting crashes because it was returning (0,0) as the logical size! After a bit of debugging, I tracked it down to the following code in SDL_SetRenderTarget:
if (texture) {
renderer->viewport.x = 0;
renderer->viewport.y = 0;
renderer->viewport.w = texture->w;
renderer->viewport.h = texture->h;
renderer->scale.x = 1.0f;
renderer->scale.y = 1.0f;
renderer->logical_w = 0;
renderer->logical_h = 0;
}
This is obviously wrong; 0 is never the correct value for a valid renderer. Those last two lines should read:
renderer->logical_w = texture->w;
renderer->logical_h = texture->h;
1.1 --- a/src/render/SDL_render.c Sat Jun 29 22:08:38 2013 +0200
1.2 +++ b/src/render/SDL_render.c Sat Jun 29 14:40:55 2013 -0700
1.3 @@ -950,8 +950,8 @@
1.4 renderer->viewport.h = texture->h;
1.5 renderer->scale.x = 1.0f;
1.6 renderer->scale.y = 1.0f;
1.7 - renderer->logical_w = 0;
1.8 - renderer->logical_h = 0;
1.9 + renderer->logical_w = texture->w;
1.10 + renderer->logical_h = texture->h;
1.11 } else {
1.12 renderer->viewport = renderer->viewport_backup;
1.13 renderer->clip_rect = renderer->clip_rect_backup;