I've just pushed a change to luxe alpha branch that removes the distinction for high DPI (retina) devices and the window size. See this issue for brief discussion, and this post for some details.

In summary, you no longer have to worry about high DPI screens changing the size of your game, it will look the same at the same window size on both types of screens.

If you were handling this type of sizing yourself and want to avoid this, define luxe_no_device_pixel_scaling in your project file to revert to the previous default behaviour.


If you've used luxe and given a build to a friend their screen size made your game tiny because of their high DPI monitor, you ran into the default behaviour of the alpha versions ( <= alpha 3.1 ) where the screen size in the engine is represented by the rendering size, not the window size.

To illustrate what happens more plainly, have a look at this diagram I drew. Most people have an intuitive understanding of what's happening here:

The black frame is the device pixels and the frame behind is the renderable pixels. We'll come back to that.

The device pixels are also known as the "window size" because the device (your computer or phone) gives you the window resolution you're talking about in those terms - in it's own device units - which is why I call them "device pixels".

However, in this example, when we request a window of 960x640 and the device is high DPI enabled, what you get is an added scaling factor to the backing frame - not the front frame - this is the difference between renderable pixels and device pixels in practice. They are two different coordinate spaces, where as before they were 1:1.

The renderable pixels are quite literally as if you took an image in an image editor and zoomed right into the pixels - the frame behind the window is called a "frame buffer", and it is essentially a 2D image, and frequently represented by a literal texture.

So the effect is that your units are different between these two spaces now, what you see (in pixels), and what you measure (in device pixels), 32 becomes 64.

When it changes your game

If you read this discussion before on handling higher resolution screens for your game, I cover the one important distinction to make beyond that of the window size when talking about your game.

The post says similar things but what it refers to is the above two coordinate spaces, device and renderable pixels, but there is a third coordinate space - the game world.

Many people learning to make games don't pay attention to the distinction between window size and world size, so they build their game according to the size of the window. This is often fine, and a useful way to make content visually match your expectations.

But, if you create a 32x32 window, and you place a 32x32 sprite, you would image the sprite covers the entire window? You are defining the sprite size in world units, always. In this case though, collapsing the two coordinate spaces in your mind will create problems when the expectation no longer holds true .

If you did imagine this would cover the full window, you would be surprised to find on a high DPI with 2x scaling (and the above diagram in action), the sprite is only covering a quarter of the window. This is because your window size no longer matches your world size, what you see is more pixels so you also see more of your game world units.

The original disconnect comes from treating your world size and your window size as the same coordinate space, but the question is how to avoid this issue in practice?

The rendering viewport

The solution is to keep your world units matching the window units, but to stretch the viewport (the black frame) onto the backing frame behind it. To do this, you usually use OpenGL glViewport and the orthographic matrix of your camera.

The orthographic camera size will tell the camera what size area of the world you want to see.

The viewport size will tell OpenGL the size you want to render at.

Looking at the sizes above, if we want to keep window size == world size, set the camera size to window size without device scaling, and set the rendering viewport to the size with device scaling.

To get this we take the following steps:
world area = window size
camera area = world area

Now we know how much we want to see of the world units.
This is the black frame, to take this to the renderable size:

viewport = camera viewport in world units * scaling

So, if our viewport was the full black frame area, we just stretch it directly onto the back buffer, and we see more pixels, but the same world units. This makes our game look the same everywhere, and it makes everything more pixels, so higher resolution.

Then, you account for the distinction in the assets, by having higher resolution images and using SDF fonts for clear, clean rendering at higher resolutions.

what you need to do


Previously the above diagram was configured by default, now the above description is configured by default.

The window size and visible game world units are now consistent regardless of device scale. This is handled internally now by the renderer.

You will notice all the tests and demos (when built with newer code) work correctly across high DPI without any changes, your game should too.

Hope that helps!

Oh and as you'll see in the coming dev logs, the final luxe design handles this nicely as well.

Feedback or questions about this post can be found in this discussion.