Lo-Fi Games Seen Right

Or: Low Resolution and Low Frame-rate as Integral Aspects of the Art-style of Low-Fidelity Graphics

When one puts on one's NostalgiaVision (TM), one recalls how the games of years past seemed to wow with their graphical prowess. How they seemed so life-like at the time! Or rather how they provided a compelling and consistent simulation that was easy to become adsorbed in. When one returns to such games, one is amazed at how bad they look, it seems we are incapable of correctly remembering their low-fidelity. However, rarely are we actually properly returning to these games. Games intended for low-resolution CRT screens are instead played on high-resolution LCDs, where possible the rendering resolutions and frame-rates are increased, behind these changes is the implicit assumption that improving the clarity of the image can only be beneficial, that it is separate from and not intrinsic to the art. Here I would like to argue that this is misguided, and that the aesthetic of old games is only properly appreciated when they are experienced in their rightful context.




Take frame-rate. A lot of early 3D games suffered from terribly low frame-rates, sometimes being unplayable by high standards. While it is perhaps rare that the game-play systems work best within the context of a lower frame-rate, it is frequently the case that the visual presentation does. For sure a higher frame-rate image is smoother, easier to discern, and clearer. But this clarity brings into sharp contrast the facsimile of the image with the reality it strives to imitate. The higher the frame-rate is, the less interpolation is necessary by our brains, and therefore the less our own imagination enters the picture to fill in the gaps. This phenomenon was experienced by many viewers of Peter Jackson's The Hobbit when viewed in theaters at 48 frames-per-second (fps). The high-frame-rate (HFR) images made the costumes, the sets, the make-up, all of the details, look cheap, like a stage production viewed at too close a distance. What worked so well for many on The Lord of the Rings now failed to hold up. A foam sculpture may pass for rock when dimly lit and seen in the cinema-conventional 24-fps, but the higher the frame-rate, the more acutely the viewer perceives that it is indeed foam - the seams become increasingly visible. There are minor details, details noticed only unconsciously, that are seen all the time and used as cues by the mind with which to interpret our surroundings. Part of the "magic of cinema" is to deprive our senses of some of those vital cues and so enable our imaginations to fill in the gaps. With our senses restored more fully to how we perceive actuality, the use of imagination recedes, and we are struck with the sensation that what we are perceiving is fake, a perception we may be unable to shake. In crude games, a low frame-rate may work to help mask less fluid or detailed animations, as well as the low polygon-counts of the characters and environments. While the mind is never truly deceived into seeing these old games as reality, it can be 'duped' into accepting the 'reality' of the world presented if it is sufficiently convincing. Key to acceptance is in hiding the seams, in making clear distinctions between components of the game-world difficult to perceive. A low frame-rate is therefore a friend of a low-fidelity game as concerns its artistry. However, unlike in cinema, games are interactive, requiring user input. Low frame-rates generally actively harm this most important part of games, so that their employ is not without (dire) consequences.

Then there is resolution. Early in the development of 3D graphics textures were invented*. Textures are two-dimensional images that are projected onto three-dimensional surfaces (like say the graffiti covering a concrete wall). Textures can be used to encode three-dimensional surface details as image data. They allow a game to present a more convincing illusion of reality (or simply a more visually rich experience) without increasing the polygon budget to the realms of infeasibility. Take a brick wall. It consists of a three-dimensional matrix of bricks held together with a concrete binder. The bricks themselves are not very flat either, often presenting surface variations that lead to interesting shadows and patterns across them. However, the entire brick wall can be regarded as relatively flat, and so represented with a single rectangle. An image of a brick wall is then projected onto the rectangle and the player is treated to the illusion of a brick wall using a minimum of polygons. However, key to pulling off this illusion is blurring the distinction between polygons and textures. If the player can always clearly perceive what is a texture and what is a polygon, then the seams are showing, the fantasy becomes unconvincing. A key weapon in the arsenal of blurring this distinction is a low resolution. Or more specifically, screen pixel-density. A low pixel-density causes edges to be step-laddered, it blurs the lines between the worlds of the polygonal and the texture. With high pixel-density edges become sharp, polygons become clearly defined, and textural detail becomes just that - textures. As much as this assists in clarity, in perceiving the environment, it detracts from the illusion.

Now, it is not my intent to suggest that games or rather the aesthetic of games is improved by a lower frame-rate or a lower resolution. Rather to point out that the optimal settings are those around which the game was designed, and while subsequent improvements may aid the visual clarity, they will no doubt detract from the overall visual presentation. If your reaction to returning to an old game is 'This looks way worse than I remember' and not 'I can't see anything, how did I ever play this?' then you are doubtless playing it wrong.