The responsibility of game art content towards smooth performance
Your game’s performance isn’t an engineering problem for that team to solve in the closing stages of development. It’s an art content problem.
Since games are a visual medium, it should be no surprise that the visual content on screen has the greatest impact on performance. Smooth performance isn't achieved by heroic optimization at the end of a project; it's designed, budgeted, and built into the art and associated tech from day one.
Performance in games has always been a challenge for developers. With each new platform generation, we gain more computational power to push for higher frame rates, resolutions, and visual fidelity. Yet, the struggle to maintain consistent performance has only grown more complex.
Recent discourse might blame game engines or the developers, the reality is that performance is a collective responsibility. The contributions of Engineers, Designers and the work of other developers all utilize the hardware resources, therefore impact performance. Each department has their role to play in achieving smooth performance.
As an Art Director, I am aware that all content from the size of levels, to the number of characters on screen and asset density in the worlds and individual asset costs must support the project's technical goals. The cost to render a single pixel has increased exponentially, driven by layered shaders, real-time global illumination, and physically based materials.
Every element has a cost; nothing is free. The limit is defined by the hardware.
This article will focus on how art content and direction are fundamental to achieving a smooth performance.
Building to the Target, Not Beyond It
When I started in the industry, one of the first games I recall hitting 1080p at 60fps was Wipeout HD (2008), a title that pushed the PlayStation 3 to its absolute limits. It was an early adopter of dynamic resolution scaling. Today, many games on ninth-generation consoles require dynamic resolution scaling to approach that same target on more powerful hardware, with few achieving a native 1440p+ at 60fps without the tech.
The Wipeout team produced a smooth experience with a high quality visual presentation because they built their content to hit their performance targets. They didn't over-focus on pushing visual complexity beyond what the hardware could handle, they did exercise strong and confident art direction decisions. This achievement was no accident; it was the result of diligent work to assess the hardware's limits.
A modern example of this would be Doom: The Dark Ages which strikes a considered balance between visual fidelity and performance required by the gameplay. It leverages modern scalable rendering features to deliver a consistent experience across a wide range of hardware, proving that a high visual quality and smooth performance is still achievable. On Series X it goes from 1080P - 1440P. One common pitfall today which heavily contributes to performance issues in games is overworking the content.
Teams driven by the desire to make visually stunning games but without clear technical targets inevitably overshoot and produce assets that are too complex for the target hardware. This forces teams to rely on harsh global solutions such as aggressive resolution scaling and reducing the fidelity of the rendering overall at the end of productions in order to hit release dates. Worse, it often leads to "hacking" content down, brutally reducing the visual quality and wasting the art team's earlier efforts.
Learning from "Line Mileage"
Traditional art forms like comic art and hand-drawn animation have long designed around production constraints. A key concept is "line mileage"—the total length of lines required to draw a character. More lines mean more time and effort for every single frame. This discipline forces artists to be efficient and intentional with their designs.
This concept translates directly to game art. Every polygon, every texture, and every shader instruction adds to our "technical mileage." The time spent creating overly complex assets that will later be cut or reduced in quality is a direct cost to the project's schedule and budget. The cost to produce the content in the first place and then the additional cost to drop the quality later via mesh reductions and texture downsizing.
Establishing clear technical budgets provides teams with the information they need to work efficiently, preventing wasted effort. This discipline is the difference between a smooth, iterative production and a last-minute, panicked scramble to optimize.Decide early and commitA primary responsibility of an Art Director during pre-production is to collaborate with engineering and design to define and commit to the project's technical pillars. This is a negotiation.
Key decisions include:
The target frame rate and resolution. This is a major factor in final image quality and the frame rate target.
The rendering pipeline and lighting model. Lighting model choice has a huge impact on performance in ninth gen consoles.
Budgets for draw calls, memory, and shader complexity.
Identifying and prototyping solutions for high-risk visual features, like dynamic hair, water, or dense foliage.
These constraints are not limitations but solid guideposts that allow for creative iteration and refinement. Technical fidelity is in part defined by the art direction goals for the visual aesthetic and production quality target for a project.
Making these decisions early is crucial. Failing to do so results in rework, wasted effort, and cascading delays across all teams. This indecision is a common source of project delays, where teams over-promise on a visual target that the budget, team size, or hardware simply cannot support. The project then bleeds time on optimizing content that was built at the wrong fidelity from the start, time that would have been better spent polishing the game.
Know Your Target
It is an Art Director's responsibility to be aware of the impact of visual style and aspiration on the technical fidelity and ultimately performance.
If a key priority of a project is to be a high bar in visuals utilizing raytracing and micro geometry detailing then the targets will lean more towards 30 FPS with lower native render resolutions that will take advantage of upscaling techniques such as DLSS and FSR. Complex shaders can be utilized more on screen and support up to 4 layers to facilitate more detailed visual storytelling and contextual wear.
If the priority is response time and a solid experience for competitive gaming, the focus should be on less resource heavy assets, fewer layers in shaders (a maximum of two, for instance) and less use of complex layered shaders overall and unique materials. The team should aim to use as many single layer assets as possible. Maintaining tighter budgets on texture use and geometry density per asset.
If complex lighting solutions are desired then understand the cost of ray tracing and budget for it meticulously balancing the resource use of the other content to pay for it. If the priority of a project is to be on the maximum number of hardware platforms then more resource hungry technologies need to be avoided to facilitate the lower end hardware which will have slower CPU’s / GPU’s and less available RAM.
In each of the above scenarios the fidelity and complexity of the content needs to be considered and balanced appropriately.
Benchmark and Profile
I’ve seen too many projects skip this critical step. If you don't create representative, stress-tested content to prove you can hit your technical and visual targets, how do you know it's even possible? You don't, and that uncertainty will create major problems later.
The process of defining these benchmarks is never linear, but getting the known elements handled early frees up development time for the unknowns.
Examples of elements to benchmark include:
Texel density across all asset types, based upon the intended camera POV and target resolution.
Mesh density standards (polycounts).
The number of materials allowed per asset.
Sensible approaches to sharing across assets to bring render costs and memory footprints down.
How granular to make modular kits.
The complexity and number of layers in shaders.
Density of decal use and VFX.
Lighting best practices to be optimal yet achieve the desired look.
Defining solutions for complex game features such as destruction, deformable landscapes, interactable and dynamic assets.
Profiling is a crucial next step to validate your decisions and refine the pipelines and visual targets to better align with the overall project goals. Measure performance with tools like PIX, RenderDoc, Unreal Insights and assessing the data to make informed decisions with the team.
A golden rule: do not enter full production or scale your team until you have benchmarked one of every major content type to near-final quality. While this investigation is happening, the game can be developed in a simple "whitebox" or "blockout" state. Proving out gameplay functionality early and simply is far more effective than trying to fix it when most of the art is already built. It's always cheaper to cut or revise content in this state.
Conclusion
Achieving consistent performance is a discipline, not a mystery. It is always better to scale up from a performant foundation—increasing resolution or adding detail as the production progresses than to desperately scale down, hacking away at over produced content. A healthy development cycle is one of continuous improvement, where the game’s visual quality and performance improve together.
Build to perform from the start. Art direct solutions that balance production cost, visual fidelity, and performance. There is no single tool or setting available to hit 60 FPS, but proven approaches and sensible considered choices can make it a smoother journey to achieve it. Your team will be less stressed, your producers will be happier, and your players will get the smooth, polished experience they deserve.
Set clear technical targets for your platforms early in pre-production.
Develop content specifically to meet those targets.
Establish credible visual and technical benchmarks.
Profile and test those benchmarks constantly to validate decisions.
Polish and add fidelity over time in a controlled, considered manner.