The slides from Tim Sweeney's (Epic Games) POPL talk are available; it is a deeply technical presentation (PPT, "The Next Mainstream Programming Language: a game developer's perspective") worth noting for a number of reasons. The most important is its hint of a looming challenge confronting the games industry. Will the industry need to change how it works with its software?
While the issues presented are not unique to the games industry it will likely be felt there sooner. In so many ways the games industry has led the rest of the pack by way of technical challenges, solutions, and breakthroughs. The problem of software engineering was hinted at in an earlier discussion - albeit vaguely, when the distinction was made between virtual world design and its added-ons (see "Lipsticking the Chicken"). The question here is a more vexing extension: if I can't practically express it, how can I build it?
What follows is the first in a series of posts planned on software and virtual worlds. There are large forces in play that shape what worlds look like today and into the future -forces rivaling the imaginations of designers...
A programming language and a programming paradigm can shape how we engineer a world. As with our natural languages perhaps there is a cognitive dimension, but without having to even reach that far it is safe to say that engineering practices establish approaches to problem-solving that bias solutions. These practices are hard to ignore in especially high-stakes, risk-adverse software development environments. Thus our first biq question, can game software development as it is now conducted scale in the face of advances in hardware, appetite for content, and capped costs?
Our story now migrates to *objects*. If Code is the Law in our realm, then the modern conceptualization of code (see Footnote ) often aspires to be object-based. The craft of software objects is then Object Oriented Programming, even if it is only sometimes realized. By and large, software object-oriented design has been a cultural touchstone for nearly a generation of software developers and designers - objects provide a convenient and intuitive means of partitioning/ decomposing problems and mapping them onto code building blocks. Challenges emerge, however, when one scales interactions from small numbers of objects to large sets of objects. Throw in parallel threads of computation and all hell breaks loose. Why the concern with large numbers of objects? Well, that is arguably where gameplay simulation is heading.
This is where Tim's slides enters our stage. They worry a particularly difficult and central problem: how to have large numbers of objects interacting across many threads of computation. On the one hand this may seem like technical arcanum, but note that we all often pretend this point in our discussions and comments on Terra Nova and elsewhere. It is how most of us conceptualize a simulation. We talk to the illusion of a world with many concurrent activities and a speak least metaphorically, to the agencies that can live in such places (e.g. of Non-Player-Characters and Player-Characters interacting with shared world state). In the fact of today, however, such parallelism is a fiction - most games are implemented within a single simulation thread (they just iterate through all the objects quickly but in sequence... "butcher before baker before the cat jumps over the moon..."), but this is likely to change, perhaps very soon.
A question for the future is how to implement larger simulations with more objects. In a Gamespy.com article a while ago, Tim Sweeney stated that while the last ten years of programming progress were about objects, the next ten years will be about "ecosystems of objects." One problem looking forward is how to work reliably with game simulation objects in parallel (see "concurrency"). As he points out, the approach of today using mainstream programming languages is to manually synchronize object state - a developer has to explicitly lock/unlock the bits of the object and figure out how it should share with other objects ("shared state concurrency"). This won't scale - it is too error prone and too complicated to implement over large object sets. It is also expensive (skilled developers). Thus, we stand at the edge of the abyss looking to worlds feared with plains of bugged tribbles.
Beyond software engineering there too have been subtler claims favoring parallelized code. Assuming tools and practices catch up (a big if), can it lead to more fine-grained definitions of game simulation behavior? Fewer quest chains, more negotiation? If true this can mean that content creators / script-writers will be able to more naturally express game-world behaviors - allowing them to produce more cost-effective content. This would be in contrast to how scripting and coding games is currently done in games (imperative styled programming, see also other related discussion: "Nested Worlds") using approaches that are arguably non-scalable (labor-intensive).
If the suggestion sounds pie-in-the sky, consider that we're likely on our way. Tim identifies these types of code in a game:
Shading code is presented as already "data parallel" (within the GPU - or Graphics Processing Unit/graphics card). Numeric computation (building blocks of path-finding, physics, collision detection, btw, an interesting stat - 80% of CPU in Unreal Engine can be parallelized at this level) is well partitioned and should be straightforward. The challenge lies therefore with the last category. Gameplay simulation is where Tim identifies the bottleneck for the industry. Given the number of code pieces (objects) and the degree to which these pieces must interact, the problem is hardest there.
A solution is forwarded in the slides (in geek: "software transactional memory"). Regardless of the merits of this particular approach the more general question lies with the inertia of culture, technology (tools), and legacy (existing code): how quickly can new techniques, skills can be adopted. Too risk adverse, or ready or not? 2009 was mentioned as a date to watch, Tim's prediction: CPUs with 20+ cores, 80+ hardware threads, 1 TFLOP of computing power and GPUs with general computing capability...
Scary stuff if you want to take full advantage of this!
The trouble with tribbles is that there are so many of them and they do multiply so. How ever to organize and herd them in worlds to come?
Tim forwards a nice couple of slides quantifying the amount of software that goes into a
modern game: e.g. Gears of War :
~24 month development cycle
1 middleware game engine
~20 middleware libraries
Gears of War Gameplay Code ~250,000 lines C++, script code
Unreal Engine 3 Middleware Game Engine ~250,000 lines C++ code