Inertia's picture

Nostalgia

This topic has come up on IRC a couple of times, and since many people here are working on games it pretty much deserves it's own forum discussion ;)

In my opinion, most questionable is the term "next gen", which is used way too often. The most notable change in games since the arcade classics is the integration of collision/physics engines, rather than using rough approximations. Sure, they set the standards for graphics a little higher every year (and the HW requirements), but nothing really ground-breaking has happened to gameplay or A.I. in the past year on the PC. Unreal Tournament is imho a good example, I loved the initial version of that game because it came with the 'Assault' game mode where you had deeper objectives than just 'kill enemies, don't kill teammates', but it's 2003, 2004 and Unreal Tournament 3 sequels didn't really tempt me to play it again at all, not even with a shiny 'now with vehicles!' sticker put on it.
From a marketing standpoint their decisions surely are viable (never change a successful concept), but please don't name graphics updates 'next gen' :P

Games like Starcraft, Diablo, Counter-Strike, Battlefield and World of Warcraft have heavily outdated presentation, but still have millions of players every day. Ask yourself, why is that?

According to http://www.vancouver.wsu.edu/fac/peabody/game-book/Coverpage.html the primary motivation to play games is to learn. Secondary motivations are fantasy/exploration, nose-thumbing, proving oneself, social lubrication, exercise, and need for acknowledgment. (mind he's not writing about videogames specifically and it's probably more complicated than just those few motivations)

All the games listed above fulfill these basic needs though (if you can live with exercise==moving the mouse), and what makes them outstanding from the competition is that they are easy to get into, but difficult to master - providing a challenge even for experienced players. This is also true for many arcade games: the controls are very simple, but the further you get in the game the harder it gets.

Imho this is often overlooked in games, their approach feels more like creating a graphic demo first and then attempting to tweak it to become a game.


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
objarni's picture

I agree. :)

the Fiddler's picture

I think games fall into three wide categories:

First, the casual gamers. Flash-based games, puzzles (Tetris!), etc etc. They prefer games that are simple to get into and enjoy. This is probably the largest market in terms of sheer numbers (and Nintendo understands this very well). This might also be the easiest market for an indie developer to break into.

Then, the nu-gamers (tm). Those value presentation the most (e.g. Counterstrike: Source is better than Counterstrike, because it has prettier graphics!), even if the substance is not improved. It's a hard market for a new developer to break into: risks are high and without a large studio to back you, the costs are prohibitive (and, if anything, steadily climbing).

Last the hardcore gamers. As in the other categories, there are many shades of gamers in this category: from those who play StarCraft simply because nothing better has come along in the past 10 years, the extremists (give me gameplay!) who play text-based MUDs or the original Ultima Online.

Most large studios today cater to the second demographic. Once they have a franchise down (Unreal Tournament, GTA, Halo) they keep the core gameplay more or less the same and improve on the graphics and physics. It's only natural, too: they try to minimize risks associated with new and unproven material, and count to nu-gamers to keep bying the n+1 version of the game.

There's a problem with this approach however: as graphics get better, it's becoming increasingly more difficult to make improvements. It's both a matter of cost (AA titles are getting more and more expensive to produce), and a matter of diminishing returns. As graphics get better, players start looking for (and getting annoyed by) increasingly small details (before Crysis, had you ever heard anyone complain that felled trees don't fall in a completely realistic way? :p)

Personally, I find myself thinking differently when playing an old-school 8- or 16bit era game, from a current graphics-laden title. In older games, my imagination likes to fill in the details. The graphics may not be realistic, but this actually makes the game easier to get into. In newer games, I expect the details to be perfect - anything that's not, stands out like a sore thumb (CoD4: "the lighting on these guys is all wrong!", or "couldn't they have used a higher-res texture here? Cheapskates.").

The uncanny valley effect? Possibly. And maybe that's why Nintendo opted out of this market (a decision which seems to have payed itself off).

Inertia's picture

...as graphics get better, it's becoming increasingly more difficult to make improvements.
There's another problem related to this fact. Mods, often used as a selling point "it's ok that our game ships with only 5 maps, the community will add more" is not entirely true anymore. For past generation games (Half-Life, Quake 3 etc.) this was true, because the amount of detail for levels and characters was manageable even for hobby designers. With Doom 3 the era started where this began to change. As of today I haven't seen any user-made player model for Doom 3 (I've searched for some to test the ms3d loader). Barely any levels/mods at all (most existing 'mods' are more a tweaking of weapon behaviour, or changes (e.g. disable all lights) to it's shaders ). This seems to be the price to pay for increasingly detailed graphics.

..complain that felled trees don't fall in a completely realistic way?
Now you're making me curious: how does 'a completely realistic way' to chop a tree with a machine gun look like?

The graphics may not be realistic, but this actually makes the game easier to get into.
Interesting observation, this also matches the motivation "fantasy/exploration".

Edit: Where does World of Warcraft fit into your categories? Casual?

Inertia's picture

The graphics may not be realistic, but this actually makes the game easier to get into.
Actually Tomb Raider proves this right. The first games were quite popular, but after the first movie the sales fell way below expectations. Discussing this with a friend brough up an interesting insight: He said the game didn't appeal to him, because a) Lara Croft in the game doesn't look like Angelina Jolie and b) he doesn't find Ms. Jolie appealing either way.
This brought me back to your point: Before the movie, the character was quite abstract. Lara Croft could have been some girl you know (maybe what she'd look like at the age of ~25), some girl you have seen somewhere or even pure imagination. It doesn't matter much which option actually applies for you, the important fact is that the character was open for interpretation, while using an actor to personify the character is reducing this to judgement.