As I’m given to understand it, Sony announced the other day that the astonishing perfection of the CD-based Playstation, which had been superseded by the even more astonishing perfection of the DVD-based Playstation 2, which had then been rendered obsolete by the Blu-Ray vision that was the Playstation 3, had in turn become an outmoded and worthless piece of crap compared to the forthcoming Playstation 4. The reaction I’ve heard (which is, admittedly, reading a bit of Penny Arcade and Kotaku, because I didn’t even bother with the PS3 let alone caring deeply about a hypothetical game console that they wouldn’t even show people at the press conference) is a resounding “meh”. Why might that be? Apart, of course, from the fact that they wouldn’t even show people the console at the unveiling press conference.
I’m starting to form a radical theory about why the latest generation of game consoles (Wii U, PS4, and whatever Microsoft is going to unveil in a wek or two) have gotten such lukewarm receptions. It’s probably going to be controversial…heck, I’d say there’s no more than a 50/50 chance that it’s right…but I think that there’s not actually as much incentive as the game companies think there is to improve the performance of their consoles. In short, I’m starting to believe that the age of video game hardware improvements is coming, slowly but surely, to a close.
This isn’t to say that they can’t do any better. I’m sure that processing power is improving on a very impressive curve, if only because the experience of my entire life has been one of computers getting constantly better (when I was born, the brand-new state of the art personal computers had 16K of RAM.) While there does at some point have to be a plateau for the improvement of the computer, we’ve not found it yet and we don’t even really know where it might be. So this isn’t a “man will never reach the moon” type rant about how they’ve reached the limits of how good a gaming console can get.
It is, however, true that there is a point of diminishing returns in terms of how well these improvements in processing power will translate into practical, measurable gains in the finished product. The previous jumps in technology have been immediately obvious to the average consumer; someone playing an Atari 2600 would see the NES as a clear improvement over their machine, just as the Super Nintendo was obviously better than the NES, the Playstation was immediately better than the Super Nintendo, and the PS2 was eye-poppingly better than the PS1. But with the PS3, there wasn’t the same level of “eye candy”. It was better–of course it was better–but it wasn’t a quantum leap the way the PS2 had been over its predecessor. That was the opening that let the not-quite-as-powerful but much cheaper Wii in to lead the market; it wasn’t as good as the PS3 or the XBox 360, but it was good enough and it was about $350 cheaper.
And while the PS4 is bound to be an improvement over the PS3, developers already have enough processing power to make astonishing games that are beyond our expectations. Slight improvements in skin texturing and particle physics isn’t going to make the next Batman game better than ‘Arkham Asylum’; clever game design and innovations in playability will. Being able to represent more figures onscreen at any given time, doing different things isn’t what makes a game great; if it was, ‘State of Emergency’ wouldn’t have been as dull as dishwater. We have reached a point where it’s not how much power your machine packs, it’s what you do with it.
The question is, what does this mean for the industry? Because it’s clear that Sony/Nintendo/Microsoft have a marketing strategy that involves avoiding market saturation for their product by putting out a “new, improved” version every six or seven years and coercing the software companies into supporting only the latest toys. But if the buying public doesn’t make the jump, what does that do to the company that just spent hundreds of millions of dollars on R&D? Can software companies afford to gamble that there’s no need to design games for the next-gen platforms? Or is the system simply “too big to fail”? Will the buying public just continue to buy next-gen consoles simply because the assumption that they must be better has become too ingrained? Something tells me there has to be a reckoning at some point for the continuing push for an ever-better system at all costs, but I can’t predict when it would be.
If you think you can, feel free to share it here! I’ll buy stock based on your recommendations, but hey, no pressure.
Related Articles
29 users responded in this post
It’s not just that; making these new super-HD games is *expensive.* There’s a reason you see so many complaints about how all games look the same these days.
A lot of the people I chat with are saying that the next generation of consoles might be the last one. Between mounting triple-A costs, the semi-revival of PC gaming, and the rapid growth of mobile gaming and microtransactions, games are having an increasingly hard time selling on graphics. Consoles are either going to completely overhaul their current operating means, or die.
The big three are already pretty well aware of these points, I think. They already extended this console generation a bit longer than previous ones (Sony ran the PS1 and PS2 for six years each, but has already run the PS3 for 7 and it might make it another year before the PS4 launch). Furthermore, Nintendo’s most recent consoles have really been marketing themselves on their bells and whistles, not their tech specs. It seems to me that the big companies are already trying to find a way to differentiate their place in the market while still trying to defer the process a bit longer.
I think the next console is going to have to be along the lines of the Oculus Rift, because you’re right – consoles will never win the graphics race, and the only real advantage they have besides certain IPs is that you can play them on your couch on the TV. And Steam’s Big Picture has eliminated that too. The biggest feature the PS4 seems to have (besides zomg graphix!!) is a built in ability to record Let’s Plays. TLDR, invest in Valve 🙂
Speaking as a more or less former gamer, people will get whatever console the newest games come out on. 3 or 4 years into its lifecycle there sere still PS3 games being made for PS2 but you didn’t have to guess which version was better.
As for a resurgence of PC gaming, you can buy every Sony, Nintendo, or Microsoft console ever made and barely match the cost of a decent gaming PC. Particularly one that won’t be hopelessly outdated in two years.
I think you’re wrong—or at least right for the wrong reasons. Hardware improvements aren’t at an end. Sure, we’ve already hit diminishing returns when it comes to photorealism, but in terms of authentic-feeling gameplay there’s a long way to go. A lot of games look good in screenshots, but lighting, motion, and physics in general can contribute just as much to the appeal and even beauty of a good game.
What’s more, design and innovation also owe at least a little to processing power. The ability to use more detailed physics models is crucial both to fun, concept-y, physics-toy games like Hydrophobia and Portal 2, and to realistic materials and destructible environments (which open-worlders have been clamoring for since open-world became a thing, and which would present all kinds of new design challenges). Sure, 1000 zombies doesn’t feel much different from 500—but the same advance in computing power could make for a bridge you can really burn, and that feels a hell of a lot different from one you can’t. Clever AI is a huge demand on processing power too, and what with the emphasis on graphics that’s hardly changed since Half-Life. The same goes for the sort of simulation that goes into GTA or Skyrim.
You might make the argument that the PC will kill the console—let’s face it, better and faster PC parts have a wider audience, more demand, and a much faster release cycle—but I agree: people will keep buying new consoles as they come out, at least for the foreseeable future. After all, they bought the iPhone every time, too.
I’m not much of a gamer. Actually I AM a gamer but just don’t give a damn about shooters and that seems to be most of what gets make on the PS3 and X-Box. I’d like to support Nintendo for making games I actually want to play but they repeated the same mistake they made with the Wii U that they did with the original; they put out an expensive console that can’t even play blu-rays.
No amount of improved graphics is going to make me want to buy a next gen console that has priced itself out of my budget,
If Shadowrun Retruns will run on the Ouya, I won’t need another console for a few years.
It’s worth keeping in mind that the PC people do tend to vastly, vastly overestimate how important they are to the market and tend to have a neurotic meltdown whenever someone challenges it (Case in Point: RPS in response to the Destiny thing) I say this as the owner of a big expensive gaming PC: They are a terrible, awful group of people.
There’s plenty of room for improvement that future consoles can take advantage of. Adding social media sharing buttons to controllers and requiring a Kinect hooked up for all console use are just the beginning. To further enhance your video game marketing experience, your social media reactions along with your console’s usage details will allow optimized content recommendations and targeted advertising. While the rising cost of game development may be a concern, newer consoles can help remedy this. An improved, more personalized console can aid developers by easily determining what aspects of a game the user most enjoys. Then those aspects can be immediately locked as DLC, providing additional revenue for the developer. A new, enhanced Kinect can help enforce this exclusive, personalized gaming experience by utilizing a revolutionary facial recognition system to ensure locked and premium content are only accessible by the user who purchased them. Also, blast processing.
“made” not “make”
(Wishes I could edit posts)
Greg’s predictions seem the most accurate and most horrific. I pretty much gave up on new gaming ages ago (still thinking about picking up a Wii, that’s how behind I am), but those things would ruin it for me if I hadn’t.
Whatevs. I still get my enjoyment out of Fire Emblem and No Mercy. My idea of gaming is probably really antiquated and arcadey, but I’m also not spending hundreds of dollars a year on being led by the nose through glorified action movies made by artless nerds.
“As for a resurgence of PC gaming, you can buy every Sony, Nintendo, or Microsoft console ever made and barely match the cost of a decent gaming PC. Particularly one that won’t be hopelessly outdated in two years.”
Sure, if you buy an entirely new computer every time you start getting slowdown. All you really need is a good graphics card and some RAM to spare, and you can use your home computer without too much trouble. It shouldn’t set you back more than $300. Hell, you can do it over and over, and make Alienware cry.
I don’t think you’re being controversial enough. The graphical barrier for entry into the video game market never existed. The Nintendo Wii was the bestselling console in the previous generation, and was, by far, the worst graphically. The whole reason this market even exists is because video game manufacturers were among the first to come up with a good reason for consumers to buy personal computers.
After a few decades, that first mover advantage is finally waning as other manufacturers have figured out an even more compelling way to get computers into the hands of more people: smartphones and tablets.
Apple has already sold over 500,000,000 iOS devices over the past six years, more than the combined sales of every DS, Wii, PSP, Xbox 360 and PS3. Games are sold for less than a cup of coffee, hardware updates every year, everyone needs a phone, and their platform is significantly friendlier to developers.
Did I mention Apple’s platform isn’t even the market leader?
Us nerds complain about the lack of a proper controller, or AAA titles, but the reason the Wii was a success was it expanded the market for gaming. Smartphones and tablets didn’t just expand that market, they’ve destroyed it. Look at retail video game sales, they’ve been in decline for years now. The Wii U and 3DS have consistently sold below Nintendo’s expectations, the PS Vita tanked, it’s no longer enough to be better than the other consoles, you have to convince people that you’re better off spending $300-$400 on a console than on a new smartphone or tablet.
Good luck with that.
The Playstation is not better than the SNES in any possible way. Even graphically, the SNES has cleaner graphics and far better playability.
The popularity of low-resource games like “Angry Birds” and “Minecraft” should’ve been the wake-up call to the threshold described above.
Innovative design is a risk, and risks are not what Sony/ Nintendo/ Microsoft’s investors want to see. You gotta do what was successful in the past to make investors comfortable. And if it fails, well, I did everything right so it’s not my fault if the end result wasn’t successful. Must be something outside the recipie, like software pirates.
An argument analogous to this can be applied to the trend of “blockbuster” movies as well. May be a cultural thing, not an industry thing.
“All you really need is a good graphics card and some RAM to spare, and you can use your home computer without too much trouble. It shouldn’t set you back more than $300.”
Which is, ironically, the mature cost of a console.
Console gaming isn’t dead yet, nor is it on its last legs. The focus is changing, though, from being for stand-alone games to being a networked entertainment system; I’m big into Xbox Live and watched it grow from a rather-clumsy multiplayer interface to an extremely sophisticated social platform with features I’d like to see adopted on the PC side.
I agree that hardware spec increases don’t have the same impact in press conferences, and I agree with the reasons John lists why. However I think that game studios are getting savvier on how to use the extra horsepower now that “more polys” is tapping out. (The devs I follow on Twitter were overjoyed to see the PS4 carry 8GB of unified RAM and were chafing their hands at the prospects of using it for whatever they chose instead of it being portioned off into dedicated systems.)
— Steve
At this point, it’s not about making a prettier game. Sports titles already look, at first glance, like actual broadcasts. That actually makes the limitations of the AI even more embarrassing – two photorealistic dudes bouncing off each other like sprites is way more jarring than two sprites bouncing off each other like sprites. More processing power makes it easier to brute-force good AI.
Three things:
First, everything switchnode said. People tend to focus on graphics, and sure, graphics are important. But there’s a ton of shit games do badly today that will continue to be improved by increased processing. If you’ve ever been annoyed by bad AI, by clipping, by the inability of your character to do something they should be able to but are barred from due to programming requirements, you’ve seen why consoles need more power and storage.
And for that matter, graphics shouldn’t be overlooked. Are companies still pre-rendering cutscenes to make them look better? If the answer is yes, then there remains room for improvement, because it means on-the-fly rendering ain’t there yet.
This is not true, and I wish people would stop saying it.
All of those consoles together, purchased at their original price points when released, comes to something like three grand. More in inflation-adjusted dollars; the original NES was priced at 200 in the mid-eighties, which is like 400 today.
My current gaming rig was assembled by me in 2008. It has a Core 2 Quad processor, what was, at the time, an extremely high-end video card, and a monitor and storage that, while a bit outdated now, were likewise cutting edge at the time. It ran me around 1500 dollars all-inclusive. It’s lasted for five years and probably has another year or two of life as an excellent gaming platform (I can already run Heart of the Swarm on ultra, and that’s a AAA new release), and there have been VANISHINGLY few games for it I can’t run at something approaching their highest graphics settings.
And if and when I build a new one, it will cost LESS than this one. Because the case is still good, and so is the monitor.
The days when companies would release games that actually couldn’t be run at their highest settings on consumer-grade hardware are over ten years in the past, as are the days when you had to upgrade every year. A competently assembled high-end box is still expensive but will now last you the better part of a decade.
As for the Wii; everyone brings up the Wii. The Wii is an argument that you can succeed based on good games with less oomph behind them. Yes. True.
Let’s not forget that Nintendo had an enormous stable of intellectual property it built up by years of trying (and often succeeding) to be the first name in gaming. It leveraged the fuck out of that. A lot of the initial good press and good word of mouth was in fact driven by serious, not casual, gamers, who were picking it up based on the fact that Nintendo always solidly executes it’s IPs (Other M notwithstanding).
It was also competing on price; I’ve seen people upthread complaining that the Wii was overpriced. It dropped at 250. It was, in adjusted terms, CHEAPER than the SNES and NES were when they debuted.
Let’s not pretend these are advantages, or a model, any company BUT Nintendo could have or follow.
As a developer, I have to say that this misses the boat in some ways. Clever game design is far easier when you don’t have to battle the constraints of the hardware at every turn. Yes, there are examples of innovative game play that game about because of such struggles. However, far more common, are the innovations that were enabled by the greater abilities of the hardware that are now taken for granted as being just something that you find in most modern games.
So, yes, you might not notice the crazy graphics jump between the PS3 and PS4. However, you might notice a more immersive world. The other enhancements that will come about will be things that you won’t even notice. A lack of load screens, for example. You might not notice when they don’t happen, but you do notice when you have to wait from a minute or two with some loading graphic running.
The thing about console going forward is that they are no longer concerned solely with being gaming systems; they are attempting to become all-purpose entertainment centers. Or, in other words, to become PCs, which is what the Xbox always was and what the PS4 essentially is.
Also, what in God’s name are you doing reading Kotaku? Read a real news source like Rock Paper Shotgun or Polygon or something, you’ll be better of for it, I promise.
It’s interesting that you compare it to reaching the moon. If the Apollo 13 movie is any indication, people even started seeing continued moon landings as “routine” because modest improvements after the Apollo 11 landing were considerably less exciting. Then it nearly crashed and that was really exciting, so I’m sure I could torture another metaphor there somewhere.
[…] Source: http://mightygodking.com/2013/02/23/how-good-do-consoles-need-to-be/ […]
If they made consoles able to accept gaming mods like the PC games can I’d care. As it is, I’ll probably by the XBox 720 and possibly, though not probably, the first Playstation I’ve ever owned as well, but I won’t be in as much of a hurry as I am to replace my gaming laptop, that recently started overheating whenever I open up Skyrim.
An important point that I’ve gleaned from the recent coverage (perhaps it was the Giant Bombcast) was that we should not expect that development costs will increase commensurately with the prior round of console rollouts. The cost of transitioning to high-definition assets was catastrophic for non-AAA developers, and that cost isn’t go away, _but_ we’re not going to be seeing another comparable increase as the new consoles aren’t expected to chase the 4K boondoggle (PS4 is only for video playback and not for gaming).
Furthermore, there’s greater hope for development costs to possibly diminish due to greater infrastructure for asset generation via outsourcing, and also with the expected maturation of digital marketplaces within the consoles. While I personally am not expecting Steam-class quality, I expect the marketplaces to be more robust with a range of prices not limited to $15~$20 for digital-only release or $60 for disc-less versions of full games which never ever go on sale. With luck, Sony’s open arms to developers will encourage more small-house releases to hit PCs & both consoles.
Or it could all go to shit. *shrugs* I personally haven’t the money or time to justify upgrading for games I won’t play & can’t afford. We have to be aware that the audience on this blogpost is likely to be a lot of people in a transitioning post-college age, with diminishing time for all night Goldeneye or Halo. We might have an echo chamber here ignoring what’s happening with the youth marketplace.
Game designer chiming in. I don’t work on consoles, but friends do. In no particular order, here’s why a new console generation is coming:
1) It’s been 7 years, which matches the longest a console generation has ever lasted. They’re approaching maximum sell-through on the current generation.
2) Art pipelines can once again produce assets that current hardware can’t draw (or can’t draw quickly enough). This means games can look better than they do, and this does help them sell.
3) Open world games, modifiable terrain, and anything with the word ‘procedural’ in it have all reached the limits of hardware. Most catastrophic bugs in big open-world games stem from the engine having to work like hell to load and unload assets to keep within the memory space they have available.
4) Non-rendering pipelines (AI, sound engines, game state management) are all hungering for more power, but they’re being squeezed out in favor of graphics because that’s what sells.
5) Discs are going to die, soon. EB hurried the process along with their used games business, but physical media is ending in general. A new console with new DRM will let developers offer things like digital rentals, flash sales, bundles, gifting — basically everything Steam does.
6) Consoles are competing to be the center of the entertainment hub, now. They want to be where you watch your video, listen to music, poke your Facebook friends, and get your information, so that they can always offer you that tempting bit of gaming fun. The device that is always on is the device that’s most relevant.
7) New hardware means a new way to get you to buy old software the companies have already produced. Nintendo is best at this, but any long-standing publisher has “best of” packs or “reimaginings” in development.
8) The current generation of consoles has faded from glory. There’s no buzz about them. No current-gen system is going to “win” the holiday period. A new generation re-energizes the fanbases, which gets discussion going, which gets more people generally interested in buying game-related things.
9) Developers are sick of the current generation. There’s not really room to innovate technically any more, especially with multiplatform releases, and the restrictions are really starting to chafe.
10) The movement controls that came out as accessories last generation will be built in from the start this generation. Developers can once again work on a unified hardware base and try to figure something to do to make them useful or fun. Maybe they’ll get it this time?
11) DLC freemium in-game-purchases ARPU DAU PAU metrics. Current consoles and infrastructure don’t support the type of information and transactions that the post-Zynga gaming world is moving towards. Console makers know that they have to be able to support these tiny games if they want to get their piece of the next Angry Birds (there won’t be another Angry Birds).
12) Riiiiiiidge Racer!
All I know is, I’m not buying another console until they’re marketed and sold in Fortress of Solitude plug-and-play crystal form.
So if you’re actually looking for stock picks, Nintendo is your best bet. Mostly because they keep making the market that the other two try to capitalize on. Motion control was the big thing Wii did right. Asymmetrical play experiences on the Wii U will be novel enough to get the people who took a chance on the Wii, the non-hardcores, to upgrade. I don’t know that Nintendo can keep it up forever, but I think in the short term (five or ten years) they will continue to innovate in the console space.
I don’t know that the lack of Blu-Ray hurts Nintendo. Most of the people I know aren’t interested in Blu-Ray because they don’t have the TVs for it right now. A few of them are convinced something better will come out in three years, so they’re skipping this generation entirely.
I don’t really have a dog in this fight; I haven’t played a videogame seriously since Sonic the freakin’ Hedgehog. Not to say I’m not a nerd- I’ve got a tattoo from a dice and paper role playing game, for chrissakes. I’d like to play videogames, but I haven’t really heard of one with a storyline that sounds engaging. That could be ignorance speaking, I don’t know enough. I guess I’m just waiting for the tech to level out and present me with characters and stories that I can believe in.
If I’ve missed some through avoidance of the rapidly changing medium, I would gladly welcome enlightenment.