PS4 is revealed

Status
Not open for further replies.

figmentPez

Staff member
And that's essentially correct, but incomplete. You can set a flag in your compiler options, and it becomes a flag in the output .exe that says "I want to use 3.xGB (I don't remember the exact amount) and I know what I'm doing!" So the "kernel space" (not exactly) is smaller, but still, all within 4GB of space. By default, yes, split into two 2GB chunks.
Which still doesn't address my main point: Can the resources be easily scaled back in order to fit in a smaller memory footprint? Is it possible to make a PC version of a PS4 game, that looks and plays as good or better, without also making it a 64-bit program? It would kill PC game sales if virtually every multi-platform title had worse graphics, smaller maps, dumber AI or something else because programmers had to cut corners to fit in a smaller memory space.
 
There's no such thing as a straight port. The assets are moved over, but the engine that actually runs the game and handles the assets is different between the two platforms. There is probably a ton of common code between the two engines, but the underlying architecture, particularly graphics capabilities, is so different that they have to have significant code changes to make the game run well on either platform.

It's that engine that decides when to load something, and where to hold it for when its needed, and that appears to be the crux of your question.

However, it's mitigated by the asset pipeline. Most pc games run from the hard drive, which is fantastically faster than the optical drive in any console. Even if a game is forced to run in a 32 bit address space, it can load man more things on demand from the hard drive that would take too much time on the console.

If anything, I would say the console is more restrictive in terms of raw data handling capacity and capability, for handling physics, ai, etc in a timely manner.

But again, this is all handled by the engine, and in a multiplatform game, the engines are different enough between the platforms that the 32 bit or 64 bit distinction hardly matters.

You could think of the engine, in fact, as a virtual machine. It's upper interface to the game is the same across all platforms, and the lower interface takes as much advantage of the underlying architecture and resources as possible, and runs the game as well as it can given the resources available.

This suggests that 1) the engine for the pc is far more complicated than the console engine because it has to adapt to different resources and 2) that the pc platform could be both better or worse than the console based solely on the engine design and adaptability and the underlying hardware. I'd fully expect to see much better graphics on a pc setup for the same game on a retina display with a high end GPU, and much worse graphics for a very low end graphics card, compare to a console version.

As far as ai and physics, they are minuscule compared to the graphics processing, so I don't expect you'd see much difference unless they made the engine different intentionally. I'd expect them to do this if they recognize that pc gamers have a different play style and set of preferences than console gamers, but otherwise I don't think it would come down to resource constraints.
 
A real 64 bit data bus really won't buy a console much, and would cost more. Better, for them, to simply make the existing 32 bit bus faster, keep the instruction set small, and allow for a few instructions that deal with the larger memory space. Since memory size is fixed, it won't really be a problem.
Given that Xbox has been on X86-derived hardware from the beginning, and now the PS4 will be as well, and that's the arch that pcs have been on since the 80s, it's actually cheaper to take the existing 64-bit memory bus that everything has had for about 10 years now, and just use that, rather than re-engineer a "better" 32-bit bus. If talking about costs, off-the-shelf (or close enough) is WAY cheaper than re-engineering your own hardware. If anything, that's the lesson of the PS3, in that even though Cell was a far superior platform from the numbers perspective, the extra costs of engineering it made it more expensive, and as well they didn't make an easy-to-code-for API on top of it. Both reasons are assuredly why they went to X86 for this generation.
 

figmentPez

Staff member
However, it's mitigated by the asset pipeline. Most pc games run from the hard drive, which is fantastically faster than the optical drive in any console. Even if a game is forced to run in a 32 bit address space, it can load man more things on demand from the hard drive that would take too much time on the console.
Two words: Mandatory installs.

But again, this is all handled by the engine, and in a multiplatform game, the engines are different enough between the platforms that the 32 bit or 64 bit distinction hardly matters.
Bullshit. When it comes down to having >4GB of data in RAM vs <4GB of data in RAM that's a significant difference. Remember back in the Xbox/PS2 days when the console ports of PC FPS games had to have the maps carved into smaller chunks because consoles didn't have anywhere near the RAM that PCs did? That's the type of difference that we're going to be looking at between consoles and PCs if the switch to 64-bit programs isn't made on the PC end, only it'll be PCs that get lower detail maps, or dumber AI or more pop-up or something. The only question is if it will happen this generation. Does the PS4 (or the next Xbox) have enough RAM available to programmers that porting to less RAM on a 32-bit PC program will make a significant impact. My guess is that, yes, it will.

and runs the game as well as it can given the resources available.
Which is exactly my point. What has to be sacrificed to fit into the memory constraints?


I'd fully expect to see much better graphics on a pc setup for the same game on a retina display with a high end GPU, and much worse graphics for a very low end graphics card, compare to a console version.
Except we're talking main system RAM as well as graphics RAM. Even if the PC can still have enough video memory to cache all the necessary textures, and pixel-shaders and physics calculations, what is being sacrificed out of main system RAM in order to make things run on the PC?

otherwise I don't think it would come down to resource constraints.
It always comes down to resource constraints of one type or another (granted, sometimes it's money and development time, and not system capability, but always resource constraints). The reason that Wii ports are radically different from 360 and PS3 games is resource constraints. The reason TF2 would be near impossible to update on 360 to match the current PC version is resource constraints. The reason that most multi-platform games don't tax the CPU on modern PCs is because it's too difficult to make a game that both takes advantage of a PC's processing power, and can also be scaled back to run on what little CPU horsepower is available on the 360/PS3.
 
I just don't see why you think there could be a problem - or opportunity - here.

Consoles are obviously moving towards the pc architecture, but you are essentially asking if the architecture differences will result in reduced experience for one platform or the other.

IF the manufacturer of the game uses both platforms to their UTMOST capability, there will be differences, and those differences will rely completely on the platform.

IF The manufacturer of the game puts most of their effort into one platform and only barely supports other platforms, there will be differences, and those differences will rely completely on the manufacturer.

Since I've never seen a manufacturer choose the first path, I think that blame, if there's any to lay, can be laid at the design decisions the manufacturer makes, and not due to the capabilities of each platform.

But we are really discussing tiny parts of the game, right? Are you asking about whether they will drop features wholesale to "fit" into an inferior architecture, or are you asking about minor differences that will only be noticed by minutiae inspecting nerds that spend hundreds of hours on a game to decide if the ai path planning is slightly different, and if so how that can be used to their advantage in an upcoming competition?
 

figmentPez

Staff member
Consoles are obviously moving towards the pc architecture, but you are essentially asking if the architecture differences will result in reduced experience for one platform or the other.
This has nothing to do with architecture. PCs are already 64-bit, and have been since before dual-core chips became the norm, so that's not the issue. The sole issue is weather this will finally push game makers to adopt 64-bit programs and make a 64-bit OS a requirement to run their game.


But we are really discussing tiny parts of the game, right?
We're not talking about the minor differences between a 360 version and a PS3 version. We're not even talking about the minor differences between a game originally made for the 360 and then ported to PC. We're talking differences on the level of a 360 game vs a Wii version of the same game. Whatever data fits into multiple gigabytes of RAM is what will have to be dumped to fit a PS4 game into the memory space of a 32-bit program. We're not talking about a few tweaks here and there, this is massive amounts of data that can be used on a PS4, but can't be used as long as PC games remain compatible with 32-bit operating systems (which not many people are running anymore).
 
We're not talking about a few tweaks here and there, this is massive amounts of data that can be used on a PS4, but can't be used as long as PC games remain compatible with 32-bit operating systems (which not many people are running anymore).
<sigh> Sadly, a lot more people are running 32bit OSes than you'd think. A LOT more. And, while most of these people aren't gamers, a lot of them do have kids young enough that they can't buy their own computers and therefor have to use their parents' computers which are still running Windows XP, or in some cases... WinMe or Win98SE. It's... it's almost enough to make a grown man cry, just thinking about how many times my customers tell me they can't use a data disc I've sent them because their OS won't run IE 8 or higher OR Firefox 17 or higher.[DOUBLEPOST=1361914980][/DOUBLEPOST]On the other hand, this doesn't necessarily mean that we should keep making games accounting for people who refuse to update their computers. That's just silly.
 

figmentPez

Staff member
<sigh> Sadly, a lot more people are running 32bit OSes than you'd think. A LOT more. And, while most of these people aren't gamers, a lot of them do have kids young enough that they can't buy their own computers and therefor have to use their parents' computers which are still running Windows XP, or in some cases... WinMe or Win98SE. It's... it's almost enough to make a grown man cry, just thinking about how many times my customers tell me they can't use a data disc I've sent them because their OS won't run IE 8 or higher OR Firefox 17 or higher.
I meant gamers aren't running 32-bit OSes anymore. Seriously, if you've got a PC capable of running a game that needs a 64-bit executable in order to compete with a PS4 game, then you've got a system that should be running a 64-bit version of Windows 7. If you're still running Windows XP, but you've got a $300 graphics card made in the last two years, then something is seriously wrong with your priorities.

EDIT: For the record, I run XP on my netbook. Even if I upgrade it to Windows 7, as I've been thinking about for a while, it will still be a 32-bit version of Win7 because my netbook has an older Atom processor. Doesn't matter in a gaming discussion since netbooks suck at 3D gaming.
 
I meant gamers aren't running 32-bit OSes anymore. Seriously, if you've got a PC capable of running a game that needs a 64-bit executable in order to compete with a PS4 game, then you've got a system that should be running a 64-bit version of Windows 7. If you're still running Windows XP, but you've got a $300 graphics card made in the last two years, then something is seriously wrong with your priorities.

EDIT: For the record, I run XP on my netbook. Even if I upgrade it to Windows 7, as I've been thinking about for a while, it will still be a 32-bit version of Win7 because my netbook has an older Atom processor. Doesn't matter in a gaming discussion since netbooks suck at 3D gaming.
I'm not saying it's common overall, but I do know that there are a lot of people who have $300 (or more) graphics cards, more RAM than their OS can actually access, quad core processors, and enough case cooling to run high end games on High graphics without a single worry; and still refuse to upgrade beyond Win98SE - because they're absolute nutcases who're afraid that Microsoft is going to steal their identities and everything else they could ever possibly consider valuable if they put a newer OS on their computers. We laugh at them frequently, when they're trying to get support for something that just doesn't run on that old of an OS and they're bitching about it every chance they get. Thankfully, they will eventually die off and the world will be a better place.
 

GasBandit

Staff member
<--- STILL running XP on every machine except my laptop. I keep intending to install 7, on my main gaming rig but I've been running this install of XP since 2007 and the inertia is palpable.

As for at work, we've still got a bunch of legacy software which doesn't play nice with anything but XP, though I've gotten most of the salesforce on 7 at least, and half the program directors. But all the machines that do the actual work - production studio, broadcast studio, administration, mine... they are all stuck on XP for now.
 
Still on XP 32-bit. Hate myself a little more each day for not upgrading yet, but I need parts more than I need a new operating system right now, so it's going to be awhile before I finally upgrade. Hopefully Windows 7 will have dropped in price by then, because Windows 8 is a piece of shit and I want no part of it.
 

figmentPez

Staff member
The XP thing is a bigger problem than you think: http://en.wikipedia.org/wiki/Usage_share_of_operating_systems

Win XP: still 39% of desktop OS installs, as of Jan 2013.

So ya... that sucks a lot, as virtually all of those are 32-bit.
And how many of those are gaming PCs with dual-core processors over 3.0Ghz or better and DX11 graphics cards with at least 2GB of RAM?

Far more relevant is the Steam Hardware Survey (currently from January 2013)
  • 55% were Windows 7 64-bit
  • 9.6% were Win XP 32-bit
  • ~28% were running some sort of 32-bit OS
  • ~70% were running some sort of 64-bit OS
See how much XP use drops off when you look specifically at systems people are using to run games on?
 
While I disagree with your analysis of the situation, I honestly don't care enough about it to explain why it probably doesn't matter. Perhaps it does. Perhaps it doesn't.
 
IF The manufacturer of the game puts most of their effort into one platform and only barely supports other platforms, there will be differences, and those differences will rely completely on the manufacturer.
I'm looking at YOU, Quicken!
...it's going to be awhile before I finally upgrade. Hopefully Windows 7 will have dropped in price by then, because Windows 8 is a piece of shit and I want no part of it.
Microsoft tends to keep the previous version of the OS available for only one year after it is discontinued, so if you want to pick up a copy of Win7 you might want to do so before October. I just managed to sneak in my copy of Win7Pro a couple weeks ago. Whew!

The phase-out has begun. XP stuck around because people didn't want to upgrade, but that 3GB memory limit* will be its undoing, and soon. Major developers have already begun to move away from XP (Google, Apple), to the point where XP now accounts for less than 20% of all desktop systems, according to some samples. And its share is still falling, and showing no sign of stopping.

--Patrick
*Yes, I know that's for server, but it's mostly the same kernel.
 
I don't think that Windows 7 will go away nearly as quickly for the same reason that Windows XP stayed around for so long. From what I've heard, the bulk of people simply hate Windows 8. Microsoft will keep selling as long as it's viable to make money off them, hence why XP stayed available for so long, but vista and ME haven't.
 

GasBandit

Staff member
The simple fact of the matter is there is no compelling reason to move beyond windows 7, and the only compelling reason to go to 7 from XP is the 32-bit memory limit. If not for that one bugbear, I'm pretty sure we'd still have XP dominating the market share.
 
XP also has some security holes that were fixed in 7. 7 also has better versions of direct X. But that's STILL not a whole lot better.
 

GasBandit

Staff member
XP also has some security holes that were fixed in 7. 7 also has better versions of direct X. But that's STILL not a whole lot better.
For a vast majority of users, this is not a compelling reason. I still don't give a shit about DX10, much less 11.
 
Something I had never thought of came to light. The prices of video cards is going to, likely, skyrocket (at least at first) once the PS4 is released. GDDR-5 is an expensive part of video cards. Sony is sticking 8 gigs in each console. They're going to be using more GDDR-5 in PS4s than AMD or Nvidia use currently in video card production.
 

GasBandit

Staff member
Not interested in Bioshock Ifinite then? (link)
By "I don't give a shit about," I mean "I don't see any actual advantage delivered by." I don't get excited by the supposed/alleged improvements over DX9. I already have a laptop with 7 on it, so it's not like I'm boycotting later versions of directX, it's just I believe that they were more of a backdoor scheme to force OS upgrades than actually delivering any better graphics APIs.
 
By "I don't give a shit about," I mean "I don't see any actual advantage delivered by." I don't get excited by the supposed/alleged improvements over DX9. I already have a laptop with 7 on it, so it's not like I'm boycotting later versions of directX, it's just I believe that they were more of a backdoor scheme to force OS upgrades than actually delivering any better graphics APIs.
Crysis 2 with or without DX11 is a huge shift graphics wise. Also, in horsepower required to run said graphics.

Crysis 3 is kind of a gong show compared to 2 in the PC optimization department.
 

GasBandit

Staff member
Crysis 2 with or without DX11 is a huge shift graphics wise. Also, in horsepower required to run said graphics.

Crysis 3 is kind of a gong show compared to 2 in the PC optimization department.
Well, you say that, but I have yet to really see the difference. Of course, as far as horsepower, I guess I'm spoiled because my XP box is using an overclocked 8800GTX I bought for 500 bucks back in 07 that's stood me in good stead to this day...
 
Until budget cards can effectively run tessellation (no matter what they advertise, they can't), only really high end cards will see a big difference.
 
Microsoft tends to keep the previous version of the OS available for only one year after it is discontinued, so if you want to pick up a copy of Win7 you might want to do so before October.
To clarify, I am referring to the retail version, not the OEM version. The OEM/System Builder version will probably remain available longer.

--Patrick
 
Status
Not open for further replies.
Top