Recommended Posts

As long as that 6GB of RAM is genuinely used and not just an artificial block like CoD: Ghosts.

Even if it's not genuinely used, would you really care? Got 16gb on my gaming machine and 90% of it's just sitting there doing nothing, most of the time.

Link to comment
Share on other sites

The x64 application push came during the Vista era - to a whimper.  (Microsoft Office 2010 x64 was the first x64 productivity suite - and it was basically a yawnfest.)

 

Was the Vista era the Exchange 2007 Era, cuz that's when the big x64 app push came and it came in the datacenter where at the time is the place it had the most value. Then SQL, then SharePoint, now Hyper-V.

 

Microsoft still recommends x32* Office because of add-in compatibility, but it's time for that to die. It was a yawnfest because there were so many non-x64 machines in the enterprise and to this day, there are millions of enterprise desktops with no more than 1GB of RAM.

 

Gamers and enthusiasts have no excuse for trying to hold evolution back on this. There's no downside. The only excuse to even give audience to is if one can't afford a modern PC.

Link to comment
Share on other sites

Even if it's not genuinely used, would you really care? Got 16gb on my gaming machine and 90% of it's just sitting there doing nothing, most of the time.

 

I have 16GB and it does annoy me, it was reported CoD: Ghosts uses at most 3GB of RAM, and that a tweak out there already lets users with less than 6GB play the game, and it plays perfectly. Demanding system requirements that you'll never hit just for bragging points reeks of cheap shots to get publicity. 

Link to comment
Share on other sites

I remember games used to come written in the pages of a magazine and we had to type the code in manually for hours if not days before we got to play them and only required 8k of ram or 16k if you were posh.

 

Better still...I remember games that you had to load via a cassette tape.  Purchased at a store which came in a baggy, with a picture card.  My first game was TextTrek.  A star trek game made of text.  Fun as hell as you used your own imagination in place of graphics. 

Link to comment
Share on other sites

Do you really believe that? You really believe BF4 on Ultra Settings, at 1080p, 60+fps doesn't require that? No dropped frames, screen tearing, or hiccups. Rather, it's because of sloppy code it won't run like that x32 on a i5 with 2GB ram and iGPU?

 

Yes, I do, to the first point I made. This game in particular I feel is like that. Do I also think that other games and devs will simply use bloated resource hogging code, compression, etc... absolutely.

 

Did I mention anything about BF4 specifically, no. Do I think they were as lean as they could be to get all those things, well, you'd have to see the code, but at first blush it doesn't pass a sniff test.

 

This will bear out as devs have longer time to work with the consoles and build from there.

Link to comment
Share on other sites

Yes, I do, to the first point I made. This game in particular I feel is like that. Do I also think that other games and devs will simply use bloated resource hogging code, compression, etc... absolutely.

 

Did I mention anything about BF4 specifically, no. Do I think they were as lean as they could be to get all those things, well, you'd have to see the code, but at first blush it doesn't pass a sniff test.

 

This will bear out as devs have longer time to work with the consoles and build from there.

 

I disagree with you suggestions that x64, and as much RAM as possible means bloated code. More memory will always result in better performance with better graphics IMO. And that's what I want out of a hardware investment. More, better, faster.

 

I do agree probably no one has code as tight as possible. If nothing else, budgets and deadlines dictate that. Along with human nature, almost no one gives every job duty or task 100%. It's simply not in human nature. Most people who work an 8 hour day only work 5, no matter how good you think you are at what you do. I'll put a camera on you and prove it.

 

Perhaps that's what Decryptor is talking about when he says how bad NFS: Rivals on the PC is coded. Yet for me, with ultra settings it performs better than any x32 title ever did in that genre. I guess I judge what is delivered.

 

BF4, while not my time of game, and clearly has bugs, delivers gameplay, graphics and performance that I think are great. That's the engine more than anything else but it has left me ready to do my first pre-order when available, Star Wars Battlefront with Frostbite. BF4 gets my i7 quite hot, so I believe they are utilizing all the cores and pushing it, and the performance supports that.

 

I know what x64, as much RAM as possible, and more cores can do in the datacenter, I have no doubt it can enable similar performance and capabilities in gaming and such is being borne out. Have you tried Warframe F2P beta? It's x64, the performance and graphics are quite excellent. Changing my mind about the whole genre.

 

We complain about software not taking advantage of multithreading, then multiple cores, then we have all this RAM no one uses, now that people are trying to use it, I don't understand the complaints. I really don't.

Link to comment
Share on other sites

Yes, I do, to the first point I made. This game in particular I feel is like that. Do I also think that other games and devs will simply use bloated resource hogging code, compression, etc... absolutely.

 

Did I mention anything about BF4 specifically, no. Do I think they were as lean as they could be to get all those things, well, you'd have to see the code, but at first blush it doesn't pass a sniff test.

 

This will bear out as devs have longer time to work with the consoles and build from there.

 

The decision comes down to a tradeoff of more memory or increased CPU time, in the case of the last gen consoles their memory pools were far more constraining than the relative "slowness" of their processors - so devs chose to conserve memory.

 

Now with the new consoles devs have at very minimum 8x the capacity of the previous gen consoles, combine that with tech such as GL_ARB_sparse_texture / "Tiled Resources" to reduce the size of the texture pool, and you can shift the balance back towards the middle.

 

Because at the end of the day, unused RAM is useless. The key is to not be wasteful, but not be stingy either - unless your platform calls for it.

Link to comment
Share on other sites

GTX 460 and HD 5850 are fairly ancient now. The 6GB RAM requirement might be the biggest stumbling block for users though. A quick look at the latest Steam Survey shows more than 50% of users won't meet the requirement.

 

Problem is most people don't read the box for specs. They just grab it off the shelf for their kid and install it on their P4 1.6 512mb ram Xp machine. There were some games i've seen install on some customers computer where I just sit there and wonder "THAT was playable?" .."How was that not at slide show????"

 

Link to comment
Share on other sites

I disagree with you suggestions that x64, and as much RAM as possible means bloated code. More memory will always result in better performance with better graphics IMO. And that's what I want out of a hardware investment. More, better, faster.

 

I do agree probably no one has code as tight as possible. If nothing else, budgets and deadlines dictate that. Along with human nature, almost no one gives every job duty or task 100%. It's simply not in human nature. Most people who work an 8 hour day only work 5, no matter how good you think you are at what you do. I'll put a camera on you and prove it.

 

Perhaps that's what Decryptor is talking about when he says how bad NFS: Rivals on the PC is coded. Yet for me, with ultra settings it performs better than any x32 title ever did in that genre. I guess I judge what is delivered.

 

BF4, while not my time of game, and clearly has bugs, delivers gameplay, graphics and performance that I think are great. That's the engine more than anything else but it has left me ready to do my first pre-order when available, Star Wars Battlefront with Frostbite. BF4 gets my i7 quite hot, so I believe they are utilizing all the cores and pushing it, and the performance supports that.

 

I know what x64, as much RAM as possible, and more cores can do in the datacenter, I have no doubt it can enable similar performance and capabilities in gaming and such is being borne out. Have you tried Warframe F2P beta? It's x64, the performance and graphics are quite excellent. Changing my mind about the whole genre.

 

We complain about software not taking advantage of multithreading, then multiple cores, then we have all this RAM no one uses, now that people are trying to use it, I don't understand the complaints. I really don't.

 

It's not a complaint about making stuff that legitimately uses the resources, it's the fact I feel most devs are going to throw as much as they can against the resource wall and rely on bloated requirements to carry their code and game. It has already started happening with nonish-optimized games that just launched, and even in recent interviews, devs praise the huge ram and cpu, etc that will allow them to do all this crazy stuff. I only fear many devs will choose the easy path, and NOT optimize and bloat the systems down.

 

We have seen it already with memory leaks and ewwy coded stuff on both PS4 and ONE. While you say 'We complain about software not taking advantage of multithreading, then multiple cores, then we have all this RAM no one uses, now that people are trying to use it, I don't understand the complaints. I really don't.', I say we asked for it, now we have the right to be afraid they will abuse it.

Link to comment
Share on other sites

About damn time games started using the hardware! Sick of games being held back by 8 year old consoles. I bet AMD and NVIDIA also love this. For years you could run games extremely well with really old cards.

 

I totally agree, it's also about damn time they used a 64-bit OS for something more than very large, complex spreadsheets!!

  • Like 2
Link to comment
Share on other sites

Because at the end of the day, unused RAM is useless. The key is to not be wasteful, but not be stingy either - unless your platform calls for it.

 

Oh, I agree totally. I only hope most of the dev teams don't take the former route.

Link to comment
Share on other sites

Why does it need i7 when i5 works almost the same in games?

GPU list looks okay , however we all know game system requirements are ###### and overrated most the time.

Link to comment
Share on other sites

Was the Vista era the Exchange 2007 Era, cuz that's when the big x64 app push came and it came in the datacenter where at the time is the place it had the most value. Then SQL, then SharePoint, now Hyper-V.

 

Microsoft still recommends x32* Office because of add-in compatibility, but it's time for that to die. It was a yawnfest because there were so many non-x64 machines in the enterprise and to this day, there are millions of enterprise desktops with no more than 1GB of RAM.

 

Gamers and enthusiasts have no excuse for trying to hold evolution back on this. There's no downside. The only excuse to even give audience to is if one can't afford a modern PC.

I was referring to the desktop - the push at the SERVER end came with Server/Exchange 2003 - the first x64 iterations of both.

 

I'm quite aware of the lack of x64 add-ins - however, if you don't need such add-ins (for me, the lack of add-ins for Outlook went away due to a change in Hotmail), why NOT crossgrade?

 

And your point is, in fact, one I agree with, and have been trying to make since Vista's launch, in fact (which is when I started with the upgrades for my x64-ready users - among ALL the Windows hardware run by all the folks I support, there is exactly NO case of x32 software - other than browsers - being used where an x64 version is available; none at all).  If anything, the x64 lack of penetration is of greater impact in terms of browsers than productivity applications, let alone suites - there is no x64 version of Google Chrome, for example.

Link to comment
Share on other sites

Activision did this with Call of Duty Ghosts requiring 6GB of Ram to play but was able to run on 4GB after an update patch. Allot of gamers won't be happy with the DirectX 11 cards only.

Link to comment
Share on other sites

Why does it need i7 when i5 works almost the same in games?

GPU list looks okay , however we all know game system requirements are bull**** and overrated most the time.

 

Presumably it is going to have an RTS sub game with thousands upon thousands of units and be CPU bound. Or more likely: the requirements have almost no basis in reality.

 

Honestly, that i7 thing puts into question where any specs from come from. It is probably just a calculated publicity stunt to generate news and hype about how good the game must be given the requirements. It's not something that is unfeasible in the gaming world. It is probably just going to come out and run fine on mid-range systems and they'll just never change the requirements to reflect reality.

Link to comment
Share on other sites

There is always a big jump like this when a new console comes out. i remember when The Elder Scrolls Oblivion came out and i could only run it with everything turned all the way down. but just a few years later i can play any game maxed out without issue. Now we run into that same issue again.

Link to comment
Share on other sites

Activision did this with Call of Duty Ghosts requiring 6GB of Ram to play but was able to run on 4GB after an update patch. Allot of gamers won't be happy with the DirectX 11 cards only.

 

No, what Activision did was set a purely artificial limit that had no bearing on the amount of RAM Ghosts actually needed.

 

Not sure what on earth they were trying to achieve by doing so, maybe to evaluate what the high-end market numbers are like - either way it was circumvented by end-users and then eventually removed.

Link to comment
Share on other sites

Activision did this with Call of Duty Ghosts requiring 6GB of Ram to play but was able to run on 4GB after an update patch. Allot of gamers won't be happy with the DirectX 11 cards only.

 

Given that it is suppose to be an XBox360 game also, the DX11 requirement for the PC version really doesn't really make much sense. It is most likely going to kill sales if it stays upon release. It'll give publishers an excuse to never release PC games because of "poor sales" ;-)

Link to comment
Share on other sites

Why are people even surprised?  Everyone knows that most game developers prioritise towards consoles.  The new consoles will help narrow the gap between them and the PC by having similar hardware, but it's pretty obvious that because these new consoles have like 8GB of RAM etc that system requirements are going to go up for PC's, but this is all good!

Link to comment
Share on other sites

There is always a big jump like this when a new console comes out. i remember when The Elder Scrolls Oblivion came out and i could only run it with everything turned all the way down. but just a few years later i can play any game maxed out without issue. Now we run into that same issue again.

It'd make sense if the new consoles were actually high-end gaming machines, but they aren't. They are just mid-range equivalent systems with netbook cores for the most part. People seem to be looking at the RAM and assuming that it is a-lot more than what PC gamers typically have. The reality is that it isn't though for the most part if we consider both the GPU and CPU ram on systems together (e.g. 4GB of system RAM + 2 GB of GPU RAM...). I suspect the real basis for the 6GB RAM requirement is simply to make it equivalent to what these next gen systems have total: 8GB.

Link to comment
Share on other sites

It's not a complaint about making stuff that legitimately uses the resources, it's the fact I feel most devs are going to throw as much as they can against the resource wall and rely on bloated requirements to carry their code and game. It has already started happening with nonish-optimized games that just launched, and even in recent interviews, devs praise the huge ram and cpu, etc that will allow them to do all this crazy stuff. I only fear many devs will choose the easy path, and NOT optimize and bloat the systems down.

 

We have seen it already with memory leaks and ewwy coded stuff on both PS4 and ONE. While you say 'We complain about software not taking advantage of multithreading, then multiple cores, then we have all this RAM no one uses, now that people are trying to use it, I don't understand the complaints. I really don't.', I say we asked for it, now we have the right to be afraid they will abuse it.

 

Fair enough. Let's hope modern resources on the desktop bring out the best in devs and they take pride in their craft.

Link to comment
Share on other sites

Activision did this with Call of Duty Ghosts requiring 6GB of Ram to play but was able to run on 4GB after an update patch. Allot of gamers won't be happy with the DirectX 11 cards only.

True - the same requirement existed with Crysis 3, and it horked off quite a few folks, and that is despite DX11 being ANCIENT - nV GeForce 5xx or AMD HD5xxx, for crying out loud.  (The first DX11 GPUs are about to head into legacy-support territory, and folks are still complaining that the technology is too new?  It would be one thing to complain if newer GPUs were pricey/expensive - however, unless you are talking the higher end (GTX6xx or above for nV, AMD R9 280X or above for AMD) you aren't talking even $200USD (GTX660 is at that price point today - so is AMD's R9 280).

 

And CPU spec puffery isn't new, either - how many games had an i7 (first-generation i7) in the "recommended" check-box, when a first-generation i5 would work?  What we keep forgetting is that the "recommended" is the high-end of a game's requirements - the "required" is the floor; what you should be looking at is where your current hardware, or your target if you are upgrading, sits between the two.  (What I just outlined is, in fact typical of most game development for PCs - even in terms of ports - only Crytek didn't follow that design philosophy, though they actually came closer with Crysis 2/3.)

 

I still fit rather comfortably in the middle spec-wise of most games (with the exception of Watch Dogs, I still fit in that space between "required" and "recommended" of every game where the specs have outed that has shipped during 2013, or planned to ship during 2014 - despite my dead hardware).  That is the "really embarrassing" point that I am making with threads like this.  Someone earlier in the thread stated that I have hardware specs below the low end - apparently, game developers still think that my dead hardware is relevant - if it weren't for the RAM requirement, I would have zero reason to upgrade my hardware at all from a gaming POV.  So, whose fault is that - mine, or the game development community for lack of envelope-pushing? (I AM looking at upgrading - specifically, CPU, motherboard, and RAM (I have the RAM purchased already - 4GB x2 of DDR3-1333), adding those items will put me over the floor requirements for even Watch Dogs.  GPU?  Right now, the reason to change that is none - not even for Watch Dogs; if I had a display requirement for above 1920x1080, or were looking at multiple displays, GPU change would be worth looking at.  However, with neither an issue (lack of space to mount either a larger display, OR an additional display), why would I need to upgrade the GPU?  Disk space?  Not an issue yet.  SSD?  Want - not need; still, we're talking a three-figure want, at worst case (128/120 GB drive not on sale from a Big Four brand - Samsung, Intel, Crucial, or Plextor).  Gaming is the single activity that pushes hardware the hardest in terms of non-outlying users - and most of the time, it fails at even that.  And then we wonder why no OS beyond 7 has gotten any traction.

Link to comment
Share on other sites

I totally agree, it's also about damn time they used a 64-bit OS for something more than very large, complex spreadsheets!!

 

hehehe, and for the OS itself. When is the last time Virtual Memory/Paging was something you had to concern yourself with? Of Course, never having to page much to disk hasn't done a think to stop random Modern UI app quits, lol.

Link to comment
Share on other sites

hehehe, and for the OS itself. When is the last time Virtual Memory/Paging was something you had to concern yourself with? Of Course, never having to page much to disk hasn't done a think to stop random Modern UI app quits, lol.

 

It is really 2GB for all of the 64-bit pointers they are keeping around. It'd only be 4GB had they kept 32-bit OS (and pointers) in the mix ;-)

Link to comment
Share on other sites

This topic is now closed to further replies.