New Gaming rig


Recommended Posts

I'd suggest a Gigabyte 750Ti, he'll enjoy it more.

 

Why not get a R5 case? It's better made.

 

I'd get a 650W for that vid card. Or 750W to future proof him.

  • Like 1
Link to comment
Share on other sites

I pop in to threads like these just to see how others are prepping a game rig build, and I usually get some good ideas on how to configure mine for a future purchase. This build, however, is almost identical to the one that I have planned on Newegg.

I find that rather comical at the coincidence of it all.

Link to comment
Share on other sites

Since it is for gaming, it might make sense to select a true DX12 GPU. Not too many available yet.

 

I would skip the optical drive and get a 250 gig SSD. The 120 size is workable but just becomes annoying over time. Also in most cases there is a sharp performance drop from 250 to 120.

Link to comment
Share on other sites

Since it is for gaming, it might make sense to select a true DX12 GPU. Not too many available yet.

 

I would skip the optical drive and get a 250 gig SSD. The 120 size is workable but just becomes annoying over time. Also in most cases there is a sharp performance drop from 250 to 120.

 

Hell with a little patience a 512 gig SSD is within reason, paid a little over $100 for mine not that long ago 

Link to comment
Share on other sites

Since it is for gaming, it might make sense to select a true DX12 GPU. Not too many available yet.

 

I would skip the optical drive and get a 250 gig SSD. The 120 size is workable but just becomes annoying over time. Also in most cases there is a sharp performance drop from 250 to 120.

Actually, there are quite a few available - it's just that the high end is pricey.

 

In NVidia's case, there is Maxwell - however, it stretches, and today, from high end down to mainstream - and from Titan X to GTX750 - and from dual fans to fanless.  (No; I didn't stumble - I said fanless.  The most available example is the ASUS GTX750 DirectCU Silent - the most widely-available fanless DX12 GPU available.)

 

AMD is trying to counter with Southern Islands - however, there simply isn't the GPU range that Maxwell has today.  (I'm NOT saying AMD won't get there - the question is will it be soon enough to do them any good?)

Link to comment
Share on other sites

For NVIDIA, I think the second series Maxwell chips will have full DX12 with older chips going back to Fermi getting some sort of emulation.

 

https://en.wikipedia.org/wiki/Maxwell_%28microarchitecture%29

 

From the chart, https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

 

The GTX 960 would be the least expensive future-proof GPU from NVIDIA.

Link to comment
Share on other sites

Actually, there are quite a few available - it's just that the high end is pricey.

 

In NVidia's case, there is Maxwell - however, it stretches, and today, from high end down to mainstream - and from Titan X to GTX750 - and from dual fans to fanless.  (No; I didn't stumble - I said fanless.  The most available example is the ASUS GTX750 DirectCU Silent - the most widely-available fanless DX12 GPU available.)

 

AMD is trying to counter with Southern Islands - however, there simply isn't the GPU range that Maxwell has today.  (I'm NOT saying AMD won't get there - the question is will it be soon enough to do them any good?)

 

I think it has to be the secind generation Maxwell GM-20X and not GM-10X to be sure of full DX12.

Link to comment
Share on other sites

The AMD options are headache inducing.

 

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

 

I don't have time so if anyone knows precisely how to nail down DX12 on AMD that would be great.

 

From what I can quickly see, it looks like AMD is being more misleading than NVIDIA and full DX12 will actually be enabled on GCN 1.2 chips - R9 285 and R9 380

Link to comment
Share on other sites

For NVIDIA, I think the second series Maxwell chips will have full DX12 with older chips going back to Fermi getting some sort of emulation.

 

https://en.wikipedia.org/wiki/Maxwell_%28microarchitecture%29

 

From the chart, https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

 

The GTX 960 would be the least expensive future-proof GPU from NVIDIA.

 

For NVIDIA, I think the second series Maxwell chips will have full DX12 with older chips going back to Fermi getting some sort of emulation.

 

https://en.wikipedia.org/wiki/Maxwell_%28microarchitecture%29

 

From the chart, https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units

 

The GTX 960 would be the least expensive future-proof GPU from NVIDIA.

The ONLY difference between Big Maxwell and Original Maxwell is a die-shrink.  Kepler and Fermi DO require emulation (which is why I didn't count either one); however, other htan notebooks and other portables, it's a straight-up swap/upgrade from Fermi or Kepler to Maxwell.

Link to comment
Share on other sites

Its nice to see, hardly anyone in HH tries to do a build with a crappy, generic PSU.

Even the people who are doing a 1st time build, asking for help - will at least start with a decent brand PSU... as far as Ive seen recently.

I remember when almost every build you had to tell the person, "dont pinch pennies with crappy PSU - never !"

 

  • Like 1
Link to comment
Share on other sites

The ONLY difference between Big Maxwell and Original Maxwell is a die-shrink.  Kepler and Fermi DO require emulation (which is why I didn't count either one); however, other htan notebooks and other portables, it's a straight-up swap/upgrade from Fermi or Kepler to Maxwell.

 

 

"Second generation Maxwell (GM20x)

 

Second generation Maxwell introduced several new technologies: Dynamic Super Resolution,[5] Third Generation Delta Color Compression,[6] Multi-Pixel Programming Sampling,[7] Nvidia VXGI (Real-Time-Voxel-Global Illumination),[8] VR Direct,[9][10][11] Multi-Projection Acceleration,[6] and Multi-Frame Sampled Anti-Aliasing(MFAA)[12] however support for Coverage-Sampling Anti-Aliasing(CSAA) was removed.[13] HDMI 2.0 support was also added.[14][15]

 

Second generation Maxwell also changed the ROP to memory controller ratio from 8:1 to 16:1.[16] However, some of the ROPs are generally idle in the GTX 970 because there are not enough enabled SMMs to give them work to do and therefore reduces its maximum fill rate.[17]

 

Second generation Maxwell also has up to 4 SMM units per GPC, compared to 5 SMM units per GPC.[16]

 

GM204 supports CUDA Compute Capability 5.2 compared to 5.0 on GM107/GM108 GPUs, 3.5 on GK110/GK208 GPUs and 3.0 on GK10x GPUs.[6][16][18]

 

Maxwell second generation GM20x GPUs have an upgraded NVENC which supports HEVC encoding and adds support for H.264 encoding resolutions at 1440p/60FPS & 4K/60FPS compared to NVENC on Maxwell first generation GM10x GPUs which only supported H.264 1080p/60FPS encoding.[11]

 

After consumer complaints,[19] Nvidia revealed that it is able to disable individual units each containing 256KB of L2 cache and 8 ROPs without disabling whole memory controllers.[20] This comes at the costs of dividing the memory bus into high speed and low speed segments that cannot be accessed at the same time for reads because the L2/ROP unit managing both of the GDDR5 controllers shares the read return channel and the write data bus between the GDDR5 controllers, making either simultaneously reading from both GDDR5 controllers or simultaneously writing to both GDDR5 controllers impossible.[20] This is used in the GeForce GTX 970, which therefore can be described as having 3.5 GB in its high speed segment on a 224-bit bus and 512 MB in a low speed segment on a 32-bit bus.[20] The peak speed of such a GPU can still be attained, but the peak speed figure is only reachable if one segment is executing a read operation while the other segment is executing a write."

 

That's a lot more than a die shrink.

 

The HEVC is nice for non-gaming usage...

 

It's not like the device driver guys are going to let us know in advance where the emulation diving point will be. And either company releasing specific information risks sales drying up until DX12 comes out but for an individial playing the future proof dice game with hard earned dollars, it makes sense to err on the side of caution by selecting the safest bet for a GPU.

Link to comment
Share on other sites

I took the liberty of making this build. It's $100 more but worth every penny.

 

  • The Core i3-4160 is a decent performer for games (as you can see here)
  • The GTX 760 significantly outperforms the 750 Ti
  • The motherboard has built-in WiFi so there's no need for a wireless adapter
  • And the SSD has more than twice the storage (which is very important considering it should be used as the primary drive)

 

yXrBlqn.png

 

https://pcpartpicker.com/user/Anaron/saved/#view=ny6Ff7

Link to comment
Share on other sites

The AMD options are headache inducing.

 

https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

 

I don't have time so if anyone knows precisely how to nail down DX12 on AMD that would be great.

 

From what I can quickly see, it looks like AMD is being more misleading than NVIDIA and full DX12 will actually be enabled on GCN 1.2 chips - R9 285 and R9 380

The only thing more headache inducing is the misinformation about DX12 in this thread. This is a good starting point (read the high rated comments as well).

 

I took the liberty of making this build. It's $100 more but worth every penny.

 

  • The Core i3-4160 is a decent performer for games (as you can see here)
  • The GTX 760 significantly outperforms the 750 Ti
  • The motherboard has built-in WiFi so there's no need for a wireless adapter
  • And the SSD has more than twice the storage (which is very important considering it should be used as the primary drive)

 

yXrBlqn.png

 

https://pcpartpicker.com/user/Anaron/saved/#view=ny6Ff7

One big problem with that build, there are games that require 4 cores to run and there probably will be more in the future.

Link to comment
Share on other sites

The only thing more headache inducing is the misinformation about DX12 in this thread. This is a good starting point (read the high rated comments as well).

 

From your linked article:

 

"Nvidia

Link to comment
Share on other sites

Shall we begin?

There is no "true DX12 GPU" if by true you mean that said GPU supports all of DX12. Maxwell DOES NOT have full DX12 support.

There is no "emulation". There's only partial support of varying degrees. All GPU's that have been announced to support it (AMD's GCN 1.0+, Nvidia's Fermi+) will support the performance boost that will come with the closer-to-the-metal approach (theoretically anyway until games are out).

There is no full DX12 support yet. Maybe AMD's Fury X card that has launched today has it, but I haven't read any reviews yet.

Again, no full DX12 GPU at the time you posted that.

No such thing as emulation.

And again with the emulation..

 

I don't get your point, although it is entertaining. You are trying to create a debate in a situation for which there will always be a lack of information as long as we don't have access to the source code of the video device driver.

 

I am using the word emulation loosly (and incorrect in the actual definition of the word) to convey quickly a concept in threads dealing with config for a new computer. If we want to get into angels dancing on pin heads then we should start a thread on DX12 which might be useful.

 

If I was buying a new video card right now in order to get the best perceived value for my money, I would attempt to predict which one has the best chance of having hardware support for whatever the device driver guys decide to do in the next couple of years. It is always irritating when NVIDIA or AMD make an arbitrary decision to drop device driver support based on some particular GPU feature.

 

As near as I can figure out what you are trying to say it would be "Trust NVIDIA and AMD to support whatever you buy today for DX12" and well, based on past experience, I don't trust them at all.

 

The whole "everything back to Fermi will be supported" sounds warm and fuzzy when they don't want sales to dry up during the ramp-up to DX12 but once they both have a range of new and shiney silicon, I suspect the story will start to change.

 

The outcome from following my suggestion is a better video card that has some reasonable chance to last an extra year or two....

Link to comment
Share on other sites

I don't get your point, although it is entertaining. You are trying to create a debate in a situation for which there will always be a lack of information as long as we don't have access to the source code of the video device driver.

I'm not trying to start any debate, I was just annoyed about what you and PGHammer were saying and, from where I'm standing, there isn't a very good reason to use DX12 as a main factor in picking a card. GPUs from the last two generations from both AMD and Nvidia should be fine, just go for bang for buck, or manufacturer preference is that floats your boat.

 

In fact, I wouldn't have even commented about it if Yusuf hadn't recommended the i3 build. Reasonably old GPU's will still play DX12 titles because it will take time before it's used exclusively, however there are games out right now that will not work on a dual core without applying various fixes and I expect more will follow that won't be bypassed.

Link to comment
Share on other sites

I'm not trying to start any debate, I was just annoyed about what you and PGHammer were saying and, from where I'm standing, there isn't a very good reason to use DX12 as a main factor in picking a card. GPUs from the last two generations from both AMD and Nvidia should be fine, just go for bang for buck, or manufacturer preference is that floats your boat.

 

In fact, I wouldn't have even commented about it if Yusuf hadn't recommended the i3 build. Reasonably old GPU's will still play DX12 titles because it will take time before it's used exclusively, however there are games out right now that will not work on a dual core without applying various fixes and I expect more will follow that won't be bypassed.

I originally went with a Core i5 and Z97 motherboard but it was more expensive. I assumed OP wouldn't go too far beyond his original budget. You're right though. It's wise to get a quad-core CPU, at minimum. I also agree with what you said about DX12. Current cards now should be fine.

Link to comment
Share on other sites

Hell with a little patience a 512 gig SSD is within reason, paid a little over $100 for mine not that long ago 

how the flip did you get a deal like that??

Link to comment
Share on other sites

Can never have "too much" RAM also 

+1. After many years of custom pc putting together, laptops and what not, I have found myself reaching ram limits in realms when 4gb was "enough", then 8gb... currently sitting at 16.

 

Edit: also AMD's vs Intels, anything intel will be more power efficient, so that does need to be factored in as well, price/performance/per watt.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.