Intel / AMD & AMD / NVIDIA


Intel / AMD & AMD / NVIDIA  

125 members have voted

You do not have permission to vote in this poll, or see the poll results. Please sign in or register to vote in this poll.

Recommended Posts

Andrew

Refreshed for 2018, please make your selections above. In a change of pace we welcome discussion in the topic surrounding your preferences, but please be mindful of the Community Guidelines.
 
Last year's poll for comparison is here.

Link to post
Share on other sites
Steven P.

I voted Intel (CPU) and Nvidia (GPU) but I might upgrade to AMD Threadripper later this year lol. Depends, I have a 4th gen i5 (4440 Haswell) atm from 2013 so it might be time for a new board and cpu heheh.

 

GPU is GeForce GTX 1050 Ti (4GB) which is fine for what I do guyzzzzz.

Link to post
Share on other sites
Mando

im Intel/NV all the way, personally and work wise.

Link to post
Share on other sites
Xahid

AMD/Nvidia

More cores and without restrictions....

  • Like 1
Link to post
Share on other sites
xendrome
2 minutes ago, Xahid said:

AMD/Nvidia

More cores and without restrictions....

What restrictions do you speak of?

Link to post
Share on other sites
Xahid
8 minutes ago, xendrome said:

What restrictions do you speak of?

K and non-K based processors.

Link to post
Share on other sites
+Zagadka

I'm iffy on the CPU side. I did AMD for most everything until 3.2 GHz (Phenom II) and only recently switched to Intel in the past 2 builds. I would probably stick with Intel in a future build. It really help determine motherboard selection, though.

 

GPU is clear cut.  Nvidia just does a better job of it. AMD has been notorious for driver errors, and I've never had a problem with Nvidia cards while having better performance (in my experience). I suppose I just trust them more.

Link to post
Share on other sites
Mando
6 minutes ago, Xahid said:

K and non-K based processors.

id say paying an extra £30 on a £300 cpu for unlocked multiplier is good value. OFc if you wish to have this, most users wont have a need or desire to have an unlocked multiplier out the box (they already have Turbo boost above the registered CPU clock on non-K series)

Link to post
Share on other sites
xendrome
6 minutes ago, Xahid said:

K and non-K based processors.

I guess your right, Ryzen/Threadripper need to be overclocked to be even close to Intel single-core speeds/benchmarks, so it wouldn't be fair for AMD to charge even more to still be in second place (first loser)

  • Like 1
Link to post
Share on other sites
Xahid
3 minutes ago, Mando said:

id say paying an extra £30 on a £300 cpu for unlocked multiplier is good value. OFc if you wish to have this, most users wont have a need or desire.

That's not my point, I just share my opinion.

Link to post
Share on other sites
Mando
Just now, Xahid said:

That's not my point, I just share my opinion.

thats cool mate :) was just saying :) all good mate.

Link to post
Share on other sites
satukoro

I still haven't played anything that my R7 1700 and RX580 can't handle maxed out (usually minus motion blur). I haven't touched the OC settings yet.

My first Compaq had an AMD Athlon x2 with an integrated nvidia Geforce Go 6150 and it was the worst. AMD has come a long way since then.

Link to post
Share on other sites
xendrome
25 minutes ago, satukoro said:

I still haven't played anything that my R7 1700 and RX580 can't handle maxed out (usually minus motion blur). I haven't touched the OC settings yet.

My first Compaq had an AMD Athlon x2 with an integrated nvidia Geforce Go 6150 and it was the worst. AMD has come a long way since then.

1080p I'm guessing. 1440p and 4k are where you'll have difficulty.

  • Like 1
Link to post
Share on other sites
satukoro
28 minutes ago, xendrome said:

1080p I'm guessing. 1440p and 4k are where you'll have difficulty.

Yeah, both my 23" and 40" displays are 1920x1080. I don't have anything of a higher resolution.

Coming from medium (at best) settings at 720p on various laptops over the years, this is incredible for me.

Link to post
Share on other sites
StrikedOut

Intel and nVidia for me. It is time for me to upgrade my GPU, it is my biggest bottle neck at the moment and with a 1440 monitor, I am getting stutter on some games.

Link to post
Share on other sites
Mindovermaster

I had nothing but trouble with AMD GPUs. I moved to NVIDIA long ago...

  • Like 2
Link to post
Share on other sites
  • 6 months later...
+Starry

Can't complain about my Ryzen and 1070 except that G-Sync bothers me.  Someday I might go back to AMD.  Already got a Freesync 2 display.

Link to post
Share on other sites
PGHammer

I've almost always been an Intel CPU guy (the exception has been in notebooks - even there, the exception was limited); oddly enough, I've mostly been AMD (and ATI before that) in terms of graphics; where I moved was nVidia's Fermi and successors (I pointed out that I went from Fermi to Pascal in terms of desktop graphics).  It wasn't lack of interest (but lack of money) that kept me out of the intervening generations; I pulled the trigger on desktop Pascal on the dip.

Link to post
Share on other sites
boydo

AMD CPU for me (I just built my Threadripper 1950X rig) but Nvidia GPU - if I can pickup a 1080 Ti on the cheap I will, otherwise I'll see if I can hang on until the next-gen GPUs come out.

  • Like 1
Link to post
Share on other sites
Steven P.
Just now, boydo said:

AMD CPU for me (I just built my Threadripper 1950X rig) but Nvidia GPU - if I can pickup a 1080 Ti on the cheap I will, otherwise I'll see if I can hang on until the next-gen GPUs come out.

Check Nvidia.com in your region they are actually linking retailers that are putting the 10xx series on sale, which also include the 1080, this is ahead of the 11xx launch, so you are buying at the right time ;)

  • Like 1
Link to post
Share on other sites
LaP
On 8/2/2018 at 12:23 AM, LostCat said:

Can't complain about my Ryzen and 1070 except that G-Sync bothers me.  Someday I might go back to AMD.  Already got a Freesync 2 display.

Yeah i prefer nVidia gpu and drivers but G-Sync monitors prices are ridiculous. I simply can't find any G-Sync monitors over 23 inches for under 800$ where i live. I don't like buying monitors online from USA. I can easily find good 27 inches IPS 2k and 4K FreeSync monitors for around 400-500$. G-Sync 2k IPS are pretty much all over 800. The 4k IPS G-Sync ones are pretty much all over 1k. That's ridiculous on paper they are not even better than the FreeSync 2k and 4k IPS ones sold at around 500$ in Canada.

 

nVidia will need to wake up with G-Sync. Either they reduce the price or ditch it for FreeSync. From all the reviews i've read G-Sync is not really better than FreeSync. It's not worth paying around 50% more for a G-Sync monitor with the same specs as a FreeSync one.

 

If the price of G-Sync monitors don't come down in Canada i'll have to go with AMD next time if their gpu are close enough. Right now they are Vega 56 is pretty much equals to a 1070 if not slightly better. The TI cards are just too expensive to even consider them.

 

I just built my new workstation with a Ryzen 1800x and i'm 100% satisfied with it. I don't have enough money to justify having both a workstation and a gaming computer so i game on my workstation. I could have both but why since my workstation requires a good cpu and gpu for my work anyway.  The fact AMD gpus are doing very good at compute and FreeSync is significantly less expensive in Canada those are very strong selling points to me. More than enough to make me ditch Intel/nVidia even if i've been using both for the last 15 years or so.

Link to post
Share on other sites
Steven P.
55 minutes ago, LaP said:

Yeah i prefer nVidia gpu and drivers but G-Sync monitors prices are ridiculous. I simply can't find any G-Sync monitors over 23 inches for under 800$ where i live. I don't like buying monitors online from USA. I can easily find good 27 inches IPS 2k and 4K FreeSync monitors for around 400-500$. G-Sync 2k IPS are pretty much all over 800. The 4k IPS G-Sync ones are pretty much all over 1k. That's ridiculous on paper they are not even better than the FreeSync 2k and 4k IPS ones sold at around 500$ in Canada.

 

nVidia will need to wake up with G-Sync. Either they reduce the price or ditch it for FreeSync. From all the reviews i've read G-Sync is not really better than FreeSync. It's not worth paying around 50% more for a G-Sync monitor with the same specs as a FreeSync one.

 

If the price of G-Sync monitors don't come down in Canada i'll have to go with AMD next time if their gpu are close enough. Right now they are Vega 56 is pretty much equals to a 1070 if not slightly better. The TI cards are just too expensive to even consider them.

 

I just built my new workstation with a Ryzen 1800x and i'm 100% satisfied with it. I don't have enough money to justify having both a workstation and a gaming computer so i game on my workstation. I could have both but why since my workstation requires a good cpu and gpu for my work anyway.  The fact AMD gpus are doing very good at compute and FreeSync is significantly less expensive in Canada those are very strong selling points to me. More than enough to make me ditch Intel/nVidia even if i've been using both for the last 15 years or so.

What is so special about G-Sync / FreeSync and why does it cost so much over "normal" monitors? :s 

Link to post
Share on other sites
LaP
19 minutes ago, Steven P. said:

What is so special about G-Sync / FreeSync and why does it cost so much over "normal" monitors? :s 

It gives you the advantage of v-sync without the high cost in fps. If you can stand tearing and play without v-sync enabled then it gives you nothing. Personally i simply can't stand any level of tearing so i always turn v-sync on but the cost in fps is high and it induces input lags too in some games. FreeSync monitors cost pretty much the same as non FreeSync monitors. As for G-Sync monitors i don't know why they are so expensive (often they are 200 to 300$ CAD more than equally specked FreeSync monitors in Canada).

  • Like 1
Link to post
Share on other sites
Steven P.

Thanks @LaP :)

Link to post
Share on other sites
+Starry
7 hours ago, LaP said:

As for G-Sync monitors i don't know why they are so expensive (often they are 200 to 300$ CAD more than equally specked FreeSync monitors in Canada).

Freesync basically just requires an upgraded scaler from the usual makers, where G-Sync requires a custom hardware module from NV in there.  

 

So making a mon with G-Sync costs more to begin with, the market for it is lower and they have to recover R&D and make money off it in general (otherwise why even bother releasing it.)

 

Personally I have two Freesync mons and a laptop that uses Freesync, alongside the Xbox One X of course.  I don't currently have a desktop GPU or TV that use it, though I definitely plan to get both at some point.

Link to post
Share on other sites
  • Nick H. changed the title to Intel / AMD & AMD / NVIDIA

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By LoneWolfSL
      Nvidia 466.11 WHQL driver lands with support for Mortal Shell RTX and DLSS 
      by Pulasthi Ariyasinghe



      Nvidia is back with another driver update for graphics card owners on the green team. This latest WHQL-certified 466.11 Game Ready release comes with support for Mortal Shell’s RTX update, Valorant's Nvidia Reflex Boost improvements, Nvidia Noise Removal for OBS, and more.

      Mortal Shell, last year's indie Souls-like developed by Cold Symmetry, is finally receiving its long-promised RTX update, featuring both ray-traced shadows and DLSS capabilities. According to Nvidia, the DLSS implementation can improve framerates by up to 130% at 4K resolution.

      Meanwhile, Valorant players who are CPU-bound can now enable Boost Mode in the latency-reducing Nvidia Reflex technology. According to Nvidia, enabling Boost Mode while in a CPU-bound scenario can further reduce latency by up to 16% in the competitive title.

      Open Broadcaster Software (OBS) also received additional support in this driver. The popular software's new OBS Studio 27 beta version now touts Nvidia Noise Removal as a native feature, allowing creators to automatically remove any unwanted background sounds. Previously, this was only available via the company's Nvidia Broadcast companion app.

      Nvidia also validated six more monitors and televisions as G-Sync Compatible Displays with the release of this driver. The updated list now includes the LG displays 27GP950, 2021 B1 4K Series, 2021 C1 4K Series, 2021 G1 4K Series, 2021 Z1 8K Series, and MSI's MAG301RF.

      The handful of bug fixes included in this driver are as follows:

      Here are the currently known issues to keep an eye out for:

      The 466.11 WHQL-certified Game Ready driver can now be downloaded through the links below or via the GeForce Experience app. The release notes are here.

      Desktop GPUs:
      Download: Windows 7, 8, 8.1 | Windows 10 – Standard / DCH

      Notebook GPUs:
      Download: Windows 7, 8, 8.1 | Windows 10 - Standard / DCH

    • By indospot
      Unity is getting support for Nvidia DLSS with version 2021.2
      by João Carrasqueira

      Nvidia's DLSS technology is set to become available to more game developers soon, as the company today announced that Unity is adding support for it in the upcoming release of Unity 2021.2. DLSS support will be included as part of Unity's High Definition Render Pipeline (HDRP).

      DLSS, or Deep Learning Super Sampling, is a technique that allows games to be rendered using conventional methods at lower resolutions, and to then take that image and process it using Nvidia's artificial intelligence processors, called Tensor Cores. This processing can take that lower resolution image and bump it up to a higher resolution, with a trained AI model that can often deliver similar or better visuals than what you might get using conventional rendering alone. Otherwise, similar visual results can also be obtained with much lower performance requirements, enabling higher framerates in games.

      However, DLSS is a feature that has to be implemented per game, and not every developer can easily gain access to it. With Unity being such a common engine for smaller and medium-sized game developers, having DLSS built into the development tools can make it accessible to a broader range of games. As Mathieu Muller, Senior Product Manager for High End Graphics at Unity, explains in the video above, DLSS will be available with just a few clicks, with developers being only having to adjust some values to reap the benefits of the technology.

      No specific Unity titles were announced to be in the works, but multiple popular games are based on Unity - such as Fall Guys, Genshin Impact, Outer Wilds, and Oddworld: Soulstorm. There's potential for any of those titles to get a performance boost thanks to the technology. As to when that will actually be possible, Unity 2021.1 was released just a few weeks ago, and version 2021.2 is in alpha testing right now, so it may take some time for it to be fully available.

    • By zikalify
      NVIDIA invests $1.5 million into Mozilla Common Voice
      by Paul Hill



      Mozilla has announced that its Common Voice project has received $1.5 million from NVIDIA to help “transform the voice recognition landscape.” Mozilla Common Voice has been around for a couple of years so far and allows volunteers to contribute to a database for speech recognition software that’s in the public domain, available for everyone to use.

      Accompanying today’s news, Mozilla has taken the decision to move Common Voice under the umbrella of the Mozilla Foundation and will make up part of the firm’s initiatives to make artificial intelligence more trustworthy. Common Voice has the ability to democratise voice technology development as existing voice data used to train algorithms is held by a few big companies whereas Common Voice is open for all to use.

      Speaking about the investment, Kari Briski, senior director of accelerated computing product management at NVIDIA said:

      Mozilla said that NVIDIA’s investment will accelerate the growth of Common Voice by engaging more communities and volunteers in the project. The money will also help Mozilla hire new staff to improve and promote Common Voice which will result in better data.

    • By Ather Fawaz
      Nvidia now a three-chip company as it unveils its first Arm-based CPU
      by Ather Fawaz

      Image via Nvidia Nvidia is a company that is almost synonymous with the world of GPUs. However, at the Nvidia GPU Technology Conference today, the firm unveiled its first data center CPU. Dubbed Grace, Nvidia's first foray into the realm of central processing units is based on the famed Arm architecture and promises to deliver 10x the performance of today's fastest servers.

      Named after Grace Hopper, a computing pioneer who invented one of the first linkers and laid the foundations of COBOL, Nvidia's Grace will cater directly to the use cases that prioritize high-performance computing. These applications range from natural language processing, computer vision, to protein folding and quantum chemistry.

      The motivation behind Grace was to produce a CPU that can tightly couple with today's GPUs so that system bottlenecks are removed. To achieve this, Nvidia chose the arm architecture and LPDDR5x memory subsystem with the 4th-gen NVIDIA NVLink interconnect technology. Taken together, Nvidia claims that this ecosystem will provide 10x better energy efficiency compared to traditional DDR4 memory and a record 900 GB/s connection between Grace and Nvidia GPUs on the system that will in turn lead to 30x higher aggregate bandwidth compared to today’s leading servers.

      Following today's announcement, Nvidia is now a three-chip company (meaning it specializes in CPU, GPU, and DPU). Jensen Huang, the Founder and CEO of Nvidia, commented on the momentous occasion:

      While Simon Segars, CEO of Arm pointed towards how the Arm architecture could be the driving force behind data center hardware in the future:

      As expected, Grace will be fully supported by the Nvidia HPC software development kit and the complete fleet of CUDA and CUDA-X libraries. Though you will have to wait a bit for it the CPU itself. Expected availability is two years away at the start of 2023. The Swiss National Supercomputing Centre (CSCS) and the U.S. Department of Energy’s Los Alamos National Laboratory have already announced plans to power their data centers using Grace. Further details can be found here.

    • By Copernic
      Core Temp 1.17.1
      by Razvan Serea



      Core Temp is a useful tool that will help monitor your PCs CPU temperature. What makes Core Temp unique is the way it works. It is capable of displaying a temperature of each individual core of every processor in your system! You can see temperature fluctuations in real time with varying workloads. Core Temp is also motherboard agnostic.

      Core Temp is easy to use, while also enabling a high level of customization and expandability.
      Core Temp provides a platform for plug-ins, which allows developers to add new features and extend its functionality.

      Core Temp 1.17.1 changelog:

      Fix: Crash on some AMD Opteron/FX/APU A-series (Bulldozer based) CPUs Fix: Crash on old versions of Windows Download: Core Temp 1.17.1 (32-bit) | 399.0 KB (Freeware)
      Download: Core Temp 1.17.1 (64-bit) | 440.0 KB
      View: Core Temp Homepage | Core Temp Add-Ons

      Get alerted to all of our Software updates on Twitter at @NeowinSoftware