Intel / AMD & AMD / NVIDIA


Intel / AMD & AMD / NVIDIA  

125 members have voted

You do not have permission to vote in this poll, or see the poll results. Please sign in or register to vote in this poll.

Recommended Posts

Andrew

Refreshed for 2018, please make your selections above. In a change of pace we welcome discussion in the topic surrounding your preferences, but please be mindful of the Community Guidelines.
 
Last year's poll for comparison is here.

Link to post
Share on other sites
Steven P.

I voted Intel (CPU) and Nvidia (GPU) but I might upgrade to AMD Threadripper later this year lol. Depends, I have a 4th gen i5 (4440 Haswell) atm from 2013 so it might be time for a new board and cpu heheh.

 

GPU is GeForce GTX 1050 Ti (4GB) which is fine for what I do guyzzzzz.

Link to post
Share on other sites
Mando

im Intel/NV all the way, personally and work wise.

Link to post
Share on other sites
Xahid

AMD/Nvidia

More cores and without restrictions....

  • Like 1
Link to post
Share on other sites
xendrome
2 minutes ago, Xahid said:

AMD/Nvidia

More cores and without restrictions....

What restrictions do you speak of?

Link to post
Share on other sites
Xahid
8 minutes ago, xendrome said:

What restrictions do you speak of?

K and non-K based processors.

Link to post
Share on other sites
+Zagadka

I'm iffy on the CPU side. I did AMD for most everything until 3.2 GHz (Phenom II) and only recently switched to Intel in the past 2 builds. I would probably stick with Intel in a future build. It really help determine motherboard selection, though.

 

GPU is clear cut.  Nvidia just does a better job of it. AMD has been notorious for driver errors, and I've never had a problem with Nvidia cards while having better performance (in my experience). I suppose I just trust them more.

Link to post
Share on other sites
Mando
6 minutes ago, Xahid said:

K and non-K based processors.

id say paying an extra £30 on a £300 cpu for unlocked multiplier is good value. OFc if you wish to have this, most users wont have a need or desire to have an unlocked multiplier out the box (they already have Turbo boost above the registered CPU clock on non-K series)

Link to post
Share on other sites
xendrome
6 minutes ago, Xahid said:

K and non-K based processors.

I guess your right, Ryzen/Threadripper need to be overclocked to be even close to Intel single-core speeds/benchmarks, so it wouldn't be fair for AMD to charge even more to still be in second place (first loser)

  • Like 1
Link to post
Share on other sites
Xahid
3 minutes ago, Mando said:

id say paying an extra £30 on a £300 cpu for unlocked multiplier is good value. OFc if you wish to have this, most users wont have a need or desire.

That's not my point, I just share my opinion.

Link to post
Share on other sites
Mando
Just now, Xahid said:

That's not my point, I just share my opinion.

thats cool mate :) was just saying :) all good mate.

Link to post
Share on other sites
satukoro

I still haven't played anything that my R7 1700 and RX580 can't handle maxed out (usually minus motion blur). I haven't touched the OC settings yet.

My first Compaq had an AMD Athlon x2 with an integrated nvidia Geforce Go 6150 and it was the worst. AMD has come a long way since then.

Link to post
Share on other sites
xendrome
25 minutes ago, satukoro said:

I still haven't played anything that my R7 1700 and RX580 can't handle maxed out (usually minus motion blur). I haven't touched the OC settings yet.

My first Compaq had an AMD Athlon x2 with an integrated nvidia Geforce Go 6150 and it was the worst. AMD has come a long way since then.

1080p I'm guessing. 1440p and 4k are where you'll have difficulty.

  • Like 1
Link to post
Share on other sites
satukoro
28 minutes ago, xendrome said:

1080p I'm guessing. 1440p and 4k are where you'll have difficulty.

Yeah, both my 23" and 40" displays are 1920x1080. I don't have anything of a higher resolution.

Coming from medium (at best) settings at 720p on various laptops over the years, this is incredible for me.

Link to post
Share on other sites
StrikedOut

Intel and nVidia for me. It is time for me to upgrade my GPU, it is my biggest bottle neck at the moment and with a 1440 monitor, I am getting stutter on some games.

Link to post
Share on other sites
Mindovermaster

I had nothing but trouble with AMD GPUs. I moved to NVIDIA long ago...

  • Like 2
Link to post
Share on other sites
  • 6 months later...
+Starry

Can't complain about my Ryzen and 1070 except that G-Sync bothers me.  Someday I might go back to AMD.  Already got a Freesync 2 display.

Link to post
Share on other sites
PGHammer

I've almost always been an Intel CPU guy (the exception has been in notebooks - even there, the exception was limited); oddly enough, I've mostly been AMD (and ATI before that) in terms of graphics; where I moved was nVidia's Fermi and successors (I pointed out that I went from Fermi to Pascal in terms of desktop graphics).  It wasn't lack of interest (but lack of money) that kept me out of the intervening generations; I pulled the trigger on desktop Pascal on the dip.

Link to post
Share on other sites
boydo

AMD CPU for me (I just built my Threadripper 1950X rig) but Nvidia GPU - if I can pickup a 1080 Ti on the cheap I will, otherwise I'll see if I can hang on until the next-gen GPUs come out.

  • Like 1
Link to post
Share on other sites
Steven P.
Just now, boydo said:

AMD CPU for me (I just built my Threadripper 1950X rig) but Nvidia GPU - if I can pickup a 1080 Ti on the cheap I will, otherwise I'll see if I can hang on until the next-gen GPUs come out.

Check Nvidia.com in your region they are actually linking retailers that are putting the 10xx series on sale, which also include the 1080, this is ahead of the 11xx launch, so you are buying at the right time ;)

  • Like 1
Link to post
Share on other sites
LaP
On 8/2/2018 at 12:23 AM, LostCat said:

Can't complain about my Ryzen and 1070 except that G-Sync bothers me.  Someday I might go back to AMD.  Already got a Freesync 2 display.

Yeah i prefer nVidia gpu and drivers but G-Sync monitors prices are ridiculous. I simply can't find any G-Sync monitors over 23 inches for under 800$ where i live. I don't like buying monitors online from USA. I can easily find good 27 inches IPS 2k and 4K FreeSync monitors for around 400-500$. G-Sync 2k IPS are pretty much all over 800. The 4k IPS G-Sync ones are pretty much all over 1k. That's ridiculous on paper they are not even better than the FreeSync 2k and 4k IPS ones sold at around 500$ in Canada.

 

nVidia will need to wake up with G-Sync. Either they reduce the price or ditch it for FreeSync. From all the reviews i've read G-Sync is not really better than FreeSync. It's not worth paying around 50% more for a G-Sync monitor with the same specs as a FreeSync one.

 

If the price of G-Sync monitors don't come down in Canada i'll have to go with AMD next time if their gpu are close enough. Right now they are Vega 56 is pretty much equals to a 1070 if not slightly better. The TI cards are just too expensive to even consider them.

 

I just built my new workstation with a Ryzen 1800x and i'm 100% satisfied with it. I don't have enough money to justify having both a workstation and a gaming computer so i game on my workstation. I could have both but why since my workstation requires a good cpu and gpu for my work anyway.  The fact AMD gpus are doing very good at compute and FreeSync is significantly less expensive in Canada those are very strong selling points to me. More than enough to make me ditch Intel/nVidia even if i've been using both for the last 15 years or so.

Link to post
Share on other sites
Steven P.
55 minutes ago, LaP said:

Yeah i prefer nVidia gpu and drivers but G-Sync monitors prices are ridiculous. I simply can't find any G-Sync monitors over 23 inches for under 800$ where i live. I don't like buying monitors online from USA. I can easily find good 27 inches IPS 2k and 4K FreeSync monitors for around 400-500$. G-Sync 2k IPS are pretty much all over 800. The 4k IPS G-Sync ones are pretty much all over 1k. That's ridiculous on paper they are not even better than the FreeSync 2k and 4k IPS ones sold at around 500$ in Canada.

 

nVidia will need to wake up with G-Sync. Either they reduce the price or ditch it for FreeSync. From all the reviews i've read G-Sync is not really better than FreeSync. It's not worth paying around 50% more for a G-Sync monitor with the same specs as a FreeSync one.

 

If the price of G-Sync monitors don't come down in Canada i'll have to go with AMD next time if their gpu are close enough. Right now they are Vega 56 is pretty much equals to a 1070 if not slightly better. The TI cards are just too expensive to even consider them.

 

I just built my new workstation with a Ryzen 1800x and i'm 100% satisfied with it. I don't have enough money to justify having both a workstation and a gaming computer so i game on my workstation. I could have both but why since my workstation requires a good cpu and gpu for my work anyway.  The fact AMD gpus are doing very good at compute and FreeSync is significantly less expensive in Canada those are very strong selling points to me. More than enough to make me ditch Intel/nVidia even if i've been using both for the last 15 years or so.

What is so special about G-Sync / FreeSync and why does it cost so much over "normal" monitors? :s 

Link to post
Share on other sites
LaP
19 minutes ago, Steven P. said:

What is so special about G-Sync / FreeSync and why does it cost so much over "normal" monitors? :s 

It gives you the advantage of v-sync without the high cost in fps. If you can stand tearing and play without v-sync enabled then it gives you nothing. Personally i simply can't stand any level of tearing so i always turn v-sync on but the cost in fps is high and it induces input lags too in some games. FreeSync monitors cost pretty much the same as non FreeSync monitors. As for G-Sync monitors i don't know why they are so expensive (often they are 200 to 300$ CAD more than equally specked FreeSync monitors in Canada).

  • Like 1
Link to post
Share on other sites
Steven P.

Thanks @LaP :)

Link to post
Share on other sites
+Starry
7 hours ago, LaP said:

As for G-Sync monitors i don't know why they are so expensive (often they are 200 to 300$ CAD more than equally specked FreeSync monitors in Canada).

Freesync basically just requires an upgraded scaler from the usual makers, where G-Sync requires a custom hardware module from NV in there.  

 

So making a mon with G-Sync costs more to begin with, the market for it is lower and they have to recover R&D and make money off it in general (otherwise why even bother releasing it.)

 

Personally I have two Freesync mons and a laptop that uses Freesync, alongside the Xbox One X of course.  I don't currently have a desktop GPU or TV that use it, though I definitely plan to get both at some point.

Link to post
Share on other sites
  • Nick H. changed the title to Intel / AMD & AMD / NVIDIA

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By hellowalkman
      All-new AMD RX 6800 XT Midnight Black edition reportedly launching today
      by Sayan Sen

      Image via VideoCardz As most of us are aware, the latest generation graphics cards nowadays, especially the ones from AMD, are extremely hard to come by. Ethereum mining is one of the primary reasons for the shortage but production and yield issues are also the culprits. However, despite the dire situation, Team Red is seemingly launching yet another new graphics card.

      According to a screenshot that was posted on the AMD subreddit, the company will be launching the RX 6800 XT Midnight Black edition today at 6 am PST and it had sent advance notices to its Red Team members. The new Midnight Black 6800 XT will only look different from the vanilla silvery edition as it reportedly has the same specifications. The card will apparently have "limited availability" even though that's what most graphics cards are nowadays anyway. Here's what the full statement about the Radeon RX 6800 XT Midnight Black from AMD reads:

      in a way, the new Midnight Black 6800 XT is kind of similar to the RX 5700 XT 50th Anniversary Edition albeit the latter not only looked different but also had higher clocks.

      Source: Reddit

      Update: Corrected the source.

    • By zikalify
      Nokia to use new Intel Xeon Scalable Processors to reduce emissions
      by Paul Hill



      Nokia has announced that its AirFrame data centre is set to begin using third-generation Intel Xeon Scalable Processors that will boost speeds, data throughput, and ultimately lower energy consumption which reduce the firm’s carbon emissions. Nokia said that it will upgrade its hardware as soon as the latest processors are released thanks to its partnership with Intel.

      Commenting on the news, Pasi Toivanen, Head of Edge Cloud at Nokia, said:

      Nokia’s AirFrame data centre is used to run virtualized and cloud-native software that demand powerful computing resources. Nokia said that it helps support network functions and addresses latency constraints so that huge data demands can be met. Nokia’s new processors will be used to support its 5G AirScale Cloud RAN and 5G Cloud Core solutions.

      Switching to the new processors will help the firm meet its Science Based Targets (SBT). Last month, it said that it wants to reduce its emissions by 50% by 2030 compared to 2019 and it wants to meet a 1.5°C global warming scenario.

    • By Rich Woods
      Intel announces its Ice Lake Xeon Scalable processors
      by Rich Woods

      Today, Intel is introducing its third-generation Xeon Scalable processors, which are made for data centers and anything else that requires that level of complexity in computing. A key new feature is that they come with AI acceleration using DL Boost. In fact, Intel says that these are the only data center chips with baked-in AI. Compared to the previous generation, Intel is promising 74% better AI performance.

      The new Xeon Scalable processors are from the Ice Lake family, meaning that they're built on a 10nm process, and Intel is promising a 46% improvement on common workloads. Compared to a five-year-old system, the firm is promising a 265% improvement in average performance. Each processor can have up to 40 cores, and it supports up to eight channels of DDR4-3200 memory and up to 64 lanes of PCIe Gen 4, per socket.

      “Our 3rd Gen Intel Xeon Scalable platform is the most flexible and performant in our history, designed to handle the diversity of workloads from the cloud to the network to the edge,” said Navin Shenoy, executive vice president and general manager of the Data Platforms Group at Intel. “Intel is uniquely positioned with the architecture, design and manufacturing to deliver the breadth of intelligent silicon and solutions our customers demand.”

      Another thing that Intel is touting is crypto acceleration, which means better performance on cryptographic algorithms. If you've got what Intel calls an "encryption-intensive workload", you should feel the effect of this.

      The platform also includes Intel Optane persistent memory 200 series, Optane SSD P5800X and SSD D5-P5316 NAND SSDs, along with the Ethernet 800 Series Network Adapters and the latest Intel Agilex FPGAs. There are new N-SKU chips for networking, which are promising 5G breakthroughs with 62% better performance.

      Intel says that more than 800 cloud service providers use Xeon Scalable, and this year, all leading cloud service providers will be using services powered by the new third-generation chips.

    • By Copernic
      Core Temp 1.17
      by Razvan Serea



      Core Temp is a useful tool that will help monitor your PCs CPU temperature. What makes Core Temp unique is the way it works. It is capable of displaying a temperature of each individual core of every processor in your system! You can see temperature fluctuations in real time with varying workloads. Core Temp is also motherboard agnostic.

      Core Temp is easy to use, while also enabling a high level of customization and expandability.
      Core Temp provides a platform for plug-ins, which allows developers to add new features and extend its functionality.

      Core Temp 1.17 changelog:

      New: AMD Zen 3 and Zen 2 APU support New: Intel Rocket Lake support New: Preliminary Alder Lake support New: Very preliminary Meteor Lake support Fix: "Unsupported CPU" message when only some cores have HT enabled Fix: Epyc Rome/Threadripper 3rd gen Platform detection Fix: Gemini Lake platform detection Fix: Whiskey Lake codename Fix: Incorrect VID reporting on some Celeron/Pentium processors Fix: Crash on Intel Banias based (Pentium/Celeron M) processors Fix: Turbo multiplier detection on Nehalem/Westmere Fix: Bugs related to response to DPI changes Fix: VID reporting on some AMD Athlon64 processors Change: AMD Bulldozer based processors now display the amount of modules/threads instead of cores/threads Change: Improve accuracy of information on unsupported Intel CPUs Download: Core Temp 1.17 (32-bit) | 399.0 KB (Freeware)
      Download: Core Temp 1.17 (64-bit) | 440.0 KB
      View: Core Temp Homepage | Core Temp Add-Ons

      Get alerted to all of our Software updates on Twitter at @NeowinSoftware

    • By hellowalkman
      Hong Kong authorities have reportedly seized over 300 Nvidia CMP HX cards
      by Sayan Sen

      The Hong Kong customs has reportedly seized a batch of CMP HX GPUs with over 300 Nvidia CMP 30HX cards according to a new report by the Chinese media 'mydrivers'. The picture above shows a batch of around a hundred such CMP 30HX units that were busted. These were apparently being smuggled so they could be put into use at mining farms there. While the reason for the action isn't clearly stated anywhere, it may be a result of the recent efforts by the Hong Kong government to ban unauthorized cryptocurrency trading and use at retail.

      Back in February, Nvidia launched these dedicated CMP HX mining cards to satisfy the needs of Ethereum miners, among others, so they may leave the gaming graphics cards to gamers. From a driver inspection, it was found that the CMP HX series of cards is actually based on last-gen Turing architecture with modifications made that prevent them from running graphics workloads. The variant in question here is the CMP 30HX which is a apparently a repurposed GTX 1660 SUPER and offers an ethereum hash rate of up to 26MH/s.

      Alongside the launch of the CMP HX series of GPUs, Nvidia had also crippled the mining performance of the RTX 3060 so as to deter the miners from buying it up too. However, last month, the company had inadvertently released an internal developer driver which removed the 3060's hash rate limiter. The driver has since been pulled.

      Source: mydrivers