[POLL] What are your favorite CPUs and GPUs for 2023?


Preferred CPU and GPU for 2023? (AMD, Intel, Nvidia…)  

74 members have voted

  1. 1. What is your favorite processor (CPU) for 2023?

    • AMD
      34
    • Arm (Apple, Samsung, Qualcomm, etc.)
      2
    • Intel
      36
    • Other (please describe)
      2
  2. 2. What is your favorite graphics card (GPU) for 2023?

    • AMD
      17
    • Intel
      4
    • Nvidia
      52
    • Other (please describe)
      1

This poll is closed to new votes

  • Please sign in or register to vote in this poll.
  • Poll closed on 01/01/24 at 07:59

Recommended Posts

Hello,

This is the new yearly poll for your preferred processor (CPU) and graphics card (GPU).

The old poll is located here.

Please leave a comment sharing your current hardware, plus any upgrades you plan on making this year!

Regards,

Aryeh Goretsky

Link to comment
Share on other sites

  • 2 weeks later...

Haven't had an amd gpu/cpu  since my 9700pro and my Athlon x2 64bit

been intel ever since then, only thing about intels i hate is the constant socket changes and nvidia because the pricing is just insane i can no longer afford to upgrade as often (allways 2 gens behind with gpu and 2/3 gens behindin cpu)

Link to comment
Share on other sites

On 14/01/2023 at 23:34, DKAngel said:

been intel ever since then, only thing about intels i hate is the constant socket changes and nvidia because the pricing is just insane i can no longer afford to upgrade as often (allways 2 gens behind with gpu and 2/3 gens behindin cpu)

That's probably not a big deal in the big picture for vast majority of people and probably done you a favor really. because CPU's that are not fairly ancient (say about 2nd or 3rd gen i5 range CPU's or better) are still pretty good for most tasks. basically having the latest and greatest CPU's are simply not needed for the vast majority of people (assuming mostly general usage and gaming and the like). hell, my 3rd gen i5 (i5-3550) is still pretty good even today and it's 2012 CPU technology, so it's pretty much 11 years old CPU tech here soon. so anyone with anything noticeably more recent than a 3rd gen i5 range CPU (call it roughly 6-7th gen or so) almost certainly is not even close to needing a CPU upgrade since that will easily handle most tasks and basically all games available etc.

I never understood people who upgrade CPU's that are already well more than fast enough for what they do as it's largely a waste of money because the performance upgrade over what they currently have simply ain't enough to justify the upgrade cost. so what they are basically doing is spending hundreds of dollars for a minimal overall improvement to their previous setup, just ain't worth it. it's best to hang onto ones setup as long as they can, then when the time comes for a real need of a upgrade, then you will get a rather large boost to the overall performance without having to spend a arm-and-a-leg to do it.

I would say GPU's will likely need a upgrade more often than a CPU will, but even these, assuming you have a half way decent one, should 'easily' get you years of use at minimum. probably at least 3-5 years for a conservative figure, and well over this if you buy a decent GPU that's not already close to the edge.

like even when I bought my 1050 Ti 4GB in July 2017 (It's basically 2016 graphics tech) for only $135, while it's nothing special at the time, it was still respectable for the price range. hell, if I would have gotten say the '1060 6GB' model back in those days I would have still been comfortable for gaming with it today and that GPU is nothing special compared to today's standards. because you can see the 1050 Ti 4GB is showing it's age a bit and all on newer games. still, it's at least passable if not pretty good for all of the games I tend to replay (Mafia series, RDR2). because back when I bought mine, at least in terms of the NVIDIA side of things, I would probably say the 1060 GPU is more of the entry level of the higher up GPU's where as my 1050 Ti 4GB was pretty much the best of the non-higher end range cards of the Geforce 10 series for a pretty good ball park figure here.

p.s. another thing with gaming in general... I tend to assume 'about' 1080p (1920x1080) at 30fps as a good minimum guideline for gaming performance (at least single player games) and by this standard you don't need anywhere near a monster level GPU as just about any half way decent modern GPU will probably well exceed the 1080p/30fps standard. sure, I prefer to be more in the 1080p/60fps range (as not only is it smooth, but makes you more future proof since your frame rate can decline a fair amount as graphics advance before it becomes a real problem), but this is not required to have a playable game, even though it's more optimal. sure, I get some are shifting to 1440p or higher, but at least for me, it's not all that important (since 1080p is pretty much standardized for quite a while now and I suspect will remain this way for the foreseeable future for many, if not most(?)), especially if you got to pay a lot more money to get similar performance levels. like I could 'maybe' see going to 1440p, but 2160p appears to be simply not worth it because it seems to require monster GPU's to run that well from what I heard (I am a little out of date on current GPU's etc (last time I did proper research on CPU's was basically 2012 and GPU was 2017), but if I am in the ball park here, then you get my point).

Link to comment
Share on other sites

I'm upgrading from an Alienware M17 R3 2020 to an Alienware X17 R2 w/ a 4K UHD screen this year and can't wait to get gaming on 4K!

This year is most likely and probably your last chance to score a 17" laptop seeing as none of them were offered or shown off at the CES 2023 show. Most that were shown off were 14 inch, 16 inch and 18 inch, no 17 inch appeared at CES ;) 

Link to comment
Share on other sites

On 15/01/2023 at 14:41, ThaCrip said:

That's probably not a big deal in the big picture for vast majority of people and probably done you a favor really. because CPU's that are not fairly ancient (say about 2nd or 3rd gen i5 range CPU's or better) are still pretty good for most tasks. basically having the latest and greatest CPU's are simply not needed for the vast majority of people (assuming mostly general usage and gaming and the like). hell, my 3rd gen i5 (i5-3550) is still pretty good even today and it's 2012 CPU technology, so it's pretty much 11 years old CPU tech here soon. so anyone with anything noticeably more recent than a 3rd gen i5 range CPU (call it roughly 6-7th gen or so) almost certainly is not even close to needing a CPU upgrade since that will easily handle most tasks and basically all games available etc.

I never understood people who upgrade CPU's that are already well more than fast enough for what they do as it's largely a waste of money because the performance upgrade over what they currently have simply ain't enough to justify the upgrade cost. so what they are basically doing is spending hundreds of dollars for a minimal overall improvement to their previous setup, just ain't worth it. it's best to hang onto ones setup as long as they can, then when the time comes for a real need of a upgrade, then you will get a rather large boost to the overall performance without having to spend a arm-and-a-leg to do it.

I would say GPU's will likely need a upgrade more often than a CPU will, but even these, assuming you have a half way decent one, should 'easily' get you years of use at minimum. probably at least 3-5 years for a conservative figure, and well over this if you buy a decent GPU that's not already close to the edge.

like even when I bought my 1050 Ti 4GB in July 2017 (It's basically 2016 graphics tech) for only $135, while it's nothing special at the time, it was still respectable for the price range. hell, if I would have gotten say the '1060 6GB' model back in those days I would have still been comfortable for gaming with it today and that GPU is nothing special compared to today's standards. because you can see the 1050 Ti 4GB is showing it's age a bit and all on newer games. still, it's at least passable if not pretty good for all of the games I tend to replay (Mafia series, RDR2). because back when I bought mine, at least in terms of the NVIDIA side of things, I would probably say the 1060 GPU is more of the entry level of the higher up GPU's where as my 1050 Ti 4GB was pretty much the best of the non-higher end range cards of the Geforce 10 series for a pretty good ball park figure here.

p.s. another thing with gaming in general... I tend to assume 'about' 1080p (1920x1080) at 30fps as a good minimum guideline for gaming performance (at least single player games) and by this standard you don't need anywhere near a monster level GPU as just about any half way decent modern GPU will probably well exceed the 1080p/30fps standard. sure, I prefer to be more in the 1080p/60fps range (as not only is it smooth, but makes you more future proof since your frame rate can decline a fair amount as graphics advance before it becomes a real problem), but this is not required to have a playable game, even though it's more optimal. sure, I get some are shifting to 1440p or higher, but at least for me, it's not all that important (since 1080p is pretty much standardized for quite a while now and I suspect will remain this way for the foreseeable future for many, if not most(?)), especially if you got to pay a lot more money to get similar performance levels. like I could 'maybe' see going to 1440p, but 2160p appears to be simply not worth it because it seems to require monster GPU's to run that well from what I heard (I am a little out of date on current GPU's etc (last time I did proper research on CPU's was basically 2012 and GPU was 2017), but if I am in the ball park here, then you get my point).

When going from a 4thgen i7 to a 10th gen i7 it was a massive performance boost, esp since i run emulation, plex servers from it and other things, im still a 1080p gamer because of my monitors but my 2060rtx should see me thru for a while

Link to comment
Share on other sites

On 15/01/2023 at 08:47, DKAngel said:

When going from a 4thgen i7 to a 10th gen i7 it was a massive performance boost, esp since i run emulation, plex servers from it and other things, im still a 1080p gamer because of my monitors but my 2060rtx should see me thru for a while

I am not sure how true it is but I heard something like a 3-5% increase in performance per CPU generation (like 2nd to 3rd to 4th to 5th and so on) assuming comparable core count etc to keep comparisons fair (since obviously if you have more cores and programs take advantage of those cores then the performance differences will be more profound). if that's roughly correct, while I would not be surprised if there is a noticeable increase in performance from 4th gen to 10th gen, it's probably nothing major at the end of the day putting aside additional cores (to keep comparisons fair) so it's more of a generational increase/GHz bump where the performance differences are at. but obviously, if you get more cores and programs take advantage of those cores then I can see the performance gap being a bit more major.

but from what your saying I suspect the extra cores are ultimately where the bulk of the performance boost comes from to make things more major for you.

but in terms of gaming... as you already know, you are not going to have problems for the foreseeable future on that 2060 GPU at 1080p as I think even a 1060 6GB GPU is still going to hold up well for years to come (assuming at least 1080p/30fps minimum standards or so). so you got plenty of breathing room where as myself, with my 1050 Ti 4GB, while I am not quite running on THE edge, you can see it's age setting in as to guesstimate things on the more higher end games in recent memory are probably starting to push it on this GPU to where one might have to lower res/lower graphics settings a fair amount. but on graphics engines that are not too recent (say upper 2010's or so), it's still good enough even if I have to tweak things a bit but ultimately can still keep graphics on the higher side etc.

have a good day and thanks for the info ;)

Link to comment
Share on other sites

For my gaming PC I'm back with an AMD for processor for the first time since the Athlon XP days, I went with an Nvidia GPU.

However if / when I upgrade my home server / NAS I would go with an Intel CPU that supports Intel Quick Sync. This would be a lot more efficient for video transcoding, which would be the main thing putting the CPU under load.

Link to comment
Share on other sites

On 16/01/2023 at 17:32, ThaCrip said:

I am not sure how true it is but I heard something like a 3-5% increase in performance per CPU generation (like 2nd to 3rd to 4th to 5th and so on) assuming comparable core count etc to keep comparisons fair (since obviously if you have more cores and programs take advantage of those cores then the performance differences will be more profound). if that's roughly correct, while I would not be surprised if there is a noticeable increase in performance from 4th gen to 10th gen, it's probably nothing major at the end of the day putting aside additional cores (to keep comparisons fair) so it's more of a generational increase/GHz bump where the performance differences are at. but obviously, if you get more cores and programs take advantage of those cores then I can see the performance gap being a bit more major.

but from what your saying I suspect the extra cores are ultimately where the bulk of the performance boost comes from to make things more major for you.

but in terms of gaming... as you already know, you are not going to have problems for the foreseeable future on that 2060 GPU at 1080p as I think even a 1060 6GB GPU is still going to hold up well for years to come (assuming at least 1080p/30fps minimum standards or so). so you got plenty of breathing room where as myself, with my 1050 Ti 4GB, while I am not quite running on THE edge, you can see it's age setting in as to guesstimate things on the more higher end games in recent memory are probably starting to push it on this GPU to where one might have to lower res/lower graphics settings a fair amount. but on graphics engines that are not too recent (say upper 2010's or so), it's still good enough even if I have to tweak things a bit but ultimately can still keep graphics on the higher side etc.

have a good day and thanks for the info ;)

Its not just cpu cores, its cpu instruction sets as well.  The ghz bump means nothing as my devils cannon 4th gen i7 was 4ghz my current 10thgen i7 is a 2.9.

the multicore comparison is like a 60% increase in speed and single core is like 20 between those generations, which is massive. the memory bandwith is doubled etc etc

i couldnt transode the videos i have on my gen 4 over plex these days as it would just kill my cpu 

either way the gen 4 is still going strong just needs a new gfx card as the ole GTX 760 is a bit old in the tooth

Link to comment
Share on other sites

Hello,

  

On 15/01/2023 at 06:42, SnoopZ said:

 @goretsky shouldn't the poll part for GPU say 2023 rather than 2022?

 

Yes, it should have.  I was out of town, though, but @Jim K fixed it.

Regards,

Aryeh Goretsky
 

Link to comment
Share on other sites

  • 9 months later...

On the CPU side, it is still Intel - despite the inroads of ARM.  On the GPU side it's nVidia adding laptop strength to the lead it held in desktops - and I actually own a laptop that proves the point - a Dell Inspiron 7786.  Despite it being a retread Core i7, it doesn't quibble at anything I throw at it.

Link to comment
Share on other sites

My last three systems (2 self-built desktops, 1 laptop) were all AMD (CPU/GPU) affairs except the laptop which has a Nvidia dGPU.  My next PC which will be a self-built desktop machine that will happen in 2024. In the past my new machines have always run Windows and when get a new PC it moves to Linux. This will be the first time a new PC buy/build will be Linux first so may switch to Intel has they seem to get quicker, more complete support for Linux. The GPU? Not sure yet but probably not Nvidia. I am not much of a gamer so maybe Intel if they come out with their 2nd gen parts.

It's interesting there is only one vote for ARM. That doesn't even cover any possible Mac users buying a PC this year. LOL I think ARM is so overrated for the desktop and don't really see the demand for it. Maybe that will start to change in 2024/25 with new chipsets from Qualcomm, AMD, and Nvidia.

Link to comment
Share on other sites

On 25/10/2023 at 11:56, Good Bot, Bad Bot said:

My last three systems (2 self-built desktops, 1 laptop) were all AMD (CPU/GPU) affairs except the laptop which has a Nvidia dGPU.  My next PC which will be a self-built desktop machine that will happen in 2024. In the past my new machines have always run Windows and when get a new PC it moves to Linux. This will be the first time a new PC buy/build will be Linux first so may switch to Intel has they seem to get quicker, more complete support for Linux. The GPU? Not sure yet but probably not Nvidia. I am not much of a gamer so maybe Intel if they come out with their 2nd gen parts.

It's interesting there is only one vote for ARM. That doesn't even cover any possible Mac users buying a PC this year. LOL I think ARM is so overrated for the desktop and don't really see the demand for it. Maybe that will start to change in 2024/25 with new chipsets from Qualcomm, AMD, and Nvidia.

Yeah, don't go NVIDIA EVEN if you don't game. AMD GPU drivers are written into the Kernel. You don't have to worry about installing NVIDIA drivers that may or may not work.

All the horror stories I've read over the years..

Link to comment
Share on other sites

On 25/10/2023 at 13:19, Mindovermaster said:

Yeah, don't go NVIDIA EVEN if you don't game. AMD GPU drivers are written into the Kernel. You don't have to worry about installing NVIDIA drivers that may or may not work.

All the horror stories I've read over the years..

I think you meant to say  "don't go NVIDIA EVEN if you DO game". LOL I am well aware the difference between AMD and  Nvidia on Linux.

Link to comment
Share on other sites

On 25/10/2023 at 21:29, Good Bot, Bad Bot said:

I think you meant to say  "don't go NVIDIA EVEN if you DO game". LOL I am well aware the difference between AMD and  Nvidia on Linux.

You got what I meant, only thing that matters.. 😛

Link to comment
Share on other sites

  • goretsky locked and unpinned this topic
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.