Intel / AMD & AMD / NVIDIA


Intel / AMD & AMD / NVIDIA  

125 members have voted

You do not have permission to vote in this poll, or see the poll results. Please sign in or register to vote in this poll.

Recommended Posts

LaP
On 8/7/2018 at 10:03 PM, LostCat said:

Freesync basically just requires an upgraded scaler from the usual makers, where G-Sync requires a custom hardware module from NV in there.  

 

So making a mon with G-Sync costs more to begin with, the market for it is lower and they have to recover R&D and make money off it in general (otherwise why even bother releasing it.)

 

Personally I have two Freesync mons and a laptop that uses Freesync, alongside the Xbox One X of course.  I don't currently have a desktop GPU or TV that use it, though I definitely plan to get both at some point.

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

Link to post
Share on other sites
+Randomevent
4 minutes ago, LaP said:

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

Well, they both support Freesync for a while now, but the 5700s are a huge upgrade :)  Nice.  I did the same.

 

(HDR handling seems a little iffy at times right now on the 5700s, but other than that fantastic cards.)

Link to post
Share on other sites
LaP
1 minute ago, LostCat said:

Well, they both support Freesync for a while now, but the 5700s are a huge upgrade :) Nice.  I did the same.

 

(HDR handling seems a little iffy at times right now on the 5700s, but other than that fantastic cards.)

Nah my monitor was not supported by nVidia drivers it would display a black screen with gsync compatible enabled.

 

Yeah lot of people report drivers bugs saying the card is almost unusable but so far i had the card for over a month and played around 15 games without any problem. Definitely a good upgrade at 2k.

Link to post
Share on other sites
Mindovermaster

We should revamp this poll... LOL

  • Like 1
Link to post
Share on other sites
sinetheo
4 hours ago, LaP said:

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

Link to post
Share on other sites
Mindovermaster
1 minute ago, sinetheo said:

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

'cough cough' driver quality?

Link to post
Share on other sites
sinetheo
44 minutes ago, Mindovermaster said:

'cough cough' driver quality?

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

Link to post
Share on other sites
adrynalyne
1 minute ago, sinetheo said:

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

What does mining have to do with it? Also, your viewpoint is dated AF.

Link to post
Share on other sites
Mindovermaster
1 hour ago, sinetheo said:

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

ATI was YEARS ago. You are claiming that to today? Lot has changed, bud...

Link to post
Share on other sites
LaP
4 hours ago, sinetheo said:

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

So far none.

 

I can't speak for everyone and my experience with AMD is very small it's only my 3rd AMD gpu i own since the early 90ies (i was on nVidia most of the time) but i would say AMD drivers problems are exaggerated for the most part. It's hard to know what is true and not i mean there's so many fanboys on the Internet. So far after 1 month and a week the only game i have trouble with is Fortnite. This said i had problem with Fortnite in the past with my 1070 too so not sure it's the gpu. The game always gave me trouble with my Asus MB and Aura (in the fact it's more than anti cheat system giving me trouble). All the other games i played are working perfectly. So far i played WoW, Destiny 2, Overwatch, Diablo 3, Darksiders, Warframe, Hollow Knight, Path of Exile, Borderlands 3 and some other free games i got on Epic Store i'm probably forgetting.

 

The 2070 was way too expensive in Canada. It was 150$ more expensive (after taxes) than the 5700XT for like 5-7% better performance on average (according to most reviewers). Can't say i'm impressed by RTX either so far it look like the early shaders years where devs were overusing it to make everything shiny. Ray tracing is the future but the current implementation kind of sucks imo. I just could not justify to pay 150$ more for basically almost the same performance. Maybe people having problem with the 5700XT bought the bad models. I know the MSI Evoke is known to run super hot (over 100 Celsius for the memory) because of badly placed thermal pads.

 

Anyway so far so good. Drivers are in very early stage of development too so i expect things to improve. It's a new architecture and there's always bump in the roads coming with it. My last AMD cards was a 5850 something like 12 years ago or so and it was running fine back then i don't expect things to have changed much over AMD. I would say this ignoring the gpu drivers i do prefer Andrenalin 2020 over Geforce Experience actually.

Edited by LaP
Link to post
Share on other sites
  • 2 weeks later...
sinetheo
On 1/4/2020 at 7:11 PM, Mindovermaster said:

ATI was YEARS ago. You are claiming that to today? Lot has changed, bud...

Not according to gamers nexus. Read all the hate comments below 

 

 

So many people had to return their 5700xts after they would crash constantly

Link to post
Share on other sites
+Randomevent
On 1/4/2020 at 9:14 PM, LaP said:

Anyway so far so good. Drivers are in very early stage of development too so i expect things to improve. It's a new architecture and there's always bump in the roads coming with it. My last AMD cards was a 5850 something like 12 years ago or so and it was running fine back then i don't expect things to have changed much over AMD. I would say this ignoring the gpu drivers i do prefer Andrenalin 2020 over Geforce Experience actually.

I hate to say it but I've had far more issues since 19.12.2 than anyone should ever have to deal with o.o

 

Until now I considered AMD drivers pretty solid over the years, this has been a strangely broken experience.  At least it's getting fixed up nicely, but damn.

Link to post
Share on other sites
LaP
42 minutes ago, LostCat said:

I hate to say it but I've had far more issues since 19.12.2 than anyone should ever have to deal with o.o

 

Until now I considered AMD drivers pretty solid over the years, this has been a strangely broken experience.  At least it's getting fixed up nicely, but damn.

I have 19.12.3 installed and so far it's fine. Like i said Fortnite gives me a black screen after a while but that's the only problem i have. Not sure if it's the drivers or something else no really playing the game anymore so don't really care anyway. I heard the drivers can cause problems (related to the boost) if the fps is not capped and vsync not activated with very high fps. I don't think i capped my fps or activated vsync in Fortnite so that could be the problem but like i said don't really care about the game so meh. Anyway the 2070 Super is simply too expensive in Canada and the 2060 Super performance is a little bit underwhelming for the price. I trust AMD will solve the main issues over the next months specially since the upcoming 5800XT and 5950XT will use the same drivers and arch so i'll keep mine for sure.

Link to post
Share on other sites
  • 1 year later...
+jnelsoninjax

AMD Ryzen 7 2700/Nvida 2060

Price and performance is simply better AMD route than Intel.

Link to post
Share on other sites
adrynalyne
10 minutes ago, jnelsoninjax said:

AMD Ryzen 7 2700/Nvida 2060

Price and performance is simply better AMD route than Intel.

Price, yes.

 

Performance...not really these days, unless you go Zen2+ or Zen3.

Link to post
Share on other sites
+jnelsoninjax
8 minutes ago, adrynalyne said:

Price, yes.

 

Performance...not really these days, unless you go Zen2+ or Zen3.

OK, let me rephrase, at the time I put the system together. Now I know it is not as good, but it still works for what I need it for.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By hellowalkman
      The Liquid Cooled reference RX 6900 XT card has an unexpected trick up its sleeve
      by Sayan Sen

      AMD launched the Radeon RX 6900 XT, its flagship product this GPU generation, in December last year. The company it seems has moved on from its blower-style designs and instead the reference RX 6900 XT brought on a cooler with triple axial fans. However, that apparently is not the only reference 6900 XT in existence as a liquid-cooled (LC) version of the reference GPU has also been spotted a few times since.

      The reference RX 6900 XT LC allegedly packs the binned XTXH variant of the Navi 21 chip, the same chip that is also present in PowerColor's Liquid Devil Ultimate as well as in Sapphire's TOXIC Extreme Edition. However, and rather bizarrely, it seems the reference 6900 XT LC is the one that actually features the fastest memory clock among the three, even though the other two models are aftermarket versions and are supposed to have the better specs.

      The LC 6900 XT model was tested today by a Baidu Tieba user who spotted this deviation. The memory on this new card is apparently clocked at 18.48Gbps whereas all other 6000 series Radeon cards are locked at 17.2 Gbps. That's an increase of close to 7.5%.

      Below is an image showing the specs of the card. While the language may be Chinese, the specification details are fairly easy to figure out.



      As expected, the card, being liquid-cooled and also packing a binned chip, is faster and cooler than the air-cooled reference 6900 XT. The faster memory though doesn't seem to be helping the reference 6900 XT LC much in terms of performance against the other liquid-cooled 6900 XT models.

      It is alleged that AMD originally planned to launch these chips to DIY gamers worldwide but due to the shortage has changed tactic. The Radeon RX 6900 XT LC, for now, is only available to Chinese OEMs and system integrators.

      Source and images: 寒山虹光 (Baidu Tieba)

    • By hellowalkman
      AMD's Zen 4 could be a behemoth with up to 128 cores in a single socket
      by Sayan Sen



      AMD's Zen 4 is the next big revision of the company's Zen CPU micro-architecture and lately, information related to the upcoming platform has been spilling out fast. According to the latest rumor today, each next-gen EPYC server processor based on Zen 4 (codenamed 'Genoa') will pack up to 128 cores, which is double that of what AMD offers in its current EPYC 7003 lineup.

      The image below shows the 64-core layout of an EPYC 7703 (Milan) processor:

      The rumor isn't completely new as earlier there were reports of Zen 4 allegedly having more than 64 cores with new instructions like AVX-512, BFloat16, and more. These new instructions are helpful for high-performance computing (HPC) and server workloads so the alleged addition of them definitely makes sense.



      Intel added AVX-512 instructions to its CPUs with the Rocket Lake architecture and the gains in compatible workloads are truly impressive. According to AnandTech, even an 8-core Rocket Lake-S part was able to win against a 64-core Zen 2 EPYC processor in 3D Particle Movement AVX-enabled benchmark.

      While Zen 4 and Genoa are still a while away, it is already known that fourth-gen EPYC processors will be deployed inside an upcoming exascale supercomputer dubbed "El Capitan". El Capitan is expected to release in 2023 and it will be used for overlooking U.S. nuclear research and operations.

      Source: Vegeta (Twitter) | Image via zhangzhonghao (Chiphell forum)

    • By hellowalkman
      Here are more details on AMD's big.LITTLE CPU architecture leak
      by Sayan Sen

      Pretty much like Intel is doing, AMD too has been working on its own hybrid processor architecture consisting of big and little cores. We came to know about this from a leaked patent last year (via @Underfox3). Today we have new information on the development as Twitter user @Kepler_L2 has spotted one of AMD's new patents related to big.LITTLE published a few days back.

      The patent outlines how task processing between the two types of cores would be handled in this hybrid approach.

      According to this patent, the little cores will have a time threshold built-in and sensors will monitor the length of time it runs at its full clock speed. Once the threshold is crossed, the task will be handed over to the big core. A similar process would be carried out for memory-intense workloads if it runs at its highest frequency state for higher than the threshold time.

      That's because the idea behind the use of the little cores is to save power and running them at full speeds for long durations defeats that purpose.

      For the big cores, the implementation is exactly the opposite. In essence, if a workload running on the bigger core does not cross the threshold, the task is sent to the little cores since clearly so much processing power does not seem to be necessary for the workload.

      Going back to the patent from last year, the architectural block diagram of the big.LITTLE design approach was described in it.

      Both the cores will have their own dedicated L1 cache but they will share the pool of L2 between them.

      Source: @Kepler_L2 (Twitter) | Images via FreePatentsOnline (1), (2)

    • By hellowalkman
      AMD's next-gen RDNA 3 performance jump rumored to be absolutely insane
      by Sayan Sen

      AMD's second-generation RDNA architecture (RDNA 2) was generally praised by reviewers for the performance and power efficiency gains it was able to achieve despite being on the same 7nm node as RDNA. However, this was AMD's first time implementing hardware-accelerated ray tracing (RT), and the results for this, compared to Nvidia's RT capabilities, were far less impressive. That is all set to change according to a report by RedGamingTech (RGT).

      The report claims that AMD's RDNA 3 ray tracing performance will get a significant uplift and will be very competitive with what Nvidia offers. It also adds that RDNA 3 will be utilizing a next-gen, "smarter" Ray Tracing IP 2 that could enable it to even leapfrog Nvidia's RT performance. The architecture will also feature new Machine Learning instructions.

      AMD hasn't forgotten about the rasterization performance of RDNA 3 either as a leakster on Twitter alleges that Navi 31, Navi 32, and Navi 33 will respectively offer 2.8x, 2.2x, and 1.5x times the performance of AMD's current best, the Radeon RX 6900 XT.

      Unknown at this point is how exactly AMD could be achieving this uplift. Whether the performance claims purported here mean the company will be adding more compute units (CUs) to the 80 units on the 6900 XT, or if the improvement is purely based on per CU architectural and clock gains. Or perhaps it's a combination of all of them.

      That said, it is important to note that this is all based on speculation and unconfirmed reports for now, so it is advisable to take these rumors with a grain of salt.

      Source: vegeta (Twitter) via RGT (YouTube)

    • By Abhay V
      Nvidia to drop Game Ready Driver updates for Windows 7, 8, and 8.1 starting this October
      by Abhay Venkatesh



      Nvidia today detailed its plans for Game Ready Drivers upgrade support for Windows 7, Windows 8, and Windows 8.1. The company posted a support article that states that it will cease to provide Game Ready Driver updates for its graphic cards for the mentioned versions starting October 2021. However, it does note that it will continue to serve “critical security updates” for systems running those operating systems until September 2024.

      Microsoft ended support for Windows 7 in January 2020, while Windows 8 lost its support in January 2016 – a short life span for the OS thanks to Windows 8.1 and the debacle that Windows 8 was. However, while Windows 8.1 reached the end of mainstream support back in 2018, the OS is still being serviced with security updates and will be till January 2023.

      Nvidia, says that a “vast majority” of its GeForce customers have migrated to Windows 10 and that it aims to provide the “best possible security, support, and functionality” for those users, which is why it is focusing on Windows 10 alone. In the FAQ section, it adds that it will ship the last Game Ready Driver that supports the three operating system on August 31, with the first drivers to drop support for the versions completely expected to ship in October.

      The change might not be a major one considering that most users are running the latest offering from the Redmond giant. However, for those that are still on older versions, they can rest assured that their GPUs will be served with updates to address any critical vulnerabilities. However, they will lose out on upgrades with performance enhancements, new features, and bug fixes, Nvidia says.