Intel / AMD & AMD / NVIDIA


Intel / AMD & AMD / NVIDIA  

118 members have voted

You do not have permission to vote in this poll, or see the poll results. Please sign in or register to vote in this poll.

Recommended Posts

LaP
On 8/7/2018 at 10:03 PM, LostCat said:

Freesync basically just requires an upgraded scaler from the usual makers, where G-Sync requires a custom hardware module from NV in there.  

 

So making a mon with G-Sync costs more to begin with, the market for it is lower and they have to recover R&D and make money off it in general (otherwise why even bother releasing it.)

 

Personally I have two Freesync mons and a laptop that uses Freesync, alongside the Xbox One X of course.  I don't currently have a desktop GPU or TV that use it, though I definitely plan to get both at some point.

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

Link to post
Share on other sites
+LostCat
4 minutes ago, LaP said:

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

Well, they both support Freesync for a while now, but the 5700s are a huge upgrade :)  Nice.  I did the same.

 

(HDR handling seems a little iffy at times right now on the 5700s, but other than that fantastic cards.)

Link to post
Share on other sites
LaP
1 minute ago, LostCat said:

Well, they both support Freesync for a while now, but the 5700s are a huge upgrade :) Nice.  I did the same.

 

(HDR handling seems a little iffy at times right now on the 5700s, but other than that fantastic cards.)

Nah my monitor was not supported by nVidia drivers it would display a black screen with gsync compatible enabled.

 

Yeah lot of people report drivers bugs saying the card is almost unusable but so far i had the card for over a month and played around 15 games without any problem. Definitely a good upgrade at 2k.

Link to post
Share on other sites
Mindovermaster

We should revamp this poll... LOL

  • Like 1
Link to post
Share on other sites
sinetheo
4 hours ago, LaP said:

I just upgraded my gtx 1070 to a Radeon 5700XT and imo it was worth it. Freesync is very cool. I would say it's definitely better than vsync. Smoother and no input lag at all. You get used to it fast but i would probably not go back to vsync.

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

Link to post
Share on other sites
Mindovermaster
1 minute ago, sinetheo said:

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

'cough cough' driver quality?

Link to post
Share on other sites
sinetheo
44 minutes ago, Mindovermaster said:

'cough cough' driver quality?

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

Link to post
Share on other sites
adrynalyne
1 minute ago, sinetheo said:

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

What does mining have to do with it? Also, your viewpoint is dated AF.

Link to post
Share on other sites
Mindovermaster
1 hour ago, sinetheo said:

Nvidia cards don't have any of the issues AMD ones have. If they do they are miner and when an Nvidia card comes out it is rock solid day 1. AMD will take a few years. At least this was the case back when ATI Cataylst was a thing for all their drivers.

ATI was YEARS ago. You are claiming that to today? Lot has changed, bud...

Link to post
Share on other sites
LaP
4 hours ago, sinetheo said:

How many bsod and crashes have you received. I was told on youtube that the 2070Super is well worth the price premium due to the driver quality

 

So far none.

 

I can't speak for everyone and my experience with AMD is very small it's only my 3rd AMD gpu i own since the early 90ies (i was on nVidia most of the time) but i would say AMD drivers problems are exaggerated for the most part. It's hard to know what is true and not i mean there's so many fanboys on the Internet. So far after 1 month and a week the only game i have trouble with is Fortnite. This said i had problem with Fortnite in the past with my 1070 too so not sure it's the gpu. The game always gave me trouble with my Asus MB and Aura (in the fact it's more than anti cheat system giving me trouble). All the other games i played are working perfectly. So far i played WoW, Destiny 2, Overwatch, Diablo 3, Darksiders, Warframe, Hollow Knight, Path of Exile, Borderlands 3 and some other free games i got on Epic Store i'm probably forgetting.

 

The 2070 was way too expensive in Canada. It was 150$ more expensive (after taxes) than the 5700XT for like 5-7% better performance on average (according to most reviewers). Can't say i'm impressed by RTX either so far it look like the early shaders years where devs were overusing it to make everything shiny. Ray tracing is the future but the current implementation kind of sucks imo. I just could not justify to pay 150$ more for basically almost the same performance. Maybe people having problem with the 5700XT bought the bad models. I know the MSI Evoke is known to run super hot (over 100 Celsius for the memory) because of badly placed thermal pads.

 

Anyway so far so good. Drivers are in very early stage of development too so i expect things to improve. It's a new architecture and there's always bump in the roads coming with it. My last AMD cards was a 5850 something like 12 years ago or so and it was running fine back then i don't expect things to have changed much over AMD. I would say this ignoring the gpu drivers i do prefer Andrenalin 2020 over Geforce Experience actually.

Edited by LaP
Link to post
Share on other sites
  • 2 weeks later...
sinetheo
On 1/4/2020 at 7:11 PM, Mindovermaster said:

ATI was YEARS ago. You are claiming that to today? Lot has changed, bud...

Not according to gamers nexus. Read all the hate comments below 

 

 

So many people had to return their 5700xts after they would crash constantly

Link to post
Share on other sites
Director Fury

AMD for my CPU/NVidida for my GPU

Link to post
Share on other sites
+LostCat
On 1/4/2020 at 9:14 PM, LaP said:

Anyway so far so good. Drivers are in very early stage of development too so i expect things to improve. It's a new architecture and there's always bump in the roads coming with it. My last AMD cards was a 5850 something like 12 years ago or so and it was running fine back then i don't expect things to have changed much over AMD. I would say this ignoring the gpu drivers i do prefer Andrenalin 2020 over Geforce Experience actually.

I hate to say it but I've had far more issues since 19.12.2 than anyone should ever have to deal with o.o

 

Until now I considered AMD drivers pretty solid over the years, this has been a strangely broken experience.  At least it's getting fixed up nicely, but damn.

Link to post
Share on other sites
LaP
42 minutes ago, LostCat said:

I hate to say it but I've had far more issues since 19.12.2 than anyone should ever have to deal with o.o

 

Until now I considered AMD drivers pretty solid over the years, this has been a strangely broken experience.  At least it's getting fixed up nicely, but damn.

I have 19.12.3 installed and so far it's fine. Like i said Fortnite gives me a black screen after a while but that's the only problem i have. Not sure if it's the drivers or something else no really playing the game anymore so don't really care anyway. I heard the drivers can cause problems (related to the boost) if the fps is not capped and vsync not activated with very high fps. I don't think i capped my fps or activated vsync in Fortnite so that could be the problem but like i said don't really care about the game so meh. Anyway the 2070 Super is simply too expensive in Canada and the 2060 Super performance is a little bit underwhelming for the price. I trust AMD will solve the main issues over the next months specially since the upcoming 5800XT and 5950XT will use the same drivers and arch so i'll keep mine for sure.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By Ather Fawaz
      Intel shows promising progress and key advances in integrated photonics for data centers
      by Ather Fawaz

      Image via Intel Press Kit The effective management, control, and scaling of electrical input/output (I/O) are crucial in data centers today. Innovative ideas like Microsoft's Project Natick, which submerged a complete data center underwater, and optical computing and photonics, which aim to use light as a basic energy source in a device and for transferring information.

      Building on this, at the Intel Labs Day 2020 conference today, Intel highlighted key advances in the fundamental technology building blocks that are a linchpin to the firm's integrated photonics research. These building blocks include light generation, amplification, detection, modulation, complementary metal-oxide-semiconductor (CMOS), all of which are essential to achieve integrated photonics.

      Among the first noteworthy updates, Intel showed off a prototype that featured tight coupling of photonics and CMOS technologies. This served as a proof-of-concept of future full integration of optical photonics with core compute silicon. Intel also highlighted micro-ring modulators that are 1000x smaller than contemporary components found in electronic devices today. This is particularly significant as the size and cost of conventional silicon modulators have been a substantial barrier to bringing optical technology onto server packages, which require the integration of hundreds of these devices.

      The key developments can be summarized as follows:

      These results point towards the extended use of silicon photonics beyond the upper layers of the network and onto future server packages. The firm also believes that it paves a path towards integrating photonics with low-cost, high-volume silicon, which can eventually power our data centers and networks with high-speed, low-latency links.

      Image via Intel Press Kit “We are approaching an I/O power wall and an I/O bandwidth gap that will dramatically hinder performance scaling", said James Jaussi, who is the Senior Principal Engineer and Director of the PHY Lab at Intel Labs. He signaled that the firm's "research on tightly integrating photonics with CMOS silicon can systematically eliminate barriers across cost, power, and size constraints to bring the transformative power of optical interconnects to server packages.”

    • By Ather Fawaz
      Intel Labs Day 2020: Robotics demonstrations and a next-gen neuromorphic chip on the horizon
      by Ather Fawaz

      Loihi, Intel’s neuromorphic research chip. Image via Intel Press Kit Neuromorphic computing, as the name implies, aims to emulate the human brain's neural structure for computation. It's a relatively recent idea and one of the radical takes on contemporary computer architectures today. Work on it has been gaining traction, and promising results have come up; as recently as June this year, a neuromorphic device was used to recreate a gray-scale image of Captain America’s shield.

      Alongside other notable announcements at Intel Labs Day 2020, the firm also gave us an update on the progress with its Intel Neuromorphic Research Community (INRC). The aim of the INRC is to expand the applications of neuromorphic computing in business use cases. This consortium, which originally came together in 2018 and includes some Fortune 500 and government members, has now been expanded to over 100 companies and academics with new additions like Lenovo, Logitech, Mercedes-Benz, and Prophesee. Moreover, Intel also highlighted some research results coming out of the INRC computed on the company’s neuromorphic research test chip, Loihi, at the virtual conference.

      Intel Nahuku board, each of which contains 8 to 32 Intel Loihi neuromorphic chips. Image via Intel Press Kit Researchers showcased two state-of-the-art neuromorphic robotics demonstrations. In the first demonstration by Intel and ETH Zurich, Loihi was seen adaptively controlling a horizon-tracking drone platform. It achieved closed-loop speeds up to 20kHz with 200µs of visual processing latency, a 1,000x gain in combined efficiency and speed compared to traditional solutions. In the second demonstration, the Italian Institute of Technology and Intel showed the operation of multiple cognitive functions like object recognition, spatial awareness, and real-time decision-making, all running together on Loihi in IIT’s iCub robot platform.

      Other updates highlighted in the conference include:

      Moving forward, Intel will be integrating the takeaways accrued from experiments over the last couple of years into the development of the second generation of its Loihi neuromorphic chip. While the technical details of the next-gen chip are still nebulous, Intel says that it is on the horizon and "will be coming soon".

    • By Ather Fawaz
      Intel debuts Horse Ridge II, a cryogenic quantum control chip with two key features
      by Ather Fawaz

      Image via Intel Press Kit Last year, Intel debuted its first-generation Horse Ridge cryogenic control chip to make quantum computing more commercially viable. The SoC targeted control electronics and interconnections within quantum computers to reduce the complexity of controlling and managing quantum circuits.

      Today, at Intel Labs Day 2020, the tech giant unveiled the next iteration of its Horse Ridge SoC. Dubbed Horse Ridge II, the new SoC is implemented using 22nm low-power FinFET technology (22FFL) and its functionality has been verified at temperatures as low as 4 kelvins, just like last-gen. However, Horse Ridge II builds on its predecessor and adds two vital features: the ability to manipulate and read qubit states, and the ability to control the potential of several gates required to entangle multiple qubits.

      The new SoC also has a programmable microcontroller that performs additional filtering on pulses to reduce crosstalk between qubits. Intel plans to detail the full technical specifications of Horse Ridge II during the International Solid-State Circuits Conference (ISSCC) in February next year.

      First-generation Intel Horse Ridge. Image via Intel Press Kit Jim Clarke, who is the Director of Quantum Hardware, Components Research Group at Intel, believes that the new SoC will streamline quantum circuit controls, which will subsequently allow greater fidelity at decreased power output, and inch us “one step closer toward the development of a ‘traffic-free’ integrated quantum circuit.”

    • By Ather Fawaz
      Intel unveils ControlFlag, a machine programming tool that detects errors in code
      by Ather Fawaz

      Today, at Intel Labs Day 2020, Intel revealed ControlFlag, a machine programming system that uses machine learning to detect errors in code. Trained on over 1-billion unlabeled lines of production-quality code that contained various bugs, ControlFlag uses a technique dubbed as ‘anomaly detection’ to detect traditional coding patterns and identify any potential anomalies in code that are likely to cause a bug, irrespective of the programming language.

      The system extends the tech giant’s Rapid Analysis of Developers project that aims to help software engineers and researchers write code faster. It uses unsupervised learning to train itself to identify patterns and stylistic choices in code. Intel notes that ControlFlag understands code in a way that it does not characterize a difference in stylistic choices as a syntax error just because it is ‘written differently’. An apt analogy would be to compare its working to a traditional grammar-checking tool that checks a given sentence or a set of words in the English language for correctness.

      When put to the test, ControlFlag was able to identify bugs in production-quality code. In one case, it even identified an anomaly in a cURL code that had not been previously recognized when developers were reviewing the code. Furthermore, in-house, Intel has already started using the system in software and firmware productization.

      Justin Gottschlich, who is the Principal Scientist, Director andFounder of Machine Programming Research at Intel Labs believes that the system can “dramatically reduce the time and money required to evaluate and debug code.” This would be beneficial, he added, since, “According to studies, software developers spend approximately 50% of the time debugging. With ControlFlag, and systems like it, I imagine a world where programmers spend notably less time debugging and more time on what I believe human programmers do best — expressing creative, new ideas to machines.”

    • By Abhay V
      Possible Surface Laptop 4 and Surface Pro 8 benchmarks spotted
      by Abhay Venkatesh

      Microsoft is expected to launch the Surface Pro 8 and Surface Laptop 4 sometime in January sporting minor spec bumps. Images of the devices were spotted passing through certification recently, corroborating reports of a minor update. The Pro 8 and Laptop 4 are expected to be powered by Intel’s 11th-gen Tiger Lake processors, with the Laptop 4 also said to offer AMD versions, just like it’s predecessors.

      Now, possible benchmarks of the devices have been spotted, hinting at the processor SKUs expected to make it to the Redmond firm’s offerings. The two benchmark listings refer to “OEMWY” and “OEMGR” product names, codenames that are similar to what Microsoft has used in the past for other Surface PCs. One of the listings sports a system with an Intel Core i7-1185G7 processor with Iris Xe graphics, while the other features an AMD Ryzen 7 3780U Microsoft Surface Edition processor.

      It is possible that the device with the Intel part – which is a top-of-the-line Tiger Lake SKU – could be the 13.5-inch Laptop 4, the 15-inch business version, or even the Surface Pro 8. However, the more interesting of the two is the one sporting the same AMD processor as the Surface Laptop 3.

      The “Renoir” codename and the much higher overall scores hint that the chip is possibly not the 3000-series offering, and could be a Zen 2-based processor, coming from the Ryzen 4000-series laptop CPUs. Interestingly, the numbers beat the generic Ryzen 7 4700U benchmarks. This could be due to the optimizations done by the company.

      The Ryzen 5000-series laptop chips are expected to be launched early next year, so it seems less likely that those chips will make it to the Surface line. However, it will be interesting to see if the firm has any surprises in store for the Surface line.

      The Surface Pro 8 and the Laptop 4 are expected to be launched without much fanfare sometime in January. There are murmurs of a possible black version of the Go 2 as well, accompanied by the expansion of the availability of the Surface Duo.

      Source: Geekbench 5(1)(2) via WindowsLatest