Analyst Expects Nvidia to Acquire AMD

Doug Friedman, an analyst with American Technology Research, said that graphics chip maker Nvidia Corp. could well acquire x86 microprocessor maker Advanced Micro Devices in order to "re-architect it". The acquisition is considered to be useful due to the fact that roadmaps of AMD and Intel Corp. threat Nvidia. The only problem for the graphics giant is that AMD's x86 license is a non-transferable one.

"We believe AMD [could] face mounting pressure from shareholders, to restructure the company with a focus on a change in leadership," said the analyst. Indeed, shareholders of AMD are hardly pleased with the company's performance in the recent quarters as well as issues with the launch of quad-core microprocessors and the release of DirectX 10 graphics processing units. Nevertheless, late last year AMD managed to secure $622 million from Mubadala Development Company, which means that there are those who believe in AMD.

Report a problem with article
Previous Story

US elections become latest malware lure

Next Story

Miyamoto: Wii Fit is Not Designed to Make You Fit

75 Comments

Commenting is disabled on this article.

Isnt AMD's offering technically superior to Intels offering?

AMD's chips are true quad cores i.e. 4 completely separate cores while Intels are just basically 2 dual cores shoved together with gaffer tape. Also AMD has on chip hardware virtualization (whatever that does but apparently its good for certain things). So the chip is techincally superior its just lacking quite a bit in performance so as with anything thatll be AMDs thing to work on and itll only get faster and faster (probably with future cores). So maybe nVidia wants that type of technology under its belt.

Okay, this would suck for competition, but nVidia rocks when it comes to graphics. Of course, I can't say anything bad about the GameCube being built on ATI technology. That was a wonderful gaming system. I still prefer nVidia and Intel for my PCs though. ATI GPUs don't play well with Linux, which is one reason why I enjoy nVidia. As for Intel, I don't hear about any real innovation from AMD. Intel rocked the world when HyperThreading came out. Of course, dual-core CPUs have made that technology obsolete, but it was a very nice stepping stone for its time.

This would be VERY bad for the consumer. Buying out your competition, then I can guarantee you prices will skyrocket. Since ATI is owned by AMD.

It also would mean even much slower gpu updates from Nvidia, who already are milking their customers for every last dime.

This could never happen could it? by buying AMD they also get ATI and therefore this will give them a monopoly over the graphics card business. Wouldn't any move like this be blocked by EU/DOJ or whoever in charge of stopping things like that happening?

Firstly, as has been said many times, Intel owns the graphics card market as it is. These company's just have the enthusiast section. Secondly, if it was an issue NVidia could still sell off ATi if all it wanted was AMD. But with intel being a high profile company apparently reentering the enthusiast market in some way in the future NVidia likely wouldnt face too much pressure in that regard I'd feel.

This simply cant happen. If nVidia would own ATi, there would be NO competition in the GPU market (except for a few lamers) which is simply disastrous (and possibly impossible) from an economical and consumer point of view.

Interesting facts no one seems to have mentioned:

-ATi makes the GPUs in the Wii AND the Xbox360 (the two best selling 7th gen consoles). It isn't dying.
-AMD's profits are going down, but so did Intel's in 2007 according to an article in the newspaper a few months ago.

AMD will probably bounce back. In case you guys forgot (it was such a long time ago) AMD was on top until Intel's Core 2 Duo's came out.

Also,
Synthetic is right: IBM (worth 150bil) could top ANY offer nVidia (worth 13.4bil) throws at AMD.

Amd will not be merged or brought out right now (as in q1). Baring the TLB bug in the k10 we don't know what the future revisions will bring to the table. And yes its those revisions that will either make or break amd.

I'd be crying out in pure joy if this ever happens. The match I've been dreaming about AMD+nvidia. There is also the added bonus of ATi.

Now, I hear Intel is going to come up with their own dedicated graphics card soon. That could be really good news.

what ever the hell they were smoking when they came up with this, pls pass it this way :D
this wont and cant happen, its so friggin unheard of just to think about it

(RAID 0 said @ #17.1)
I feel the same way about ATi. The Catalyst Control Center sucks a big hairy dong.

Translation: Of course you can install just the Display Driver with ATI, Nvidia on the other hand.....

(D-M said @ #17.2)

Translation: Of course you can install just the Display Driver with ATI, Nvidia on the other hand..... :rolleyes:

IT'S STILL BLOAT! Ya dig?

Maybe nVidia is foreseing something in the Intel entrance to high-end graphics...
Or maybe they want to fully implement their General Purpose GPU-CPU vision.

But i don't believe this is happening, at least not a few years...

"Analyst Expects to Manipulate Stock Price of AMD" should be the title. There is nothing here but speculation from an industry analyst here.

I'm suprised nobody has mentioned the fact that Intel are going to be hitting the discreet GPU market soon so there will still be competition.

Intel tried that before; remember the Intel740 (and later Intel752)? While better than S3's infamous ViRGe, all ATI had to do (and did) was keep a lower-end chipset around and basically price Intel out of the market (it's why Intel has stayed in integrated-graphics ever since and not dared to do a discrete chipset of their own since ATI whacked them a good one).

Also, while AMD *has* been getting hammered in the CPU market (not just because of the Phenom/Opteron flap) they also now have a *value* DX10 GPU (though it wasn't planned as such) in the HD2600PRO/XT (which is something that Big Green *doesn't* have right now). Why isn't it *just* because of the issues with Phenom/Opteron? Even if Phenom (and Opteron's earlier Barcelona core CPUs) had gone off swimmingly, they would *still* be facing the Core2Quad juggernaut, which is more about sheer volume and price than performance (Intel's Core2Quads, especially Q6600, are still largely overkill for mainstream use, despite their mainstream pricing). More so than even Opteron/Athlon64, both XEON and Core2 are, quite literally, the same architecture in different packaging, backed by Intel's sheer production volume, with which AMD simply is unable to compete, even with IBM's help. Lastly, despite Q6600's mainstream pricing, what is their marketshare in terms of even desktop CPU sales? Why is it that E8500 is outselling Q6600 (despite there being more Q6600-ready motherboards than those ready for E8500)?

I hope it doesn't happen but it probably will. Seems like the competition always dwindles down to nothing. It doesn't seem that long ago that we had 3DFX, ATI, Video Logic, Matrox, Nvidia, Intel, Trident, Number 9, Hercules, Diamond, S3 and others as choices. All of that competition drove a lot of innovation. Now all we have are basically two choices, both outrageously priced and each new model comes with a slightly louder and more massive vacuum cleaner attached to it. :(

Then again, this is just some analyst talking. Getting paid to make guesses and spread rumors.

(Skyfrog said @ #11)
I hope it doesn't happen but it probably will. Seems like the competition always dwindles down to nothing. It doesn't seem that long ago that we had 3DFX, ATI, Video Logic, Matrox, Nvidia, Intel, Trident, Number 9, Hercules, Diamond, S3 and others as choices. All of that competition drove a lot of innovation. Now all we have are basically two choices, both outrageously priced and each new model comes with a slightly louder and more massive vacuum cleaner attached to it. :(

Then again, this is just some analyst talking. Getting paid to make guesses and spread rumors.

Interestingly enough, 3DFX, Video Logic?, Number 9, and maybe Diamond are the only ones gone, the rest still make GPUs in some form or fashion

True but they aren't really in the 3D market anymore, or even consumer market in some cases. PowerVR (formerly VideoLogic) only makes chips for embedded devices now so no more PC video cards from them. Matrox makes cards for things like professional workstations. I believe S3 only makes onboard graphics chips for motherboards like Intel.

The only real choices of 3D cards we have right now are ATI or NVIDIA. I really hope Intel will get back into making standalone cards. I think they could really give our two stale competitors a run for the money if they really tried. I wouldn't mind seeing them buy S3 if they had to (S3 bought Number 9 so they'd have that as well). Would be great to see three big players in the 3D card market again.

The only reason nVidia would acquire AMD is if they want their reputation to go straight down the crapper. Why on Earth would nVidia want to add two unsuccessful brands to it's business?

AMD and ATI are only unsuccessful right now. AMD dominated the CPU market, as far as performance and price are concerned, before Intel brought the Core 2 Duo to the table. ATI had the best GPU until NVIDIA released the 7000 series.

Technology performance is cyclical. The performance crown changes between the heads of the big players in the prospective markets.

(Fanon said @ #10.2)
Technology performance is cyclical. The performance crown changes between the heads of the big players in the prospective markets.
I'll add that I've personally been rather disappointed with Nvidia's performance of late. Verging on 18 months since the 8800GTX and we don't have a new top end card (if you consider the Ultra as an almost overclocked GTX). Even the new 9800GX or whatever due next month according to rumors only yields a 30% improvement which I find awful for a card thats running in SLI and being compared to a more or less 18 month old card running on it's own. If the rumour is true, I'm wondering what was stopping consumers getting the same performance 18 months ago by placing two 8800's in SLI besides the cost difference that will come from having them together. Pricewise it's nice, technology wise it could be a bit disconcerning (look at what Intel has achieved in CPU's in a similar timeframe).

That being said, thing's arent alot better on the ATI side, infact they are worse, but I feel if anyone has anything to loose it's NVidia and if they don't start getting some new cards out they are asking for Ati to pull a fast one on them. I realllly don't think ATI is so far behind that in a generation or two they can't get ahead.

I think they've hit the speed wall with graphics chips as lately they've had to resort to the giant heat pipes and jet blowers attached to cool them as well as going back to dual cards (Voodoo 2 flashback). I think they need to take an example from Intel and stop worrying about megahertz and move to more efficient solutions. I'm betting dual core graphics chips will be coming soon. Speaking of Intel I hope they are the first to bring them to us. I want a fast card, but I also want a quiet and cool running card.

(Fanon said @ #10.2)
ATI had the best GPU until NVIDIA released the 7000 series.
bull****, the 7 series GPUs were failures. The X1 series proved again that ATi could implement technology the proper way, IE Shader Model 3. Oh, and wasn't Nvidia found to be cheating when it came to AA during that time as well?

Nvidia, didn't have anything worthy on the table till the 8 series. You'd be hard pressed to prove me otherwise.

(IceBrewedBeer said @ #10.5)

Nvidia, didn't have anything worthy on the table till the 8 series. You'd be hard pressed to prove me otherwise.
I'd say it was pretty cyclic myself and off my head there were times where NVidia were infront then ATI. By the end of that card lineup though ATI I believe did have the fastest card.

(Skyfrog said @ #10.4)
I think they've hit the speed wall with graphics chips as lately they've had to resort to the giant heat pipes and jet blowers attached to cool them as well as going back to dual cards (Voodoo 2 flashback). I think they need to take an example from Intel and stop worrying about megahertz and move to more efficient solutions. I'm betting dual core graphics chips will be coming soon. Speaking of Intel I hope they are the first to bring them to us. I want a fast card, but I also want a quiet and cool running card.

Dual core graphics chips are a gimmick. There is nothing positive that doubling the number of cores does that doubling the number of in-core units will not do for graphics cards. The only accomplishment of doing that would be to increase the amount of heat being generated as a result of the redundant circuitry and to decrease amount of performance drivers squeeze out of hardware due to the new layer of complexities. Although the complexities would be less than they are with SLI, there will be increased cost for the non-parallelized version (i.e. the version most people buy), as either yields will decline from using multicore chips on single core cards or a separate single core version would need to be made which is almost as expensive as making a new chip (design resources need to be used and quality assurance testing needs to be done). You could imagine that everyone would be sold the dual core version, but then what point is there of being dual-core in the first place when the number of in-core units could simply be doubled with no additional complexities and lower power consumption?

Multicore technologies make sense for CPUs because they are not infinitely parallelizeable at the instruction level (or arguably on any level), but GPUs are infinitely parallelizeable at the pixel level, so there is no point to multicore GPUs.

As for the "speed wall," new graphics chips can be made that are twice as fast as old graphics chips by doubling the number of on chip units, but doing that also has the consequence of increasing the amount of heat the chips generate (not by a factor of two, but close to it). Typically, new process technologies help to alleviate that, but TMSC has been slow to adopt new process technologies, which has brought the graphics industry to a stand-still in terms of its ability to increase performance. Graphics chip designers can get around that by using transistors that are well suited to operation at low voltages (say to about half the current voltages) and lower clock speeds (say by about a factor of two), as in thoery that would enable them to double performance by quadrupling the number of on-core units while slightly lowering power consumption, but that would increase the die area by about 3 to 4 times, which would harm yields, affecting profits and bringing prices up. For these reasons (and the fact that a small amount of people are willing to pay for it), multicard technologies are being adopted, as they allow an increase in performance (theoretically proportional to the number of cards) without increasing the amount of heat that each card's cooling solution has to dissipate (it actually does increase the amount of heat each card generates, but given how little extra circuitry SLI needs to work, the amount is minuscule).

Both methods that graphics chip designers use to increase performance (more units and higher frequencies) require new process technologies to reduce the downsides (larger die areas and/or higher thermal output in a small area) and the only way to get get around this is to use the multicard technologies that they are already using. Until new process technologies are available to graphics chip designers, graphics cards will be at a thermal barrier (which given the power consumption of DirectX 10 graphics cards, should have been hit a long time ago) without resorting to water cooling, which for the majority of the graphics industry's customers is unacceptable. I, myself, find needing a fan to be unacceptable, as I have very sensitive hearing (which is why I have a passively cooled XFX GeForce 7950 GT), but I am in the minority of the graphics industry's customers.

I guess they'd have to get rid of ATI before doing that. AMD/nVidia is what most people wanted since the beginning anyway, lol. If the merger means nvidia owning ATI, they shouldn't do it. It's always good to have an alternative.

"Most people" = you.

nVidia + Intel = The way games are meant to be played

nVidia + AMD = Wasted potential

It's just simple math.

(C_Guy said @ #9.1)
"Most people" = you.

nVidia + Intel = The way games are meant to be played

nVidia + AMD = Wasted potential

It's just simple math.

Except Intel wasn't allways on top in price and performance, and they won't be forever, it's a cycle, this is Intels cycle.

Amd has been down this route several times before, broke and outperformed and every time they climb out and beat intel for the next cycle.

AMD and Nvidia would be some great potential AMD has allways been and still is the one that takes chances nad has actually been driving processor technology forward, especially lately, while intel has been following after copying AMD's stuff. combining the ingenuity and powerbehind AMD and the brains at Nvidia would not only bring us awesome GPU's but with a brand name like Nvidia behind it, they could finally make the giant intel shake.

Don't forget the awesome prices that would bring us, being basically no competition. Oh wait...

Look at Creative, they basically bought up all their competitors and look how innovative and great they have...oh wait again. Not having any competition is bad for consumers no matter how you look at it, so I have to disagree. Nvidia buying AMD would truly suck.

I was referring to the fact that many people on neowin went all "why ati and not nvidia?" when AMD purchased ATI.
AMD kicked ass back then. Now, of course, intel does, but I wouldn't want intel to team up with nvidia anyway-

(C_Guy said @ #9.1)
"Most people" = you.

nVidia + Intel = The way games are meant to be played

nVidia + AMD = Wasted potential

It's just simple math.


i would love to see AMD + Nvidia, but im very doubtful that its going to happen.

You really don't follow the world, do you?

First of all, AMD does not have the money to buy even 1/4 of nVidia's stocks, afaik.

Second, Intel and other small companies also produce GPUs.

(OblivionStalker said @ #8.1)
You really don't follow the world, do you?

First of all, AMD does not have the money to buy even 1/4 of nVidia's stocks, afaik.

Second, Intel and other small companies also produce GPUs.

Lol that was funny. So true.

I really don't think that government approval would be an issue because the leader in the graphics chip market (as far as sales) is actually Intel.

But if AMD is bought out, then I hope it would be by IBM instead of nVidia. That move would really have Intel shaking. With the power of IBM behind them, AMD could make huge advances in gaining marketshare.

IBM no longer cares about the desktop/home market. the sale of their laptop division to Lenovo was their final pull out of the consumer market.

(HawkMan said @ #7.1)
IBM no longer cares about the desktop/home market. the sale of their laptop division to Lenovo was their final pull out of the consumer market.

They quit making home desktops and laptops. But look, IBM makes the processors for the 360, PS3, and Wii. Is that not for the "home" market? There's no reason why they wouldn't make processors for home PCs also.

(Chugworth said @ #7.2)

They quit making home desktops and laptops. But look, IBM makes the processors for the 360, PS3, and Wii. Is that not for the "home" market? There's no reason why they wouldn't make processors for home PCs also.

They designed or helped design those CPU's but they aren't doign anything for the consumer, they are just selling those designed and cpu's to other companies such as MS and Nintendo. Toshiba actually manufactures and owns the Cell now, IBM was just part of the design process, and that's wasn't as involved either.

the leader in the graphics chip market (as far as sales) is actually Intel.

Technically true, but consider the market for discreet graphics cards.

Hmm... too monopolistic for me... Here's how it'll go... nvidia acquires AMD/ATI. Year later, Intel acquires Nvidamdati. Another year later, Google acquires Intelvidamdati. Then another year passes and MS acquires Gootelvidamdati.... leaving you with Microsogootelvidamdati.

That was a nice waste of 2 minutes.....

A couple of reasons why it won't happen.

a) antitrust because of ati-amd merger.

b) x86 license is non transferable.

c) Nvidia has only had experience in the gpu market. Just because its done great in that doesn't transfer itself into cpu market.

d) Nvidia's hardware ****-ups and malfunctions in the 6000 and 8000 series with the video units would not be tolerated in the cpu market.

Remember the k8 launch was met with slow supplies and delays. what's happened with the k10 launch was the same but due to amd needing to be competitive from the start with the k10, amd's problems have become very public.


Coming back to Nvidia, ironically it is ati being more of a threat to nvidia in the next 18 to 24 months than intel will be. Intel high end graphics will not appear till 2009 and then you will have a year or more before gamers will except intel high end graphics. Remember this is like Intel's 3rd or 4th time trying to get into the graphics market (barring igp/oem markets)

(krustylicious said @ #5)
A couple of reasons why it won't happen.

c) Nvidia has only had experience in the gpu market. Just because its done great in that doesn't transfer itself into cpu market.

If you can make cpu chipsets you definitely have SOME experience in the gpu market.

considering the amount of F-ups Intel has done overthe eyar, I fail to agree that they won't be tolerated in the CPU market.

Intel has had mroe and bigger f-ups than Nvidia eer did.

the x86 license they could get a new one, as was said earlier by someone, ATI and Nvidia are both small players in the GPU market as a whole, it's just the high end sub market they're big in, but if Nvidia did buy AMD, I don't see how intel would be allowed to ot license x86 to Nvidia. Since x86 is pretty THE standard for desktop computers,even with a64 instructions, I doubt Intel would be allowed tor efuse licensing since it'd block competition.

(krustylicious said @ #5)
A couple of reasons why it won't happen.

a) antitrust because of ati-amd merger.

b) x86 license is non transferable.

c) Nvidia has only had experience in the gpu market. Just because its done great in that doesn't transfer itself into cpu market.

d) Nvidia's hardware ****-ups and malfunctions in the 6000 and 8000 series with the video units would not be tolerated in the cpu market.


a) as someone above said, Intel actually is the main player.
b) doesn't mean they can't get their own if they wanted
c and d) by buying AMD they would be buying it's staff and all that comes with it. AMD could function as it does today if NVidia wanted and NVidia owning them doesn't mean their graphics card designers would be making GPU's all of a sudden. Not that they likely couldn't, I'm sure many of them have the necessary training to. Considering NVidia makes Board chipsets along with Video cards, it may be a very natural progression for them and since they would inherit the AMD taskforce, probably safer than doing a startup on their own (although there is some inherit dangers given that AMD's performance of late has been subpar)

"The only problem for the graphics giant is that AMD’s x86 license is a non-transferable one"
Explain ?

I would imagine that the licence AMD holds to use x86 instructions in its processors (as acquired from Intel) cannot be used by any other company that would take over AMD.

The new owners would possibly have to apply to Intel for a license to manufacture x86 compatible processors in the future, as the existing one with AMD would have expired.

Antitrust regulators will be scrambling to block such a purchase as it would mean there will be only one player in the discrete GPU market.

ATi is still operating under ATi tier, could just well do so functioning as a wing under nvidia just as AMD would be separate business but it won't happen, yeah.

(Ledward said @ #2)
Antitrust regulators will be scrambling to block such a purchase as it would mean there will be only one player in the discrete GPU market.

I don't think the government cares about some luxury items that a small number of geeks happily blow $500 on every 6 months. There is a "GPU" market and nVidia/ATI represent about 25% of that market together, Intel is the dominant player.

This is more like Sirius and XM merging, where the antitrust regulators agreed that the "satellite radio" market where these two companies represent 100% of the market, is really just the "radio" market, where they represent 5% of the market.

Don't forget Intel will enter the discreet GPU market in the next few years. There will still be competition, just not immediately.

(Fanon said @ #2.3)
Don't forget Intel will enter the discreet GPU market in the next few years. There will still be competition, just not immediately.

I can't wait to get a "discreet" GPU! My last one blabbed all round town about my 3D exploits. Now my wife has left me and I am unemployable. I look forward to a graphics card that can keep its trap shut!

(Havin_it said @ #2.4)

I can't wait to get a "discreet" GPU! My last one blabbed all round town about my 3D exploits. Now my wife has left me and I am unemployable. I look forward to a graphics card that can keep its trap shut!

Comment of the day!

(Havin_it said @ #2.4)

I can't wait to get a "discreet" GPU! My last one blabbed all round town about my 3D exploits. Now my wife has left me and I am unemployable. I look forward to a graphics card that can keep its trap shut!

You rock my ass.

Wait, if Nvidia acquired; wouldn't that mean they they'd own ATI as well? Oh for the love of god :suspicious: I use Nvidia cards but there'd be no competition...

nVidia could purchase AMD and (agree to) sell off ATi. Then again, does ATi ever stand a chance against nVidia -- with or without AMD ?

(Cøbra said @ #1.1)
nVidia could purchase AMD and (agree to) sell off ATi. Then again, does ATi ever stand a chance against nVidia -- with or without AMD ?

Yes, I think mostly in the mainstream market, as the ATI cards can deliver a pretty good punch for a lower price than say the 8800s, although with a little framerate loss (10-15).

(Cøbra said @ #1.1)
nVidia could purchase AMD and (agree to) sell off ATi. Then again, does ATi ever stand a chance against nVidia -- with or without AMD ?

Yes. ATI trounced nVidia when they released their 7000 series. Ironically, NVIDIA didn't really have a competing product until their own 7000 series card.

Actually, if there was only one graphics card maker it would reduce competition but it would also mean that developers would only have one platform to worry about, so performance could be guaranteed. With nVidia also making CPUs (thanks to AMD) and chipsets (as they do already) you would end up with a more closed platform that is more predictable and more akin to Apple (though still actually open).

(theyarecomingforyou said @ #1.4)
Actually, if there was only one graphics card maker it would reduce competition but it would also mean that developers would only have one platform to worry about, so performance could be guaranteed. With nVidia also making CPUs (thanks to AMD) and chipsets (as they do already) you would end up with a more closed platform that is more predictable and more akin to Apple (though still actually open).

Actually having only one platofmr isn't really much of an issue anymore. Since after DX10 MS and the graphics card makers did some standardization thing that should pretty much do away with the issues of earlier cards where all the different cards had different extra fcntionality and different platform support.

Yes, but they still have strengths in different areas (like shaders, etc). A single manufacturer would allow developers to really exploit strengths and workaround weaknesses. A buyout would not necessarily be a bad thing but I'd rather not take the chance.

(Fanon said @ #1)
ATI trounced nVidia when they released their 7000 series. Ironically, NVIDIA didn't really have a competing product until their own 7000 series card.

Sorry, what?! Do you mean the Radeon 7000? If so, did you have a magic one from Mars, because the Radeon 7000 I had was quite easily outperformed by a GeForce 2... :confused: