Intel's Next IGP Slated to Run Sims 3


Recommended Posts

Not a fan of the Sims, so I don't really care about this news, although someone else may find the news useful:

Is Intel now focusing on the PC gamer? That's what a recent slide reveals, posted over on Donanimhaber.com. According to the image, Intel's next-generation integrated graphics processor will focus on the larger mainstream and casual gaming markets. Simply called the Intel HD within the slide, the next-generation IGP will be capable of playing The Sims 3, World of Warcraft, Battlefield Heroes, and even a Nancy Drew title.

While it may sound like we're dripping with sarcasm, we're really not. Previous helpings of Intel-based IGPs haven't been real winners, especially when it comes to PC gaming. If Intel has any hope of gaining some kind of market in the gaming industry, it will need to crank out an IGP capable of 1080p video and DirectX 11-capable graphics.

http://www.tomshardware.com/news/Intel-IGP...aming,9333.html

Link to comment
Share on other sites

This intrigues me on no levels at all

Thanks for sharing!

==========================

This is great news for a lot of people. I'd be interested as to what they consider "playable" though?

Link to comment
Share on other sites

How can an IGP capable of modern gaming not be of interest? It also means it will run Aero a lot better, for example - thus fewer people will require dedicated cards in their laptops - thus offering lighter devices, less power consumption, less heat, longer battery life...

See beyond the ridiculous title!

Link to comment
Share on other sites

Sims 3 and World of Warcraft are pretty light games, my old Radeon 9200 used to be able to run WoW very well.

Since the latest engine updates and the new shadowing, it probably would not work so well now.

As someone else posted, this kind of performance impacts more than just games, from Aero to Media Center.

Link to comment
Share on other sites

Thanks for sharing!

==========================

This is great news for a lot of people. I'd be interested as to what they consider "playable" though?

I know so useful :D

i'd assume at least 30 fps at a reasonable quality level perhaps we'll just have to wait and see

How can an IGP capable of modern gaming not be of interest? It also means it will run Aero a lot better, for example - thus fewer people will require dedicated cards in their laptops - thus offering lighter devices, less power consumption, less heat, longer battery life...

See beyond the ridiculous title!

would be great for those not really interested in gaming but still able to have that kind of power with there system

e.g. for those who may game sometimes but not enough to be bothered in getting a dedicated card :laugh:

Link to comment
Share on other sites

I only care if this graphics card is going to be on netbooks with Atom processors. I'm still more likely to buy an ION system, though.

Link to comment
Share on other sites

So they're thinking of using The Sims 3, and the horribly programmed(see northrend), rather cpu bound WoW as the definitive benchmarks for this new chip? How underwhelming. I'd be more impressed if they said something like... "our next igp will be able to play UT3 decently" or something equally demanding.

Link to comment
Share on other sites

It wasn't really very long ago that onboard graphics were widely supported. Anyone remember onboard Nvidia TNT2 graphics? It was the last great integrated graphics and it ran anything you could throw at it. I'd like to know why onboard video isn't supported like it used to be.

Now things have changed, so it doesn't really matter what Intel does with integrated graphics. If game developers don't support it, and most don't, then it's useless for gamers.

I found this out the hard way when I built my current rig 2 years ago. I decided not to get a pci-e vidcard to save a little cash, so I went with onboard video (Intel's GMA 3100). I quickly found out it was only good for some games close to 10 years old, like the Thief series. Hellgate London was the newest title it would run, and poorly at that. Oblivion wouldn't run at all, and it even told me my onboard video wasn't supported. I ended up with a $26 MSI 8400GS and it plays today's games good enough for me.

Even though today's onboard video chipsets are powerful enough to run games like Oblivion, Fallout 3, and many others, game developers simply refuse to support onboard video.

So unless Intel has come up with some sort of awesome hardware emulation to fool games into thinking it's running on a pci-e card that is supported, this news probably means nothing.

I suppose it's possible Intel is paying off game developers, like EA, to support their future integrated graphics. EA could easily issue a patch for Sims 3 to add support. This is most likely the case.

Link to comment
Share on other sites

It wasn't really very long ago that onboard graphics were widely supported. Anyone remember onboard Nvidia TNT2 graphics? It was the last great integrated graphics and it ran anything you could throw at it. I'd like to know why onboard video isn't supported like it used to be.

Now things have changed, so it doesn't really matter what Intel does with integrated graphics. If game developers don't support it, and most don't, then it's useless for gamers.

I found this out the hard way when I built my current rig 2 years ago. I decided not to get a pci-e vidcard to save a little cash, so I went with onboard video (Intel's GMA 3100). I quickly found out it was only good for some games close to 10 years old, like the Thief series. Hellgate London was the newest title it would run, and poorly at that. Oblivion wouldn't run at all, and it even told me my onboard video wasn't supported. I ended up with a $26 MSI 8400GS and it plays today's games good enough for me.

Even though today's onboard video chipsets are powerful enough to run games like Oblivion, Fallout 3, and many others, game developers simply refuse to support onboard video.

So unless Intel has come up with some sort of awesome hardware emulation to fool games into thinking it's running on a pci-e card that is supported, this news probably means nothing.

I suppose it's possible Intel is paying off game developers, like EA, to support their future integrated graphics. EA could easily issue a patch for Sims 3 to add support. This is most likely the case.

Err...

TNT2... recently ?...

and the game devs dont' support any chipsets, internal or external. The developers use DirectX or OpenGL. The graphics chips support Dx or OGL. There's no need, for developers to code for specific graphics chips, in fact they're not supposed to.

And what exact integrated chipsets do you think could easily support Oblivion and Fallout 3 ? the integrated chipsets are simply underpowered.

Link to comment
Share on other sites

Err...

TNT2... recently ?...

and the game devs dont' support any chipsets, internal or external. The developers use DirectX or OpenGL. The graphics chips support Dx or OGL. There's no need, for developers to code for specific graphics chips, in fact they're not supposed to.

And what exact integrated chipsets do you think could easily support Oblivion and Fallout 3 ? the integrated chipsets are simply underpowered.

I never used the word "recently."

Stop nitpicking. I know devs don't support specific chipsets. It doesn't change what I said.

I also never said onboard graphics weren't underpowered, nor did I imply such a thing. I simply said onboard graphics weren't supported anymore, even though many of the onboard chipsets are fully capable of running some current titles.

What I did clearly state however was my lack of knowledge at the time of current onboard video with respects to current games.

Intel's GMA 3100, for instance, is fully DX9 and OpenGL 1.4+ extensions compliant with shader model 2.0, mpeg2 decode, and full 1080p support (on G33 chipset only). While it's true that lack of Shader Model 3.0 support and Hardware T&L holds it back, all of Intel's GMA chipsets above mine include those features or better.

Source: http://download.intel.com/products/graphic...phics_guide.pdf

And just a little FYI, once upon a time many games did indeed support specific chipsets. I've been playing pc games for nearly 30 years, so I know what I'm talking about here. In many game's options menus, you had to select your video chipset from a list of supported models. Of course this was long before DirectX and OpenGL, and even before Windows.

Link to comment
Share on other sites

I wish they would just add a standardized socket to the motherboard and sell GPUs like CPUs. The size of high end cards are ridiculous and the GPU would benefit from real HSFs and not the under-performing junk they place on today's cards.

Link to comment
Share on other sites

I want to run Crysis 2 or Left 4 Dead 2. Not the piece of **** called "Sims 3". Intel think that geeks are interested in integrated graphics? :sleep: Only my mom or my grandpa will love that crap from Intel.

Link to comment
Share on other sites

I wish they would just add a standardized socket to the motherboard and sell GPUs like CPUs. The size of high end cards are ridiculous and the GPU would benefit from real HSFs and not the under-performing junk they place on today's cards.

Its called PCI-Express.... its more standardized than the myriad of cpu sockets.

Link to comment
Share on other sites

This is very good news. More capable IGP = More people gaming (and better gaming for them) = Win for PC gaming.

GeForce 9400M

Nvidia have been doing this before

So they're thinking of using The Sims 3, and the horribly programmed(see northrend), rather cpu bound WoW as the definitive benchmarks for this new chip? How underwhelming. I'd be more impressed if they said something like... "our next igp will be able to play UT3 decently" or something equally demanding.

It's obvious they are appealing the casual gaming market. That kind of market is not interested in Bioshock or UT3.

Link to comment
Share on other sites

Its called PCI-Express.... its more standardized than the myriad of cpu sockets.

And why couldn't they create a standardized socket like they've done with PCI-E? Just because Intel and AMD like to change sockets all the time with CPUs doesn't mean GPU developers have to do the same. Besides, think of PCI, AGP and PCI-E as if they were sockets: PCI was replaced by AGP and PCI-E replaced AGP. It doesn't matter how the hardware interfaces, sooner or later a better interface will come along and replace the old one. The same would hold true if they moved to a GPU socket on the motherboard, only we would benefit from better cooling, more space and possibly reduced cost since they'd only be selling us a chip versus a giant card, with capacitors, RAM, etc.

Link to comment
Share on other sites

And why couldn't they create a standardized socket like they've done with PCI-E? Just because Intel and AMD like to change sockets all the time with CPUs doesn't mean GPU developers have to do the same. Besides, think of PCI, AGP and PCI-E as if they were sockets: PCI was replaced by AGP and PCI-E replaced AGP. It doesn't matter how the hardware interfaces, sooner or later a better interface will come along and replace the old one. The same would hold true if they moved to a GPU socket on the motherboard, only we would benefit from better cooling, more space and possibly reduced cost since they'd only be selling us a chip versus a giant card, with capacitors, RAM, etc.

His point is that putting it on a socket would offer no benefit over a slot.

at one time Intel had CPU slots as well.

Link to comment
Share on other sites

His point is that putting it on a socket would offer no benefit over a slot.

at one time Intel had CPU slots as well.

Uh...yeah...I outlined a few benefits of going with a GPU socket in the last part of my post.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.