Intel explains faked DX11 Ultrabook demo at CES 2012

Earlier this week, we reported that during Intel's CES 2012 press event, it showed a demo of a racing game, Codemasters' F1 2011, running on DirectX 11 on an Ultrabook. Except that it wasn't a live game demo at all but merely a video of the game. Now Anandtech has received an explanation from Intel about the CES 2012 demo. Intel claims that the game demo was a late addition to its press conference and as a result the company didn't have enough time to run the demo as a live gameplay presentation.

Despite this apparent attempt at deception, Intel does insist that if it had time to get the F1 2011 game up and running, it would have worked. As proof, the company showed the web site a notebook that, while not the same one that was used in the CES 2012 press event, ran the game on a 2.5 GHz engineering sample and on a single GPU with Ivy Bridge graphics. The above video shows the notebook running F1 2011 live, with DirectX 11 graphics enabled.

While this may be proof that Ultrabooks can run DirectX 11 games, it's a shame that Intel tried to pull a fast one on a tech savvy media audience at its CES 2012 press conference. They would have been much better off had they simply not shown the demo at all and instead had private meetings showing the F1 2011 demo running live. This should be a lesson to any other tech company that tries to do something similar to members of the media who are more than likely to catch such an blatant attempt at deception.

Report a problem with article
Previous Story

Microsoft signs Android patent deal with LG

Next Story

Microsoft celebrates 10 years of Trustworthy Computing

27 Comments

Commenting is disabled on this article.

Seriously, why cheat and make a crappy excuse? Everyone knows AMD has nothing on Intel, so why do they keep cutting corners? No wonder Nvidia is in bed with them...

Still pretty impressive for intel. Theyve never had fast graphics capabilities from onboard. I remember playing WoW on an Intel Graphics HD and it was so rough.

There are a few exceptions, but almost all demonstrations are like this. When you see a upper-level exec of a company on stage, there is a team of people behind stage actually running the computer, moving the mouse, typing on the keyboard. Microsoft, Apple, and even Intel do it this way.

I have given speeches at tech conferences (I am not an exec, so I ran the computers myself) and you cannot concentrate on talking and typing/pointing/clicking/touching at the same time. To make it look fluid, there are people who do the work.

Intel was just not as good as the others at hiding it.

Pretty sure there wasn't any deception. I mean it was at a huge tech event that was covered by tech-educated press and the video controls were clearly displayed on the screen. Pretty sure Intel realized people would see that, they even joked about it during the presentation, and I'm pretty sure everyone there knew about it.

who cares? eitherway, if it's a video of the game, or the game itself...it still shows the graphics capabiliities. Since there is no FRAPS counter at the top, who cares...

SirEvan said,
who cares? eitherway, if it's a video of the game, or the game itself...it still shows the graphics capabiliities. Since there is no FRAPS counter at the top, who cares...

How is it showing its graphical capabilities? A cheap ARM processor can decode a 1080p video of Crysis on Max settings, yet if it tried to actually play the game (never mind that it would be emulated) it'd cry.

SirEvan said,
who cares? eitherway, if it's a video of the game, or the game itself...it still shows the graphics capabiliities. Since there is no FRAPS counter at the top, who cares...

Showing a pre-rendered video and showing real-time in-game rendering are in no way comparable. Please don't comment if you don't understand!

No, it does not show the graphics capability. By your logic if a computer is capable of playing a movie like Die Hard then it should be capable of playing games just as detailed.

SirEvan said,
who cares? eitherway, if it's a video of the game, or the game itself...it still shows the graphics capabiliities. Since there is no FRAPS counter at the top, who cares...

I agree... as long as it was recorded off the laptop and not rendered by something else. Presuming its a screen capture of the game running on that machine then it would look exactly the same either realtime or recorded.

Kushan said,

How is it showing its graphical capabilities? A cheap ARM processor can decode a 1080p video of Crysis on Max settings, yet if it tried to actually play the game (never mind that it would be emulated) it'd cry.


Please let's not put father Crysis to the game . Not even uncle Battlefield. We know they are not up to it, yet .

nik louch said,

Showing a pre-rendered video and showing real-time in-game rendering are in no way comparable. Please don't comment if you don't understand!

Perhaps it is you who doesn't understand, so I'll explain it. If I have a game...lets say BF3. While playing the game, I run a program such as FRAPS to record everything on the screen. while not entirely equal, the playback rate/quality shown is indicative of the quality of hardware in the machine. crappy hardware...low quality graphics and/or bad playback framerates. High end hardware...smooth playback. Next time you shouldn't jump to conclusions.

boumboqc said,

OMG. You are the cancer of neowin dude.
I don't even want to know what you're thinking about adult movie played on a computer.

um what? what does adult movies have to do with anything?

lt8480 said,

I agree... as long as it was recorded off the laptop and not rendered by something else. Presuming its a screen capture of the game running on that machine then it would look exactly the same either realtime or recorded.

Thank you. at least someone else here has a somewhat open mind. I don't know if it was shot via video camera, or something like fraps, but if fraps, then it is still a decent measure of the hardware in the machine.

Kushan said,

How is it showing its graphical capabilities? A cheap ARM processor can decode a 1080p video of Crysis on Max settings, yet if it tried to actually play the game (never mind that it would be emulated) it'd cry.

I'm tired of replying to people who can't think outside the box. My dual core Atom HTPC can decode 40Mbps 1080p content, but it won't run BF3, let alone use a screen recording software like fraps to record it.

so here it is last time: IF the detail in the video is really good, and the playback is smooth and not jerky, AND it was shot with something like fraps, THEN it shows that the hardware in the ultrabook is good enough.

SirEvan said,

Perhaps it is you who doesn't understand, so I'll explain it. If I have a game...lets say BF3. While playing the game, I run a program such as FRAPS to record everything on the screen. while not entirely equal, the playback rate/quality shown is indicative of the quality of hardware in the machine. crappy hardware...low quality graphics and/or bad playback framerates. High end hardware...smooth playback. Next time you shouldn't jump to conclusions.

It's you who is assuming in this care. Show me where they admit they did this by capturing content rendered on their own machine!

SirEvan said,

I'm tired of replying to people who can't think outside the box. My dual core Atom HTPC can decode 40Mbps 1080p content, but it won't run BF3, let alone use a screen recording software like fraps to record it.

so here it is last time: IF the detail in the video is really good, and the playback is smooth and not jerky, AND it was shot with something like fraps, THEN it shows that the hardware in the ultrabook is good enough.

And you're still wrong!

nik louch said,

And you're still wrong!


prove it?

show me FRAPS recordings of BF3 on a single core sempron with integrated graphics... and show me FRAPS recordings on a SLI/crossfire I7 rig.

nik louch said,

Showing a pre-rendered video and showing real-time in-game rendering are in no way comparable. Please don't comment if you don't understand!

Completely agree.

@OP
As many people have pointed out, pre-rendered video and real-time in-game rendering are completely different and cannot be compared at all. Anyone who has a basic understanding of how graphics hardware works would have no problem understanding this. Even as an 8 year old child I was aware of this.

You're crazy for suggesting a DX11 game requiring high-end specs can run on an Atom processor!

It could also be intentional! This tech of theirs got three reporting's in total.

First, the original demonstration. Second, the discovery of fake-up. And third, Intel's response to the second reporting.


Now it is ingrained in our memories.

FMH said,
It could also be intentional! This tech of theirs got three reporting's in total.
Now it is ingrained in our memories.

IMHO no memory is better than bad memory in this case.

wolftail said,

IMHO no memory is better than bad memory in this case.

Touché! But now you know the name of this tech, and *who* is behind this 'revolutionary new technology'!

Shadrack said,
It wouldn't be the first time a company has tried something like this... won't be the last.

NVIDIA NVIDIA NVIDIA NVIDIA NVIDIA NVIDIA NVIDIA