Microsoft removes sharpening filter in latest Xbox One update

One of the features from the latest Xbox One update that didn't get mentioned during the announcement was the removal of the sharpening filter which was used while upscaling lower resolutions to 1080p.

Now, many users over at NeoGAF forums have observed that Microsoft has indeed removed the filter, giving the users a better visual experience than before. The news was also confirmed by Drew McCoy of Respawn Entertainment, the developers behind Titanfall, who said that, "This should also make the scaler much better" and urged Titanfall beta testers to do a before/after comparison.

Xbox One is rumored to get better support for high resolution games through an updated SDK as revealed by a game developer, but until then upscaled games should look better with the removal of the highly criticized sharpening filter.

Readers can check out the full gallery of comparisons performed with games such as Dead Rising 3, Killer Instinct and Call of Duty: Ghosts at the source.

Source: NeoGAF via DualShockers | Image via DualShockers

Report a problem with article
Previous Story

Microsoft's Xbox European head will retire in the next few weeks

Next Story

Jobs: We're looking for writers!

40 Comments

Commenting is disabled on this article.

4k gaming is already there, I made a YouTube channel using my setup witch show console level framerate with settings I found appropriate for each games. (search nvidia4kgaming in YouTube) does it run at 60fps locked off course not, but the fact that it run at all and is playable is impressive

I mean, it is great and all but seriously, both the PS4 and XB1 are SERIOUSLY underpowered not even able to deliver decent 1080p which PC master race is accustomed. For instance, in PC gaming 4X anti aliasing is achieve by basically rendering the game at 4 times the resolution and then DOWNSCALING IT. XBOX and PS4 just can't match even a basic gaming rig....and they are supposed to be next gen?

I'll be playing 4K on my pc thanks. 1080p is soooooooooo low rez.

neonspark said,
I mean, it is great and all but seriously, both the PS4 and XB1 are SERIOUSLY underpowered not even able to deliver decent 1080p which PC master race is accustomed. For instance, in PC gaming 4X anti aliasing is achieve by basically rendering the game at 4 times the resolution and then DOWNSCALING IT. XBOX and PS4 just can't match even a basic gaming rig....and they are supposed to be next gen?

I'll be playing 4K on my pc thanks. 1080p is soooooooooo low rez.

My PC that I built 4 years ago for 4x the price of the PS4 can't do full 1080 high settings in some games. Also, I don't have any monitors or TV's that support 4k and until the price drops sub 400$ for a TV with 4k I won't own one either.

Have fun with your 4k.

firey said,

My PC that I built 4 years ago for 4x the price of the PS4 can't do full 1080 high settings in some games. Also, I don't have any monitors or TV's that support 4k and until the price drops sub 400$ for a TV with 4k I won't own one either.

Have fun with your 4k.

it's 2014 and that price point will happen sooner than you think.
http://www.pcworld.com/article...-the-4k-display-frenzy.html

the myth that 4K is for the rich is silly. it's the future.

have fun living in the past.

Mark said,
This is *NOT* how anti-aliasing works!

really?
http://en.wikipedia.org/wiki/Spatial_anti-aliasing

"In digital signal processing, spatial anti-aliasing is the technique of minimizing the distortion artifacts known as aliasing when representing a high-resolution image at a lower resolution."

BATTA BUM.

"Full-scene anti-aliasing by supersampling usually means that each full frame is rendered at double (2x) or quadruple (4x) the display resolution, and then down-sampled to match the display resolution. Thus, a 2x FSAA would render 4 supersampled pixels for each single pixel of each frame. Rendering at larger resolutions will produce better results; however, more processor power is needed, which can degrade performance and frame rate. Sometimes FSAA is implemented in hardware in such a way that a graphical application is unaware the images are being supersampled and then down-sampled before being displayed."

I love wikipedia.

"Super sampling anti-aliasing (SSAA),[2] also called full-scene anti-aliasing (FSAA),[3] is used to avoid aliasing (or "jaggies") on full-screen images.[4] SSAA was the first type of anti-aliasing available with early video cards. But due to its tremendous computational cost and the advent of multisample anti-aliasing (MSAA) support on GPUs, it is no longer widely used in real time applications. MSAA provides somewhat lower graphic quality, but also tremendous savings in computational power."

I also love wikipedia. Of course you quoted a way that anti-aliasing works, but we're not using FSAA now days!!!

The method that most of us are actually using:
http://en.wikipedia.org/wiki/Multisample_anti-aliasing

neonspark said,

it's 2014 and that price point will happen sooner than you think.
http://www.pcworld.com/article...-the-4k-display-frenzy.html

the myth that 4K is for the rich is silly. it's the future.

have fun living in the past.

IMO he is not so much living in the past as you are living in a future that may not exist for 5-10 years. I just do not have faith that 4k wont just be a fad like 3-D. It took 1080p yeas to come around to mainstream and that was mainly because TV and Cable finally changed their infrastructure and i just dont see them making another change anytime soon and figuring out a codec to able to send 4k like that. Not to mention all the movies that JUST remastered a lot of their collections to 1080p. Until then I dont see prices coming down enough on them. 4K has a up hill battle imo and it would of have been stupid to spend the money IMO to make a console right now that would have been capable of 4k GAMING. It only would of catered to high end pc gamers only that are spoiled.

Mark said,
"Super sampling anti-aliasing (SSAA),[2] also called full-scene anti-aliasing (FSAA),[3] is used to avoid aliasing (or "jaggies") on full-screen images.[4] SSAA was the first type of anti-aliasing available with early video cards. But due to its tremendous computational cost and the advent of multisample anti-aliasing (MSAA) support on GPUs, it is no longer widely used in real time applications. MSAA provides somewhat lower graphic quality, but also tremendous savings in computational power."

I also love wikipedia. Of course you quoted a way that anti-aliasing works, but we're not using FSAA now days!!!

The method that most of us are actually using:
http://en.wikipedia.org/wiki/Multisample_anti-aliasing

I realize there are many methods, but you said that is not how AA works so you're in fact incorrect on that clearly ignorant statement. Furthermore, as I didn't say it was the ONLY method you're just making a moot point. The point I was making before you just failed as a troll btw, was that 4X or higher rendering on PCs for the purposes of doing AA is not just typical, it is actually been something we've had for many years and I prefer it personally as the results are much better IMO if I have the power to spare.

Houtei said,
IMO he is not so much living in the past as you are living in a future that may not exist for 5-10 years. I just do not have faith that 4k wont just be a fad like 3-D. It took 1080p yeas to come around to mainstream and that was mainly because TV and Cable finally changed their infrastructure and i just dont see them making another change anytime soon and figuring out a codec to able to send 4k like that. Not to mention all the movies that JUST remastered a lot of their collections to 1080p. Until then I dont see prices coming down enough on them. 4K has a up hill battle imo and it would of have been stupid to spend the money IMO to make a console right now that would have been capable of 4k GAMING. It only would of catered to high end pc gamers only that are spoiled.

off course 4K isn't a fad as unlike the 3D stuff, there is no head gear to wear. 1080p looks awful in anything larger than a 24 screen and horrendous on any TV larger than 50 inches. We're already seeing cameras and no doubt soon smartphones that record in 4k. Youtube already supports it, and so will netflix.

4K is already the future, many just hopelessly resist it. The switch to HD took long. The switch to 4K will be much faster as the display technology required for this type of pixel pitch at an affordable price has long been around which wasn't the case when the HD switch took place.

plus the TV landscape is very competitive and OEMs will bombard the 4K space with ever lower prices to the point that you won't be able to sell an "HD" tv and no consumer will really opt for one in most markets where console sell.

furthermore MSFT and sony will have to compete with the steam box which btw, will easily do 4K gaming over the next couple of years and make the pressure for sony and MSFT to keep up unbearable.

MSFT/Sony don't have a choice. In fact, both are probably going to try to 1up one another since both would love to seize on this huge market.

Houtei said,
IMO he is not so much living in the past as you are living in a future that may not exist for 5-10 years. I just do not have faith that 4k wont just be a fad like 3-D. It took 1080p yeas to come around to mainstream and that was mainly because TV and Cable finally changed their infrastructure and i just dont see them making another change anytime soon and figuring out a codec to able to send 4k like that. Not to mention all the movies that JUST remastered a lot of their collections to 1080p. Until then I dont see prices coming down enough on them. 4K has a up hill battle imo and it would of have been stupid to spend the money IMO to make a console right now that would have been capable of 4k GAMING. It only would of catered to high end pc gamers only that are spoiled.

4K will be a "fad" only because 8K is also right around the corner. Netflix is already going to stream in 4K, cable/satellite companies now have a new competitor to keep up with.

theyarecomingforyou said,
Or you could have changed it to "non-native".

Native in this case means what the game engine is rendering at, the statement is correct as is.

George P said,

Native in this case means what the game engine is rendering at, the statement is correct as is.

They both missed my point I was trolling the XBOX not the way the statement was written.

George P said,
Native in this case means what the game engine is rendering at, the statement is correct as is.

Native resolution relates to displays with a single fixed resolution, as consoles / PCs aren't limited to any particular resolution. Therefore "while upscaling to 1080p from non-native resolutions" would be more concise. It's not really important though.

theyarecomingforyou said,
Or you could have changed it to "non-native".

Well actually if your TV is 720p only then the native res is 720p.

This has nothing to do with native res as this is variable. You should read ""while upscaling lower resolutions to 1080p".

theyarecomingforyou said,
None of this would be an issue if the Xbox One was powerful enough to render at 1080p natively.
Jub Fequois said,
Then why the need for upscaling?

"Xbox One is rumored to get better support for high resolution games through an updated SDK as revealed "

Due to the SDK being either hard to work with, inefficient, or limiting in some way. We don't know the details.

-adrian- said,
Well - the Xbox is powerful enough to render 1080p natively

The issue is not that it can't render at 1080p but that it isn't powerful enough to do so for a large number of games.

WizardCM said,

"Xbox One is rumored to get better support for high resolution games through an updated SDK as revealed "

Due to the SDK being either hard to work with, inefficient, or limiting in some way. We don't know the details.

It seems incredibly convenient that Microsoft can simply improve the SDK and suddenly have a console comparable to the PS4. I mean, we're talking about games on the XB1 running at half the framerate or half the resolution - these aren't small differences. And that's predicated on the notion that Sony won't be improving its own SDK.

I always question miracle fixes like this.

Jub Fequois said,

Then why the need for upscaling?

Remember that the Xbox One had to have a lot of it's software re-written after the change from the online only version, game developers now have to catch up with that new software, which is why companies with closer ties to MS (like Turn10) have 1080p games out. Some of these games take years to program and couldn't just be started from scratch again.

theyarecomingforyou said,
Not impressed - it just seems to make everything more blurry.

What games are you noticing blurriness in?

theyarecomingforyou said,

The issue is not that it can't render at 1080p but that it isn't powerful enough to do so for a large number of games.

the same was true of the 360. it just takes time for the drivers and SDK to catch up. I recall the 360 wasn't even able to do 1080p at AT ALL.

theyarecomingforyou said,
Not impressed - it just seems to make everything more blurry. None of this would be an issue if the Xbox One was powerful enough to render at 1080p natively.

to my eyes, the PS4 and XB1 are are a pixelated fest so you're just splitting hairs. Why sony and microsoft didn't go 4K is beyond understanding given 4K tvs and monitors are going to see huge uptake over the lifespan of both consoles.

so technically both consoles are already not powerful enough.

theyarecomingforyou said,

It seems incredibly convenient that Microsoft can simply improve the SDK and suddenly have a console comparable to the PS4. I mean, we're talking about games on the XB1 running at half the framerate or half the resolution - these aren't small differences. And that's predicated on the notion that Sony won't be improving its own SDK.

I always question miracle fixes like this.

I can tell you as a PC gamer, the right DRIVERS can make all the difference. It isn't magic, it is math and serious algorithm tuning. it's SCIENCE.

neonspark said,

the same was true of the 360. it just takes time for the drivers and SDK to catch up. I recall the 360 wasn't even able to do 1080p at AT ALL.

The 360 and the PS3 had some native 1080 games. They were all sports games however.

McKay said,

The 360 and the PS3 had some native 1080 games. They were all sports games however.

yes after driver updates on the 360 enabled it. The PS3 could do 1080p gaming while xbox could only do 720p, and 1080i. Eventually MSFT added an HDMI out, and drivers that could make games run 1080p. The point being that I'm confident 1080p will be the standard for games in both consoles before long.

neonspark said,
I can tell you as a PC gamer, the right DRIVERS can make all the difference. It isn't magic, it is math and serious algorithm tuning. it's SCIENCE.

I can tell you as a PC gamer that no driver improvement ever doubles the framerate or the resolution. The only way for that to be the case would be if the SDK was fundamentally broken, which would have been easily noticeable and strikes me as unlikely given the amount of resources Microsoft has committed to the Xbox brand.

neonspark said,
to my eyes, the PS4 and XB1 are are a pixelated fest so you're just splitting hairs. Why sony and microsoft didn't go 4K is beyond understanding given 4K tvs and monitors are going to see huge uptake over the lifespan of both consoles.

Even top-end multi-GPU PCs struggle to render most games at 4K and the ownership level of 4KTVs is remarkably low. Doing so would have made the consoles prohibitively expensive and would have benefited a tiny minority of owners. That said, Microsoft and Sony both put out underpowered machines for this generation and I can't see them being able to maintain a seven year lifecycle like before.

neonspark said,
the same was true of the 360. it just takes time for the drivers and SDK to catch up. I recall the 360 wasn't even able to do 1080p at AT ALL.

The X360 was never designed to render at 1080p - it was designed for 720p, which it achieved. Only several years into the lifecycle did games like GTA IV and Halo 3 have to drop the resolution below 720p to maintain performance and visual quality, though both respective sequels rendered natively at 720p at development tools improved.

This generation has been very different, with a large percentage of launch titles struggling to hit 1080p (more so on the XB1 side).

theyarecomingforyou said,
Not impressed - it just seems to make everything more blurry.

People who don't know what they're talking about always say that. The scaler sharpened everything and made games look seriously nasty and like they had no anti-aliasing when they did. ALL devs hated it, and many gamers. What you see now is a pure unaltered image, how the games should actually look, like when you play on PC at a non-native res.

If you like killing your eyes with a over sharpened pixelated image then simply turn up the sharpness on your TV. You have the option. Before the scaling fix people did NOT have an option, it was always over sharpened.

theyarecomingforyou said,

The issue is not that it can't render at 1080p but that it isn't powerful enough to do so for a large number of games.


Can we just clear this up? The difference in horsepower between the PS3 and Xbox One is miniscule, and the difference is certainly not big enough to cause the Xbox One to render all games at significantly lower resolutions. We really don't know why games like COD: ghosts render at 720p yet. We'll see if future games change or continue the trend. Let's keep it at that.

NoClipMode said,
People who don't know what they're talking about always say that. The scaler sharpened everything and made games look seriously nasty and like they had no anti-aliasing when they did. ALL devs hated it, and many gamers. What you see now is a pure unaltered image, how the games should actually look, like when you play on PC at a non-native res.

But that's exactly the point - playing games at non-native resolutions leads to horrible blurry visuals with artifacts. The sharpening filter was an attempt to restore the images natural sharpness, with the drawback being the addition of some visual artifacting. Some will prefer one over the other but neither will come close to rendering natively at 1080p.

I wasn't defending the sharpness filter - I was just pointing out that the image is now a lot more blurry, exactly as you would expect on PC running at a non-native resolution. As a PC gamer I consider that sort of image quality unacceptable.

FalseAgent said,

Can we just clear this up? The difference in horsepower between the PS3 and Xbox One is miniscule, and the difference is certainly not big enough to cause the Xbox One to render all games at significantly lower resolutions. We really don't know why games like COD: ghosts render at 720p yet.

I assume you mean PS4 and no, we haven't established that the difference is "miniscule". When we're talking about games on the XB1 running at half the framerate or half the resolution we're talking about a substantial difference. Some of that might be attributable to issues with the SDK but we simply have no way of knowing how much that is a factor. COD: Ghosts isn't the only game we're talking about here - not even close.

At the end of the day the PS4 is outperforming the XB1 by much more significantly margins than the X360 did the PS3.

theyarecomingforyou, you are assuming that current XB1 games running at 720p are using all available power of the system. That is not necessarily true. Also, as development for the console advances (through better SDKs and adoption of new coding practices) devs will be able to make games run more efficiently.

If you recall, most games that came out the first year for the PC3 did not look as good as their xBox 360 counter parts, that was because it took devs a while to figure out that platform. After about a year, they had it down and then most people agree the games where either the same or the PS3 had the edge.

sphbecker said,
theyarecomingforyou, you are assuming that current XB1 games running at 720p are using all available power of the system. That is not necessarily true.

There is no incentive for developers to not utilise all available system power. At the end of the day the same is true of development for the PS4, so there really is no excuse.

sphbecker said,
Also, as development for the console advances (through better SDKs and adoption of new coding practices) devs will be able to make games run more efficiently.

It should be pointed out that during the last generation we saw the resolution of high profile games actually drop because of how much developers were pushing graphics, with both GTA IV and Halo 3 running sub-720p. Even then the degree to which they lowered the resolution was no where near as dramatic as this generation.

I mean 720p (which is what XB1 games like COD: Ghosts, BF4, MGSV: Ground Zeroes, Killer Instinct run at) is less than half the resolution of 1080p and last gen we didn't see any games running at 640x360. We're talking about a MAJOR difference.

neonspark said,

yes after driver updates on the 360 enabled it. The PS3 could do 1080p gaming while xbox could only do 720p, and 1080i. Eventually MSFT added an HDMI out, and drivers that could make games run 1080p. The point being that I'm confident 1080p will be the standard for games in both consoles before long.

I'm referring to native 1080 games, not the upscaled resolution. Pretty much all 360 games listed 720,1080i,1080p on the back of the case. But that was the upscaled res. Both 360 and PS3 had half a dozen native 1080 games. Mostly sports games, FIFA street was one of them. And one of The tennis franchises was native 1080 but I forget which one.