OS X UI lag when running Minecraft


Recommended Posts

Hey Guys,

On all of my Nvidia based Macs I get terrible UI Lag from OS X when running Minecraft in the background. ATI based Macs don't exhibit these symptoms.

I was wondering if anybody else had seen this issue and been able to resolve it?

I'd like to leave Minecraft running while I tab around my other Applications, Starcraft 2 etc allow me to do this with no impact on performance.

Thanks

Chris

Link to comment
Share on other sites

Hey Guys,

On all of my Nvidia based Macs I get terrible UI Lag from OS X when running Minecraft in the background. ATI based Macs don't exhibit these symptoms.

I was wondering if anybody else had seen this issue and been able to resolve it?

I'd like to leave Minecraft running while I tab around my other Applications, Starcraft 2 etc allow me to do this with no impact on performance.

Thanks

Chris

What are the specs your Macs?

Version of OSX?

If the it an iMac, MacPro, MacbookPro, MacMini, what is the model number and year?

What is the specific graphics cards in each one?

What is the system resources usage before and during your minecraft sessions?

What is the CPU type?

How much RAM and type of RAM?

What type of disk are you using, how big, is it full, what is it's speed?

As the slowdown could be hardware related we would need more specific details to assist you.

Link to comment
Share on other sites

What are the specs your Macs?

Version of OSX?

If the it an iMac, MacPro, MacbookPro, MacMini, what is the model number and year?

What is the specific graphics cards in each one?

What is the system resources usage before and during your minecraft sessions?

What is the CPU type?

How much RAM and type of RAM?

What type of disk are you using, how big, is it full, what is it's speed?

As the slowdown could be hardware related we would need more specific details to assist you.

MacBook Pro /w Retina Display, However the Retina Display is turned off in favour of using a 24" Display at work.

Core i7,

8GB of RAM,

512GB SSD, 250GB Free

nVidia GT 650M

Are you using optifine?

http://www.minecraft...-and-much-more/

Not using Optifine, not even heard of that.

Minecraft itself is running fine, easily getting 60+ FPS even while using a high-res texture pack. It's the OS X UI that lags, like browsing the web while using Minecraft results in pageloads of 1 minute+ vs .1 of a second without MC running.

I'm thinking that Minecraft is gobbling up 100% of the GPU time or something....

Link to comment
Share on other sites

MacBook Pro /w Retina Display, However the Retina Display is turned off in favour of using a 24" Display at work.

Core i7,

8GB of RAM,

512GB SSD, 250GB Free

nVidia GT 650M

Not using Optifine, not even heard of that.

Minecraft itself is running fine, easily getting 60+ FPS even while using a high-res texture pack. It's the OS X UI that lags, like browsing the web while using Minecraft results in pageloads of 1 minute+ vs .1 of a second without MC running.

I'm thinking that Minecraft is gobbling up 100% of the GPU time or something....

Optifine is worth a try then, it enables such things as support for multi-core cpus and stuff. Even though you already have minecraft running smooth, you can bump up detail levels and stuff with it too, better texture filtering and stuff, antialiasing etc and it'll still be smoother than it currently is.

Link to comment
Share on other sites

Optifine is worth a try then, it enables such things as support for multi-core cpus and stuff. Even though you already have minecraft running smooth, you can bump up detail levels and stuff with it too, better texture filtering and stuff, antialiasing etc and it'll still be smoother than it currently is.

Thanks for that man, very helpful, is the Multi-Core enabled by default or is there a switch I need to define?

I installed the Optifine pack but found that my FPS Suffered, 32FPS vs 400FPS. Will keep tweaking to diagnose what happened.

What're you recommended performance tweaks? Again, this is still causing UI lag, but apparently only when I have MSRDC running.

Link to comment
Share on other sites

Thanks for that man, very helpful, is the Multi-Core enabled by default or is there a switch I need to define?

I installed the Optifine pack but found that my FPS Suffered, 32FPS vs 400FPS. Will keep tweaking to diagnose what happened.

What're you recommended performance tweaks? Again, this is still causing UI lag, but apparently only when I have MSRDC running.

Chunk loading: Multi-core

Also, I thought you said you had a stable 60fps... what do you mean about 32fps vs 400 fps? Either way, my target is always 60fps because that's what vsync maxes out at.

Also, the render distance can be increased very drastically with optifine so you may have had a drop if you turned that up too far. There's also an option which triples whatever render distance you have set so if you enable that, you're computer is going to be VERY strained.

BTW, it's been a while since I used it. What I can say is that there are many tweaks that can affect performance in a negative way, in a hugely negative way in fact! Using defaults though, you should get better performance and similar to stock performance with all of the quality settings turned up.

Just to be clear as well, you don't even necessarily have to do anything at all if you install it, it produces performance gains once you have installed it. Still like I said, enable multi-core and you should be safe to increase performance.

The lag you're getting could be chunk loading, I think even if you don't move in minecraft, the chunks are occasionally refreshed.

Link to comment
Share on other sites

I installed the Optifine pack but found that my FPS Suffered, 32FPS vs 400FPS. Will keep tweaking to diagnose what happened.

I don't play Minecraft, but given those framerates, that may be your problem.

See if you can cap the framerate to 60 or even 100. Your GPU is working overtime generating 400 FPS; it's a total waste of resources and generates a ton of heat and extra wear on your components. That would definitely rob the rest of your system of GPU resources to render other windows, web pages, etc.

I'd be curious, if that 400 number is accurate, what you see for framerates on the AMD/ATi-based Macs. Are they closer to 60?

Link to comment
Share on other sites

Your GPU is working overtime generating 400 FPS; it's a total waste of resources and generates a ton of heat and extra wear on your components. That would definitely rob the rest of your system of GPU resources to render other windows, web pages, etc.

False. GPUs cant work overtime. Running your GPU at 100% should never generate unhealthy heat levels or create 'extra wear' on your components. A good OS would also never allow it to rob from core resources. Thats an OS/Nvidia driver issue.

Link to comment
Share on other sites

False. GPUs cant work overtime. Running your GPU at 100% should never generate unhealthy heat levels or create 'extra wear' on your components. A good OS would also never allow it to rob from core resources. Thats an OS/Nvidia driver issue.

Semantics.

The fact that a modern monitor can display, at the very most, 240 FPS, means that a GPU generating 400 FPS is doing extra work ("overtime") that is never going to be seen or realized to the user.

Agreed, running your GPU at 100% shouldn't be bad. Except when it happens all the time (since it sounds like OP likes to leave Minecraft running in the background while he performs other tasks on the computer). Running anything at 100% for extended periods of time will cause premature failure.

This is also a laptop, which aren't exactly renowned for their incredible thermal design. A laptop gets hot when used for extended periods of time. There's only so much space in a laptop for cooling.

And I'm not saying that this situation is the norm. I agree that a properly written OS/driver/software stack should not allow these conditions to exist, but the fact is, they do.

You remember a few years ago when bad drivers caused Nvidia GPUs to eat themselves? Or when uncapped frame rates cause overheating problems? It happens.

Link to comment
Share on other sites

A game like minecraft running at 400fps is doing less work than a more complex one at 30fps, so its clearly not semantics. FPS is not a metric of utilization of the GPU.

Stop with the RDF. Running a computer component at 100%+ utilization does NOT cause premature failure. Its not a gas engine. Anecdotal evidence of nVidia's numerous driver blunders over the years doesn't change that. Sure, it can expose faulty hardware or improper cooling quickly enough, but its still defective hardware. Hello burn in tests...

Now, if you buy crap components, you simply get what you overpaid for, but don't just start making **** up to justify its deficiency. "...a card pushed to the max would not die were it not for poorly engineered and/or defective cooling."

Link to comment
Share on other sites

A game like minecraft running at 400fps is doing less work than a more complex one at 30fps, so its clearly not semantics. FPS is not a metric of utilization of the GPU.

Of course it's less work, but the fact that it is generating frames more often than the rest of the hardware stack (monitor) or even the user can utilize means it's doing more work than necessary.

I never said FPS was a metric of utilization, but when a GPU is generating more frames per second than the monitor can display, it's 100% wasted effort and does nothing but generate heat and take resources away from other tasks that may need the GPU.

Stop with the RDF. Running a computer component at 100%+ utilization does NOT cause premature failure. Its not a gas engine. Anecdotal evidence of nVidia's numerous driver blunders over the years doesn't change that. Sure, it can expose faulty hardware or improper cooling quickly enough, but its still defective hardware. Hello burn in tests...

And you're taking my statement completely out of context. I never said purely running it at 100% will cause premature failure. I said for extended periods of time. That's the key part and it causes heat buildup - excessive heat is bad for ICs.

I've not had any of the issues I linked to. I linked to them to illustrate that the real world happens - this isn't theory.

The point of all this is, that no matter what you think in theory, heat is going to exacerbate other issues and can cause premature failure - such as poorly implemented cooling, manufacturing defects that aren't serious enough until heat comes into play (like thermal expansion and breakdown of the materials used in the manufacturing process). Running the component at 100% utilization for extended periods of time will only cause those issues to occur more quickly.

The direct cause of a premature failure may be a breakdown of the plastics used in the IC, but the root cause of that would be heat.

Link to comment
Share on other sites

On all of my Nvidia based Macs I get terrible UI Lag from OS X when running Minecraft in the background. ATI based Macs don't exhibit these symptoms.

I was wondering if anybody else had seen this issue and been able to resolve it?

I'd like to leave Minecraft running while I tab around my other Applications, Starcraft 2 etc allow me to do this with no impact on performance.

I've found that going to System Preferences -> Energy Saver and disabling Automatic Graphics Switching dramatically improves things for me. You can also try installing gfxCardStatus which replaces that functionality based on what the software needs and lets you choose one card or the other. The slowdown is still there a little bit, but it's at least usable now.

Link to comment
Share on other sites

This topic is now closed to further replies.