I use Macs for everything but every now and then there’s a software that is only available on a Windows platform so I have it as a virtual machine in my Imac. With the new release of Yosemite installed on my iMac mid 2011 i5, I was happy with it’s performance and new “skin” (look and feel). Now when it came to using the good old Windows 7, running in Vmware Fusion (Version 7.0.0), it started off just fine like in Lion, Snow Leopard or the other flavors of OS. When I launched an application in the virtualized Windows, I immediately noticed that it was hogging all my CPU resources. I was boggled. This never happened before from OS upgrades. My CPU resource was over it’s limit. See image below:
Vmware-vmx used 211% of my cpu resources.
After 15 minutes of research on Apple’s forum and on Vmware’s forum, I came to find out that it was an issue with Yosemite. Fortunately for me, others have experienced the same issue and found a work around. Apple has also acknowledge the issue and is working on a patch.
To get your Vmware working properly, follow the instructions below.
1. Open up a terminal
2. Run the following command in terminal. When prompted for a password, enter in your administrator password.
sudo nvram boot-args=debug=0x10
3. Restart your computer
Leave a comment below if it worked for you.