So I just sold my MB and am going to be upgrading to a MBP. I have been reading up on the current ones and have seen that almost everyone reccomends going with the 15" 2.44ghz vs the 2.66, as the ~$400 difference is not justified. What still sways me towards the higher end 15" is that the 330m has 512mb vs the 256mb in the base model. I will be using it for games such as The Sims 3, Civilizations, and SC II. I will also be using PS as well. Will I notice a huge performance difference between the 512 and the 256?
I am not really a gamer on my laptops, is there any other reason I would want 512 vs. 256? Can OS X tap that memory for other functions? I just do programming, web dev, virtual machines, email, office, etc.
I have a late 2008 15" MBP with an NVIDIA GeForce 9600M GT w/256MB of RAM. I am thinking of trading it for a current 13" MacBook with a GeForce 320M w/256MB.
Am I right in assuming the new MacBook's GPU is inferior to my current's? If so, how much of a hit would I take in games like Left 4 Dead 2, Portal 2, etc.. ?
I am looking too use Photoshop CS4 and was wondering if it is worth the upgrade to NVIDIA GeForce 9400M + 9600M GT with 256MB or should I just stick with the NVIDIA GeForce 9400M.I am not able too spend any more than the 1999.00 price point so these are my only two options. I need the portability of a laptop otherwise I would go with the iMac. I am sorry if this has been asked before,
My 2.4ghz i5 MBP runs a lot better when using discrete graphics. Minimizing and maximizing windows, scrolling, opening folders etc... all run smoother with the 330M. Is there a way to always run the 330M when plugged in, and switch to automatic graphics switching when unplugged? This would be the perfect solution for me.
My macbook pro 15" wouldn't turn on a couple of days ago but yesterday i turned it on and everything seemed to work. Then i realised that the iSight and bluetooth weren't working. Apart from that it was fine. I then decided i wanted to use windows through bootcamp but when i tried to start it up it just black screened and didn't boot up. I then realised that it might be my graphics card so i went into system profiler and found this. I then tried both the PRAM reset and a SMC reset it didn't fix it. Is there anything else i can do? Or should i just take it to apple?
I have noticed that using 9600m GT causes my mac to run about 20-30 degrees warmer compared to using the 9400m. For instance, when using the 9400m i'd normally get about 45-60 degrees when running firefox and itunes, but when i switch to the 9600m GT and performing the same tasks i get up to about 85 if not 90! I just thought this was a tad perculiar. Anyone with similar experiences / solutions to cooling this thing down (besides using smc which i already have installed).
I'm thinking of getting a new MBP, but want to be able to play MMORPGs like Warhammer Online or similar games coming out. Can anyone speak to first hand experience with the various 15" MBPs with i5 or i7? With 256MB GT 330M vs 512 MB? I would love to be able to get the 13", but I'm sure that video card won't make it happen.
Right now, I have a 2009 MBP with 2.66 core 2 duo and the 256MB 9600gt. I am looking to buy a 17 inch i5 with the 512MB 330m. With regards to gaming under bootcamp, how much better is the 330m to the 9600gt? Will I notice a big difference in my games or will it be barely noticeable? I tried to play black ops under bootcamp and it was barely playable.
Standard bootcamp drivers are quite old so I went to upgrade the drivers from nVidia.com; I noticed that nVidia doesn't support the 330M under its driver page, am I to assume the 330M is an Apple specific part? Will nVidia support the 330M in an upcoming release?
i used to use only the 9600m gt for gaming as 9400m was handling everything, has better battery life and run cooler.
but i notice when i'm playing 1080p videos the 9600m gt perform better.
ok, 1080p youtube 9400m no problem at all.
1080p played on vlc 9400m laggy sometimes. but 9600m gt perform well.
1080p played on plex 9400m no problem at all.
that's why i think the new auto switching graphic cards on the new macbook pro is cool. sometimes i'm on 9600m gt and decide to take the mac to the living area and forget that i was on the dedicated card while using on battery.
graphic card used in 15" MBP. Performance in digits from here shows that the 9400M is worst than older brother - 8600M GT. Apple wants extra 300$ for "better performance" of 15" MBP.Is it worth it for me? I haven't play any game since 8 years. Now im using 24" 2.8GHz & 20" 2.0GHz iMacs, but i need some easy to carry computer. New MBP will be my main machine for work on apps from adobe (Photoshop, Illustrator, Indesign and once in a year After Effects to make video from vacation).
anyone has any idea how StarCraft 2, released in a few days, will run on a MacBook Pro with a 9600M GT. I understand this game is largely CPU intensive so I am hoping that for once it will run as well as you'd expect on a less-than-2 year old laptop.
I have seen some footage of the beta running on a MBP 9600M GT, and very nicely at that, so that leaves 2 questions (they may be very difficult to answer):
1) Are we expecting the game to run a lot slower on OS X than XP as games usually do (including WoW)
2) Are we expecting performance to be significantly better in the final over the beta
Before there was a discussion on a couple of apps people had written that show in the status bar whether you're using the Intel HD or Nvidia 330M on an i5/i7 macbook. I now have an i7 MBP but can't find the thread anywhere (tried MRoogle too!).
I know a couple guys in the MBP refresh gaming thread were able to overclock using Nvidia Tools. I'm not having any luck; the sliders are grayed out and I can't make any changes with system tools 6.06. I've also got errors when trying to install Nvidia drivers other than the ones included with Bootcamp. What driver/system tools version are you using to OC? And what settings have you tried? (I'm on Windows 7 64bit).
I would first like to start off with the disclaimer that I am not good at writing guides and I am also not a pro with overclocking but here is my attempt at both. Also I am not liable for damages to your computer and ask fobis has mentioned each gpu even if they are the same may overclock better or worse then the next. So take my overclocking numbers for what they are worth. Experiment and try it out on your own.
Note: This guide assumes your running Windows 7 64bit, and also it assumes that you are new to overclocking.
-------------------------------GUIDE------------------------------ 1.First make sure you have a copy of windows installed through bootcamp.
2.Then go ahead and install the drivers that came with bootcamp ( we won't be using the gpu drivers but the rest are going to be useful anyway so might as well go ahead and install them )
3.After you have all that you will want to go here to get a modified driver. This will give you better performance then the bad drivers that apple supplies it will also let you overclock the gpu
4.After you have downloaded both the driver and the INF file open up the driver and it will extract the files to the directory that you choose. It will also try to launch the install but it will fail saying something like " no compatible hardware found " ignore this for now.
5.Now take the INF file and copy it to the folder that the driver was extracted too. It will ask you if you want to overwrite the file just say yes.
6.Now open up the device manager by right clicking on my computer, selecting properties, this should open a new window and on the left there should be something that says device manager.
7.Under the tab that says "display adapters" select the only device that shows up on that tab. Right click it and choose uninstall.
8.After you have done that it will likely mess up your resolution and set it too 800x600 don't worry this is normal. Now just restart your computer.
9.Once you have restarted when it starts back up it will say new hardware found. Now you have to choose to install it manually choose the option that says something along the lines of " search for drivers in specified area "
10.Now it will take you to a new page and on that page there should be an option that says "have disk" choose this and select the directory that you extracted the driver too earlier in this guide. It should find one of the files that it can use and install it just fine.
11.You will need to restart again once this is done but when you start back up your resolution should be fixed if not just right click and hit screen resolution and just change it back to the native resolution.
12. Download Nvidia system tools found here
13. Go ahead and install this it should be self explanatory.
14. Once it is installed open the program and go to the performance tab on the left. ( It might ask you to agree to some terms of use )
15. Just put in these numbers and hit apply 646 for the first one 864 for the second one and 1314 for the third one
Now your done if your paranoid like me of overheating your computer you can also optionally download and install LubbosFanControl to max out your fans to keep it as cool as possible.
Enjoy your faster GPU!.
----------------------BENCHMARKS----------------------------- Before OC: Furmark Points:912 FPS: min=13 max=22 avg=15 Crysis: 24.89 Unigine Sanctuary Demo (run with everything on defuilt excapt resolution turned down to 1280x800 ) DX10:24.9fps (score:1057) OpenGL: 23.2 (score:982) Unigine Tropics Demo: (run with everything on defuilt excapt resolution turned down to 1280x800 ) DX10:18fps (scores 452) OpenGL:16.3 (scores 410) Unigine Heaven Demo: DX10:14.8fps (scores 372) OpenGL:12.6fps (scores 317) 3DMark06:5975 3DMark Vantage: P2294
After OC:
Furmark Points: 1081 FPS: min=16 max=26 avg=18 Crysis: 33fps Unigine Sanctuary Demo (run with everything on defuilt excapt resolution turned down to 1280x800 ) DX10: 31.2fps (scores: 1322) OpenGL: 28fps (scores: 1211) Unigine Tropics Demo: (run with everything on defuilt excapt resolution turned down to 1280x800 ) DX10: 21.7fps (scores: 546) OpenGL:19.8 (scores:498) Unigine Heaven Demo: DX10:15.7(scores:395) OpenGL: 16.2(scores:408) WTF? OpenGL wins? lol 3DMark06:6994 3DMark Vantage: 2922
Notes: Crysis was run at 1280x800 everything on medium excapt physics on very high
Another note: The highest GPU temp underload from Crysis got up to about 78C after about 15mins of running the game. Furmark got the temp up to 80C though after about 15mins also.
I have also played TF2 at max settings @ 1920x1200 for over 2 hours to test stability and it ran fine without any hiccups
Also I feel that this card can be pushed further then this ( I have not tried ) but from what I see it cools a lot better then I expected from a laptop I come from a world of desktop overclocking.
The 9400M was in older Macbook Pros and I've seen many videos where people run crysis on high/very high and have pretty good FPS. The laptops arent as fast as the newer ones, but yet my MBP (specs down below) which is faster, can only run crysis on medium at around 30 FPS, unless i wanted 5 FPS with high. The only reasoning for this, to me at least, is that the 330M chip is not as good. Is this the case? If so, why the hell did the graphics chip get downgraded? Here is a link to one of the vids I was talking about: [URL:....] This guy is using the demo for his vid. when i use the demo, it doesnt even give me the option of using very high settings, and when i'm on high, my screen flashes in and out, i get random lines across the screen, and then my computer freezes.
Is there a significant difference between the new nVidia 330m Graphics against the older 9600 cards found in the slightly older MacBook Pros (i am talking about both having 512RAM) not really bothered about the auto-switching deal as i will be Windows frequently for Gaming and using Snow Leopard for University work, oh and as a side note, will the i5/i7 make much difference when gaming (more specifically Modern Warfare 2 and Gears of War),
I'd like to know how many over clock their 330M and how much. Somebody without Windows, who hardly ever plays any kind of game probably doesn't over clock but there are probably many that do play but still don't over clock. The only game I am playing now is Modern Warfare 2 and over clocked the 256mb 330M offers about 25% more performance. That is the difference between 25 and 30 fps. I found that temps don't really rise. At default the GPU reaches about 70C and the CPU around 50 (idle is 32-40). Over clocked 645/808/1451 it avg 73C and CPU is a little lower on 48-49C because the fans work a little harder while the CPU has the same amount of work. That sounds pretty cool to me and I didn't even test any higher, because usually a notebook GPU is a cool chip that over clocks well as long as it is cool enough, and desktop GPUs on the same silicon sometimes run standard on almost 90C. The max temps are higher with 82C which it reaches after game start for a short period until the cooling really kicks in. There is no fan control prog running. That is a 30% increase in clock speed with almost no increase in temps.
I don't know how to read out fan speeds in windows but I don't notice the fans from in the loud gameplay anyway. What are your clocks?? I don't like benchmark tools as they don't determine what matter really to me. I know with furmark I get higher temps but I am not running furmark for fun ever. I play CoD. I used in the game (Call of Duty MW2) in Act 2 of the campaign the misson "of there own accord" because it is probably one of the hardest with the lowest min framerates. I get stock 24-25 fps running outside and min 22. Inside it is 33-60fps. Over clocked it is 28 min 30-31 outside and 38-120 indoors. I use native res (high res) but no AA and no soften smoke edges because it kills performance in some scenes and I don't see the difference. Everything else is on the maximum I can switch it to and the textures are high. AA is only missing in some levels with high contrast where edges start flickering a lot but in general I prefer the higher res over AA. It is just more detail and easier to see the enemy.
I am just wondering if the 9600m will be better if I am using a large external monitor or multiple external monitors or will the 9400m be basically the same. The only reason I am not jsut getting the 2.66 version is that I do not want to be tempted to game in college. I will also be getting the employee discount 25% from a friend so the 2.53 will cost about $1720 with tax and applecare and the 2.66 will cost $1920 with tax and applecare. Also I am just wondering for those who do not game or do 3-D editing... how often do you use your 9600m?
I searched, but didn't find anything too specific on here. Can anyone recommend some good overclock values for my 9600m in bootcamp using nvidia tools? Basically, I'm looking for a stable profile or some recommended settings you've been using to overclock your 9600 in bootcamp. Maybe upload a screenshot like the one below with your custom adjusted settings for memory, core, and shader clocks. I'd rather not play around with it if someone has found a fast/safe setting that increases performance.
So I have a 1.8ghz dual G5 with 3gb of ram for work. I mainly work in Adobe CS and do a far amount of Photoshop work. At any given time I may have all of Adobe CS plus Office and a few other apps running -- and a gazillion fonts. Went to the store and saw the new 24" iMac. How would a new iMac compare to my late '04 1.8DP G5? On that same note, how would a new MacBook Pro compare to the above?
Just got the new office 2011 which I use every day for university, primarily the powerpoint program for presentations and note-taking. Today I noticed that my battery drained more quickly than usually and found out that I use the nvidia 330M when powerpoint is open. Just did some research and neither word or excel makes the MBP use the 330M. Only powerpoint. Is there any way to force the intel gfx on all the time? Or at least when running on battery?