GeForce 680 GTX

blindhamster

Bronze Level Poster
Thanks for the help with rectifying my loss of overclocking Buzz :) Will let you know how it goes tonight +rep

now i just need to know why my gpu isn't performing correctly...
 

Buzz

Master
I would personally remove the GPU altogether.

so
1st go into add remove progs and remove the GPU drivers etc. Restart
2nd remove power from computer and remove the GPU power cables. (No need to remove the GPU itself)
3rd Start computer, then shut down again (to allow the onboard GPU to take its settings)
4th Shut down comp and boot to BIOS (tap Del on startup)
5th As in bios post earlier, make sure that the Primary GPU is set to PCIe
Click back, and select the 3rd Tab, Advanced.
Click System agent configuration
Graphics Configuration
Make sure primary gpu is Pcie
F10 to save and restart.

Shut down once again and plug back in the GPU power cables and boot comp.
Put in original disk you got with GPU and install.

Restart computer

Install and run nvidia_system_tools_6.05

and click the Icon for the GPU. And see what the clock speeds are there.
 

baron75mk2

Banned
You can also use MSI Afterburner to see what your clock speeds are as well , its always reported correctly for me with a large number of nvidia GPUs , the only time it has not is with some radeon cards (it does work with radeon but sometimes reports wrong & sometimes the overclock settings are greyed out , must be a radeon driver enforcement or something) .
 

blindhamster

Bronze Level Poster
okay, so I'm happy the CPU is sorted now, thanks very much Buzz :) I followed your instructions along with also getting the exact settings I needed from PC Specialist support.

even after installing everything from the files I never did get an OC auto tuner in the AI Suite, a bit of googling implied that it was taken out of the sabertooth version.
Heres a screenshot of my turbo evo screen
turboevo.png


So that just leaves my GPU as the issue...
I downloaded NVidia system tools and had a look, seems my 'factory settings' for the 680 is 705Mhz for the graphics clock... i did some googling, and interestingly that was apparently at one point or another the planned speed for all 680s.
Not sure why mine is still clocked that way (and rubensolos too I believe)
Here is a screenshot:
Untitled-1.png


suggestions?

Thanks for all the help so far :)
 

Buzz

Master
Have a look in Manage 3D settings, Base profile and see if your Power Management mode settings are on Max performance. If not try that setting and apply.

Then go into Device manager, and disable the GPU, restart, go back into devman and re enable the card.

Did you try what I posted in last post?
 

Buzz

Master
Have a look in Manage 3D settings, Base profile and see if your Power Management mode settings are on Max performance. If not try that setting and apply.

Then go into Device manager, and disable the GPU, restart, go back into devman and re enable the card.

Did you try what I posted in last post?

Also what kind of temps are your GPU reading?

No idea how that double posted :)
 

blindhamster

Bronze Level Poster
Hey again Buzz :)

Yes, I followed your previous instructions before posting the last post. Didn't seem to make any difference.

I just tried what you have suggested in this post - the power management was set to adaptive, its now on performance.

Score is basically identical to before (a little higher but still nowhere near the other 680 scores in the benchmark thread)

http://3dmark.com/3dm11/3479461

p.s. GPU temp just after running the benchmark was 53 degrees
 

keynes

Multiverse Poster
Hey again Buzz :)

Yes, I followed your previous instructions before posting the last post. Didn't seem to make any difference.

I just tried what you have suggested in this post - the power management was set to adaptive, its now on performance.

Score is basically identical to before (a little higher but still nowhere near the other 680 scores in the benchmark thread)

http://3dmark.com/3dm11/3479461

p.s. GPU temp just after running the benchmark was 53 degrees

I contacted Nvidia and they suggested I should contact Palit. The advice given was to reinstall drivers and check different programmes but I ended up with the same:
core clock 705 MHz
Memory clock setting 3004 MHz
As I mentioned before I have the same readings as other GTX 680 users from the forum in the benchmark but not sure why. On the Palit website the GPU is advertised as 3004 MHz memory clock setting but 1006 MHz core clock setting going to 1058, as additional information my CPU is not overclock.
 

keynes

Multiverse Poster
hmmm, perhaps we got a dodgy couple? I know yours is a jetstream as well, are both of yours 705Mhz? or just the one?

Mine is not a Jetstream just the standard palit card. I sent an email to Palit to ask for feedback on this matter, very strange.
 

Buzz

Master
Ok.

Seem to have sussed this at least with Rueben's anyway. The 680s seem to work very similar to the BIOS overclock in the sense that it will only clock it up as and when needed.

Please download
MSIAfterburner

After downloaded, extract the 2 files and install Kombuster and run it.

Bottom left you will see a G. Click it to open Gpu Shark

Next click the 3D test tab and run the preset 1080.

While the test is running, press Alt+Tab to tab back to your Gpu Shark and keep an eye on the True Current clock speeds / VDDC: section.

Your clock speeds should change to the higher speed once the GPU is under benchmark test.

You can also download Nvidia Inspector nvidiaInspector.zip Click the tools Icon after Driver version and create/edit your driver profiles. Id advise Not using the overclock feature in that software.

Hope this helps., and you do in fact see the GPU clock speeds hit what they should when they should.
 
Last edited:

blindhamster

Bronze Level Poster
looked to be running at 1098ish at one point, the temperature was hitting about 65 degrees at the same point as that, and the fans were still only at 30%...

I tried setting the fans to manual and upping them to 85% constantly as a test (very noisy) I then ran the Heaven benchmark app, and my minimum frames went from 13.6 to 22.7, my max was 121.7 instead of 120ish. my average was actually a little lower though.

I then ran 3dmark11 again, and the score was actually a little lower again - 9780ish this time.

My main issue honestly, is that I paid extra for the 4GB version of the 680, and seem to be getting worse performance out of it than others with the 2GB version... i'd have expected the same or better if something was testing at higher resolutions :/
 

keynes

Multiverse Poster
My main issue honestly, is that I paid extra for the 4GB version of the 680, and seem to be getting worse performance out of it than others with the 2GB version... i'd have expected the same or better if something was testing at higher resolutions :/

I know what you mean, from an online review (http://www.guru3d.com/article/palit-geforce-gtx-680-4gb-jetstream-review/23) the GTX 680 4gb is usually performing as well as the GTX 680 2gb, I'd assume the extra ram would kick in at higher resolutions (5760x1080) or 3D surround.
 

blindhamster

Bronze Level Poster
comparison-1.png


The first card is mine - note the only thing I scored higher on was the physics test, which is entirely CPU based.
all the other scores were single 2GB 680 GTX.

Note how all the others are relatively consistent and then mine is consistently lower on each...
 

keynes

Multiverse Poster
Got a reply from Palit:
Dear customer

Thank you for the mail.

It’s a known issue.
Using an old version OC utility to read GTX 680 information it will give incorrect GPU clock information (705MHz).
You can read correct information by below 3 ways.
1. Nvidia Conotrl Panel (see attached picture)
--> You can see default clock setting.
2. GPU-Z 0.6.2 http://www.techpowerup.com/downloads/2137/TechPowerUp_GPU-Z_v0.6.2.html
--> You can see default clock and the modified clock settings.
3. Bundled ThunderMaster (in driver CD)
--> You can see default clock setting, but it doesn’t show modified GPU clock in current version. (next revision will fix this issue)

Thanks.

Palit Support
Palit Microsystem Ltd.
Website: http://www.palit.biz
 

blindhamster

Bronze Level Poster
yeah, thundermaster shows 1006, but everything else (including the other utils mentioned above) all showed 705.
To be honest though, the clock speed isn't what I'm frustrated about, I think it's pretty clear that the 705 is technically correct. The thing I'm bothered about is the poor porformance compared to it's cheaper counterparts... something I'm hoping someone will resolve for me (be it Palit or PCS or even NVidia)
 

blindhamster

Bronze Level Poster
I had a very similar reply

Dear customer

Thank you for the mail.

Did you test all the five GTX 680 cards with the same conditions in your system?
Or you just test your own GTX 680 card and then compared the scores of other four GTX 680 cards that you got from internet?

There are many factors may affect performance results such as driver version, BIOS version, clock setting, power input, system hardware configuration...etc.
So the reasonable performance comparison should be testing the five GTX 680 cards with the same conditions and in the same system.

Also there are tolerance among the GTX 680 GPUs.
Some will perform a bit better and some might have lower performance.

And due to GTX 680 is a high end graphic card so we would suggest to test the graphic crad by extreme test of 3DMark11.

The clock setting of Palit GTX 680 4GB Jetstream is M3004/E1006.
The clock setting of Palit GTX 680 2GB Jetstream is M3150/E1084.

Using an old version OC utility to read GTX 680 information it will give incorrect GPU clock information (705 MHz).
You can read correct information by below 3 ways.
1. Nvidia Conotrl Panel (see attached picture)
--> You can see default clock setting.
2. GPU-Z 0.6.2 http://www.techpowerup.com/downloads/2137/TechPowerUp_GPU-Z_v0.6.2.html
--> You can see default clock and the modified clock settings.
3. Bundled ThunderMaster (in driver CD)
--> You can see default clock setting, but it doesn’t show modified GPU clock in current version. (next revision will fix this issue)

If you concern the performance very much maybe you can ask your vendor’s help to have a further check.

Thanks.


Doesn't fill me with confidence... The other systems that scored higher graphically in my little chart actually had arguably lower spec CPUs (proven by the physics scores) and were otherwise pretty much identical... So difference in components doesn't seem too likely to be the cause... and if it's just that my 680 is a poor performer I'll probably ask for a replacement, too big a gap in scores for my liking (a variance of 1fps or so would've been fine, but all the other folks graphics scores are consistent.
 

Buzz

Master
The other results in your graph, do you have some links for their results you can post so I can have a look.

Also did you run kombuster and check your true clock speeds?
 

blindhamster

Bronze Level Poster
hey again buzz :)

http://3dmark.com/search?resultTypeId=232&linkedDisplayAdapters=1&searchKey=1337678515492&cpuModelId=1419&chipsetId=767
search results for non sli machines with 680GTX and the same processor as mine (3770k)

I did indeed get clock speeds from kombuster:
looked to be running at 1098ish at one point, the temperature was hitting about 65 degrees at the same point as that, and the fans were still only at 30%...

I tried setting the fans to manual and upping them to 85% constantly as a test (very noisy) I then ran the Heaven benchmark app, and my minimum frames went from 13.6 to 22.7, my max was 121.7 instead of 120ish. my average was actually a little lower though.

I then ran 3dmark11 again, and the score was actually a little lower again - 9780ish this time.

Here are two of the ones from my spreadsheet though:
http://3dmark.com/3dm11/3406531
http://3dmark.com/3dm11/3084511
 

Buzz

Master
Humm...

All I could say at a quick guess is that Your memory clock is lower then http://3dmark.com/3dm11/3084511 and are different vendors. The Memory in your comp is 8Gig, in theirs is 16
as for this one, http://3dmark.com/3dm11/3406531 the CPU is overclocked @ 4.6Ghz Vs your 3.4Ghz which would also cause different scores. At least it did when I overclocked my CPU.

Did you ever try overclock your graphics card??
 
Top