Member Since []
1
[OP] 127 views 0 replies Posted by Unknown 9 years ago Mar 29, 2:49 am forums.robertsspaceindustries.com
I wrote this as a response to another thread, but I got all kinds of informative, so I figured I'd help some of you out. The mods may want to sticky this or add it to one of their posts so it helps as many as possible.

If you get a 290x I would read the reviews of each specific card you were looking at, some of them have badly designed thermal dissipation (cough..Gigabyte...cough, and other rev 1.0 cards, even with the bios flash/driver fan speed update). I pulled the heatsink/fan from my Gigabyte 290 as it was running in the mid 90 degree Celsius range, they start lowering the GPU core clock at 74 degrees Celcius thereby slashing your card's performance. I cut out the metal on the back of my case where the fan is to allow maximum airflow, same for the front face case fan except I left a small bit of metal acting as a makeshift grill to keep it neat, taped over the stupid triangular holes on the top cover of the Gigabyte 290 card (who puts huge holes right next to the fans effectively cutting the airflow meant for the heatsink :/ ), carefully scraped off the **** factory thermal paste, applied some arctic silver I've had laying around forever, put it all back together. Now I can run anything, even Unigine Heaven at everything maxim settings, and my card never gets above 64 degrees Celcius. It also overclocks like a beast now, and I can undervolt it while overclocked at 1180 core/1450 memory (crappy Elpida memory too) on a rev 1.0 card.

In my opinion, none of the high end cards should be running as hot as they do, it's called planned obsolescence, they run them hot on purpose so it shortens the lifespan and you'll be forking over another $300-500 dollars in less time than you can cook an egg on that hot plate they call a GPU. With only the arctic silver thermal paste (3-5 year old tube I had sitting around) and no case modifications it would max out at 74-77 degrees Celcius, almost 20 degrees C less, from a tiny bit of thermal paste, on a horribly designed fan/heatsink. You can't tell me they don't know this to be true. And guess what, the card uses less power because it runs cooler, the reason for this is because there is less leakage in the GPU at lower temperatures therefore requiring less power, therefore generating less heat! Are you soaking all this in, or is your GPU running too hot... :)

Side note: if you decide to replace the thermal paste/material on your GPU, do not overtighten the screws holding the heatsink to the GPU, doing so will slightly warp the card enough to make for less than optimal thermal transfer, I know because I did this. Then after some speculation I didn't crank down the screws, I tightened them to the point that they could still be turned with a very small amount of torque, this yielded a 7 degree Celcius temperature drop when the card was not under load, I did not test it under full load with improperly tight screws, this would have been..... unwise.....