Will using a higher refresh rate make a monitor wear out more quickly?

Solid.snake

Beta member
Messages
5
I was exploring Windows today and I found a feature that lets you raise the refresh rate of your monitor. I currently have a 17" BENQ T705 monitor with a 60Hz refresh rate but Windows lets me raise it to 75Hz (with the "hide modes this monitor cannot display" box ticked).

Should I put it all the way up to 75Hz?
 
In the monitor's documentation it probably states the recommended refresh rate - you're probably best sticking with that though I doubt anything higher will hurt (as long as it's supported obviously!)
 
NO it won't
USE VIDEO CARD ALLOWED SIZE AND YES HIGHEST REFRESH
I never has a Monitor or TV "Wear Out "
 
Again Brutus you're wrong. On a CRT type monitor you can over drive the horizontal and burn the board out.
However it's really a waste of time. The whole thing hinges on how fast your eyes give up the last image it saw. It's called image latency. I'd adjust the refresh in 2 or 5 Hz steps until you find the image on the screen doesn't flicker for you.
It's not necessary to ramp it up as fast as it will go. It's a balancing act. FPS coming off the vid card versus refresh rate of the monitor versus image latency of your eyes. You'll have to keep mucking about with it till you find the sweet spot.
But as for hurting an led/lcd monitor, I think the monitor control computer will throw in the towel long before you do any damage if any. The leds or lcd crystals are only going to switch so fast.
 
Its odd really.
With CRT monitors, I can see flickering at 60hz (hated it in school when they were set to this), yet most allow 75hz which is a lot better for the eyes and the flickering becomes less noticeable.

Yet if you are talking about an LCD monitor, because they work differently, and are progressive forms of screens. I don't notice any flickering at 60hz.
Unless of course, you are wanting to go 3D, in which case finding one that's 120hz is necessary these days.
 
Back
Top Bottom