Quote:
Originally Posted by sclitheroe
I often wonder, in the case of Plasma break in, if it is in fact, a myth that this is required. Do any manufacturers actually include this information in the manual for their televisions? Has an engineer who designs plasma displays and understands the chemistry involved with the phosphors actually come out and described in detail why this would be required?
Or is it all confined to “forum wisdom” on a handful of discussion boards and amateur websites?
Just curious - I use LCD across the board, so I don’t really know, but I can never find a definitive source online that outlines why any of this break in stuff is required.
|
Seems to be forum wisdom and internet soothsaying to me. There are many tech/consumer websites however that have done tests with trying to intentionally cause burn in on plasmas and failed, concluding that you would have to leave a TV on a static image for several days for permanent burn-in to occur.
It could be a few people getting bad panels and certain TVs just being poor choices. As I said, I've had mine for a few months and I intentionally set it on it's most brightest and vivid setting from the get go. I simply avoided any 4:3 TV or any static images for over an hour and have never had burn in issues. I don't even have any latent/ghost burn in (the temporary kind) whereas my friend's Panasonic will have temporary burn in from half an hour of playing PS3.
Maybe videogames are just bad for plasmas. Keep in mind, a videosource outputting a true 1080p signal to your TV is actually much crisper and defined than the compressed and fuzzy 720p HDTV signal. When I stick my nose to the TV and check out the pixels, they are all fuzzy and sparkly and jumping everywhere on a 720p TV signal which probably helps as a sort of faux-screensaver effect.