From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.
The TL;DR is now pixels get tracked for how long they’ve been lit. Then the device can evenly burn out the other pixels so the usage is uniform. The trade off is you are going to lose max brightness in the name of screen uniformity.
The other point is a shifting of the TFT layer that people think is burn-in, but it’s actually image retention. But, it is solved by these TV maintenance cycles. Just hope that this compensation cycle actually runs since some panels just fail to run them.
Checkout this RTings video for a good overview of lots of different TV brands and how they perform.
PS: Burn-in is actually a misnomer from the CRT era. There’s nothing burning in; the pixels are burning (wearing) out.
This is a good read.
On a side note: Anyone remember the story of the guy that went on vacation and his buddy watching the place left the gay porn on pause on the plasma screen as a joke?
I remember that!
This is gold
Edit: or gOLEd
Pranks involving permanent damage aren’t gold. Unless OP did something to deserve this, fuck that room mate.
Burn in will always be a problem, you can’t get rid of it. Sure there are ways to minimize it and monitors can try to hide it, but eventually you will have a task bar, window borders, and desktop icons burned into the screen.
True.
I still have my 1080p LCD monitor from 2010s working fine.
According to the article OLED has 5% chance to have burn in after 2 years. The article also mentioned Rtings test 10 years usage for OLED monitors.
It’s in the “nature” of OLED that it eventually wears down. My understanding is that technically, it’s not burning in, but burning out, and what’s perceived as burn-in is irregular wear of the different color channels or different brightness of the individual pixels (especially with HDR content).
Is the nature of all self-emissive displays, even micro-leds as they become more common.
Sure, but it’s more pronounced on OLED displays. We’ll see how microLED ages once we get some mainstream panels, but as most other display technologies are evenly backlit over the whole display area instead of the pixels emitting light on their own, they wear evenly and the subpixels/color channels don’t wear.
That’s true, but at the same time LED TVs have a huge problem with bloom issues that are essentially a lottery because most manufacturers don’t consider it an actual defect and won’t replace it.
I’m not going to change my habits for a monitor. Hiding the taskbar is annoying, as Windows randomly has the habit of not showing it.
Also there will be static elements on it for 16+ hours at least on the weekend. 8 to 13 under the week. Some buttons are bright some orange.
Brightness can’t be lowered much as I don’t have many options to mitigate the sun unless I fully cover the window (bright reflection neighbor houses at different daytimes + normal sun + mirrors on walls etc.)
What if I do a 48h gaming session? Can I throw it in the trash afterwards?
I will not use oled monitors with desktop pc. With my usage, I would have burn in in 2 years if not sooner. Not counting that, I would still have that thought in my mind, that if I use it more, I will get burn in, and anytime I’m leaving the pc, even if only for a minute, I should turn it off. I like good contrast and blacks, so my next monitor will probably be good VA with local dimming.
My 2009 LCD panel still works perfectly and has been repurposed as a dining room TV. While it may not excel in reproducing black levels, it continues to function just as it did when I first purchased it. I am not going to bother with OLED if it means having to replace the screen every 2-3 years.