Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.

Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?



Most likely brightness. Turn the brightness to the maximum value and power will go up a lot.


Power consumption varies significantly based on what's being displayed, on top of brightness settings.

I have a 42" 4k LG OLED. With a pure black background and just a taskbar visible (5% of screen), the TV draws ~40W because OLED pixels use no power when displaying black.

Opening Chrome to Google's homepage in light mode pulls ~150W since each pixel's RGB components need power to produce white across most of the screen.

Video content causes continuous power fluctuation as each frame is rendered. Dark frames use less power (more pixels off/dim), bright frames use more (more pixels on/bright).

Modern OLEDs use Pulse Width Modulation (PWM) for brightness control - pixels switch rapidly between fully on and off states. Lower brightness means pixels spend more time in their off state during each cycle.

The QN800A's local dimming helps reduce power in dark scenes by dimming zones of the LED backlight array, though power consumption still varies significantly with content brightness. It's similar to OLED but the backlight zones are not specific to each pixel.

Dark mode UIs and lower brightness settings will reduce power draw on both QLED and OLED displays.

Traditional LCDs without local dimming work quite differently - their constant backlight means only brightness settings affect power, not the content being displayed.

This explains those power fluctuations in the QN800A measurements. Peak power (429W) likely occurs during bright, high-contrast scenes - think NFL games during a sunny day game, or HDR content with bright highlights. For gaming, power draw is largely influenced by the content being displayed - so a game like Factorio, with its darker UI and industrial scenes, would typically draw less power than games with bright, sunny environments.


Thanks for taking the time to write this.

I was under the incorrect impression the power consumption would related to the rendering of the image (ala CPU/GPU work). Having it related to brightness makes much more sense.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: