Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That seems like a pretty impressive achievement!

Question for anyone here:

What do you use to play 4K/HDR? I have an Apple TV 4K, which can do 4K Dolby Vision playback and looks ok, but the Apple TV tends to have some jittering when streaming certain shows (very noticeable in panning shots of animation). The sound quality is also noticeably worse on my set with it (it doesn’t seem to be able to do direct pass through to my AV unit, always PCM/decoded on the Apple TV).

On the other hand I have a a Shield TV that does direct pass through of audio and sounds much better, and also seems to do video playback without that occasional jittering. It does not seem to support Dolby Vision though. Additionally the UI is very, very, very laggy after recent updates.

Does anyone use anything that doesn’t have either of these issues?



I use the native apps in my Samsung Q70 TV. They are really good, and I rarely find a need to use my 4K Apple TV anymore. Even the Samsung Apple TV apps, and the AirPlay 2 support is excellent.

As it is, Apple has screwed up HDR on the 4K so badly that it's very difficult to make it work smoothly without also screwing up HDR on other devices.


Agreed, i have Sony Android TV and with Amazon Prime 4K HDR the content has really good quality the problem is that after some time it starts to buffer/jitter and i think its due to Android and bad processors Sony used in the TVs.

My Apple TV 4k obviously runs much snappier but i find the HDR quality really dark and lower than native Android TV.


The difficulty for me is that my TV doesn’t have enough HDMI inputs (LG B7). I use 7 through my yahamah receiver, but have struggled to get ARC (audio from the TV through the sound system) to do-exist with also also having everything else go through the receiver.


I have a Yamaha receiver also. Occasionally I need to power-cycle the receiver or the TV to get ARC to work again, but other than that, I've had no problems with it. I run everything through the Yamaha.


The jittering is caused by the Apple TV defaulting to 60p frame rate for all content. But if you turn on “match frame rate” the jittering goes away.


I can‘t understand why it wouldn’t default to match the content’s frame rate. How can anyone watch content with a frame rate that doesn‘t match (or is a multiple of) the source material? Maybe modern TVs can hide those issues with interpolation, but I wonder how this isn’t a bigger and more widely discussed topic. It‘s my understanding that the Apple TV even handles this better than other devices, which sometimes simply don‘t seem to bother to offer this option at all.


Before I turned that on, watching BBC documentaries in 25 FPS the judder was very noticeable on panning shots. Definitely worth enabling if your device supports it.


At least Apple TV supports this feature, even if it's off by default.

Google simply "decided" for Chromecast users that 60 Hz is fine and that nobody needs 24 fps or 50 fps content.


I used to use Linux, but the DRM makes that really hard. I'm restricted to HD from Netflix, so some media I access 'differently' for the media format of my choice.


I use an Apple TV 4k into a Denon AVR-X4500 which then outputs in to my 2018 LG 65” TV. Match frame rate is on. All supports 4K Dolby Vision and Dolby Atmos (I have ceiling height speakers as well as normal surround). Looks and sounds great.

I also have a second setup which is similar to the above in a dedicated cinema room with projector as well, but with even more speakers.

If you have a receiver, it should be where your video playback devices attach. Only the receiver attaches to the TV.


HDR10 is fine right now. I don’t know of any tv that has a 12 bit capable panel or can create over 1000 nits brightness. HDR10 supports 4,000 and Dolby vision supports 10,000. TVs need to get a lot better before Dolby vision makes a difference.


The per-scene dynamic metadata of Dolby Vision is what makes the most difference in my opinion, not the theoretical max nits / bits. I can see a pretty dramatic difference on my LG C9 OLED when watching HDR10 vs DV.


I wasn’t aware! I have an LG B7 and it’s maximum brightness is like 800nits, and even that seems too bright a lot of the time, so I turn on “power save” mode which dims it by about half. No wonder I never noticed any difference.


Yes unless you have a new tv that supports eARC it won’t pass Dolby Atmos to your receiver from the tv. There is a solution to this which is currently in preorder. https://www.hdfury.com/product/4k-arcana-18gbps/


So if I understand correctly, using this device you can e.g. hook up a HDMI 2.1 device directly to a HDMI 2.1 TV, then pass the audio through an eARC port to the device, which then outputs a normal HDMI signal with full audio support (including Dolby Atmos) you can plug into a HDMI 2.0 receiver that has no eARC or HDMI 2.1 support?

If that’s how it works that would be awesome as I bought an expensive 11.2 receiver not long ago, but when the new consoles are out I want to have VRR and ALLM, which means HDMI 2.1, which means it has to go straight into the TV, which means I would lose Dolby Atmos (assuming PS5 will support it of course, but that seems pretty likely)


Of course it's an HDFury product <3


I have a Vizio Quantum X and a PS4 Pro both work great. YouTube videos show the occasional dropped frame on the Vizio app, and generally 0 dropped frames on the PS4.

I don't have the best speakers or room acoustics, but I can't complain about the sound.


I have a PS4 which won’t do the trick, but was planning on a PS5 when it comes out so maybe that’ll be the answer.


I was using FireTV Cube and recently switched to latest Shield. Both of them were playing content fine till I went playing with receiver settings and discovered that I didn't have HDR enabled on this HDMI port. After enabling I started to have a lot of jitter and blank outs (though while it worked - picture was better then before).

Solution was to buy "premium" HDMI cables that are rated for 4K HDR


I'm using a 2018 model-year Samsung 4k smart TV. I'm pretty sure it's running Tizen OS and it's surprisingly good. I'm doing sound out via ARC to a Sonos Beam in a 5.0 with the IKEA/Sonos speakers as rears. So far my only complaint is that the Tizen Plex app doesn't play nice with DTS audio even though the TV has a native DTS decoder.


"I have an Apple TV 4K, which can do 4K Dolby Vision playback and looks ok, but the Apple TV tends to have some jittering when streaming certain shows (very noticeable in panning shots of animation)."

How don't more people complain about this? I avoid streaming on the Apple TV because it does some sort of bizarre framerate thunking that is just brutal for panning. Do so few people use the product that it just goes unnoticed?

My LG 4K TV has fantastic Netflix, Prime, and Disney+ clients. HDR, 4K, etc.


I believe most people just don't notice the judder, and many of the rest don't care. E.g. it took a lot of convincing to get Google to add "use 50p output" to Chromecast settings years ago (and that didn't fix 24p of course, just European 25p/50p which is more severely affected).

Also, most high-end TVs are able to recover the original 24p (or other) frame rate and thus remove the judder by using specific settings: https://www.rtings.com/tv/reviews/lg/c9-oled/settings#judder

I tested with a high-speed camera that with correct settings my old Samsung UE75H6475 was able to recover the original frame rate perfectly from quite a few framerate mangling combinations. Haven't done similar testing with my current C9, though.


"I believe most people just don't notice the judder, and many of the rest don't care"

This is probably the case. DAZN took over NFL streaming in Canada and for the first two years seemed to use their existing European soccer processing chain (they might still --- I gave it a try to years straight and then gave up). So the 60/30 NFL stream was re-encoded to 25/50, and then on playback on my set would be displayed at 30/60. It was brutal, and even if displayed at 25 or 50 FPS was still brutal because they were seriously corrupting the NFL stream.

I tried it across a number of devices -- AppleTV, Chromecast, different TVs, pads, laptops -- and it was just unbelievably intolerable to me. Every panning pass was the horrendous juddering mess. Yet somehow no one seemed to have a problem with this! In discussions it seemed to be a non-issue.


I also have a 4K LG TV and “frame rate thunking” is the perfect term to describe it.

I wish it would just give the direct output to the TV and receiver (video audio). No idea why they the need to process it all on device.


Have you tried turning “match frame rate” on?


I had tried it a while back stopped because anytime you go into or out of video streaming it would black out the whole screen for a second or two before starting to buffer the content. It’s particularly annoying if trying to search for a particular episode of something.

That’s probably the ticket for it though. I find the lack of audio pass through to be the more annoying piece though. I have a machine that costs significantly more than the Apple TV and has knowledge of all of the attached speakers to do that decoding.


> I had tried it a while back stopped because anytime you go into or out of video streaming it would black out the whole screen for a second or two before starting to buffer the content. It’s particularly annoying if trying to search for a particular episode of something.

That‘s the correct behavior though and not specific to the Apple TV. Any device that actually does switch and match the content’s frame rate will cause the output display to resync/adjust with a short black screen. The alternative of not adjusting frame rate is much worse and it’s such an underrated problem in video playback in my opinion. The Apple TV handles this better than most devices.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: