Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most people don't connect a computer to their TV, assuming they even have a computer at all.

And assuming they do, they all have HDMI anyway.

As for everything else most people would connect to their TV, they all have HDMI and precisely none have DisplayPort.

Given that, supporting DisplayPort is an unnecessary expenditure on bill of materials and labor for TV manufacturers.



I'm not talking about computers specifically. But as an HDMI replacement everywhere. It shouldn't be more expensive, I'm not saying having an additional port, but replacing one of the 4+ HDMI ports with a DP. I think it's cheaper in fact because it has no royalties. That's why GPUs pushed heavily for DP I believe. Nowadays GPUs usually come with 3 DPs and 1 HDMI.

Just throw one and if the ecosystem grows (like it did on PCs) then you keep replacing HDMI ports with DP ports.


Except that most laptops today have only DP, and basically all USB-C=>HDMI adapters are USB-C => DP alt mode => HDMI adapters.

And outside of gaming consoles it is becoming increasingly more rare to connect anything to the TV outside of the luxury segment (as even build in sound often is better then any external sound up to a price region where the lower luxury segment starts, so buying a slightly better TV without an external sound system is often better then a cheaper system and external sound).

And many of the "killer features" of HDMI (like network over HDMI) are semi-dead.

And DP is royalty free, HDMI isn't. So gaming consoles probably would love going DP only.

So as far as I can tell the only reason physical HDMI interfaces are still everywhere is network effect of them always having been everywhere.

I.e. if there is a huge disruption HDMI might go from everywhere to dying off.

And USB-C is perfectly suited for such disruption (long term).

I mean everything from server centers to high speed interconnects of consumer PCs is currently moving to various forms of "PCIe internally". USB4 and NVMe just being two examples. So it might just be a matter of time until it reaches the console/TV space and then USB-C with USB4+ or Thunderbolt would be the way to go.


Generally agree, except for the sound part. TV speakers are awful. No matter the TV you should always have an external system.


I thought so too until I bought a TV ~1.5years ago and regretted having bought a sound system as it added hardly any benefit (and it wasn't a cheap or low quality sound system either).

Through I do not have the space to have any speakers behind the couch so it's only 3.1 either way.


It’s niche I’ll admit, but a while back the lack of support for DisplayPort in home theater equipment bit me because I had an Oculus Rift (original) hooked up to the same machine, using the only HDMI output on the computer’s GPU which meant hooking it up to the TV (through a receiver) required a DisplayPort → HDMI adapter.

Thing is, DP → HDMI adapters all suck when you’re using them to send anything but a basic 1080p picture. They nearly all fail or struggle with uncompressed 4K. I tried several different cables and adapters and despite marketing claims, they all had trouble. The best was one that was externally powered via USB, but even it exhibited quirks.

I no longer have the Rift hooked up to that machine which freed its HDMI port up, but I too wish TVs and receivers had even just one DisplayPort.


At least a few active DP → HDMI 2.0 are okay at 4kp60. For example, these[1][2] do 4:4:4/RGB 4kp60 fine, though they're limited by HDMI 2.0 to 8 bits per color channel at 4kp60 (higher bit depths work fine at lower resolutions and frequencies, or with chroma subsampling).

In my testing (several years ago), at least half a dozen other similarly-priced DP → HDMI 2.0 adapters purchased from Amazon were limited to chroma subsampled output (4:2:2 or 4:2:0) at 4kp60, which is obviously unacceptable for desktop use, so I do see your point.

I've used the linked adapters successfully now for several years with both a 2013 Mac Pro and a pair of DP-only AMD GPUs (WX 3200 and WX 4100), all connected to a 2019 LG TV, and, while testing, confirmed all claimed signal outputs using an HDFury Vertex.

[1] https://www.amazon.com/dp/B00S0BWR2K

[2] https://www.amazon.com/dp/B00S0C7QO8


It's complicated, but the tldr is: use a DP++ type 2 adapter that advertises 4k compat.

DP++ adapters tell the GPU to output an HDMI signal instead (DP is an entirely different protocol), and then just level-shift the signal. Type 1 adapters are limited to i-forgot-how-many MHz which means no more than 1080p60. Type 2 adapters contain a tiny 256 byte rom that tells the GPU its maximum supported bandwidth.

Other adapters are active, they convert the signal and thus add latency, and often need external power so can get quite hot.


I haven't noticed any perceptible latency from any active DP → HDMI I've used, and I'd honestly be surprised to see any inexpensive active DP → HDMI adapter introducing latency, if only because doing so would require a frame buffer, which drives up costs.

What I have seen is DP → HDMI adapters that only support chroma subsampled pixel formats at 4kp60, and TVs that introduce many frame times' worth of latency due to various post-processing effects.

My hunch is as follows: given that consumer video devices (DVD/Blu-ray, set top box) almost exclusively output chroma-subsampled 4:2:2/4:2:0 YCbCr formats, TV post-processing pipelines may only support these formats, causing RGB (and possibly 4:4:4 YCbCr) signals to bypass post-processing, similar to the "PC" or "Game" modes present on some TVs that do the same.

In other words, if adapter → 4:2:2/4:2:0 and 4:2:2/4:2:0 → TV post-processing → latency, then adapter → latency, even if the adapter itself introduces no significant latency.

If I'm correct, the solution is an adapter which supports RGB output at the desired resolutions and frame rates and/or a TV with a PC/Game mode that bypasses post-processing for all source types, both of which are highly desirable for "monitor" use in any case.


It’s a recently popular thing to use a tv as monitor.


Which all still have HDMI ports. It's pretty rare for a laptop to have DP over HDMI. Even USB C using DP alt mode is easier and cheaper to convert to HDMI. You really have to hunt for a hub with DP.


>> It’s a recently popular thing to use a tv as monitor.

I picked up a 55" curved 4K about 5 years ago for use as a monitor. Now I keep my desk in the living room, so I use it as a TV too (have to move the chair). Curved TV is kind of dumb, but huge curved "monitors" are awesome. You can't find curved TVs any more, and wide curved monitors don't have the height to double as a TV.


It’s a recently popular thing to use a tv as monitor.

Only if you you define recently popular as "More than yesterday, but still a rounding error."


if yesterday is since around 5+ years

because for people mainly playing games but not being over obsessed with having the most expensive fastest hardware ever using a TV as screen has been popular for a very long time, like even when I was in 12grade it was already popular and that was well over 10 years ago


Youtube is littered with videos about it. Has been for over 2 years.


If I were to do this I think I’d try to find a BFGD (big format gaming display), which is a type of screen that’s sized like a TV but behaves more like a monitor and usually has a DisplayPort.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: