I've designed and built many such TVs for the commercial/industrial vertical. I am currently working on developing such a TV for the consumer market and launching it under the name DUMBO.TV
Let me know your thoughts. Reference pix of 70" industrial display using Samsung LCD panel and in-house LCD controller with physical OSD menu buttons along with IR/RS232 control capability: https://imgur.com/a/k6zrH3s
Yes, going to take care of that. Thanks for the heads up. I really looked that one over...didn't think too much of it as the domain was available. It's hard to do anything the easy way nowadays!
this is a good example of a preemptive design choice. it's almost always better to optimize for success today, not for success in the future. imo a name like "slab", or even literally "dumb tv" would be much more effective.
I think it needs just a bit of fool-proofing in a couple dimensions though:
(1) Ideally, the brand would also be able to speak to the segment of people who aren't going to get the wordplay.
(2) Also, there's the age-old pronunciation question. Is it just "Primitive"? Is it "Primi - Tee - Vee"?
It's quite close to being a great brand, though. I wonder if it just needs a skillfully-placed lowercase letter, or a couple dots, or something — maybe some sort of subtle variation could solve both problems.
Hi lagrange77, I like this name too. Thank you for the suggestion. It's a bummer that primi.tv is already taken, but I'm fairly optimistic that I can find another easy-to-remember domain. Also, I'll do some digging beforehand to make sure that I'm not overstepping any existing trademarks or copyrights. If everything works out, I'd like to give you one of these displays if and when I get the prototype unit off the ground. Would that be okay?
Hi pupdogg, thank you very much for the generous offer!
Actually, I don't own a TV, and besides, it would be the coolest story to tell when people ask me where I got this device. So I would like to gladly take you up on the offer IF you actually end up using the name, everything worked out as you hoped and you're in serial production.
I really like the design btw, clean, uniform bezels and logo-less. That's what i always look after for my computer displays.
According sources in the Wikipedia article about: idiot:
The word "idiot" comes from the Greek noun ἰδιώτης idiōtēs 'a private person, individual' (as opposed to the state), 'a private citizen' (as opposed to someone with a political office), 'a common man', 'a person lacking professional skill, layman', later 'unskilled', 'ignorant', derived from the adjective ἴδιος idios 'personal' (not public, not shared). In Latin, idiota was borrowed in the meaning 'uneducated', 'ignorant', 'common', and in Late Latin came to mean 'crude, illiterate, ignorant'. In French, it kept the meaning of 'illiterate', 'ignorant', and added the meaning 'stupid' in the 13th century. In English, it added the meaning 'mentally deficient' in the 14th century.
But who/what are you calling "idiot" and why? I would never think of calling that a plain HDTV set - it is still a technological marvel (and also would have all kinds of "smarts" built in anyway, just no apps or wifi).
maybe something like dumb.box ? I think the .box suffix might be very appropriate, something like brainless.box or whatever, I know it's unconventional though, perhaps .tv is better
I live self deprecating humour but it negatively affects me at work, I imagine dumb.tv or idiot.tv may have similar problems. But maybe also you go viral because of it, who knows.
GOTO no longer considered unsafe! (Actually, it never really was all that unsafe, anyway, the function guys just established themselves as the new programming hipsters...)
Not impossible to rank for. For example, see "stripe", or even my business, Keygen, which now ranks #1 for the term "keygen" after 7 years (funny fact: HN poked fun at me when I claimed I'd rank #1 for the term in my 2016 Show HN).
STANDARD is probably not the best choice for short-term SEO, though. :)
Kind of like framework laptop though. So while I agree search wouldn't be great to start, This isn't a TV for the masses if we're being honest. It will be high word of mouth and on niche tech sites.
Maybe consider picking another brand, sooner rather than later?
I like the name "dumbo", maybe coming from "dumb TV" and Jumbotron -- a nice, big TV/display without all the "smart TV" problems, and having fun naming something that's a bit of a cult niche -- but I also want your business to succeed.
(Disney has been putting their Dumbo on TV since at least the 1980s. I don't know whether there are other Dumbos, but I could imagine lawyer billable hours destroying your margin regardless of the theoretical defensibility. Also, separate from US govt. offices and courts, there's ICANN, and I guess you could suddenly lose your domain name that way.)
My main issue with smart TVs isn't the network connectivity, its their horrible UI. The menus are universally confusing, slow and poorly thought-out. And because these TVs have to have some complicated OS to run all this stuff, they're slow to start up. My AppleTV goes from standby to on faster than my circa 2008[1] (only kinda smart) TV and receiver are ready to display an image. The AppleTV should not be the fastest device in the chain.
I assume your commercial displays have a time-to-first-image that more closely resembles a computer monitor. That's what I want in a TV. If Apple had a TV-version of CarPlay that TV manufacturers could license, I'd be pretty happy with that.
Given that embedded TvOS isn't coming any time soon, my ideal TV would have:
* 70" - 80" screen
* the smallest possible bezel
* 4k HDR
* capable of professional color calibration
* one HDMI 2.1 input with ARC
* TV tuner with auto-scan
* any crappy speaker will do (I'm not going to use it)
* IR/RS-232 control with distinct codes for on and off (i.e. not toggled)
* excellent CEC support
* time-to-first-image under 3 seconds, immediate audio
Nice to haves:
* simulated snow for dead channels/no input (much nicer than blue IMO)
* power, HDMI, serial connections should be down-facing
* physical buttons on the back of the TV for power, channel, volume, menu
* a rear or bottom-mounted red power LED that reflects off the wall/table
I would be willing to pay $3000 - $4000 for this TV, with the expectation that it would last at least 15 years.
[1] In December 2008, I bought a Samsung LN52A750 52" 1080p TV for $1937. That's roughly $2772 inflation-adjusted. It has a huge bezel; a modern TV with the same physical dimensions would be closer to 60" diagonal.
This is a very good and comprehensive list. Thank you! Our industrial displays already have most of your "must-haves," except for three:
1. Professional color calibration - this is doable, but it adds significant cost. When referring to color calibration, it is not the typical marketing gimmick used for consumer displays, but instead something equivalent to what you would get with Sony's BVM-HX310 reference monitor. So, my question would be what level of professional color calibration is acceptable?
2. HDMI 2.1 - HDMI-2.1a to be specific. Our controllers are all compliant up to HDMI-2.0 as of now. This is something in the works, but due to chip shortages, it has been challenging to come by industrial-grade chips/SoCs that handle HDMI 2.1a. However, this is on our radar.
3. TV Tuner - it is unclear if you mean an actual channel tuner or an auto-scanner for the available input. If referring to a channel tuner for COAX-based inputs, this is something that has been phased out, not by us but by our chip suppliers.
Regarding the price point, that sounds very doable. My personal expertise is in overall system integration and sheet metal design. The intent is to create even the consumer-grade display with a fabricated aluminum shell that feels rock-solid and heavy-duty aesthetically.
FYI, all the "nice-haves" are also already present, with the exception of down-facing connections. Can you clarify? Also, did you check out the pictures from my link above?
Color calibration - I'm not entirely sure what would be involved here, TBH. "Professional" in this context would be what a pro home AV installer[1] would look for, not what a Hollywood color grader would need.
TV Tuner - Yes. An ATSC tuner with a COAX antenna input that can auto-scan for receivable channels. If this isn't feasible anymore, then I guess the HDMI input doesn't need ARC.
Connectors - looking at your photos, your input connections are on the side. If you put them on the bottom (down-facing), I think they'd be easier to reach if the TV is wall mounted (maybe not as easy if its on a table-top stand though). As long as they don't stick straight out the back, any orientation would be acceptable.
What's your experience with outdoor displays? 1000+ nits, IP-67 rated?
In that case, I would say that color calibration is very easy to accomplish and having the connectors facing down is something I was already planning to do. Our industrial displays are typically mounted 15-20ft in the air, so customers have an easier time accessing them from the side while on their lift.
As for outdoor TVs, that's a whole other Pandora's box, but yes, we also make those displays, mainly for digital billboards, advertising, and rental purposes. They typically range from 5000 nits to 12000 nits in brightness, and their pixel pitch ranges from 3mm to 9mm. They're all IP67 rated. Displays with a brightness lower than 5000 nits are typically indoor LED displays with much finer pitch, ranging from 0.9mm to 6mm. A 0.9mm pitch would be very similar to Samsung's QN90A series TVs with QLEDs (a marketing gimmick for an actual LED pixel-based display). These displays may cost a pretty penny, but we hope that once semiconductor plants are up and running in the US, costs will start to come down. Here's a project we did with a circular 3mm display back in 2017. This unit is 7ft tall with a 4.5ft diameter: http://bit.ly/43keDD5
So my use case for outdoor displays is in the recreational marine market. Screen sizes are typically between 7" and 18" with some very high-end "glass bridge" displays coming in around 24". Retail for a 12" chart plotter is $4000. Typical display brightness is 1200 nits, which is OK, but still hard to see in direct sunlight with sunglasses. Conversely, they also don't get dim enough at night.
The glass bridge solutions all use a "black box" chart plotter (essentially a rugged PC) and dumb displays. But the smallest displays are 16" and way more expensive than the integrated solution.
I'd love a 12" to 15" HD display with capacitive touch, optically bonded LCD, wide viewing angle, viewable in direct sunlight (with sunglasses), and dimmable to nearly off (20 - 50 nits maybe?) so that its not blinding at night (not city night, but offshore night). It should have one cable: a USB connection for power, video, and HID output.
In the vein of your pole display, a 14-18" tall, 4" wide screen would be very cool for the sailing market. Most folks mounting mast displays are still stacking a bunch of individual 3" or 4" monochrome LCDs (at $1000/ea).
Thank you! I think I know what you're talking about. I live in a gated subdivision where the gate entrance control panel has a similar 15" screen but with terrible brightness and UI. Your display also sounds a lot like an HMI panel but for outdoors. Do you happen to have a product link for this display with an embedded PC? If the market is big enough, I'd be more than willing to explore some sort of joint venture with you.
Thank you! What you see in there is a WebGL creative put together using ThreeJS running at 60FPS on a RaspberryPi with a custom/frameless Chromium build. This was back in 2017, and I can only imagine what someone can accomplish nowadays!
Yes, a few of our permanent install/broadcasting projects use TouchDesigner and Ventuz. For rental applications, Resolume, Modul8, and VDMX are preferred depending on the use case.
FYI, this is our current OSD structure: https://imgur.com/xNnsjwT. If you look at the "User Color", it lets you set independent gain/offset for each color channel. You can also select one of the predefined "Color Balance" options between Warm, Normal, Cold, or sRGB. The sRGB option arranges the panel color to the sRGB gamut. Additionally, you can set the Gamma curve between 1.8 all the way up to 2.2. I hope this information helps.
> So, my question would be what level of professional color calibration is acceptable?
Naively, I'd say… Delta-E <5. Asus ProArt monitors originally shipped with that, and could be calibrated by the user to Delta-E <2; afte a while, Asus simply started shipping them them with the latter.
Personally, I explicitly do not want a tuner in a TV. I would much rather feed that through my existing system and have the TV be as dumb as possible -- I literally just want a display.
Down-facing connections are a must.
I also don't want anything except a power button -- no channel, volume, menu, etc. I don't want crappy TV speakers and I'd really rather not have an OSD. I would love a display where I could fire up some software on my computer and have it issue commands over HDMI CEC to change settings. There are USB CEC adapters out there that would make this possible if computer HDMI ports won't emit CEC commands for whatever reason.
Ideally, strip the display down to the simplest possible thing and focus on the core features: big display, small bezel, high resolution, HDR, color accuracy (configurable through the aforementioned CEC mechanism). I don't even need multiple inputs: just give me one working HDMI input.
Eh, menu for basic colour calibration would be nice, doesn't need CEC for that though. I do agree with throwing out the speakers, many people have at least a sound bar and those that don't are unlikely to want this product (That market wants all in ones). I also agree with the tuner, there would be no DVR, signal filtering, etc on the TV but those may be desirable and would therefore require an external tuner anyways.
Agree on exactly 1 HDMI input honestly. It's much easier to have the rest of my system mux HDMI (because you either have a receiver, or just a simple switcher, or exactly 1 device anyway).
Then I'm afraid it's not possible. The reason the input switch takes so long is because of the HDID negotiation.
A video mixer acts as the source and sink for the output and the inputs respectively, where a HDMI switch will just physically disconnect and connect the output port to a different input port, meaning the HDID negotiation has to be re-done every time.
It still seems the right hardware, upstream or downstream depending on which direction you're viewing things from, could keep the connection alive with the HDID pre-negotiated. Like, assume the hardware doesn't change between switching. If you're the TV's firmware and have the CPU power to, why renegotiate when switching inputs. Solve for the default case and have it standing by and running hot. it's not like the panel is going to change in the interim.
My TV is 12 years old, so I don't think it's unreasonable? Only down-side to the old TV is I have to strip HDCP to hook it up to my laptop through my receiver (a direct connection works because it negotiates the older version).
So I have had to suffer being given the gift of my neighbor, an AV-file, setting up a home theater style setup. Personally I’m not super happy with what he did. I have a full size rack meant to be filled with electronics I don’t want, and hundreds of feet of wire to locate said rack to a closet so there is no entertainment unit under the TV.
All of this is overkill in my opinion. However… I think this is a core market for you. Between the Enthusiasts on Reddit and the home theater market there are a lot of people that would appreciate a highly tunable screen where they can have their AV closet run the show without Samsung, Google, or Roku in the way.
One of the most painful parts of my install was getting an HDMI Balun that would operate in 4:4:4 color space, provide 7.2 ARC, provide CEC, and provide IR… and work reliably. This is a tall ask because it’s supposed to use CAT6 between the ends. The expense of these units was high (300-500) and the reliability wasn’t great. A lot of the reason for this pain is the idea that you can upgrade later without running a new set of wires. Ultimately I said screw it and ran an optical HDMI cable (and am very happy).
You would really engage with this market if you integrated this functionality into your TV so you only needed to run a cable and attach the receiver side Balun. I don’t know where the pain came from but this feels like something that the native TV hardware could expose over a cable. Some people have Toslink run through their walls as well which would be a good option to support as well.
You're right, extenders can be expensive, but they can save you a lot of trouble if you choose the right one. My go-to choice is Atlona, especially this one: https://atlona.com/product/at-ome-ex-wp-kit-lt/. The best part is that the destination end is powered by PoE, so you get active conversion all the time. We've used these very reliably (I would say they are industrial-grade) for the past 10 years.
We do have a few LCD controller boards that support IP TV, but the downside is that they use 264/MPEG-4 AVC compression for transport.
If you can really make a 100% unconnected one, not even for firmware upgrades (just let me use that damn USB port), or Smart but with 100% Open Source firmware -drivers included-, and possibly sell it in the EU too, I'm in. I'll soon (2-3 months) have to buy one, and I'll be probably shelling out like 3-4 times the cost of a Smart TV to buy a signage display to connect to my Kodi based system just not to have spyware running in my next living room, so unless I'm a complete idiot, I guess there's a market for a lower cost one, even though as a consumer product it wouldn't be rated to last as much as signage displays are.
A final thought: Arduino and the Raspberry PI became the reference in the makers community, not because they were better and/or cheaper but because they were reasonably good and cheap, but most importantly they were very well documented and offered standard connectors that would allow makers to build their devices knowing they would be compatible with the new models. If you make a 100% Open Source dumb TV with a rear Open Source well documented connector allowing tinkerers to extend the device functionality beyond adding a TV box (which in most cases just moves the privacy issue one HDMI cable away), your product would be extremely well received by the tinkerers community, and probably much beyond it.
Many uses where having a low cost computer device on board might come handy, but one couldn't give away privacy and security for that. Adding an external RPi or similar board today could require some non trivial amount of money (board+case+cables+psu) so if that could be done at a smaller price still maintaining openness, why not?
Is there any chance whatsoever that you can use open source software/firmware for it?
I am not in the market for a TV and I am only a hobbyist with a tendency to spend more than I should on anything that lets me open the hood and tinker with it. If there is any consumer electronics company that wants to be the anti-Apple, I'd support it in any possible way.
Seriously, I even wonder if there could be some type of "Patreon-based" R&D for consumer eletronics. Get a group of software and industrial engineers to design and build all sorts of different projects, and patreons would get access to early prototypes and would be able to "buy" the finished products at cost.
I’m not a licensing expert but my general vision is that if we can get assistance from community firmware designers, I’d be more than willing to champion this initiative. Would love to create an evaluation board that others could utilize around come up with their own interpretation of a perfect display.
Both varied control and power options are really valuable. Power is easy: if it's DC in with a separate brick, please use a common connector :). If it has an integrated power block with an AC input, an alternate DC input would be very nice.
For control… it's really a question of balance. "all the things" sure would be nice, but with a mid 4-digit price I think putting a small fully-open Linux embedded board inside would be awesome. Some existing SBC or SoM is preferable over rolling your own, the latter would just splinter off a separate community for no good reason. Also make sure the embedded system has full control over all functions (there should be enough GPIO & I²C…) and has its own power control. Having it able to actually drive the TV display would be nice (and easy these days) but isn't even the point :).
If you don't stick that in, how about a slot/connector for a Raspberry Pi compute module — or some other reasonably sourceable similar module?
Barring that, RS-232 for control is a bit dated, I'd really expect both an USB device port (control in) and an USB power output, ideally USB-C PD capable, to run some small system off. If you have the pins & functions, please spend the extra $5 and wire up leftover "random" interfaces, e.g. CAN or RS-485. Ethernet without adding a full SBC is kinda "meh", putting together some embedded OS with networking is significant effort for very little return.
Look, every TV UI is slow. But how can games have responsive UI on 8-bit hardware? People are doing things wrong. You have a real chance here of having a UI which is instant.
Because 8 bit hardware was realtime? It's quite difficult to implement everything we expect likely nowadays without multiprocessing... realtime Linux exists but it's not a panacea
Honestly to compete with mass market TVs, it's going to be difficult to avoid have more- not less "bloat". Built-in Android would be a great selling point (saves you $200 on a NVidia Shield as long as it doesn't track you or lock down features). Built-in AI upscaling is a good selling point.
I would also think if you have a OLED or edge/direct LED backlight LCD then you also need to run much more complicated pixel/zone brightness algorithms....
There's probable more memory allocated per frame on a modern 4K TV than an 8-bit computer had in total.
Every real-time game ever defies these arguments. It sounds reasonable until you load a game and start using it. Effectively no lag, counted in the milliseconds.
No, the reason that UIs are slow are because they are often a web browser, running a Javascript bloatware UI with it's shadow DOM or whatever, and there's a heaps of bullshit going on before anything happens on screen at all.
And this is fine! For many use-cases, comparatively beefy PCs, web pages etc.
But for what is effectively an embedded product which will get updated seldom if ever, THIS is the time to take a step and rethink what you are doing.
For a non-smart TV you definitely don't need Linux, (but you could!) and you don't need to care if it's realtime or not. There should be only one or two processes running anyway, so there shouldn't be any competition between different threads. It's should all be coded like a classic, tight game-loop.
AI upscaling and Android... dear lord. Just give me something which isn't fucking slow and frustrating to use. I thought I was in the thread which discussed non-smart TVs, did I miss something?
These things are complicated enough to justify an operating system, it might as well be an operating system we're all familiar with. Linux scales down to a basic firmware as well as up to a desktop operating system. FreeRTOS is also an option though.
I'd love a list of products that are open-hardware or open-source source software or just even open enough to easily use them using open source solutions.
But would you pay 2x for an industrial panel? These panels might last longer at brighter drive outdoors and in well lit areas but they won’t look anywhere near as good as a consumer OLED for viewing movies and gaming because of both panel and processor. Seems like a different (and wrong for this use case) set of priorities. My OLED LG CX is not on the network - the UI is - it’s fine, I haven’t found the LG UI to be too laggy which is a pet peeve - certainly not annoying enough to throw the amazing panel out.
The only thing annoying about the OLED LG TV at least C9 version i have is that it wont allow you to airplay unless you're connected to the WIFI. So you need to add a custom rule that doesn't allow the TV access to the public internet only the local LAN.
Your LG C2 is a dumb tv - if you dont connect it to the internet. You can even buy a $10 IR remote control on amazon and use that to switch inputs. Or, in my experience, all of my external devices support HDMI CEC and auto switch when i use them.
This is essentially what I've done. The TV has never been connected to the Internet and my Apple TV drives everything. I would still have payed 2x to support a good dumb TV project and would love to do so in the future.
I made the mistake of connecting my vizio to the internet after owning it for years.
It was great for two days and then it downloaded an update that absolutely wrecked the interface. What was smooth and snappy and good enough for me now moves at a snails pace and the tv is practically unusable even after a factory reset.
Opening the menu takes 2.5 seconds from button push to response on a good day for no reason other than vizio must have decided it was time for me to buy a new tv.
I used to like their brand. Now I'll never buy another one again.
The problem with smart TVs is that you can’t easily disable the offending software if you want to. The software is tightly coupled to the hardware. In some cases it will aggressively search for opportunities to spy on you.
Worst case, Apple jumps the shark… you can just unplug it. You at least get to keep the display.
The Apple TV’s hardware is wildly more powerful than that bundled in any smart TV. The current ATV 4K is running on the last gen flagship Apple SoC with a big passive heatsink attached while smart TVs use hardware comparable to that of a low-to-midrange Android phone from 2012-2014. Even the first gen ATV 4K from 2017 is several times more powerful than current smart TVs.
That difference in power is felt quite a lot in the user experience.
I am not sure if "powerful" matters in this context though. (That is, I expect the chipsets built into TVs to be plenty powerful for their intended purpose.)
Have you tried the average "smart" TV you find at an Airbnb? I have and let me tell you, it does matter.
We were staying at one just a few days ago that had a cheap Samsung TV. The UI latency was so horribly laggy that simply clicking an arrow on the remote to try to navigate to the next menu would take up to 10 seconds to finally register on screen. It was also variable, meaning some button presses only took 1-2 seconds to respond, but some took 10 seconds, and if you pressed more than once you'd end up with a whole bunch of your delayed button presses registering at once and taking you to a menu option you didn't want.
Sad to say, but state of the art in these Android menu systems is horrible latency, most likely because the UI devs are building in new javascript features that run horribly slow on older ARM processors and they just don't give any F's about the actual user experience or testing...
I love your reference to AirBnB. That is precisely when I get to experience what I assume the rest of the world is used to. Firing up a random TV at an AirBnB is simply painful. You're 100% right that CPU power matters. The delay on every menu is painful. The UX is just atrocious compared to my AppleTV. I cringe that people use this for their normal viewing.
In many cases the SoCs used in TVs are so underpowered that they can’t render menu screens without frame drops, or if they can they lose that ability after a software update or two because there’s so little margin.
Just how “underpowered” are we taking here? I’m having a really hard time imagining a chip which cannot render a menu in non-fractional fps. An 8088 can do this…
Like I mentioned in an earlier comment, their power is roughly on par with a 2012-2014 low-to-midrange smartphone, which sits somewhere between 5-15% a powerful as a modern midrange-and-up smartphone.
That would be fine if they were rendering to a 720p screen or had much more simplistic menus like the those found on most A/V receivers, but they’re usually running recent-ish Android or something similar, which has fancy graphics and animations all over the place designed for newer devices which make that hardware choke at the 4K resolution that the majority of TVs now ship with. Exacerbating this are the terrible lowest-bidder smart TV apps which are written terribly.
TV manufacturers will never ship an OS more suitable for the hardware though, because they’re concerned that it will make the TV look less modern than competing TVs. They also won’t ship better hardware because that’d cut $5 per unit off of their margins. As such, it’s best to just write off integrated “smarts” and plug in a streaming box that’s not so anemic.
The Netflix app ran fine at first but had outgrown my 2018 smart TV's IQ by 2022. Freezes for a good moment then crashes the TVOS. Hulu as well. Factory reset was a waste of time and fixed nothing. But TCL made a few bucks more going with the cheaper cpu and accelerated obsolescence.
One more reason to get a dumb display...
But on the other hand TV SoCs need to process the video signal at 4k 120hz 4:4:4 without dropping a single frame, although most if not all those tasks are most likely done by an ASIC embedded into the SoC. Would a modern but very cut down GPU be able to do this and also at a low enough power draw?
There's hardware transcoders on the chip that render your video stream. The menu system is just a terribly outdated Android SoC... the latency is entirely from the speed at which it can render the user interface and has nothing to do with how fast your 4K 120/240hz panel can draw a frame.
The irk for smart tvs comes from that fact that they show ads and slow down overtime.
With Apple TV, you can nip that in the bud. Yes, Apple TV has its own quirks but it’s nowhere near as hostile as built in “functionality” that these tvs try to provide.
And not say that streaming services have every incentive to keep the Apple TV apps improving compared to the tv app itself.
A TV with Samsung MySmart™ HomeOS Android UltraCrap Edition is not the same as an AppleTV, if you care about things like, I don't know... consistent framerate? No random crashes? Bearable UI latency?
Similarly how Apple CarPlay is not the same as car manufacturers' sorry-ass homegrown infotainment trash software.
From a security or absolutist point of view, yeah. From a customer point of view — different companies have different reputations and market positions. We might all disagree about the exact level of faith we have in them, but Apple and Vizio or whoever seem to have different reputations, for whatever that is worth.
Depends. A Smart TV is a category of TVs being sold. If you go into bestbuy asking for a Smart TV, you're not going to get a TV + streaming box/stick, just a "Smart TV" (although there is some obvious overlap here since TV makers have partnered with Fire TV and Roku).
But when it comes to general conversation (and this context), yes it's the same. Unless you are actually using a niche setup of local streaming and the Apple TV box is just a nice interface you keep offline - don't know how well that works. For everyone else, they are just avoiding overlapping TV services, as relying on Apple or Google or Amazon is falling into the same traps.
The Slamsung smart TV I have is fine, but the UI is noticeablely show as shit even with it never connected to anything (just going to settings for example) which is part and parcel of being "smart" I guess.
99% of the problem with Smart TVs is because of the absolute dog shit UI and the relatively bad HDMI CEC setups they have (at least with mine, if HDMI CEC is on, anytime the TV accidentally or intentionally gets into any of the smart parts, it tells the receiver to go to TV mode, and getting it back to one of the HDMI inputs on the receiver either involves turning everything off, or turning it back repeatedly).
Miles ahead? Last year I bought a new generation LG (WebOS 22), and its menus are so damn slow. Everything is an app. Open settings to change the picture or something? Yes, please wait five seconds to load the first menu. And then it’s still stuttering and reacting slowly.
Ridiculously bad is the „Home Screen“ (Netflix-like UI with every app and suggestions and so one in one place), which needs even longer to start.
And the worst thing is: all of these internal apps are reloaded EVERY TIME I use them. No preloading, no caching. Everything lags and needs several painful seconds to display.
(And then LG did not even bother to pay some cents for the DTS license so that I am unable to watch movies from USB sticks with DTS audio track, and I would have to convert them to Dolby AC3 first on my computer …)
The LG display is nice, though it’s far from perfect, which is a problem of ALL televion sets, and I don’t know why because _monitors_ on the other hand are always pixel perfect, but televisions seem never be able to get configured pixel-sharp, even with perfect video material. It’s ok for the money I poured into. Just a big screen.
An also bad part of LG is the lack of apps and the control LG holds over the store. There isn’t even a web browser available, just the crappy thing from LG built-in, which does not even have all TLS certificates. I wish I had bought a TV with Android.
Furthermore, if you're connecting devices like PC GPUs that don't support CEC, most LG TVs have an RS-232 interface that supports all the basic "dumb TV" commands, including most of what you'd want to do with CEC or IR remote (power on/off, input select, volume, brightness, and other basic audio and video settings).
RS-232 control also has a command to disable OSD, which has the pleasant side effect of disabling annoying smart TV bits like pop-up notifications even when the TV is connected to the Internet (and also superfluous [to me] non-smart TV pop-ups that ordinarily appear when switching inputs and adjusting visually apparent settings like brightness).
Disabling the OSD also disables the bundled Magic Remote, though IR remotes still work (unless locked out with another command) and OSD can be re-enabled via RS-232, or by simply power-cycling the TV.
As a bonus, input switching via IR remote (or RS-232) is noticeably faster than switching via Magic Remote, even if you set up hotkeys, as full-featured LG IR remotes have hard buttons for each input that don't require press-and-hold to activate (this includes sub-$10 service remote knock-offs on Amazon, which work perfectly fine IME, though you may want to steer away from these in a casual setting as some of the service buttons can cause undesirable behavior).
For my own use, I wrote a trivial ASP.NET Web API wrapper around the LG RS-232,
While only tested on macOS controlling the TV I own (55SK9000), the documentation it's written against isn't model-specific and I'm not using any platform-specific .NET APIs, so it should work across many TV models and on any platform that supports RS-232 and .NET (.NET 6.0+ as currently configured, though it was mostly developed on .NET Core 3.1, so changing TargetFramework in the csproj file should suffice to get it running on older versions).
I have an LGCX and if I hadn't connected it to the internet I would have missed out on some important software updates that significantly improved the display performance.
I guess you can toggle the internet on and off when updates are published, but it's not the most convenient solution.
Edit: But you can use a USB! Woohoo, thank you repliers!
LG (at least for my OLED model) supports firmware updates over USB, and posts firmware updates to their website. It's a very smooth flow -- my years-old TV has never had an internet connection and is up-to-date.
Thank you, and babypuncher, for this advice! I avoid using the tvos but dislike how a parade of web connected ads pop up if I hit the dashboard button by accident. Glad I can airgap it again.
Most, if not all, routers have a one click blacklist device option. Pretty easy to just unblock it once a month and check for updates?
Alternatively, for something more automated, you could just use parental control 'bedtime'. I use it currently to keep the kid from watching TV at certain hours. Could probably do the inverse and block it except for an hour on a particular day, I'd imagine.
> I have an LGCX and if I hadn't connected it to the internet I would have missed out on some important software updates that significantly improved the display performance.
Fwiw toggling the internet is an easy fix compared to the "dumb tv" way of updating firmware - putting a bin on a flash drive.
Most likely. I worked at a political company (eww) that used this data up to two years after it was generated. The historical data is more useful for political markets for advertising issues than near real time since campaign targeting usually needs to be performed or at least planned a few weeks in advance. Near real time is great for message tweaking but knowing whether there's a receptive demographic is historical.
I think you'd have to write a DNS server where you choose what to return NXDOMAIN for. So updates.samsung.com, sure, let it connect, but spying.samsung.com, block. (Obviously, do not allow connections to any IP addresses you haven't yet approved, which you approve by manually retrieving the DNS entries.) This can be defeated with DoH, or by different business units inside the company cooperating to use the same domain for different purposes, or by doing the TLS negotiation with good.samsung.com but setting the Host header to evil.samsung.com, etc. The first is too scary to ship (you have to keep the DoH's IP address and certificate safe forever; I wouldn't sign off on that), and the second made me chuckle as I was typing it.
I'll add that "back in my day" a screen could display the video signal on its inputs without ever needing a software update. But I suppose automatic time zone changes are a reasonable reason that code needs to be pushed post-manufacture. Then again, who needs a clock on their TV?
I have an unfortunately smart TV, which of I’ve never connected to the network. In general
* it is effectively dumb to
me, so I don’t care about feature updates
* it isn’t connected to the network, so I don’t care about security updates
It hadn't occurred to me that there could be TVs out there that are so “smart” that they can’t even take an input without a network connection. Such a device would be returned as defective by me, but of course I can see somebody deciding that packing it all up into the car is too much of a pain.
What data could a smart tv collect on you if you're treating it like a dumb tv? Assuming it lives its life disconnected from the internet with an Apple TV/Roku/Chromecast connected to it. Would it have any data on you other than, maybe, when the tv has been turned on?
> Once every second, software in the Vizio TVs would read pixel data from a segment of the screen. This was sent home and compared against a database of film, television and advertising content to determine what was being watched.
If you have at least an advertised number for input latency in the minimum processing (i.e. "game") mode, that would be nice. Few panels in general advertise this, but many reviews of home TVs will do at least one measurement of it.
A decent consumer LCD panel will be ~12ms@60Hz, with OLED often being faster. If it's over 20ms or so, I struggle on some old NES games.
Wait, this Disney lawsuit thing is news to me. What's up with that? You're right, in the consumer market, there's a much higher demand for 32-40" display. Our UHD controller can easily handle that and comes with 5x total input ports [ 3x HDMI(V1.4a), 1x HDMI(V2.0), 1x DisplayPort(V1.2) ] and can run 3840x2160 or 4096x2160 resolutions @ 120Hz.
I hope to god the edid on your displays doesn't include 4096x2160 for 3840x2160-native panels the way LG TVs do; it's such a freaking nightmare to deal with any sort of gpu scaling or DSR resolution type options on my LG CX due to that
I suppose they might think to argue that someone selling televisions with the name Dumbo would be trying to tie their product in with the Disney trademarked product. I don't think that would be considered too much of a stretch so I don't think you would do too well with continuing with that name.
Aah, gotcha! Boy, I really overlooked that one. My primary goal was to come up with an easy to remember name/domain that expressed "DUMB TV" and dumbo.tv was available. Oh well, I'll have to rethink that part through a little more.
Agreed. These usually come up when people use the LG oled TVs with pcs and get annoyingly reminded that they are not actual monitors. In particular it can be very difficult to turn these "optimizations" off.
Please include decent built in speakers and / or volume-controlled analog audio out. As seen in other comments[0], ARC is finicky and unreliable. I would love to return to the days when I could just switch a TV on and start using it, nothing external required.
(Volume control is essential for analog audio out because if I’m going to use external speakers I don’t want to have to have a separate volume control just for that…)
The problem is any TV is smart-free if you don’t give it the WiFi password (as article points out), so I’m not sure why’d I pay more for something that would be harder to sell second hand later on?
Personally my TV is offline with an AppleTV for streaming.
Modern Samsung TVs don’t even let you change any settings if you don’t provide Internet access and accept the TOS. Only thing possible is changing channels and volume.
(Yes, that restriction includes changing the HDMI source for your Apple TV. Simply not possible without.)
> I use a game console for streaming. Works great.
I strongly suspect most game consoles are spying on you just as much as a smart TV would and all of them currently seem to be pushing ads in your face pretty aggressively too. Still probably better than roku which records and sends home multiple screenshots of whatever you're watching every second, but if my PS5 isn't doing some form of ACR already I suspect it's only a matter of time.
The firmware in TV's can and often do get updated, and smart TVs are increasingly capable of updates and upgrades, but you're right in that now that consoles are basically locked down gaming PCs there's still really no comparison.
I did some minor ps3 hacking a decade or so back, and dumped the network activity. Every boot it would send some XMPP traffic containing a log of your recent activities back to Sony. What you watched over Dlna, what dvds/Bly-rays you played, which games, for how long and when.
I imagine things have only got more detailed since then in newer consoles.
* VRR, preferably with G-Sync and Freesync Premium certification
These are my requirements for a new display, anything that meets them in the pipeline? My biggest problem with existing "dumb TVs" is that they lack features like these, yet cost more than "smart TVs" that do.
Yes, we support OLEDs and QLEDs as is. However, "Dolby Vision and HDR10+ * VRR, preferably with G-Sync and Freesync" are some new buzz words for me. Are they just marketing buzz words or do they actually translate to physical or software specs for the controller board/firmware?
VRR is short for variable refresh rate, a feature added with HDMI 2.1.It's useful for video games, as it means they do not have to deliver frames perfectly in sync with the display's fixed refresh rate in order to present a smooth experience. I mention G-Sync/Freesync certification, because it is all too common for VRR-capable displays to only do the bare-minimum to meet the HDMI 2.1 spec otherwise.
Dolby Vision and HDR10+ are newer premium HDR formats.
I'll talk this over with our firmware designer. I believe most of the things you're asking for come standard as part of the HDMI2.1 spec and the latest spec seems to be HDMI2.1a. As of right now, we are only compliant up to HDMI2.0 due to the nature of our commercial market. They prefer reliability over cutting edge. However, I don't see updating to the new spec as a major issue. High-end FPGAs have really come down in price and have made it much easier to accomplish such tasks due to their high throughput capabilities.
Don't just do the bare minimum VRR support to be HDMI 2.1 compliant though. The spec only requires a very narrow VRR window. Lots of cheaper devices make this window 48 to 60 hz, even when the display itself supports fixed refresh rates up to 120 hz. VRR should work anywhere from 30 all the way up to the maximum refresh rate supported by the TV. That is why I mentioned the G-Sync/Freesync Premium certifications.
How will you be dealing with movie content? 24fps content is quite hard to display without proper processing especially on large screens when the telesync judder will be noticeable.
Also HDR support and DRM will be quite problematic.
I'm not too sure about this. I use AppleTV with our industrial display playing back HDR content without any issues. Here's a 4K video playing via AppleTV on a 16ft x 9ft indoor display at our facility: https://imgur.com/a/3O58O8T
Majority of our customers use our displays for Digital Signage so I would assume we would've heard of their issues if they couldn't playback their 4k or 8k content. Some of them play this content back on videowalls, as large as 14x8 configuration, without any issues. I will double-check with our firmware designer to be sure. From what I know, I believe HDCP is supported by all versions of HDMI. Older TVs, specifically the ones without HDMI input (i.e. component, vga, rca or coax input), wouldn't be able to support this. Majority of our current controllers are compliant with HDMI2.0 spec. Someone else here asked about HDMI2.1a and I'm going to get an answer on that from our firmware designer this weekend.
Signage displays do not need to display protected content, they do not need to support standards like DolbyVision and they do not need to support three-two pull-down to display cinematic content without judder.
HDCP isn’t supported on all versions of HDMI it’s not part of the spec it is a standard that is supported on top of w/e display interface you are using.
You're right, firmware designer just confirmed that Digital Signage applications don't use HDCP (he actually said "prefer not to use") but he does believe that our controllers using MediaTek SoCs support HDCP and a separate firmware build could be made for this use case...or maybe even a toggle DIP switch could serve this purpose on the controller PCB.
The controllers may support it but you will be the one will have to buy a license and the keys from DCP.
Overall I don’t think you realize just how difficult your venture might be.
Signage displays will make really bad televisions.
They are optimized for long term operations over color accuracy and gamut, they usually operate at fixed refresh rates, and their latency is usually pretty terrible.
I don’t think you realize just how small the market for what you are looking to build in the first place is and how difficult it would be to make anything that is remotely useable as a display for movies and video games.
Those who want a dumb display can already buy one of those 50-65” enterprise displays for meeting rooms however they make pisspoor TVs and having to pay 3-5 times the price of even a high end TV makes them pretty pointless for even the user base that might want that.
The large format gaming monitors are an option but recently they are adding “SmartTV” features to them because that’s what most consumers seem to want.
Both LG and Samsung are making 30-40” gaming monitors with apps these days like Netflix because they seem to be quite popular with students and flatsharers.
I understand what you're saying. We work directly with Samsung. While HDCP may be a challenging idea due to licensing issues, obtaining LCD panels with faster response times and higher refresh rates isn’t a problem for us. In fact, these consumer-grade panels are much more readily available than actual industrial panels. If we encase these 32-40" panels in an industrial metal casing, we can create a product that is more desirable than similar models housed in plastic with SmartOS, and more affordable than an Apple XDR.
> we can create a product that is more desirable than similar models housed in plastic with SmartOS
I would not pay a premium for metal case - and I am not in the cost conscious above all consumer market. I am really skeptical there is a large contingent that would. In any event - maybe no smartos, but this minimal firmware would have to be beyond reproach, and support Dolby Vision, proper tone mapping, GSync/Freesync/VRR, isf calibration, substantial Rec 2020 gamut, and probably excellent upscaling. You’re never gonna attract the cost conscious market - the segment beyond that has very particular standards. I dont see how you can survive excluding either one of the gamer or cinema enthusiast markets - especially when the current high end TVs already cater to both. Just because the LG C- series of TVs are clearly a consumer product does not mean they are junk. And the software isn’t nearly as bad as people gripe about (at least in the higher end market where you’d be competing) - not enough to forego actual image processing features.
Elsewhere was dismissed consumer display color calibration as a “marketing gimmick” - it isn’t. These TVs have fantastic accuracy out of the box, but because they’re carefully designed and factory set that way. I think you are too dismissive of the existing market.
The panel isn’t the important part, how you drive it is.
TV manufacturers spend a lot of time and effort on figuring out how to drive these panels effectively and they build their own silicon for this.
Even tho most TVs use the same MediaTek SoC for the smart features as well as the I/O they have their own image processors and drivers on separate silicon.
I suspect that if you are serious about this you’ll be far better off partnering with an OEM who already manufactures TVs and just toning down the SmartOS features.
Getting a display to operate at 10ms or better response time and supporting VRR have the image processing required to display all types of media content, supporting HDR with proper tone mapping, getting the color science right and much more is going to be an monumentally complex task to achieve.
Hey, this thread is the first time I’ve ever heard anyone seemingly wanting HDCP, why is that? Isn’t everyone’s goal to strip it away from the video feed so it stops getting in the way? Is there some benefit to it I’ve missed?
Because if you don’t support it you won’t be able to play any protected content on it which means no AppleTV, FireTV or any other TV box at least above 720p SDR.
Thanks so it is just a roadblock. Please excuse me to keep going: most things either play or don’t play depending on that HDCP (yeash HD CP :s), but if I strip it away using say the second output of a splitter it will play. Is it the case that for example appleTV box will only output 720 if it doesn’t get some kind of positive handshake?
Allow me to add that every time I’ve removed it I’ve had full licensing and permissions, just that HDCP was in the way.
Yes if the source device does not get a handshake and a continuous one as there is one at least every 7ms they won’t display content or display it in a degraded form.
Same goes for not supporting HDCP of a specific version e.g. a display that supports only HDCP 1.3 but not 2.0 will be able to display 1080 Blu-ray’s but not 4K ones.
Yes in theory you can rip it off however that is very expensive and difficult especially with more recent standards (I’m not aware of any way to strip HDCP 2.1 and higher) and bandwidths.
Also to strip it you still need to get a license and keys so it would be quite difficult to do so on a commercial basis especially in a country with strong rule of law.
It's strange that a company as large as Walmart has a shopping category dedicated to "HDCP Strippers" at https://www.walmart.com/c/kp/hdcp-stripper. Upon further research, I stumbled upon numerous YouTube videos, such as https://www.youtube.com/watch?v=W5-PJpSfDJ8, where the user was able to strip away HDCP 2.2 and still achieve 4K@60Hz output. The user also mentions that it would work with a 120Hz panel. This raises the question of how Walmart is able to get away with this, despite the rule of law.
HDCP is separate from HDMI. You can be spec compliant with HDMI at any version and not support HDCP. That said I would be surprised that a display controller in common use would not support HDCP.
4K or 8K content sure, but what about HDR? Especially at sizes less than 40” that’s more important than res for viewing content.
Have you considered making the firmware open source? Or at least hackable so folks could run open source on it. The BoM and hardware design would be nice too, but thats a bigger ask :)
It would be so cool if it supported something like an insertable Raspberry Pi like these NEC displays did, but with standard consumer TV display characteristics (4k, > 60hz, etc)
I'm glad to see you are actually doing this. I always remember your post from a few years back about building Industrial monitors and talking about this idea out loud and me thinking: "Fuck yeah I'll buy one." That idea has always stuck with me.
All my TV's have computers hooked to them, mainly low power x86 systems and run Linux or 9front on them. Linux gets me a browser which makes into a useful internet toaster to watch hulu, browse and so on. Then I can open terminals and whatever while I'm on the couch or in bed. It's 1000x more useful than a smart TV or even a TV stick.
Is the rS232 protocol ASCII based?
Sizes I use at home range between 32 and 55 inches. And I both stand them on tables or wall mount.
I would love to keep tabs on this. I'm sure it would interest redditors at r/privacy too.
I've often though about going for something like a Sharp/NEC Display, because I often use my T.V. as a PC monitor and want to be sure it's not phoning home, but the idea of being able to slide in a compute module is also generally interesting. One of the things that always stops me is the quality and responsiveness. It would still be in my living room, so I still want a nice picture and features like variable refresh rate for devices like a PS5.
Sounds great! What price point are you aiming for? Will this be much more expensive than inexpensive Samsungs/Vizios that are subsidized by data collection?
I haven't done much market research in that aspect yet but we currently sell our industrial 70" display @ around $9-12k/ea depending on the configuration (ie type of metal shell aluminum/stainless/carbon-steel, ethernet support, protective polycarbonate screen in the front to prevent damage from industrial machinery). However, I expect the consumer grade to be much more cost effective considering the sizes in demand would are much smaller (32-40"). Since I have more experience in sheet metal design/fabrication for the enclosures, I still plan on building these TVs inside a metal shell (specifically aluminum) instead of a plastic housing like your standard Samsungs/Vizios. I'm open to market consensus as to what is the most favorable price point for the size.
FWIW, I'd be looking for a 65" or larger, but to be honest those prices probably wouldn't work for me. But regardless, I'll look forward to seeing your Show HN when the time comes!
We ship globally to our customers right now but I believe having a distribution channel for global retail would be much wiser. I'm definitely open to it once we get to that point.
How about having a built in open source capability that can be turned off/on (i.e. it has a basic firmware/OS, but can boot into essentially a linux distro)? To me that's what would be great, since it extends the main problem that's maintainability of the 'smart features'. Bonus points if the smart board is modular and can be swapped for a new version without much trouble.
I am so excited for such a TV, here are some name ideas for you: plainum-TV, gramseye, you-control-it(still comes with a remote), BarelySmart, SmartEnuff, MindYouTech, AllPane, SimplePaneVision, Plainorama, Plainoramic Vision.
HDMI such an abomination thrust upon the world. Why do we accept a TV going blank for seconds for any reason in 2023. Lawrd help us all. Can you optimize HDMI resync in any way? That would be a major selling point. Maybe blend the output between what was shown and what is about to be shownn (apply a gentle blur filter to the last seen frame for a few moments and fade to black after 3-4 seconds.. float a gentle explainer message “HDMI sync lost….” After 3 seconds)
this is the type of monitor I'd buy if the the spec and price was somewhat in range with the rest of the market. I love the idea of single-purpose device in an square metal case that made no compromises for esthetic or to try to fit in a showroom.
It's always frustrating to see manufacturers intentionally make their product worse just to look better in a store. I'm sure I'm not the only one who cares a lot more about image quality, latency, durability, etc. than having a fancy rounded plastic case with a laggy netflix integration.
- PrivateTV (only you know what you are watching, or talking in front of tv)
- yourTV (you own TV. It doesn't own you. You are not the product--it is.)
- dumbTV (smart choice)
Great idea if the panels are affordable (< $2K). A friend of mine had a Panasonic plasma display as a TV and it served him well for a long time. The only pitfall with a display is that there’s typically no speaker, which is fine with me as I would buy a soundbar anyways. I would presume since it’s a commercial display all the service-level display settings won’t be hidden?
what would you be doing that i couldn't get by just buying a display from one of the big name's "digital signage" lineups? like, for instance, any of these:
Large format or digital signage displays are not necessarily industrial displays. In fact, they would not last long in a real rugged manufacturing environment where they could be exposed to extreme heat, toxic chemicals, dust, weld arcs, and physical trauma caused by heavy machinery.
Our company caters to this market by designing and fabricating our own thick metal casings for our displays. We also bond our sourced LCD panels with an additional layer of 1/8” to 1/4” thick anti-glare polycarbonate sheets to help them resist physical trauma. We have also designed our own controller that operates these displays with a super minimal/barebones RTOS. Our controllers can be controlled via RS232, but unlike other displays, we do not have an onboard PHY (Ethernet connectivity capability). This is an industrial requirement that originated due to Stuxnet.
Majority of the large format displays in the market today are more like smart TVs than actual dumb TVs, and though they are great products, they would not last in a real rugged industrial environment. I can create large format displays without the smartness for consumer use at an approximate 50-70% of their price.
Mostly saving money. Digital Signage is what you want if you want a "dumb" display, but they don't have TV tuner (which may or may not be an issue) and also are rated for 24/7 use and thus much more expensive than consumer level TV's at the same size. They usually have minimal image processing and very low input lag as well.
If you don't mind dropping the money then go for it, but I think most people will want to go for consumer model for much less.
Also I am not sure if those support HDR or other features that may be desirable in a consumer display.
digital signage is <2x the price of consumer TVs. i'm skeptical that an independent can do a comprable product for less, when normal consumer TVs are subsidized by their data collection and advertising businesses, and operate at the scale they do.
HDR is a good point, i haven't had a chance to use any of those newer TV features so i'm not sure what i'm missing there. but that's the sort of thing i'd wait a few years for to see if they stick around, or if they're just a fad to sell more TVs like 3D was.
HDR is very well adopted. Basically all 4K content now includes it. 3D was indeed a fad. But HDR is standard on 4K movie as well as video game consoles (excluding Nintendo Switch)
Do you have some sort of mailing list I can subscribe to? I'm very interested in something like this, but I expect that I'll forget your post by tomorrow and never hear of it again unless someone sends me an email about it once it's available.
Are you considering an OS or even a somewhat powerful CPU? I'd prefer a TV with an ethernet + smb client, but not sure if it fits your dumb tv concept :)
Do yours have speakers? Because the problem with "industrial" displays I looked at was they were just that: displays. Not everything you might expect from a TV, like speakers.
I didn't see anyone else mentioning these when I started writing. Apologies if I'm repeating someone else!
Gamers and retro-gamers are distinct but related groups.
Both groups want ultra-low consistent latency as priority, with minimal to none post-processing by default.
For modern console and PC gamers, if your panel will support high(er) refresh rates (60Hz, 75Hz, 120Hz as ideal minimum, 144Hz as recommended, 240Hz as perfect) at standard resolutions (1080p, 1440p and 2160p) please include them.
If having AMD's standard variable refresh rate (FreeSync) is an option, that'd be great.
Both a Display Port and HDMI inputs ideally, but I understand if it's HDMI only.
For retro gamers, if you have to make educated guesses as to input settings or processing (deinterlacing for example) please give us the option to override the defaults. And I guess respect what you're told the input is. Don't assume 240p is 480i, or vice versa. You might think that's a given, but it isn't.
If you're going to include legacy inputs (like component video, composite, VGA) please ensure they're have low consistent latency.
Ideally include support for the (now) sometimes unusual inputs from classic consoles or the ability to add it later via firmware updates or as a stretch goal or whatever.
Or at least test the majors from PAL and NTSC regions: NES, SNES, N64, GC, Wii. Master System, Genesis, Saturn, Dreamcast. Playstation, 2, and 3. (Personal request for Atari Jaguar!)
But scalers like the OSSC and RetroTINK line exist for people who need to hook up weird stuff, so there's no massive need to bake it into the TV. So long as the HDMI inputs are low and consistent latency, anything else can be worked around outside.
If including legacy inputs is something you're thinking of doing, please have a quick word with Bob over on RetroRGB and anyone else he'd recommend speaking to.
[EDIT: Also give the option to display as much info as possible. Resolution, refresh rate, connection standard, audio codec, errors, HDCP standard and status etc.]
You may find yourself the primary supplier to the whole retro niche!
And if you become the primary TV makers for gamers, you will never run out of market or free advertising.
If you'll excuse an odd and personal somewhat scope-creep-ish request... If the colour data from the screen edges were accessible, so Hue Ambilight-style tech is implementable without having to point a camera back at the screen or intercept the HDMI stream inbound (as the Hue Play HDMI Sync Box does), that'd be awesome. No idea how feasible in terms of connectivity and dev time that'd be though.
Another thing I forgot to say out loud is this: I want it to be more DiY friendly and future-proof. I want it to be repairable and easily customized. Like offer a simple dumb-TV out of the box experience that offers hackers and the DiY crowd a platform to build on. Think MNT Reform or Framework laptops but for TV's. Like ensure the designs don't change much over time so parts are the same or similar between models/sizes.
It would be nice if there was some sort of bay to hide a computer inside of that lets the user pop a board in and turn their dumb TV into a "Smarter TV". Something like an ITX bay that can be left empty from factory. I say ITX because it's a standard that gives the owner the freedom to put the computer of their choice into it and upgrade path. RPi or other R-V/Arm SoM/SBC's adapters/carriers could be made to fit in the ITX bay. PSU would be tricky but MAYBE the TV PSU could offer an extra 60+ watts of "user power" 12V/24V DC to an internal molex connector and a little extra room to mount a DC-DC ATX PSU board? Maybe someone puts an FPGA board in there, whatever. Optional freedom bay FTW.
I know it sounds like a computer case at this point but I'd love a big ass All-in-one to neatly hang on the wall and run my Linux-Chrome Internet toasters with just a power cable and maybe Ethernet.
Top, side, and bottom mounting holes for optional speaker bars or custom speaker setups, cameras, accessories, decor, etc. Just locate a bunch of mounting holes, make the dimensions known and let the users or aftermarket fill in the blanks. If you offer audio, maybe let the user remove the speakers or replace them? Like maybe a bunch of M3.5/M4 holes on 100 or 150 mm spacing around the rear edge, closer spacing at the corners. A simple flat piece of sheet metal can be drilled and screwed to mount stuff. Or hell, mount the TV flush or frame it directly in reclaimed wood using those holes. Anything really. I cant tell you how many times I wanted to mount shit to the sides of my TV but cant. I even made a speaker bracket out of a piece of angle iron attached to the VESA mount and 1/4-20 screws to hang the speakers up higher off the sides because Sony's downward facing speakers are useless. But that is a big piece of metal and side holes would make my life easier. This is a nice way to enable modding.
Possible to accommodate a board that features an internal eDP, LVDS, DSI etc to the LCD controller? Not important but interesting.
DC power input? For a big 70" I can see low volt DC being impractical but if you plan to make smaller displays it might be interesting to have a 12 or 24V option. I work with industrial automation so I'm all about using 24V where I can. Also maybe a wide input DC supply for off-gridders running 12-60V solar battery banks. Maybe the power supply board and inlet are designed to allow AC-DC or DC-DC supplies installed/swapped in factory or field. And for DC in I'd want a pluggable screw terminal block aka "Phoenix blocks" as I don't like barrel jacks.
With all this potential I could allow myself to spend a lot of money on a hacker friendly TV. This is why people pay $1000+ for the MNT reform and framework laptop.
I'm all for non-smart TVs and have gone that route through not hooking up internet.
That said, I think there is a middle-ground here.
If its within your capacity, I'd consider allowing a USB connection. If I were able to attach a USB thumb-drive or an external hard-drive (powered though USB connection) and the TV provide a basic visual navigation system.
The nearest comparison is something like the current Roku USB app, or a more basic version of the Infuse App (https://firecore.com/), the library structure is a reflection of the file-system. I'll skip the details where a dedicated user can set the cover image for a given video by having a image filer with the same name as a companion to the MP4 etc...
I know that would be a huge jump, but I'd pay for it.
And which file formats and filesystems should that USB connection support? Should it support next-year's H269 format? Who will provide firmware updates to add new video formats?
That's my main objection to many of these all-in-one systems: the built-in technology support will be obsoleted before the rest of the system fails due to old age. I much prefer a dumb TV with a separate media box/dongle that I can replace with a new model every year to paying for a TV with soon-to-be-stale features.
That’s straight back in smart territories, and precisely not what I want. Just give me something that shows what it receives via HDMI. Give me two inputs I can switch over, maybe. That’s it.
Let me know your thoughts. Reference pix of 70" industrial display using Samsung LCD panel and in-house LCD controller with physical OSD menu buttons along with IR/RS232 control capability: https://imgur.com/a/k6zrH3s