Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Luster Lost: Pondering the way that physical objects degrade over time (tedium.co)
40 points by benbreen on March 13, 2024 | hide | past | favorite | 14 comments


Their article about the Fisker review by MKHBD (https://tedium.co/2024/03/04/mkbhd-fisker-negative-review-fi...) is also great. It expands on the idea that:

    the malleable nature of software in the modern era means that it’s now possible to release products before they’re actually ready.


There's an argument in this article that digital media can't get any "better" because the 1s and 0s can only be replicated, there is no opportunity to remaster it with improved digital tools. And then a nod to the fact that N64 emulators can significantly upscale the media (which reminds me of a similar discussion about how the pixel art of old games looks significantly better on real CRTs because of the way the artists made the sprites to interact with CRT tech).

It isn't really the case that digital reproduction can't be improved while analog representation can (with improved tech), however. It points at a more interesting discussion (to me at least) about how any preservation effort is a kind of curation. It is a political act, albeit a subtle one.

Even the brand new media that I consume today is generally curated. My "smart" TV does a wide range of modifications to the digital image it displays before it reaches my eyes. Getting it to give me an "honest" version of any video is remarkably difficult, and that's before it even hits the physical layer of anti-glare coating or intermediary layers like ambient-light dimming.

And who's to say what the canonical version ought to be anyway? Was digital input assuming a particular interpretation for its output? Perhaps it was targeting some assumptions about the average display? Did you really do it right if you try to reproduce a book, and you reproduce it as if it was freshly printed, even though it was first printed 140 years ago? Or did you "do it right" if you emulate the effect of the book freshly printed 140 years ago and then very very carefully stored for 140 years?

Back when I was a professional musician, I would always listen to the recorded "digital canonical" version on med-range monitors, high-end headphones, and super cheap car-stereo speakers, to get a sense of how it would sound in each medium. But I can never know exactly how it hits the record that really matters, which is the particular influence on the brains of the listeners.

Ultimately, you do what seems right, which in turn has more to do with evolution than correctness, even if your goal is transparent correctness.


It is now interesting that there are CRT shaders available for emulators. I just found a discussion about C64 CRT shader and people do share their preferences: https://www.lemon64.com/forum/viewtopic.php?t=79062


I agree that it's super cool and super interesting, but it's also never going to be a true match. It's not as big a deal with CRT imho, but if you've ever seen a phosphorus vector-graphic display (such as used in older oscilloscopes or in the original Asteroids arcade cabinet), then you'll know what I mean. New displays can look absolutely amazing, and can emulate aspects of older displays, but I'm skeptical of current CRT emulation, and I've never seen a modern display that could emulate what happens with a phosphorus vector display.


This demo is tells me that we're relatively close already, and that HiDPI should eventually hit a spot where you could use pixels to represent the ion gun screen gaps and the bleed effect at the edge of subpixels.


HDR will help too, I think. It isn't just about the resolution, but the dark darks, and the bright brights.


Ah, true. Black for the gaps.


Remember when .PNG included a gamma correction to try and make a more accurate image -- Appearing as the creator viewed the image...

And then WE ALL REJECTED THIS INSANITY because it made .PNG images a pain in the butt to work with since the gamma corrected image wouldn't match the RGB values of the surrounding document (see: CSS color codes). Then Ye ol' .GIF enjoyed being the pixel perfectionist's choice of image format for the web for quite a while longer. I once was forced to write a script that chopped up a 24bit images into a bunch of 16px by 16px .GIFs (one palette entry per pixel, 256 total).

Digital doesn't usually need to be restored as long as it is replicated often enough (before bit-rot sets in). However, I've got a large number of tools for restoring spinning disks (migration to new hardware isn't easy for the average end-user).

Let's say a modern game came out that had a capability to demand of a GPU more polygons/paritcles than capable today... but in the future those capabilities might exist. Digital media could be improved by adding more/better compute resources (if originally designed to scale, that is). Then there will be curmudgeons (like me) that think things were better before the edge users' hardware became a giant supercompute cluster / distributed storage...


I did have to help a designer once figure out why colors weren't matching and it came down to Photoshop ignoring the gamma correction settings he had asserted and trying to 'help' by correcting from Mac to PC anyway. Thanks Adobe. If someone is typing in hex values, leave them the fuck alone.


If the models from the original CGI are stored and not just the generated media, then one should in fact be able to digitally remaster a CGI movie.

Edit the models to ease the parsimony of triangles, rework the textures , render on a modern engine with cleaned up effects, higher DPI and color accuracy, repeat until done.

It could even be a model for buying defunct CGI studios for their data and rights. In case some cult classic they worked on wants to be rereleased some day at 16K.


I'm in that weird med-to-lo fi club.. many times I enjoyed the music more on FM quality links, many times I felt more while watching degraded VHS content on youtube (for human reasons like the fact that people were fully invested in their art even if the medium was low resolution, and sometimes because analog limits added to the video in a strange serendipity, like camera lens glare making live music shine even more magically)


Digital data, if considered in some Platonic sense, doesn't degrade. However, the physical substrate required to store said data does absolutely degrade over time: hence the need to copy data to new media periodically. 'Tain't no way to get around the Second Law of Thermodynamics.


Not only the physical substrate, but the platform it runs on. Even 30 years (not much for a library/archivist) old software can bring substantial challenges. Much more so if it was a closed system like a game console, or DRM protected binary that may or may not work in an emulator (if said emulator exists).

I do wish that published software had a maximum license life, much like patents that would allow archiving.


I'd construct a sad caveat to this, but Randall Munroe did it better than I can already: https://xkcd.com/1683/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: