Hacker News new | past | comments | ask | show | jobs | submit | ddt_Osprey's comments login

Seriously. These "enumeronyms" drive me batty.


I love this word, enumeronym, though. It seems you may have coined it, as the only Google search result is your comment. The second may well be this comment. I hope it catches on!


Google suggests that the correct term is "numeronym" [1].

> According to Tex Texin, the first numeronym of this kind was "S12n", the electronic mail account name given to Digital Equipment Corporation (DEC) employee Jan Scherpenhuizen by a system administrator because his surname was too long to be an account name. By 1985, colleagues who found Jan's name unpronounceable often referred to him verbally as "S12n" (ess-twelve-en). The use of such numeronyms became part of DEC corporate culture.

[1] https://en.wikipedia.org/wiki/Numeronym


All things considered, 84 bit encryption is pretty impressive, considering DES was still considered "okay" even into the late 1990's.

https://en.wikipedia.org/wiki/Data_Encryption_Standard

https://en.wikipedia.org/wiki/Deep_Crack


  If they ever...
Boiling frogs, my friend.

Boiling frogs.


  a typical cinema release is a 
  pretty big file - frame-by-frame 
  encoded as jpg2000 in 4k. Often 
  shipped on hard drives, or 
  downloaded via dedicated fiber 
  lines.
This is not a show stopper. All you need is one unlocked copy to drop in the open, and then everyone can work on it, and get it packed up in a more reasonable format.

  This all works due to proprietary 
  hardware - the modifications to the 
  4k cinema projectors that enable 
  decryption almost at the lens/imaging 
  chip can cost as much as the projector 
  itself, doubling the price of the 
  projector (last I heard from around 
  15k to 30k USD).
So, yeah, that's great, but the only level of piracy that could deter is the pixel-perfect variety. A motivated individual with a comparable budget could probably optically extract a high quality duplicate with an expensive digital video camera (and perhaps even a beam splitter), and pull signal from the speaker transducers with professional audio recording equipment.

There'd be no real need to tamper with proprietary hardware or intrude on the chasis of the projector. With reasonable equipment, it's likely that a person could reproduce a re-recording of a movie that is very-nearly indistinguishable from the factory source, perhaps except for negligible artifacts that might irritate only the holiest of true believers.


> a typical cinema release is a pretty big file - frame-by-frame encoded as jpg2000 in 4k

In a way it's shocking that huge cinema screens ONLY project 4k. When we had film instead of digital projectors, we actually had a much higher resolution on screen. It's a shame we replaced it with an inferior technology.


I'm not convinced. If you happen to see a fresh print, with no scratches, dirt, skipped sprockets, or speed issues then - yes - it is theoretically marginally better than a digital print.

Similarly, a highly priced vinyl player might have slightly better sound quality than a cheap CD player.

But for most people, digital Cinema and CDs represent a vastly superior experience.


Wait. Film is not just better for resolution, it has much better dynamic range and color depth than digital (digital is coming close, but still not there yet).

It's not because a technology is newer that it is necessarily better. CRT still holds some advantages over LCDs, Paper still holds advantages over e-ink, and so on. Of course Digital photography wins by a large amount for cost and convenience, but that's not the whole picture.


I work in the film industry, a 35mm full-aperture negative scan holds only about 2048x1556 pixels-worth, and even at that resolution the film-grain blobs are several pixels across (especially in the blue channel).


True and not true. I own few telecine and film scanning machines. Most of older film material, esp. 16mm isn't worth scanning over full 2k. 1556 in y on 16mm esp. important because lots of people want to reframe into 16:9 and 1556 gives more than 1080.

Anyways, most of old stock (print, negative, ip) have grain such that over cca. 2.5k won't yield anything better. However, kodak and fuji stock from mid to late eighties definitwly is worth capturing at higher detail. Highest-detailed stock from nineties we've measured to have spatial rwsolution of around 8.5k and that's about as high as you will get (rare) out of 35mm.

tl;dr; depends on the stock

edit: Here's previous mini discussion regarding telecines, if anyone's interested https://news.ycombinator.com/item?id=11377678


> I work in the film industry, a 35mm full-aperture negative scan holds only about 2048x1556 pixels-worth

There are enough good 4k releases of classic movies to see this is not the case. You can compare the 4k and HD release (which is just under the resolution you provided).

I regularly scan Vision 3 film (movie stock) and Velvia 50 and Velvia 100 (E6). I scan it professionally at 10 Mpix, but when I have a good picture, I pay big bucks to have it drum scanned at 100 Mpix. Yes, it's not 100 Mpix, there's about 15-25 Mpix worth of detail there, but it's much, much more detail than the 10 Mpix scan.


Apologies it does sound[1] like an absolutely perfectly shot 35mm frame can theoretically hold up to 20 million pixels-worth, but I've never seen this in practice in the vfx industry, it's almost always less than the ~3 million pixels that 2048x1556 provides.

http://pic.templetons.com/brad/photo/pixels.html


Some of those classic movies were shot on 65mm and 70mm. Others may have simply been upscanned, which will yield a sharper looking image nonetheless.


The Hateful Eight had a little media blitz when it first came out because you could watch it in a number of theaters that still do 70mm, and I think Tarantino was the reason.


Even if the film theoretically has a higher "pixel" density the projector smudged it out by not fully fixing the frame. Difital is sharp because the pixel raster does not move. Even the best cinema projectors wobble the individual pictures around.


Not just the resolution, but the contrast and gamut too (although the gamut in cinema projection is still much better than the more usual BT.709).


Film has a lot of noise. The resolution is not as high as you are imagining.

I've seen both, digital is better.


> Film has a lot of noise.

Film has grain, which is a type of non-chroma signal-dependent noise that is not unpleasant like chromatic signal-independent sensor noise or shot noise you get digitally in photography (not in movies, they have good lighting there).

In fact, grain is actually useful, as it's aesthetically pleasing in small amounts, can increase apparent sharpness. In fact it's so useful digital creations fake back in post-production a small amount of grain.

> The resolution is not as high as you are imagining.

The resolution of 4k projection is not as high as you are imagining either. It's just 4096x2160 lines (often less, depending on aspect ratio) or 8.8 megapixels.

In the best circumstances you can get 25Mpix of useful data from a 35mm frame shot on low grain stocks like 5203. In the worst circumstances you get 10Mpix. This is 35mm, The Hateful Eight was shot on 65mm stock, which has a camera frame surface are of 3.44 times the camera surface of 35mm film.

Either way, the resolution of 4k digital cinema is fine, I miss more the old contrast and gamut.


Maybe you don't find grain to be unpleasant, but I certainly do. I tolerate it when the source had it and there is no choice, but I certainly don't like it.

And no, it's not aesthetically pleasing in any amount. Directors that add it back artificially should be criticized. It looks like there is a swarm of gnats flying across the screen, and unless there are a ton of bugs in the scene it doesn't belong.

> This is 35mm, The Hateful Eight was shot on 65mm stock, which has a camera frame surface are of 3.44 times the camera surface of 35mm film.

It won't stay that way by the time you watch it, it will have been copied multiple times, and it's on 35mm again. The effective resolution as seen in the actual theater does not beat 4k, and you have additional noise which lowers the resolution even more.

The main thing though is that noise is worse than lack of resolution. Depending on how far you are from the screen more resolution is meaningless - but noise (like film grain) is always a problem.

> I miss more the old contrast and gamut.

I suspect you are missing what never was. Digital is better on both.


> It won't stay that way by the time you watch it, it will have been copied multiple times, and it's on 35mm again.

No, it's not on 35mm again. I've watched the 70mm projection of The Hateful Eight. At no intermediary point the film was on 35mm.

The intermediate film stocks and the print stock are very special low-sensitivity stocks (Kodak 2383 has ISO 1.6, Kodak 5302 has ISO 6) that have no discernible grain whatsoever. You can't see the Kodak 2383 grain with a grain focuser. All the grain comes from the original negative. You do lose some spatial resolution with multiple copies, but you don't add grain. I'll note that modern print film like Kodak 2383 can resolve 550 lines per mm, and contact printing has a minimal loss of quality due to imperfect optics. Granted, a showprint will always be better than a release print, but the copying process is much better than you give it credit for.

> noise (like film grain) is always a problem

Not really, dithering is essential for all video processing. Your very clean image is actually full of noise. Without dithering, videos (and movies) would look bad.

> I suspect you are missing what never was. Digital is better on both.

Absolutely nonsense. Kodak 2383 has a D-Max or 4, giving a contrast of 10000:1. Kodak Vision Premier Color has a D-Max of 5.5 giving a contrast of about 310000:1. Yes, not a typo, one-third of a million to one. There is no other technology that can achieve these deep blacks. Digital projection doesn't come any close, the best projectors today, (the ones that you won't find in any run-of-the-mill theatre) can barely do 2000:1.

DCI-P3, the color space used in digital cinema was modeled after print stock gamut, but because of technological limitations, it's not identical. Cyans and yellow has a higher chroma on film. Green on film has a higher chroma at the expense of luminosity. Other colors are pretty much the same.

The only way for digital projection to have a higher gamut is to switch to laser projection AND to retire DCI-P3. You need a green laser to produce more saturated greens.

Digital cameras are capable of recording a larger gamut than film, but unfortunately there is no technology to display these colors outside laser projection, and you need to switch the whole movie industry to a new color space.


Grain is not noise. Digital is cleaner because your pictures and movies go through tons of filters before you actually get them on screen.

If you go down to the microscopic level film goes way beyond the resolution you can ever hope to achieve with the best digital cameras out there. It's just not something that is expressed in pixels.


Grain is in fact noise. Noise: Things that are not in the actual signal.

> If you go down to the microscopic level film goes way beyond the resolution you can ever hope to achieve with the best digital cameras out there.

That simply isn't true. At the microscopic level film pixels look like "clouds" of colored ink. The pixels are not square, but they are there, and they are not the size of atoms, but much larger - you can see them in a typical microscope (I know, because I tried it).

Digital is cleaner not because of filters, but because the actual capturing technology produces a cleaner signal.

A couple of years ago there was still an argument, but these days the argument is over. Digital clearly won.


> At the microscopic level film pixels look like "clouds" of colored ink.

No, only color reversal film (slides) looks like clouds. Negative film, the one used in movies, looks like crystals (because they are crystals).

> actual capturing technology produces a cleaner signal

This is true, albeit for a film like Kodak 5203 or Kodak 5213, you'd have a hard time proving that this is true. These stocks are so smooth with good exposure and proper development (as it done when making movies) that you need a grain focuser to see the grain. Many blockbusters and big productions that you probably saw in a digital cinema were shot on this film, and you never knew.

> Digital clearly won

Digital is much cheaper, it's very good, and allows much easier intermediate processing (editing, vfx, color timing, etc). Of clourse it "won", although film is still widely used, and although archival copies are still done on film.

The fact that digital won doesn't mean that projection has better resolution (it's roughly the same for 35mm, but worse for larger formats) or better contrast (it's far, far worse for digital) or better colors (film has more saturated cyan and yellow, although this doesn't matter much). It sure has less flicker though.

There are still plenty of reasons why people shoot film, use a digital intermediate workflow, and print film today.


> A couple of years ago there was still an argument, but these days the argument is over. Digital clearly won.

You are using survivor bias to prove your point. My point is that Digital is inferior in SOME aspects to Film - and it still stands. The fact that Digital won has nothing to do with the facts to be considered when doing a comparison.


Which has been irrelevant ever since the late 90s since 99.9% of films since then has been digitally scanned from film (usually in 2K, sometimes in 4K), edited, and then reprinted on film from the digital intermediate.


Some first and second generation DCinema projectors are even 2K, these can still be found in operation. No, you probably won't notice... ;)


  Modern DRM schemes like AACS 
  employ methods to change keys 
  for new works. So when crackers 
  find a key that allows them to 
  circumvent the DRM scheme they 
  can't publish it or it would 
  soon get useless and they would 
  have to do the work to extract 
  a key again. 
Okay, just think about this for a minute.

Who would EVER care about a cipher key, when YOU STILL NEED TO EXPOSE THE RAW PIXEL RASTER, AND AUDIO CHANNELS TO THE END USER, IN ORDER FOR ANYTHING TO BE VALUABLE AT ALL?

Users will always eventually get the whole thing in the clear, in straight-up plain text somehow, eventually anyway. And all anyone needs is a buffer big enough to capture it, and it's trivial to assemble one.

You need to be able to watch a movie with the naked eye, and hear the sound with your ears. That's how movies work.

It's trivial to capture the raw data, and people have been living with NTSC quality picture and sound for decades.

This is not about perfectly matching the SHA256 hashes of the original MPEG artifacts. People just want a copy, and it's easy to skim one, somehow, one way (cracking) or another (brute force direct copies of the image frames and pulse code samples, at the signal source).


Yes, of course, but it's a lossy copy since you're de-compressing and then re-compressing the content. (At 4k is that as important? I dunno, I'm not a movie fanatic.)


I would not say a brute force copy must be necessarily lossy by default.

In theory, if the picture is revealed on a display, upon which the full pixel raster can be distinguished for every frame (in real time or slower), an optical capture may be performed, which preserves the original fidelity of the image, and one need not recompress as lossy for peers who have the capacity to receive the full duplicate.

All images are eventually revealed optically somehow. Such images may be captured directly, and corrected and restored according to original quality, if you have good video equipment.

Only one good copy needs to make it into the open, and then the cat's out of the bag.



Here it comes. The next wave of propaganda, in favour of precogs. The concept of precrime will save us from ourselves.


It's scary that it's only a small step away from thoughtcrime. George Orwell was eerily prescient.


Tracking people's movements near a murder has what to do with thoughtcrime, exactly?


It's a small leap to combine panoptic surveillance with predictive policing (the latter has been used in large cities already) - along with automated classification labeling "suspect" behaviour.

Note for example that it is rather likely, if you go from an historic "know good" dataset containing more convictions of poor people of colour, that you would automate racial profiling... Quite possibly mixing cause and effect in the training period. Consider further the implications of a company running prisons (paid per inmate) going into the business of selling city/country-wide surveillance systems to "assist police".


I think some people know exactly what that situation involves and they are inviting it with open eyes.

When you want to stay clean, you have others do the dirty work.

The ones that volunteer for the dirty work are usually regarded as equally disposable as those getting "cleaned."

Those calling for volunteers view this as a transitional period, from which they will emerge, unscathed. Unscathed, why? Madness is not to be ruled out. Nor hubris.


Not useless. Just less useful.

It pushes the capacity for your adversary to operate into (slightly) slimmer confines, which is a marginal improvement.

Otherwise, your adversary can behave in an unrestrained manner. No?


One might also argue that giving a violent prisoner left-handed scissors will slow them down and is therefore a marginal improvement.


I can't argue with that: if I had to choose between the two, I'd rather be stabbed by the non-dominant hand. Obviously not being stabbed at all is a much better choice (when available).


I guess. For the kinds of clients who would want this device, this would just be like making it so that you're only a little bit pregnant. You either prevent the eavesdropping or you don't. The only safe tactic is not bring the device anywhere you don't want to be tracked or recorded. Or to technologically prevent it from operating somehow in a way that doesn't rely on verification -- such as taking the battery out, if you can, or putting it in a verified faraday cage that you keep in a soundproof box if you can't. And if you're the kind of person with a life so interesting they need to worry about about phone implants, maybe take battery removability into consideration when you buy the device.


Jesus, I wasn't even going to read this article, but holy shit, am I glad I did.

  Using more precise sensors, the same MIT 
  researchers went on to develop systems that 
  can distinguish between different people 
  standing behind walls, and remotely
  monitor breathing and heart rates with 99 
  percent accuracy.

  A system called “WiKey” presented at a 
  conference last year could tell what keys 
  a user was pressing on a keyboard by 
  monitoring minute finger movements. 

  And a group of researchers led by a Berkeley 
  Ph.D. student presented technology at a 2014 
  conference that could “hear” what people were 
  saying by analyzing the distortions and 
  reflections in wi-fi signals created by their 
  moving mouths. 
Man, I was about to blow this off as yet another MAC address sob story. I totally was not anticipating doppler effects and radar-style reflection/signal strength analysis.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: