slightly off topic: but on the topic of AI coding agents making up apis and features that don’t exist, I’ve had good success with Q telling it to “check the sources to make sure the apis actually exist”. sometimes it will even request to read/decompile (java) sources, and do grep and find commands to find out what methods the api actually contains
> This grain, formed from tiny particles during the film’s development, is more than just a visual effect. It plays a key role in storytelling by enhancing the film’s depth and contributing to its realism.
I never understood the “grain = realism” thing. my real eyes don’t have grain. I do appreciate the role of grain as an artistic tool though, so this is still cool tech
The article points out the masking effect of grain, which hides the fake-looking compression artifacts, and also the familiarity/nostalgia aspect. But I will offer an additional explanation.
Look around you: nearly all surfaces have some kind of fine texture and are not visually uniform. When this is recorded as video, the fine texture is diminished due to things like camera optics, limited resolution, and compression smoothing. Film grain supplies some of the high frequency visual stimulus that was lost.
Our eyes and brains like that high frequency stimulation and aren't choosy about whether the exact noise pattern from the original scene is reproduced. That's why the x265 video encoder (which doesn't have grain synthesis since it produces H.265 video) has a psy-rd parameter that basically says, "try to keep the compressed video as 'energetic' as the original, even if the energy isn't in the exact same spot", and even a psy-rdoq parameter that says, "prefer higher 'energy' in general". These parameters can be adjusted to make a compressed video look better without needing to store more data.
They definitely do at night when it's dark out. There's a kind of "sparkling" or "static" that comes in faint light.
Fortunately, our eyes have way better sensitivity than cameras. But the "realism" just comes from how it was captured using the technology of the day. It's no different from phonograph hiss or the way a CRT signal blurs. The idea is to be "real" to the technology that the filmmaker used, and the way they knew their movie would be seen.
It's the same way Van Gogh's brush strokes were real to his paintings. You wouldn't want his oil paintings sanded down to become flat. It's the reality of the original medium. And so even when we have a digital print of the film, we want to retain as much of the reality of the original as we can.
Your Van Gogh analogy makes sense for old movies. It doesn't quite explain why we're still adding grain to new movies, except for those few which are purposefully evoking older movies.
Even modern cameras have grain. If you need to integrate your scene with motion graphics, background replacement, or vfx, you'll need to remove grain on part of the image, edit it, add the original grain back where possible and synthesize new grain elsewhere.
Often it can also make sense to modify the grain for aesthetics. Denoising usually produces a less detailed result, but what you can do is denoise only the color channels, not the brightness channel. Brightness noise looks normal to us, while color noise tends to look very artificial. But by keeping the brightness noise, you avoid losing detail to the denoiser.
We don't use obvious grain, usually. That's generally precisely to evoke something about the past -- for flashbacks, a period look, etc. A sense of grittiness like 70s movies, etc.
On the other hand, a small amount of constant grain or noise is intentionally often introduced because otherwise images feel too static and end up looking almost fake. Similarly, dithering is intentionally added to audio, like mastering CD's or tracks. It helps prevent artifacts in video and in audio.
...what makes the frame rate thing especially weird is how many TVs ship with motion smoothing enabled by default, which looks so much worse than actual high frame rate content would.
I don't like it at all. But someone must, or the TV manufacturers wouldn't do it.
Part of me thinks that it's only the people who know what "frame rate" means who prefer low frame rates, and a majority of the general public actually prefers high frame rates but lacks the terminology or knowledge to express this desire.
People are always trying to rationalize and justify aesthetic preferences. The depth and nuance of your understanding of a thing will change how you perceive variations of that thing, whether it's guitar tonewoods, style of music, types of paint, flavor of beer, or the grain in film. If you know a lot about a subject, you can tell a lot about the history of a thing, and that's going to change how you feel about a thing.
A child watching a Buster Keaton skit and gasping and giggling and enjoying it is going to have a different subjective aesthetic experience of the media than a film critic who knows exactly what type of film and camera were used, and what the meaning of all the different abstractions imply about the scene, and the fabric of Keaton's costume, and so on, and so forth.
Subjective aesthetic preferences are in the realm of cognition - we need a formal theory of intelligence mapped to the human brain, and all of these subjective phenomena collapse into individualized data processing and initial conditions.
There's something about film grain contrasted against clean cel animation which might make it easier for people to suspend disbelief. They are conditioned to think that absence of grain is associated with unreal animation, particular types of media, and CGI. Home video and news and so forth had grain and low quality, so grain gets correlated with "real". In my view, there's nothing deeper than that - we're the product of our times. In 40 years, media will have changed, and it may be that film grain is associated with surrealism, or edited out completely, as it's fundamentally noise.
The way I see it is that grain makes the film look more detailed than it really is, it can also hide compression artefacts and blurriness.
I don't know the psychovisuals behind that. Maybe it adds some high frequencies that compression often washes out, or maybe acts like some kind of dithering.
As for your eyes, I am pretty sure that they have grain, that's how quantum physics work, you just don't perceive it because your brain filters it out. But again, I don't know how it interacts with film grain.
Video signal without the noise or grain is annoying to watch as it makes everything in the ”out of focus” zone look smooth blurry. Your eyes want to focus yet it is an illusion of depth without an actual depth. Noise texture emphasizes that this is just a 2D plane after all so your eyes can rest and the viewer doesn’t feel like they need glasses. This is just my theory of it based on observation. No research behind it.
This reminds me of modern windows having fake panes. They’re just strips that are applied to give the impressions that there are multiple smaller panes because people are used to that and it feels “correct”.
I have to imagine past glassmakers would have been absolutely enthralled by the ability we now have to make uniform, large sheets of glass, but here we are emulating the compromises they had to make because we are used to how it looks.
> They’re just strips that are applied to give the impressions that there are multiple smaller panes because people are used to that and it feels “correct”.
It is more than just 'feeling correct': windows and their various (sub-)elements that make them up (can) change the architectural proportions and how the building is perceived as a whole:
It is similar with columns: they're not just 'tall-and-narrow', but rather have certain proportions and shapes depending on the style and aesthetic/feeling one wishes to convey:
I strongly doubt that multiple smaller panes would have ever become a common style if we could have always made large glass panes. This is a perfect example of people becoming very used to a style forced by a technological limitation that is emulated even after the limitation doesn't exist.
It used to be a bigger deal (when digital cameras started being used) since people felt like digital video didn't look real/as good - movies shot on film were generally better looking (as crews were used shooting with it and digital video wasn't as sophisticated as today) and HAD grain.
It might be that there is a large part of the population that still has that association.
Cinephiles are also more likely to watch older (i.e. with grain) movies that ARE well shot and beautiful (which is why they are classics and watched by cinephiles) and not see bad film movies, only the cream of the crop, while being exposed to the whole gamut of quality when watching todays movies shot digitally. Would reinforce that grain = good while not being necessarily the case - and their opinion might be heard more than gen pop.
At any rate, it can be a neat tool to lower sharpness!
23.976 fps has been put on a pedestal as the "correct" look. Just look at the reaction to The Hobbit. However it does provide some objective advantages. 60 fps requires more lighting. Adding more lights means more electric setup and heat for actors in heavy makeup and costume. In post production that's more frames to edit.
23.976fps is only "correct" when telecining to a 59.94Hz format. 24fps round is the "correct" format in terms of actual filmmaking. I don't know of many films specifically shot at 23.976fps.
It's funny, because I've always had a subtle visual snow that looks like a film grain on top of everything I see, and for a long time I thought everybody had it and that that's why artificial grain is added, to make pictures appear more realistic.
I've since learned that not everybody sees the world like I do, but I still do love to see grain and noise in pictures.
Only RGB noise I often find dreadfully ugly when looked at up close, which is a shame, since that is exactly what most color cameras include.
film grain adds realism in the same way that high frame rate films look wrong or vinyl sounds "warmer" or tube guitar amps sound "better" - It is what we are used to.
grain and 24fps and widescreen trigger certain contextual emotions around the movie-watching experience. remove them and your brain contextualizes the video very differently.
this is likely the result of ~100 years of film-based filmmaking and projection. hell, we still call it filmmaking.
Yes, it is only the result of familiarity. We could gradually increase the frame rate of movies made in a year by 1 fps per year and then no one would even notice after 24 years every new movie would be 48fps.
no i will be watching 24fps films for the remainder of my life, which may be 40 more years. if all new films went up 1fps per year i would still go out to the movies in 20 years and be like “wtf is this crap?”
it would take a generation or more to eradicate this cultural context. casablanca is never going to be in 48fps.
Where did I refuse to watch anything? No amount of this attitude is going to make Casablanca or The Godfather or Pulp Fiction or Infinity War or Star Wars or Fight Club or 2001 or Top Gun be in any framerate other than 24fps.
If they start making films in higher frame rates today, that won’t change that fact. Do you think people are going to stop watching films from the 50s, 60s, 70s, 80s, 90s, and 2000s?
It would take a concerted effort, beginning now, at least an entire generation (30-50 years) to remove or reduce the cultural impact of the 24 frames per second widescreen tradition. No such effort is presently underway, because higher framerates don’t help movies’ visual storytelling, and they may hurt it.
The Hobbit was Peter Jackson’s attempt. It went nowhere.
"no i will be watching 24fps films for the remainder of my life, which may be 40 more years. if all new films went up 1fps per year i would still go out to the movies in 20 years and be like “wtf is this crap?”"
The Hobbit looked great at 48fps. Higher FPS failed only because reactionary people as yourself irrationally rejected it because "it looks weird!".
You need to understand that 24fps is a compromise chosen to save film. It is the bare minimum frame rate to have smooth motion under most but not all circumstances. It really hurts action films because anything moving too fast has excessive motion blur.
Permanently rejecting faster frame rates for movies is like rejecting printed text in favor of handwritten manuscripts. You are rejecting a technological advance for extremely arbitrary reasons. If 23 or 25 or 30fps had become the standard you would be insisting that it was just as special.
That’s correct - because then we’d have 100 years of movies embedded in our collective cultural shared experiences that are 25 or 30fps.
The point is the cultural significance, not the specific framerate. The chosen framerate is now significant because of the body of work that has been done in that format, and the inextricable experience of that framerate with that body of work.
It is anything but arbitrary! It’s a very real thing. The tyranny of the installed base is a well-documented phenomenon. In this case the installed base is the fond memories of a few billion people. Pretending that they are all irrational for not wanting higher framerates is plainly objectively incorrect, as a point of fact.
Grain = realism because real captured grain isn't total random noise. It's authentic noisy data. It's part of captured scene. It adds subtle tiny but real detail to the scene. Unless I am corrected here and that real grain is also total random noise.
Film grain can create stochastic resonance with the underlying ground truth. In practice, this can improve the perceived image quality over having none.
Honestly to me it reads like a solution looking for a problem. I’ve never considered non deterministic imperfections that happen during recording of the movie to be essential to storytelling.
not quite ads in LLMS, but I had an interesting experience with google maps the other day. the directions voice said "in 100 feet, turn left at the <Big Fast Food Chain>". Normally it would say "at the traffic light" or similar. And this wasn't some easy to miss hidden street, it was just a normal intersection. I can only hope they aren't changing the routes yet to make you drive by the highest bidder
I've had this done at a sufficient variety of different places that I don't think it's advertising.
I'm also not particularly convinced any advertisers would pay for "Hey, we're going to direct people to just drive by your establishment, in a context where they have other goals very front-and-center on their mind. We're not going to tell them about the menu or any specials or let you give any custom messages, just tell them to drive by." Advertisers would want more than just an ambient mentioning of their existence for money.
There's at least two major classes of people, which are, people who take and give directions by road names, and people who take and give directions by landmarks. In cities, landmarks are also going to generally be buildings that have businesses in them. Before the GPS era, when I had to give directions to things like my high school grad party to people who may never have been to the location it was being held in, I would always give directions in both styles, because whichever style may be dominant for you, it doesn't hurt to have the other style available to double-check the directions, especially in an era where they are non-interactive.
(Every one of us Ye Olde Fogeys have memories of trying to navigate by directions given by someone too familiar with how to get to the target location, that left out entire turns, or got street names wrong, or told you to "turn right" on to a 5-way intersection that had two rights, or told you to turn on to a road whose sign was completely obscured by trees, and all sorts of other such fun. With GPS-based directions I still occasionally make wrong turns but it's just not the same when the directions immediately update with a new route.)
Landmark based directions rather than street names does seem like a plausible explanation. I still have some childhood friends whose houses I don’t know the street address but I know how to get there
I still prefer street names since those tend to be well signed (in my area anyway) and tend not to change, whereas the business on the corner might be different a few years from now.
I am still waiting for navigation software to divert your route to make sure you see that establishment. From your experience, it seems like we're close to that reality now.
"Continue driving on Thisandthat Avenue, and admire the happy, handsome people you see on your right, shopping at Vuvuzelas'R'Us, your place for anything airhorn!"
oof, I’m not sure if I’m proud or ashamed of having an idea in the “torment nexus”. I believe I heard of the idea in some of the discussion surrounding a patent from an automaker to use microphones in the car for a data source for targeted ads. Combine that with self driving cars and you could have a car that takes a sliiight detour to look at “points of interest”
Most users want the best directions possible from their maps app, and that includes easily recognizable landmarks, such as fast food restaurants.
"Turn left at McDonalds" is what a normal person would say if you asked for directions in a town you don't know. Or they could say "Turn left at McFritzberger street", but what use would that be for you?
Although I've had Google Maps say "Turn right after the pharmacy", and there's three drug stores in the intersection...
I had an old Tesla M40 12 GB lying around and figured I’d try it out with some 8-13B llms, but was disappointed to find that it’s around the same speed as my mac mini m2. I suppose the mac mini is a 10 years newer chip, but it’s crazy that mobile today matches data center from 10 years ago
as I understand it, Discover gets to charge higher fees on debit cards (than eg visa and mc) due to some regulatory carve out, so capital one wants to take advantage of this and Discover’s payment network
Are you talking about the Durbin amendment? That's available to every bank with assets under $10B, not just Discover. If Capital One wanted higher swipe fees, there's hundreds of small banks that they can partner with to issue the cards. That's what many neobanks do.
However, Discover is (largely; I believe at least Pulse is available to issuing banks other than Discover) also a three-party network, which is inherently exempt from the Durbin amendment: https://www.congress.gov/crs-product/R41913
Yes. This is exactly the same special advantage that American Express also has. However, Amex lacks enough know-how and presence in retail banking to really be able to issue enough debit cards to leverage this the way Capital One could.
The effect of this is that merchants end up hating taking Amex since it costs more, and I foresee Discover (which is even more niche) ending up the same way - there will simply be no reason to bother accepting Discover at all. How many people only have a Discover debit card in their wallet and no other?
I only have a Discover debit card. Discover pays 1% back on its debit cards, which is unusual. I have idly wondered why they did that. The fact that they're not subject to the restrictions on debit fees makes it clear why.
I shop for name brand stuff at Winco, a grocery chain in the Northwest US that doesn't take credit cards and makes you bag your own groceries. I'll be curious to see if they stop taking Discover debit cards if the fees increase.
It does if they decommission most of them in the name of "efficiency". You may recall the same thing happening to Twitter at the behest of its new owner, who happens to be the new owner here as well.
this thread is helping me understand why I always thought 3D movies looked _less_ 3D than 2D movies.
That and after seeing Avatar 1 in 3D, then seeing Avatar 2 in 3D over 10 years later and not really noticing any improvement in the 3D made me declare 3D movies officially dead (though I haven’t done side by side comparisons)