Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't see why that would disqualify you from what I said. Plenty of tech-savvy people are hung up on 24fps or vinyl, too. It can happen to anyone.


>Plenty of tech-savvy people are hung up on 24fps or vinyl, too. It can happen to anyone.

Maybe take the opposite reading of this observation? E.g. that those tech-savvy people know what they are talking about, and are not merely nostalgic or whatever?

First, vinyl vs mp3/WAV is not the same as 24fps vs higher frame rates, and conflating technical issues (each with their own characteristics and/or tradeoffs) is not really illuminating the subject matter.

Some things are not merely an issue of "technological capability" but tied to human physiology (the eye, etc).

The same way that a screen with 50,000 lumens is not naively "brighter = better" but blinding, or a 150dB headphone is not "louder = better" but physically damaging the ear.

Yeah, 150dB dynamic range is great, but nobody can hear it, and nobody can tolerate the upper range of volumes it takes to reproduce the full range on a speaker.

With color, on the other hand, it's not like that. The more, the better (24bit, 32bit, 64bit, etc).

With fps, again, there are issues related to the eye, how the "afterimage" works, when visual information becomes too distracting or overbearing, etc.

Note also that we, for example, have been technically lowering the resolution of photographs (smoothing the skin) to make portraits appear more pleasant since forever. It's one of the main staples in fashion and magazine portrait photography.

And that's not because "some old-fashioned people like smoother skin vs detailed skin". It's inherently better looking unless one likes wrinkles, pores and detailed nose hairs.


I'm not exactly clueless myself. I've yet to see an argument for lower-fidelity technology that is remotely convincing.

Reality is basically infinite fps. We all get by just fine, and the general consensus is that the effect is quite pleasing as long as the subject matter itself is.

Smoothing skin in photographs is entirely different. That's a selective effect applied before it reaches the display. Using higher resolution doesn't somehow force photographers not to airbrush.

As another commenter said, sometimes people deliberately decrease fidelity for effect, like filming in black and white after color was available. Doing it deliberately in a specific context for one work is totally different from a blanket declaration that 8k is fundamentally worse than 4k.


>I'm not exactly clueless myself. I've yet to see an argument for lower-fidelity technology that is remotely convincing. Reality is basically infinite fps.

Extreme close ups, action sequences, camera panning and traveling are not part of reality. Nor do we change camera angles several times per minute in a discontinuous fashion. Cinema is not a full real world simulation, it is a technical way to tell stories.


Sure reality has extreme close ups, action sequences, and panning. I'm not sure what traveling refers to, since the obvious interpretation is absurd. I don't get the argument with changing camera angles. If our eyes really do need 1/24th of a second to adjust for a cut, then a high-fps film can replicate that with multiple blank frames.

I get that cinema isn't reality, but the more capable the medium is, the more choices the filmmaker has for telling their story. I can buy that 24fps might be the best choice sometimes. I don't buy that it's the best choice for everything. It would be a crazy coincidence if a framerate chosen a century ago due to technical limitations when dealing with cellulose just happens to be the perfect framerate for cinema.


HFR and 8k require rewriting the rules of set design, makeup, lighting and post-production for non-documentary work as they expose the artificiality of fictional narrative. Rather than feeling like you're in the movie, you feel like you're on the set, seeing all the fake backdrops, cables, and acne. HD (and 4k) had similar issues, but not to the extent that HFR and 8k does. Although both HFR and high resolutions do work extremely well for sports and nature documentaries where suspension of disbelief is unnecessary, at the moment both are worse for narrative filmmaking using current techniques.

The Hobbit (shot 48fps 5k 3D)had mixed results using HFR. The scenes shot on location with less CGI looked awful (like low budget early 80s BBC mid day dramas) while the green-screen, heavy CG scenes felt like being in a video game (in mostly a good way). Billy Lynn's Long Halftime Walk (shot at 120fps 4k 3D) looked absolutely horrible, like early HDCam home movies. It was impossible to get swept up into the movie and made an ok script and good acting feel much worse than it actually was.

I do believe someone will crack the code on HFR, but it will first require the right source material (The Hobbit and Billy Lynn's Long Walk Home were not it). I suggest utopian sci-fi or something set in a sterile environment. But even beyond that they have to figure out the lighting, makeup and set-design (and the extra burden HFR and 8k puts on post-production, especially for CGI/VFX).

One element that actually helps make a film feel cinematic is a slight softness to the image. The best looking digital cinema uses on camera filters and/or post-process to help achieve this look that comes naturally from film shot at 24fps.

Younger audiences who've grown up on HD and HFR video games are less bothered by the differences, but audiences usually don't really know what they want until they see it (one reason that early audience feedback is poison to the process).

background note: 20years experience working in production and post-production


I don't like the "feel cinematic" argument. It's rather circular: what feels cinematic is defined by what we're used to, so of course anything new won't feel the same. That's not an argument against the change, unless you want everything to stay static forever.

As for the rest, I don't doubt that it's hard, requires new techniques, isn't always the best choice, etc. But I don't buy this idea that it's always worse. Which doesn't seem to be the argument you're making, but it is the one I was responding to.

I'm getting a lot of good arguments about why certain videos should be shot using less than the maximum possible. But that's quite different from saying 8k and HFR is just plain worse.


> With fps, again, there are issues related to the eye, how the "afterimage" works, when visual information becomes too distracting or overbearing, etc.

There is no physiological phenomenon that makes low frame rates more natural or appealing. The real world doesn't have a frame rate. The preference for low frame rate is learned. No one ever selected 24 fps because it was better. They selected it because it was technically feasible at a reasonable cost and crossed the line into acceptable fps.

> Note also that we, for example, have been technically lowering the resolution of photographs (smoothing the skin) to make portraits appear more pleasant since forever. It's one of the main staples in fashion and magazine portrait photography.

This is wildly different. Firstly because most things on screen are not human skin and secondly because we've been smoothing skin in real life for centuries with makeup. The desire to see skin as smooth and flawless doesn't mean that people generally want the world to be blurry.


With higher FPS i feel like i see more of the flaw on acting. Like im watching theater. I think low FPS increases the expérience by letting our brain fill some gaps.


>Firstly because most things on screen are not human skin

Most, no, just the most important (actors' faces).


Depends on the movie. Pretty sure pores weren't problematic for Avatar, or March of the Penguins, or many other movies. For the majority of movies that do focus primarily on humans, there are typically still scenes (e.g. outdoor panoramas) that clearly benefit from high resolution. And a competent film crew should be able to soften the look if it is appropriate. It's slightly absurd to suggest that 4K is the max resolution we should allow because some films might be diminished by the use of 8k.


It takes more than just a competent film crew to adjust for 8k and HDR. New lighting, makeup and set design techniques have to make up for the higher fidelity. Suddenly the line where makeup is applied is highly visible, as happened during the switch over to HD (this led to makeup artists switching to air-brushing in the early days). Set design needs to look even more real than before (if you've ever been on set, you probably noticed how fake everything looks, yet on screen the flaws disappear). This will take time and experimentation. And money, money that the industry really doesn't seem willing to spend yet as they've spent tons moving over to a 4k pipeline. It'll take a blockbuster success from someone like Cameron releasing an 8k HDR film for the industry to even seriously entertain the idea (and they'll be more gun shy since the 3D push wasn't as successful as they hoped). And unless annual attendance shrinks drastically, they have little reason for the extra expense (and theater owners won't want to bear the cost of the upgrade since they're still working on the upgrade to 4k).

Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.

All this is to say, it is much more complicated than you make it out to be.


> It takes more than just a competent film crew to adjust for 8k and HDR....

Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.

> Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.

Sounds interesting, and promising, though I agree that it seems unlikely to be widespread anytime soon, even if it works beautifully.

> All this is to say, it is much more complicated than you make it out to be.

That's an odd comment to end on. At no point did I ever say it was simple. I said it's absurd to treat 4k like it's the pinnacle and anything beyond that is somehow actually a loss.


> All this is to say, it is much more complicated than you make it out to be.

Sorry, I think this was meant for another comment, not yours, as I don't see the sentence I thought I was responding to in yours.

>Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.

This takes more than a "competent film crew". A competent film crew should have no problem working in well established techniques and workflows but wouldn't necessarily be prepared to venture outside that. If I was directing a production in 8k, HDR, VR, 3D or any edge cases, I want more than a competent film crew. I want creative thinkers and problem solvers. I want crew members who have experience on a wide range of projects, everything from digital video to imax (you might be surprised at how often even the crews of big budget productions have limited experience outside the status quo).

In the early days of the RED camera, the best footage came from cinematographers who had worked in lower budget HD productions, not film cinematographers. The HD crews had already been working in similar workflows, but the competent film crews were flummoxed by this one piece of equipment and even though they could see the results on set, they would still send back footage that was way underexposed and often unusable (and this was often from very well respected and experienced cinematographers).

I believe few people in the industry believe that 4k is the pinnacle. But most do believe that the technology to move to 8k is not even close to ready or worth the added cost, that workflows for 4k are just now becoming standard (the majority of projects are still finished in 2k, although that will change with distributors like Netflix now requiring 4k delivery). And that audiences won't care enough about beyond 4k enough to pay extra. Are you ready to pay extra for an 8k screening? The theaters have to recoup the cost for new projectors while they're still paying off the brand new 4k installs. Oh, and there aren't many cinema lenses that can cover an 8k image (especially since many DPs prefer the quality of older lenses).

There are old timers that lament the loss of film and resist moving from 24fps. But they'll be replaced by the younger generation who will be more open to experimentation and pushing the medium beyond its limits. The industry is driven first and foremost by profits, so once the pencil pushers see profit in 8k and HDR, the whole industry will move in that direction.


In fairness, I did not say that the transition to 8k was easy or that any "competent film crew" could make a beautiful 8k film or leverage 8k to its max. I said that a competent film crew could soften the look if appropriate. The lazy fix for "too sharp" is to just scale down to 4k or throw a slight blur at the frames. A slightly less lazy fix would be use lenses that yield a softer focus (which I assume some of the insanely expensive film lenses can deliver).

But I wasn't saying it's trivial to leverage 8k well, just that if 8k is too much to deal with in some cases, effectively reducing the resolution seems a tractable problem.


What makes it difficult to mix frame rates and resolution? I would naively assume that you could just record each scene with whatever framerate and resolution you wanted, combine the whole thing into a single movie with the max resolution and framerate you used out of the bunch, and be done. For example, if you put 24fps material into a 48fps video, it still displays at 24fps.

Edit: since it's an ongoing theme in this discussion, I should point out that my "just record each scene" description is from a technical perspective only. Making it result in a nice-looking work of art is, of course, another matter entirely.


Current playback technologies only playback one frame rate. yes you can mix frame rates in an edit, but they will be converted to the master frame rate of the timeline (so that 24fps footage would be converted to playback at 48fps, not it's native frame rate). Sometimes this looks fine, other times this can cause image problems.

And it's only recently that mixing frame rates in the same timeline has worked well. 5-10 yrs ago, we would have to convert the footage, typically using hardware specifically built for conversion (Teranex or Alchemist). Then came along desktop software that could do decent jobs converting. Now, if i'm cutting in Premiere or FCPX, I can just drop the footage in and the software will take care of it, usually without issue (Avid still has problems with non-native frame rates and it's recommended to convert before importing the media).

And it's been this way since playback frame rates were standardized (and automated). Projectors had no way of changing playback speed on the fly depending on what frames were projected. Television was locked into one broadcast frame rate spec (29.97 in N. America, 25 in Europe) and TVs were locked into one of those specs. Tapes and disc playback was typically locked into one in the early days, although DVD eventually allowed for multiple playback options, as does Bluray, but the hardware typically converted them for playback on 29.97 screens (pre-HD). We're still limited to what the screen can playback to some extent, the broadcast specs of 23.976, 24, 25, 29.97, 30, 50, 59.94 and 60.

With computers and monitors we have the capability to playback multiple frame rates, if the software allows it, and that is where the current issue is. I can playback different frame rate QTs on the same screen, at the same time without issue. But there is no software (that I know of) to create or playback videos consisting of multiple frame rate videos. Game engines might be able to change playback on the fly, but I have zero knowledge of that tech.

And it would be advantageous to have tech that allowed switch on the fly playback. I'm currently consulting on a documentary that uses source footage from at least 3 frame rates (24, 25 and 29.97). And the editor is cutting in Avid, so we have to convert before import, which slows down the creative process and adds complications to the finishing process.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: