Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a non-film person, can someone explain what it means to create the same cinematic ‘look’ as Hollywood feature films? What is Blackmagic doing when recording video to make the video feel more professional?


Marketing aside, cinematic in this context means more or less "manual control".

Something that makes a video look amateurish, it's the phone trying its best to prioritise a 'clear image', but that means changing parameters mid-recording.

Now, this isn't bad, it's ideal for someone who doesn't want to lose the moment without worrying about choosing the right setting (imagine a parent recording their child's recital or soccer game). But the trade-off is that it looks choppy.

But if you're in a controlled environment, you can set a fixed exposure (balance between ISO, shutter speed and aperture), framerate, bit-depth, focus distance, colour temperature and microphone gain depending on your intent.

As an example, image you want to have a high-contrast image with a dark silluette of someone and a bright background like a sunset, the default phone camera app will try to guess whether you want to focus on the subject or the background, and will switching between the two randomly. With manual control, you can chose, whatch you want.


> Something that makes a video look amateurish [...] changing parameters mid-recording

A prime example of this is leaving autofocus on when you're moving about. There's many YouTubers who haven't yet learnt this lesson and it can make the video unwatchable.


Yes. It’s very rare to see the focus change during a movie or TV show. The main exception is when the focus switches between two people talking, when their positions are known in advance and dialled in so there isn’t any visible hunting


You can actually see focus changes quite often in movies and TV shows - but they're usually done intentionally to accentuate something, e.g. a focus pull from a foreground object to an actor in the background.

But it's a slow and smooth motion without any focus breathing intended to highlight an object or an actor, not just autofocus hunting to find something


This is probably an elementary question, but are those focal shifts still done manually, with a camera guy turning the ring by hand? Or do they set the two points in advance and hit a button to start a motorized transition?


Behind the scenes, look for when they are "marking," which is leaving little piece of tape or otherwise on the ground where the actors are standing. The focus puller will make indications on their focus ring to match these; as long as the actor "hits their mark" the focus will be dead on. A majority of time the operation is fully manual (though possibly remote to the camera)


That's how I remembered it too. And that's also then one of the major differences between this app and an actual professional camera - because on a smartphone you only have autofocus. Which works most of the time, but I've had some recordings of concerts with weird lighting, smoke and other stuff which were out of focus for quite a few seconds. One of the most stupid things is when you try to take a picture of a bird or airplane in flight and your smartphone can't focus on it because it's too small. Why can't it just default to focus to infinity if it can't find anything to focus on?


>because on a smartphone you only have autofocus

Focus can be controlled manually on the iPhone (in 3rd party apps).


Focus pullers at work in case anyone is interested:

https://yewtu.be/watch?v=5cfTuy6lNaM

https://yewtu.be/watch?v=ZlEp_s8yHYA


Kind of both? You've usually got a small motor connected to the lens that turns the focus ring.

A dedicated person, called a focus puller, has a remote with a wheel on the side. By turning this wheel the focus puller can remotely control the focus ring of the lens.

The remote usually allows the focus puller to set the maximum range of motion with A/B points. The system doesn't automatically execute the focus pull, but with the hard stops at the A/B points the focus puller can make sure they don't overshoot the target.


At that point it sounds easier to have the focus switch executed automatically with all relevant parameters preset, e.g. duration or curve. Sort of like CSS transition or MIDI automation.


look at this video: https://www.youtube.com/watch?v=6AsGgR9oZak

Back then the focus puller would use a remote connected by a cable, which is why you see a focus puller running behind the steadicam op in that video.

That's why while you've got A/B points set and marks set on the focus control, you still need a human.


there's a dedicated person usually called a focus puller.


Having worked as a loader/2nd AC and getting thrown into the focus puller chair on some b-roll - focus is changing constantly. On a movie set, it’s pretty much an entire person’s job.


Does the focus change during a shot?


Unless the distance of the camera and the subject do not move at all, the focus will be actively changing - yes.

Depending on various conditions (lighting, lens choice, etc) there might be a very large distance range that is in focus - or it might just be a few inches. Even if the focus puller isn’t doing any big focus swings, they are likely making small adjustments.


It's not rare at all. It is just deliberate in movies/shows, not something algorithm on camera constantly fiddles with.


With “Cinematic” video mode on the iPhone you can edit focus in post with the iPhone Photos app. It does a good job for this two person talking scenario.


I’ve seen enough videos with otherwise high production values to make me suspect there is a valid trade-off to keeping autofocus on.


It depends if you have a cameraman or not. If you don't, and you're walking away from the camera, it's probably best to leave it on and hope it tracks you.


If Apple wanted to put an engineering team on solving this problem, they could record all the raw sensor data for the video, with the regular 'auto' settings, then, after the clip is recorded, decide what shutter speed, iso, etc to use, and then reprocess that raw data to simulate what that moment in time would have looked like with a different shutter speed.

I''m sure modern neural nets would do a decent job of simulating what a frame taken with one iso/shutter/focus would look like with a slightly different iso/shutter/focus.


First, I doubt users ask for this though. Those who want it, are going to use a manual videography app like OP. The 99.9% wants a camera that just works.

Second, modern neural nets are good, but not perfect. I can reliably tell if something was shot with real bokeh, or simulated via software. For serious productions like a commercial shoot, nobody wants to change the shutter speed, aperture, etc in shoot: the DP already knows what look they want before they start filming.


How could you change the shutter speed in post?


I think a neural network could do it. You just train it on a bunch of videos with different shutter speeds, and then you ask it to convert a given video from one speed to another.

I'm sure it would quickly learn to add/remove motion blur on moving things as appropriate.


But in addition to determining motion blur, shutter speed also massively affects which areas of the images are above/below the brightness range the sensor is capable of picking up.


Tangent that kind of seems relevant here¹:

Read the Foreword written by Gerald Sussman (SICP author) of the book The Little Schemer.

The most beautiful Foreword I have ever read so far.

¹(The Foreword talks about photography a little bit).


The TLDR; version of this is the progression from amateur to expert: 1/ controls are set wrong in the first place, 2/ computer changes controls during the shot but it's distracting and obvious, 3/ controls are set right in the first place and everything looks good and consistent, 4/ expert modifies the controls mid-shot (and the shot requires this) and it looks awesome because everything is changing which allows the shooter's expertise to shine through.


More than anything it's about color correction and color grading.

This video explains it nicely I think: https://www.youtube.com/watch?v=pAh83khT1no

If you are starting out with good data (e.g. 32bit exr workflow), you would be amazed how powerfully and easily you can control what you want and what the possibilities are, with tools like MagicBullet (which offer presets to get you the cinema look with just a mouse click). But if you work long enough in this area you can discover your own workflow and pull it off without these tools, e.g. play with hue&sat, white balance adjustments, the curves, introducing an S-curve for example, color wheels, etc.


> More than anything it's about color correction and color grading.

to my (literal) perception, using a framerate of 24 frames per second is an even more significant requirement to get the "cinematic Hollywood look".


Isn't the 180 degree shutter angle more crucial than the distinction between 24 and 30?


Both. The 180 degree rule just makes sure motion blur looks as intended and is a mostly artistic choice that can vary depending on the scene. E.g. for action sequences or particularly smooth motion in a dreamy scene, you can break this rule. Or, if there's moving water, you might want to choose a particular shutter in relation to the preset frame rate.

The overall frame rate gives you the distinction between a typical movie vs a TV-style documentary. The overall frame rate stays fixed across a movie and should normally not be changed.


My artistic choice is shooting 60FPS at 360 degrees (shutter: 1/60th of a second). It gives motion blur more comparable to 30FPS (which is closer to 24FPS), with the responsiveness and fluidity of 60FPS.


Why 24 fps format is still being used? I personally can't stand it. It's like watching a slide show.

I can't wait when Hollywood moves to 120fps or better.


For movies and TV, I absolutely prefer it -- as do most people in tests, which is why it continues to be dominant.

For whatever psychological reason, 24 fps "suggests" reality but without "being" reality, kind of like being in a dream, and our brains pay attention to story and action.

While 60+ fps "approaches" reality and it simply starts to feel both uncomfortably real and uncomfortably fake. Uncomfortably real because it feels too much like real-life and we don't have enough of a mental distinction between fantasy and reality, and uncomfortably fake because it looks like a bunch of actors acting and moving in ways that aren't the ways people act and move in real life. It's uncanny valley.

Nobody really knows why our brains respond this way psychologically. They just do.

So for fictional movie/TV content, higher fps is not better. 24/30 is chosen for a very good reason.

(On the other hand, news and sports do great with higher fps, because there's nothing fake trying to be passed off as real.)


Every now and then somebody makes a high frame rate movie and everybody complains it looks bad, so they don't do it again.


Reminds me I was playing games at 320x240 and then going to 1024x768 resolution. Suddenly everything started looking "basic", whereas at lower resolution, brain could somewhat "fill in the blanks" so to speak so it felt better.

I guess it is similar for higher frame rates - it just shows the shortcomings.

I think if the film industry committed to higher frame rates we would have seen massive improvements over the years.


IMO a 48 fps movie at 1/48 shutter speed looks just as dreamy as a 24fps movie at 1/48 shutter speed, but is much less stuttery.


I think 48 fps gains more detail but makes things look cheaper. The Hobbit looked like a BBC TV show compared to LOTR. That said, I don't pay enough attention to film these days so maybe people have gotten better at it and I'm watching 48 fps all the time.


The hobbit had a 1/96 shutter, which is what made it feel like a TV show. The actual fps barely had an effect on its look.


I have heard it claimed that because historically high-budget hollywood films were shot on film at 24fps, while low-budget TV content was shot on tape at 30fps interlaced to 60fps, people came to think the lower framerate is "cinematic" and that higher framerates "don't look right"

Personally I'm not enough of a film buff to notice the difference. Apparently film enthusiasts do notice, and care a great deal, though.


It's a cheaper safer option to get something that looks "right". It's not so trivial to have 120 fps video look like a smoother 24 fps. Even capturing at 1/120 shutter speed it does look different. There's an experiment I want to do that involves taking 120 1/120 video and stacking windows of 3 frames to emulate shooting at 120 fps with 1/40 shutter speed.


I'm sure the people masking shit out frame by frame can't wait to do it in 120 FPS either!


They don't exactly have any job security. They will eventually be employed doing something else.


In this case its software that treats the iphone as a camera of their own. Looking at the screenshots, the UI/UX is extremely similar to current blackmagic cinema cameras. So you can have two camera operators, or the iphone on a tripod or whatever, and each camera operator will know which settings their camera has and to typically match both the film cameras. Like a quick visual check that both cameras are at the same shutter speed or shutter angle, resolution, white balance and tint, and having the same style of histogram so they can match exposure on both cameras.

Its actually fairly neat and cool that they put time and money into this app to further their ecosystem. I guess theres a large overlap of people that film with iphones and also want to buy a legitimately good, budget cinema camera in the pocket 4k/6k.

I don't know HN's opinion of blackmagic, but they do some pretty cool stuff. With the purchase of a camera they include Davinci Resolve which is a fully featured Adobe Premier Pro rival. For reference premier pro is $21 a month, and the cheapest blackmagic cinema cam is the pocket 4k which comes in at $1200, after 5 years you have a free camera (thats still actively updated) if you consider Resolve to be equivalent to Premier Pro. Also they've constantly pushed the industry to be more affordable. They were pretty much the first that let you use a consumer usb c SSD to record raw formats. When the camera released, you could get 1tb samsung T5's for around $100, while one of their rivals RED cameras made you purchase a proprietary SSD that still costs $1500 for 480GB. Also in terms of affordability, it wasn't unheard of for a cinema camera to charge thousands of dollars to be able to use a cinemaDNG raw or ProRes, yet blackmagic cameras came with multiple raw recording options for free.


I’ve been using the free version of Resolve for about a year now. It’s absolutely outstanding and well worth the steep learning curve (cos of the massive functionality). Don’t buy a Premiere Pro subscription until you’ve tried it out. Apart from its technical excellence there are zero dark patterns associated with free sign up and use.


> With the purchase of a camera they include Davinci Resolve

That's also included with the speed editor, which is ~$400 and provides an awesome input device for Resolve.


1. They are giving you all the tools needed to work in a professional way in a professional setting. This includes many things like being able to set all the camera settings manually, good metering to avoid clipping the sensor, audio metering to avoid clipping the recorder, timecode synchronization with other cameras & audio recorders, LUT preview, etc.

2. The "cinematic look" comes from a combination of things:

- good lighting (using professional lights in most situations)

- 180 degree shutter angle (aka "24fps"), or slow motion where appropriate

- careful and artistic color grading

- taking time to set up the scene in advance & good framing

- good lenses

- good camera sensors (mainly, high dynamic range)

- holding the camera still or moving it smoothly through the scene (except when deliberately not, as in for instance The Office)

- music

- and, more important than you'd think: very high quality audio (good mics, appropriately mic'd, low noise, dubbed in post if needed, SFX added)

3. In short, what creates the "cinematic look" is many factors (and, usually, people) coming together as a system. This app lets your phone be part of that system.

4. What makes this app unique: (1) it integrates directly with Davinci Resolve in a way that's probably more convenient than Filmic Pro for that workflow and (2) it's free.

People have been making films and TV shows on iPhones for years, so this is more of an incremental event in the industry.


No attempt at real answer, but some hints from watching youtube videos on the topic:

Lightning:

- Edge or back lightning, if dramatic

- Wraparaound (cradle) lighting, if for pleasantness

- Low key look for interiors (no white walls)

- Artificial light needs to be motivated as much as possible

Set design

- Add bankers light for any money related film, normal table light for anything else

Lens

- Anamorphics to avoid perspective distortion typical to spherical lenses, also for the "rich depth of field" effect

- Surprisingly the best lens technically don't give the most "pleasing" (at least in "hollywood" terms) image. They are even called "clinical" or too sharp. A lot of DP's like lens with a "character", altough some artifacts are regarded universally ugly (like the longitudinal chromatic aberration, which pukes green and cyan fringes around the image)

Camera

- High dynamic range camera, no clipping of highlights or blacks (add light, if necessary)

- Must be able to retain true image details, any digital sharpening in the source footage immediately puts things off

Color grading:

- Good tone mapping: should look "good" in black and white, mostly solved with lighting

- Pleasing color palette: color harmonies, gradients in good perceptual color space, like okmap. Mostly solved by set design, character and dress design

- Even saturation: previous point should cover "nice colors", but the saturation is one of the most overlooked aspects. It can be highly or sparingly saturated, but too much variation in a single frame quickly makes for a garbage image. Also, one has to fight most software color manipulation tools, which tends to brighten up highly saturated parts, where in reality, they should go darker

That's a whole package of things, for a camera control specifically, typical operator or AC wants:

- Manual focus pull

- Way to judge "exposure", measured in IRE

- Some way to approximate highlight to shadow exposure ratio; 2:1 for "happy" look, 4:1 for dark, 5:1 or more for Batman

- Highlight clipping warning (especially important on talent's skin)

- Shutter angle control (typically 180 or 90 degrees), instead of the shutter time used in photography


How do you reduce the severe oversharpening with iphones?

Is there an app that can take footage without oversharpening?


I really don't know, if that even is possible.

That's main reason, why cinema cameras are picked.

I suspect that it could be possible now, to an extent. We have quite good image restoration tools, some based on neural networks. Maybe one could be trained for iPhone specifically.


I'm a non-film person as well, but I've been playing with this a bit. One key ingredient of the cinema look is the shutter speed. The iPhone standard camera app is constantly adjusting the shutter speed and the ISO depending on how much light the camera is getting.

Movie cameras work differently with a shutter speed fixed at 24 fps, except for some scenes with specific requirements (for example slow motion). The light is controlled using the ISO, the aperture, the lighting, and ND filters.

A nice trick people are using with smartphones to get the cinema look is to use an app like Blackmagic Camera, lock the shutter speed at 24 fps, and mount a variable ND filter on the smartphone to control how much light is received by the sensor, since we can't control with the aperture.


Imagine the difference between say a sit-com and a movie with the sound off. The movie will have range and intentionality to the scenes. Light, dark, vibrant, dull, perciptible and intentional changes from one to another to match the story. The sitcom is just clear and bright. The camera phone on auto is just going to aim for sitcom all the time wheras this app allows you to be intentional in order to look cool and tell a story.


Tweak the color scale to be all blue/orange




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: