The app is obviously targeting a different audience, but having bought it and recorded some test footage on it now, it has considerably fewer features than Blackmagic Cam for videography/cinematography pros - no zebras, focus peaking, stabilisation settings, anamorphic de-squeeze, etc - even commonly-expected framerates like 23.98fps / 29.97fps and settable aspect-ratios like 2.39:1 aren't available as far as I can see.
Would hope to see them address these missing features in future updates, but at the moment there's nothing here to make me move away from Blackmagic for "serious" iPhone videography.
I for one would love to see us drop the fractional frame rates (29.97, etc). They're an archaic technical relic that cause trouble when working with timecode. At Sphere we debated standardizing on 30/60/120fps but ultimately decided it was a battle we didn't want to fight in an already complicated building.
FWIW, I truly hope 24fps never goes away entirely. Something about it is the key to making movie stars look like legends and regular people look like a stars, imo.
Yeah, agree on 24fps, and I don't think poster above you meant to remove 24, just those annoying NTSC/PAL rates that are close to integers, but aren't and are stupid as hell in an all-digital 2024.
I too have a similar subjective sense from being in and around film and video production over the last couple decades about that "cinematic something" look we associate with film. However, I'm not sure we're being accurate in thinking it's entirely the frame rate. Certainly that's part of it but I think it's entirely possible we'd view 30fps as every bit as good - if all other things were equal.
I think very few people (including myself) have ever seen a true side-by-side test where everything other than 24fps vs 30fps is perfectly identical. This is because correctly engineering such a head-to-head test is surprisingly difficult. In addition to having identical (or nearly identical) content shot in cinematic style, there are several other variables which each have to be technically correct. These include having the same signal chain from camera shutter speed, capture, compression, edit and grading to distribution format, playback device and display.
One thing that's especially tricky is whether the 24fps content ever goes through a 3:2 pulldown conversion (or similar). A significant amount of high-quality big-screen-film-sourced content originally made in 24fps goes through this sort of pulldown when viewed at home - even when the source is 24fps (whether Blu-ray, Netflix, Amazon or Apple). This pulldown process definitely imparts a look many associate with being "cinematic". Yet what we see in an actual theater is native 24fps so that's what we need to match for an accurate comparison.
Having recently upgraded my dedicated high-end home theater I was surprised that every device from playback source (streaming box or Blu-ray), AVR and 4k HDR projector - while being native 24f capable - defaulted to having the native 24f turned off in settings (thus silently applying a real-time 3:2 pulldown to the native 24f source). This was only discovered during detailed calibration using test signals. This means many people's impressions of 24fps may actually have been formed watching 24fps content automatically converted to 30fps with 3:2 pulldown by their source, AVR or display.
I suspect associating my subjective sense about "cinematic" with the label "24fps" may not only be erroneous but unfair to 30fps. Technically, 30fps has advantages in reducing motion judder on fast-moving objects and camera pans. A good example of this is the Hollywood-produced pre-digital 24fps Oliver Stone football movie "Any Given Sunday" which was shot entirely on film. They did the best they could with 24fps but some of the fast, ball-tracking camera pans are extremely distracting - something 30fps would have definitely helped if it had been an option back then. Nowadays, for the first time, the industry has some freedom to choose frame rates and I wonder if, done properly, 30fps might be a better option in which us film-look purists would lose nothing of what we love but gain in reducing some unavoidable artifacts from 24 frame's limitations.
Yeah, what qingcharles says. I personally can't say I've seen what's special about 24fps artistically, but it doesn't bother me from a technical perspective (as much).
As someone who lives more on the artistic side than the technical, but appreciates both, that’s honestly reassuring to hear.
And for what it’s worth: I think 24fps is partially why people of frankly similar talent and beauty look untouchable on film, but just like some dude on social media. My personal back-filled theory is that it’s something to do with the fact that 24fps creates more gaps for your imagination to fill in with whatever burns inside your personal subconscious — those “missing” frames let you “see” in Russell Crowe or whomever just a little bit more of yourself than is possible in gapless, real-time reality. Sort of like how old photos with lower resolution feel comforting and organic, because they’re cloudy like dreams, unlike the stark reality that can be achieved by modern lenses.
It would also somewhat explain why high FPS works better for things like sports (where most of the awe is that you’re watching real people do these amazing things) and video games (where the awe comes from actually embodying the figure on screen and existing in their full framerate surroundings).
I don't see a way to grade the footage from within the BMD app. Their app seems more designed to take advantage of the ProRes/log captures intended to be used in Resolve Studio. This app allows you to do the grading on your device. So that's a pretty obvious difference. If you're someone using Resolve, you'll probably be enticed by the BMD app as it fits your existing workflow. If you're someone looking to stay on device or just don't have a computer, this gets you to a similar ability right there
Applying a LUT is not the same thing as color grading. It's simply applying a LUT. The app that was specifically linked to is not Resolve. It is an app tapping into the new features introduced with the newest model device. If you use the linked app to acquire footage with your phone, you would still need to make that data available to the iPad version of Resolve. Again, this app does not require that at all.
The Kino introductory blog post makes quite clear that all they do is apply one of a set of LUTs that ship with their app. Personally I'd be interested in one that tried to apply colouring more "intelligently", e.g. detecting faces etc and applying appropriate settings.
No absolutely not. In Resolve specifically, you have nodes that you apply to the video where each node allows for specific settings to be applied as part of the grade. In a true grading session, you dial in the settings for black levels, white levels, contrast, saturation as primaries. Then there's secondaries which start finessing. You can then draw windows/mattes to isolate a specific area or specific color range (think color image where everything is B&W except the red rose/red car/red dress style) to apply the grading. There's also tracking of those windows. There's so much more going into color grading than "apply LUT here". Just look at the control surfaces for Resolve and the number of knobs/buttons/rollers. Would something that just applied LUTs need all of that?
> You know that when you do color grading with apps like Resolve, it is stored in memory as a LUT, right?
Source? That's a very gross oversimplification of what a color session is like. LUTs don't do tracking. LUTs don't do keys. LUTs don't do mattes.
You are doing colorists a disservice if you think grading is just LUTs.
You seem to misunderstand the role of LUTs in color grading. LUTs (Look-Up Tables — a simple multi-dimensional array data structure) describe how one color is transformed into another, and they are used for efficient color transformations. In Resolve, the tools you described help build a LUT in memory, which is then applied to each pixel in each frame, often using SIMD instructions for efficiency. This avoids the need to procedurally apply each setting to each pixel individually, which is way slow.
Drawing windows, mattes, tracking, and other masking tools determine where and how the LUT is applied within the rendering pipeline.
> Source? That's a very gross oversimplification of what a color session is like. LUTs don't do tracking. LUTs don't do keys. LUTs don't do mattes.
I work in AAA games and have written code for tonemapping and color grading. We often use a gbuffer (graphics buffer that could be seen as a 2D screen-shaped image that is never shown) to mask different objects in the scene and apply different LUTs accordingly. So, it is not only LUTs that are applied in a similar screen-space way, but also masking is similar.
> There's so much more going into color grading than "apply LUT here". Just look at the control surfaces for Resolve and the number of knobs/buttons/rollers. Would something that just applied LUTs need all of that?
On a low level, it essentially is about applying LUTs. How you create these LUTs and how you mask their application are crucial aspects of the process. But ultimately, a LUT is applied to pixels. You are talking about the artistry techniques involved in making a LUT and masking where it is applied, which is not debated.
LUTs are not just files you can import and export to grade the whole image or frame. They are a fundamental compact data structure that makes SIMD operations easy, which is why they are used in grading. If you set up a color grading pipeline with nodes in Resolve, it is very likely compiled into one or more LUTs, which are then applied to the frames.
My contention isn't how a LUT works in the background technically, but the fact that people consider applying a LUT all of what a colorist does. If you want to simplify it to that level, then it's really not even a conversation worth having as that's not the answer to the question.
Not understanding might not be far from the fact, but it is probably just posturing. I doubt most of the Tubers understand it themselves which is the real reason they can't share. They saw it on some other YT video, and then made it like it was their own idea.
I've tried to explain how to use waveforms/vectorscopes and why they are important. Those things are rarely used any more, and people just don't realize how much more difficult they make it on themselves by not using them. Just because you can push that knob to an 11 doesn't mean you should. Pushing that knob while looking at the scopes will tell you when to stop. This was life or death when making content for broadcast.
Also, I've seen all sorts of weird things that blindly applying a LUT wouldn't solve. There was a specific Red camera in town that had a very strange issue where the green color channel was not recording correctly. One shoot we had footage on was of an ice cream type place that had lots of whipped cream on the desserts. However, as you pushed the levels up, the green lagged behind so as the red and blue channels were maxing out the color on the screen went magenta. Applying a LUT would have looked terrible, but the colorist was able to go in to adjust the levels of the green channel separately so that the cream went back to white. It saved the shoot because a camera was not working properly.
There's also interesting tricks to do like when shooting through windows of a high rise will give a green tint to things. So lowering just the green channel will bring things back which could be a specific LUT, but the thing with LUTs is they tend to get used for the wrong reasons. Shooting underwater without a filter can also be dialed back, but it's specific to camera/depth type of situations.
Another common ask was to "open up the eyes". Well lit faces are notoriously hard as the sunken eye sockets just naturally shadow. So you add a window to the eyes and push the exposure up to achieve the effect that no LUT will ever achieve. If the camera moves for a tracking shot, the windows can track along with it. No LUT will ever accomplish that either. The eye is naturally attracted to the brightest area of the screen, so there's ways of grading something so that it suddenly receives some attention that a LUT would not get there, again using windows. Some of the internationalization of releases have small forensic tells in them to indicate which locale it might have been leaked from. One specific example was a stack of towels in the background of various colors. Through color correction, they changed the colors of the towels to be different for each region.
So so so many things that well graded footage can have done to it that "you wouldn't understand" but would never come close to achieving with a LUT. Wouldn't understand is very harsh though, and is a total cop out from someone that sounds very pompous. I would say "wouldn't consider" as something that could be done let alone needed to be done.
Tracking eyes and reducing shadows within a mask is ... Beyond what I would call grading. It all feels like some serious gatekeeping. Thanks for your reply heh but there sure is a lot of FUD out there in the mastering and grading world.
Beyond color grading? Gate keeping? (whatever in the world you mean by that).
Might I ask how much experience you have with the world of professional color sessions? I have been in film/video post pretty much since graduating high school, so I have been a part of prepping content/materials for a color session for decades, but have also spent several years working at a post house that only did color correction. I'm guessing that it is you that has the incorrect understanding of what goes on in a color grading session. If you want to go around thinking that applying a LUT is all that happens in a grading session, then you might as well think that anyone that uses Wix or Squarespace is a web designer, or anyone that assembles Ikea furniture is a craftsman. Your definition would be very skewed.
Lol back to "it's not just luts lamer" heh ... In the computer world in my experience people who just like to say people are wrong never go anywhere, but it sure does feel a requirement of the entertainment industry.
Just to be clear, I don't believe that it's just look up tables ... But I'm starting to believe that that's all you know to gripe about.
Wow, that's just so far off basis, it's flabbergasting.
I've gone into so much detail on what goes on during a color grading session, yet you keep coming back to "but it's all LUTs" in the end. You're totally ignoring all of the work and effort that went into generating those LUTS. You're coming across like you could just download a LUT called TheMatrix.lut and any footage it is applied to will look just like The Matrix. That's not going to happen.
Somebody else came at me that they build AAA video games and that's their background with grading with LUTS. That's not even the same realm of grading camera originated footage. Being unable to recognize and admit that is just not worth arguing and that's where I left that one. Clearly, you are absolutely right in your mind and unwilling to acknowledge anything other than what you hold true.
I know it's not just lookup tables. I'm waiting for you to talk about something other than that lol
I think the real reason why everyone including all of your cohorts behave like this is because when you press people they realize that they don't have a good way to explain it. It's a very fickle and varied thing per project and if you dig into any one example, it seems really silly and that brings out artists insecurities about how everything that they do is actually pretty silly. Whereas you know, say an experienced computer professional knows that everything is BS and they don't get so hung up on feeling called out by dumb things.
So yeah, I guess I'm used to the computer world where information sharing and how you share that information is so descriptive about where you're coming from. I guess I'm trying to figure out where you're coming from and it seems to be mostly that you have to deal with a bunch of idiots.
I do believe that there is a whole world of people out there that just think it's a lookup table. I'm sorry that that's all you deal with and that you can't explain it more than just to complain about those kind of people.
What more could you possibly need me to explain? I've provided much more detail than a lot of the YT tutorials of saying download my LUT. I've described recognizing that as the signal was pushed to the max the color changed from white to magenta and the actual cause coming from a bad sensor in a camera not delivering enough data in the green channel which required using the grading software to isolate and push the green channel harder so that it went back to the desired white. I've briefly talked about the basics of primary and secondary grading. I've talked about using the scopes to see how the adjustments you are making are actually affecting the signal. I have provided much detail on how a grading session is not just applying a LUT. I have never in a serious conversation said that's proprietary knowledge and you're not worthy. I have openly shared. You have not actually asked any direct question other that taking offense and my offense.
I really don't know how you think me and my cohorts are behaving. Is it that we (extrapolated from conversations I've heard) get offended because someone calls themself a colorist because they can apply a LUT? You think this is pompous that we do not agree that the ability to apply a LUT is the same thing as running a full color session? At this point, I'm really not sure what high horse you are on or that you think I am on.
I really reread this all earnestly, and I guess all I see is tactics and no strategy discussion. I understand that to ... Change the color of an image you have to adjust that color, or that if there is a cast or shadow of an unwanted color that ... One thing a video tool does is adjust color.
But why or what next? Who knows. You gave it to a colorist.
I've tried to repeatedly point out you keep shitting on LUTs as if that isn't what anyone can say as a secret word to bitch about to get into the secret pro chat room. I get it. But if you don't know of any theory materials or why any of this all is done then I don't know why you continue to comment. Photoshop is a very old tool, but I don't know why doing basic image adjustments should impress me, unless that's your point here, that it all might as well just be applying a simple curve and calling it day but you seem to also be upset with people that reduce the work to that so it is indeed very confusing. If you don't know how to teach a craft you claim to know then do you really know it?
Might I ask why this is all you thought a grading session involved? Clearly, this is a touchy subject for me as I spent a few years as an assistant to a very talented colorist. The plethora of YouTube videos saying color correct using my amazing LUTs available when you join my Patreon blah blah nonsense is really sad.
There are some truly amazing colorists, and then there are people that claim they are colorists when they just applied a LUT. I would be embarassed to call myself a colorist that way. With my experience, I still do not call myself a colorist. I also don't go around calling myself a DP because I own a camera and make pretty pictures, yet people go around with no real training calling themselves that because it's cool.
I used to use Zotero and Mendeley, but recently switched to Paperpile and haven't looked back. It's a Chrome plugin, and saves all your papers to Google Drive. It has a really nice cite-while-you-write extension for Google Docs that really makes it worth it.