An Apple Event Shot on iPhone

As you probably have heard/seen already, Apple held the first evening keynote/event to announce M3 Macs on October 30th.

I’m already asking: M3 Ultra, when?

But the real surprise, or One More Thing in Apple parlance, was the last card at the end of the event stream.

It almost overshadowed the new M3 chips & Macs and proceeded to generate discourses on what Apple really meant by “shot on iPhone.”

Apple-the master marketers of technology, released the following video as if they were expecting the discourses. And the video revealed the production & the post-production workflows of their events for the first time, because it highlights their biggest product, even if it steals spotlights away from new Macs. And the accompanying article details more with the behind-the-scenes photos.

The biggest takeaway for the people, most notably the post production professionals using Final Cut Pro X, was that Apple didn’t use its own NLE software to make their biggest videos. Haha…

Its post production workflow appears to be (and remains to be, I might add) Premiere Pro + After Effects (not shown in the video), and color + finish in DaVinci Resolve.

However, I wasn’t a least bit surprise because I worked at RadicalMedia, the production company behind the Apple virtual keynotes & presentations since its inception (due to the pandemic). It may haven’t been a public knowledge until Apple acknowledged & credited the production company in the video. No, unfortunately I didn’t get a chance to work on it but I’ve talked to coworker(s) about their experience (without knowing that I will be working on an Apple TV+ project later).

I know that the production chose Premiere Pro to edit, neither FCP X or Avid, because it offered faster turnarounds of GFX & VFX with After Effects integration.

A real surprise to me was the new BlackMagic Camera app, because the first Shot on iPhone 15 commercial was the Olivia Rodrigo music video, which used the default iPhone camera app.

Either the Blackmagic camera app wasn’t available at the time of filming, or Apple doesn’t want to show another company app in its commercial for average consumers.

All in all, this has been interesting turn of events for video professionals as Apple embraces and incorporates more professional features, gears, and apps to highlight its growing platforms, rather than pushing/forcing their own solutions.

Because at the end of the day, Not Invented Here is a dangerous path for a big tech company like Apple.

p.s.

Stu Maschwitz wrote more in-depth article about what does and doesn’t matter about Apple shooting the event with iPhone 15 Pro Max.

“Apple wants to sell iPhones, and to accomplish this, they spared no expense to put the very highest image quality on the screen.”

Yes.

DaVinci Resolve 18.6 is OUT!

In less than a month since the 18.5 release, Resolve 18.6 it out (pdf for the list of new features)

Many quality of life improvements are welcomed, but I’m mostly/pleasantly surprised by how fast BlackMagic Design incorporated Apple ProRes Log into this release.

It’s not a new software release without AI these days. These 7 AI, excuse me Neural Engine features, will save time and enhance your colors/looks.

It’s great to see that Fusion is getting attentions with new features and improvements in this release because I always felt like Fusion is an afterthought, a checkbox, in the toolsets of DaVinci Resolve. I’m trying to do more in Fusion that I could’ve done with After Effect, for example.

It’s a lot to chew on for .1 release, but that’s the nature of this cutting-edge technology/business. Soldier on!

AI Color Matching (using AtomX Cast from ATOMOS)

With all the hype around AI for post production, AI for color correction is evolving slower than expected because, as we all know, Shot Match to This Clip in DaVinci Resolve rarely works, if ever for example. For all the latest AI capabilities added to recent DaVinci Resolve, this function is hardly improved upon, or mentioned at all. (I try it time to time though, just for kicks.)

So I was pleasantly surprised when AtomX Cast from ATOMOS unveiled its color matching by AI capability. Although I’m just taking theirs words from the demo videos below, this feature is desperately needed/required for DaVinci Resolve.

I wonder if working with raw camera signals make AI understand colors more better than video data wrapped in various media codecs?

Have you found this app working as their videos claim? Please share your thoughts if you have used it.

Color Pipeline: Virtual Roundtable

This is a great article on the latest color technologies & workflows from not just colorists, but also software developers, vendors, and device manufacturers. I’ve pulled some quotes that I found fascinating from the article.


On Marvel Studios’ finishing setup:

“We use Blackmagic DaVinci Resolve for our central hub of finishing tools as well as Boris FX’s Sapphire and Neat Video. We have also developed our own proprietary tool called Jarvis that can take control of Resolve to perform many editorial tasks, CDL management and media management and can pull on the vast amount of metadata that exists within Marvel to streamline our workflow.”


Using LUTs for projects:

“Yes, we create show LUTs for every project. To be specific, we design our looks within the ACES framework as LMTs, which represent the visual characteristics that the filmmakers want — contrast, tonality, gamut mapping, etc. — while CDLs are used and tracked throughout the project for per-shot color choices.”


On using AI:

“Resolve has some interesting tools that have a machine learning component to them, but Marvel Finishing is not using AI in the way people are talking about it today… Our focus is on quality, and I have not yet seen any AI tools that have helped improve the quality of work. Speed maybe, but not quality.”

“Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.”

“Some of the AI tools that colorists can use with DaVinci Neural Engine include our new Relight FX tool, which lets you add virtual lighting to a scene to creatively adjust environmental lighting, fill dark shadows or change the mood. There is also the new SuperScale upscaling algorithm that creates new pixels when increasing the resolution of an image. And there are a number of other AI features around facial recognition, object detection, smart reframing, speed warp retiming, auto color and color matching.”

“Right now, we are not offering any AI-enhanced tools for virtual production inside Live FX. Reason for that being that in many cases, you want 100% reproducible scenarios, which is often not possible with AI — especially on the creative side of things… We really see AI as being a technology that does not replace storytelling. It advances capabilities.”


On remote monitoring:

“We use ClearView Glow, which allows us to send an HDR feed to filmmakers all over the world. We recommend people watch using the latest iPad Pro, which is a very good match for our reference displays. However, I will always prefer to have people in the room. So much of our job is reading the room; it’s much harder when that room is a small box on a screen.”

“At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.”


On cloud workflow:

“I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.”


On beyond HDR:

“The XDR technology that Apple is driving with their Pro Display XDR technology is very interesting. Seeing an image’s true color and working in a format that is similar to how human eyes see an object is very exciting. But HDR and SDR workflows are still going to be around and needed for a long time.”


On color in virtual production:

“Color is becoming a much broader aspect in virtual production. It is not only about the color of the digital content provided anymore. It is also about the color capabilities of the LED wall, the LED processor, the monitoring on-set and, of course, accurate color representation of the stage lighting.

Where it becomes challenging is when you have to connect all the different hardware and software parties in a virtual production workflow into one coherent color pipeline.”