Talking On-Set Production and Post Trends with The Rebel Fleet

Talking On-Set Production and Post Trends with The Rebel Fleet

March 22, 2023

High-resolution, high dynamic range workflows have been the center of discussion among production and post professionals for nearly a decade, but with 4K/UltraHD and HDR technology and techniques continuing to rapidly evolve, workflow pioneers like CEO Mike Urban of The Rebel Fleet have become crucial to helping productions adapt. Urban recently spoke with us about the company’s video assist, DIT, and dailies services, which help local productions ensure a consistent picture as content moves from on set into dailies and post.

What is The Rebel Fleet’s core line of business?
The Rebel Fleet started in 2015 in Auckland, New Zealand, providing video assist, DIT, and dailies solutions and services and since 2020 we have been offering these services in Australia, and in the last year, we’ve started building out our post production offering in New Zealand. Our clients span high-profile episodic, film, TV, and commercial production, and we’ve worked on several high-profile projects all over the Asia Pacific region for major streaming platform and theatrical distribution.

How’d you get into your profession? 
I began working on the technical side of image capture in the early 2000s, and edited videos for a company in the marine industry, before moving on to a lighting gig on the medical drama “Shortland Street.” After that, I was introduced to the technical side of studio TV production, eventually moved on to live sports and scripted dramas, and landed a DIT job, which is when I developed a passion for this side of technology. A few years later, I started The Rebel Fleet with three other professionals, and business quickly took off. Today, we’ve grown to 13 full-time employees between offices in New Zealand and Australia.

Describe your approach to working with clients.
Clients approach us for workflows that suit their unique production needs. Often, they’ve chosen a camera but want help to get the best quality picture out of it, back up the data, and process and deliver the video. We help clients develop a pipeline that aligns with their production environment, budget, crew size, studio requirements, and the like.

What are your go-to workflow technologies?
We use AJA KUMO 1616 and KUMO 3232 routers, the KUMO Control Panel, and AJA Mini-Converters in our on-set video assist and DIT workflows because the build quality and warranties are so great, and reliability is paramount in the fast-paced environments we work in. We need equipment we can trust, and AJA gear gives us peace of mind. Our video assist software, QTAKE, also integrates seamlessly with the routers through the network interface so that we can set up predefined router settings quickly and easily. If a client wants all the monitors to display playback or live footage, whether in two or three-camera mode, they can preconfigure that, load it up with the Stream Deck, push a button, and changes are reflected across the routers, which is a massive advantage over having to make manual changes many times per day. Being able to bring all this control into the network ecosystem of the rig gives our clients more free time to focus on client needs. It’s also helpful that KUMO routers also run off 12-volt batteries.

Even though SDR is still the on-set standard, we are increasingly deploying HDR gear for clients. We use the AJA FS-HDR and AJA HDR Image Analyzer, which with Colorfront, help us ensure that the client’s desired look stays consistent as it moves from camera through dailies. 

Tell us more about your experience working with HDR. 
Cost remains a key hurdle to broader adoption of full HDR workflows, which is where tools like AJA FS-HDR have been great. We can easily convert between SDR and HDR and retain a high-quality image. When we have one or two monitors on set, we know they’re displaying HDR correctly if we’re getting a log signal from the camera. Then we’re able to route different paths to HDR or SDR depending on the client’s needs. We’re also using it with Pomfort LiveGrade Pro as a dynamic LUT box, so we get one rack-mounted device that does everything we need. Monitoring HDR on set also means we need to be able to look at the HDR scopes, which is why we often build AJA HDR Image Analyzer into our rigs. It allows us to technically monitor the HDR on set with scopes that match what we’re seeing in Colorfront. This is huge for ensuring consistency as we go into dailies. 

What kinds of conversations are you having with clients and colleagues about HDR?
HDR undeniably provides a better viewing experience for audiences, but as an industry, there’s still work to be done. In my conversations with DPs and colorists, there is a general consensus that they want to see how it plays out. Most of the time 300-400 nits seems to be the happy place in terms of brightness. As one DP explained it, if you have a scene with high sunlight or an interior scene with bright, practical lights or windows that bring in sunlight, they want your attention on the character or person, not the light, so they have to strike the right balance between where the skin tone sits and what the human eye will gravitate toward. When you have to compress the image to lower the brightness, the picture quality in HDR is fundamentally better than SDR. For this reason, it’s become common in finishing, but the on-set potential of HDR is still being uncovered.

What kind of demand are you seeing from clients for 4K capabilities?  
4K requests on set are more often an exception than the norm; focus pullers advocate for 4K most, which makes sense. They want to know their image is sharp and that the quality will hold up when they zoom in, which is easier to tell with a 4K signal over an HD one. It isn’t a lack of 4K interest that is holding back adoption, but more so that challenges remain, especially because there are so many ways a 4K picture can move through a pipeline and devices and a lack of standardization. 

Tell us more about a recent project. 
Last year, we worked on an epic fantasy adventure series for a popular streaming series shot in New Zealand. We originally met with the producers while they were location scouting here, and started talking about a potential workflow. We explained how we’re part of the process from camera capture through dailies delivery and upload to the cloud, and because of that, we get a good overview of how everything is working; they were interested in how it would all work. They required on-set HDR finishing, so we needed to create a workflow that would allow them to switch between SDR and HDR in the editorial pipeline. It was important to be able to show the DPs and other stakeholders what the HDR looked like during high light or outdoor scenes with high dynamic range. After our initial conversations, they decided to move forward, and we began work on the project. 

What are some of the key considerations when working on a project like this? 
Metadata was essential because every shot is effectively a VFX shot. They shot on ARRI Alexa LF cameras with DNA and other smart lenses. Starting from the camera, we designed the workflow so that they could read all the metadata from the camera and the lens through the on-set pipeline. The FS-HDR was helpful in this respect because it could read all the metadata, and we were able to bring it into the DIT cart, combine it with the notes and information the DIT was creating for whatever needed to be passed down to the pipeline, and then marry it with the information coming from QTAKE. With QTAKE bringing in the data, we could then give the team on set the ability to add metadata directly to a live database for viewing on iPads and other devices. 

All that information ultimately served dailies through Colorfront before making its way into editorial. We essentially created a color-consistent pipeline for editorial since we knew what we were seeing on set and in dailies would match what the editorial team was doing. This meant editorial could start working faster. It provided a level of automation that allowed them to work quickly and share their work with production stakeholders across the world. The team worked in the same HDR color space (P3-D65) in post production that they’d monitored on-set. 

I/O was another important consideration, and for that, we used AJA Io 4K Plus; we own around 20 of the boxes. They’re amazing output devices that give us the confidence that what we're seeing on set matches what we're seeing in dailies. Being able to switch between HDR and SDR analyzing with the AJA HDR Image Analyzer also gave the main DIT confidence they were giving the DoP the best information and that the picture wasn’t blown out or that they had enough range in the log in a night scene, those kinds of things. 

What are the biggest challenges The Rebel Fleet has overcome as a business?
When the pandemic hit, we lost 99 percent of our work overnight. We refocused our efforts on strengthening the business, because we knew at some point, things would get better, and we’d emerge stronger. We spent a lot of time working on and talking about the efficiency of film sets learning to work with remote teams. During a five-week lockdown, one of our team members took home our equipment to create a workflow and demonstrated how film sets could work remotely. That came in handy even when we returned to on-set work because the number of people allowed was still limited at first.

Supporting growth is another, more general challenge we’ve faced as a business. Our client and project load grew alongside the demand for remote workflows. To keep pace, we had to hire enough crew to support project load and then figure out how best to manage and distribute the workload. Then we also faced supply chain difficulties that made acquiring the equipment we needed less straightforward. Thankfully, the supply chain seems to be back in order though. 

Where is your team planning to focus its efforts in the coming years?  
Metadata – and finding new ways of capturing it and creating value out of it across the production chain – makes our production and post pipelines more robust, smart, and efficient. For instance, on a shoot spread out over multiple days, having insight into scene data, including file size and duration, can be helpful. Right now, much of that data is available but not readily accessible; you must chase the camera department or request it from another team. With our custom software ‘Sidecar,’ we bring all that metadata into one place to create a summary of activity that they can share with their team or studio. Even small things – like bringing in all the metadata from the production for processing and delivery to editorial – can reduce crucial time spent retyping a script supervisor’s notes or bringing VFX notes into Avid. 

About AJA Video Systems                                                                                                               
Since 1993, AJA Video Systems has been a leading manufacturer of video interface technologies, converters, digital video recording solutions and professional cameras, bringing high quality, cost effective products to the professional broadcast, video and post production markets. AJA products are designed and manufactured at our facilities in Grass Valley, California, and sold through an extensive sales channel of resellers and systems integrators around the world. For further information, please see our website at www.aja.com.

All trademarks and copyrights are property of their respective owners.