Australian audiences flock to good Australian television. The drama market is buoyant however audiences still expect a level of quality competitive with imported product and the broadcasters continue to place pressure on production budgets.
At the other end of the TV spectrum, big budget factual (or reality) programs like The X Factor, The Voice and Masterchef are focused on engaging audiences both inside and outside of the traditional family TV.
In this highly fluid environment, where technology and technique are rapidly evolving, this article seeks to explore workflow trends associated with both drama and factual production.
Blurred Lines
There are many types of TV drama; five day a week episodic, 13-part primetime series, six-part miniseries or made-for-TV movies.
Interestingly these form the stepping stones to build the skills, technology and technique to meet the higher expectations of features for cinema. And with miniseries like Top of the Lake appearing at the high profile Berlin and Sundance film festivals, things really start to blur.
The theatrical success experienced by a number of HBO made-for-TV movies some years ago continues to generate much conversation amongst local producers. Whilst many things need to be considered, technically the resolution of HDTV is not far removed from the 2K resolution commonly used for feature films.
Robert Connolly’s Underground: The Julian Assange Story was recently remastered for screenings at the Toronto and London International Film Festivals along with a run of local screenings at select cinemas. Shot on the ARRI ALEXA with television its primary focus, its key ingredients included a high profile cast, planning which saw it shot and framed for widescreen and a commitment to high production values that could ‘scale up’ to sit well on the big screen. The creation of a digital cinema DCP involved re-grading the HD video master for the digital cinema colour space and large screen contrast ratios along with re-mixing the 5.1 soundtrack in a cinema mixing stage.
Connolly noted that when the commercial breaks were removed, the discipline of building mini climaxes throughout the show created a compelling narrative when seen continuously.
4K TV
Technology is no longer a barrier preventing quality television content being presented in Cinemas. As we saw at NAB this year, 4K Television is a reality. This is also supported by the H265 (aka HEVC) video compression standard that aims to take over from its predecessor, H264. It requires half the bit rate necessary for the same visual performance and can handle up to 8K image resolutions. For broadband delivery, this means lower connectivity areas will benefit from improved quality while well-connected areas can access super hi-res 4K content.
Video versus RAW
Experienced colourists and compositors prefer to work with RAW camera files due to more colour information from the higher bit depth, sharper images from less image compression and more detail from the higher resolutions. However due to the significant reduction in workflow complexity (and therefore cost) in-camera compressed video recording has now become the ‘norm’.
In a practical sense a key difference is how far a shot can be ‘pushed’ during colour grading. Image artefacts can occur when correcting shots that are under exposed or when trying to match adjacent shots in a scene that are markedly different due to changing light. Additionally, compositors have more detail and cleaner edges to play with when dealing with challenging VFX shots.
The most commonly used compression codecs are ProRes 4:2:2 HQ or DNxHD185, however Top of the Lake, post produced by DDP Studios, was shot using ProRes 4:4:4 HQ. Whilst the 4:4:4 files add around 50 per cent to storage requirements, the key benefit is more image detail both for colour grading and any compositing work. Sony has now released the SR 4:4:4 codec so that files from their new generation of cameras e.g. F65, can be imported directly into edit and grading systems that have adopted the codec.
Colour Management
Ideally colour needs to be managed though the entire workflow to ensure the best outcome from the final grade. And indeed, colour grading technology companies seek to provide solutions that work from ‘set to the final master’. For this to occur, these systems must easily ‘plug in’ at any stage of the workflow.
Currently the common form of onset colour management is a simple LUT that provides a reference point for the DOP during the shoot. However Blackmagic’s new version of Da Vinci Resolve comes with a ‘live’ video function – direct from the camera – specifically for onset grading. Adjustments are stored and can be carried forward to other phases of post-production.
Data Management
Once the images are recorded onto the flash card in the camera, there are many options. For television, onset data operations tend to be kept simple, driven by a need to turnover shots quickly and minimise the number of onset crew.
One new onset technology that has been embraced is the ability to view footage on tablet devices within minutes of calling ‘cut’. But there’s more! All of the shots for the entire production can be stored for instant recall at any time during the shoot … Continuity love it.
Further to this, a recent article by David Heuring in Videography magazine lauded the amazingly efficient shooting process used by Game of Thrones Season Three. He explained how, with five episodes shooting concurrently, five cinematographers flying between various far-flung locales maintained consistency across all locations using iPads loaded with frame grabs from previous and current episodes.
Near Set data management – based with the editorial team – can take a number of forms. One option is to set up a fully optioned ‘data cart’ that can do everything. Alternatively a tailored solution, specific to the needs of the production, can be put in place. A key consideration is the potential of change throughout the shoot e.g. addition of different or specialist cameras or a considerable increase in the volume of dailies. Sometimes the tailored solution may not be easily adapted to changes during the shoot.
These systems are usually hired out on a weekly basis so it can be a way of managing costs in the event of a blow out in the volume of dailies. However given this system is backing up the original source flash cards before they are erased and returned to set, it is important that the operator is experienced and accountable.
Rachel Knowles, GM of Blue Post Sydney, comments: “the market is looking at the options and choosing what works best with their project and budget. However while producers appear to be seeking mobility and flexibility, they still want a team and the security of an established facility – in other words a base of operation. Their dailies are precious and insurance and completion bond companies have an increased awareness of the potential risks”.
Quality Assurance (QC)
The traditional ‘Neg Report’ that was generated when transferring film dailies to video provided a host of information about the nature of the images and most importantly any issues that may require a re-shoot, e.g. soft focus, excessively under or overexposure, flicker etc. For insurance and bond companies, detecting faults during the shoot (instead of during post production) can save considerable sums of money. Some aspects of assessment can be achieved using automated systems e.g. data integrity, video signal characteristics etc. However subjective assessments must be performed with professional monitoring technology by an experienced operator who can identify, assess and advise on any issues. Along with the increase in the volume of dailies from digital acquisition, the market continues to search for the formula where this service can be achieved at a reasonable and predictable cost.
Streaming Dailies
There is increase in the use of cloud servers to ‘share’ dailies or work in progress edits on mobile devices. This is especially important when dealing with overseas investors or a production team that is spread across a range of locations. A few things to keep in mind:
• Reliable and quality playback requires access to reasonable broadband (or at least a download option).
• The cost to hold all of the content on the server throughout the review and approval process may become expensive.
• Uploading to the server needs a very good broadband connection – most readily available broadband systems are ‘asynchronous’ – the upload speed is much less than the download speed.
• If the volume of dailies dramatically increases so does the storage space and the upload time.
Editorial
The nature of editorial depends on the type of project. For television series, once a ‘rhythm’ is set, post production is more about improving the efficiency without impacting on the creative. For one off mini-series and TV movies the nature of editorial feels more like a feature film.
Whilst editorial has always driven the post production process, it seems now to remain active long after picture lock. With so many things happening concurrently, the editorial systems (even if in the form of a Media Composer on a laptop connected to portable storage) remain in play until close to mastering. This is in part to keep the creative process alive as long as possible but also to respond to the demand for compressed production schedules.
From a workflow perspective editorial is now very much the hub; a ‘notional central storage point’ that links all departments; design, VFX, sound, colour and mastering. This is less to do with being a single physical storage point and more to do with being a framework that allows easy sharing of data by all the various departments with everything updating and staying in sync. Highly skilled Assistant Editors are now seen as integral to the finishing process.
Tighter schedules now see sound start laying up tracks well before picture lock. In an ideal world, when a picture cut changes, this automatically travels through to the sound layup. Whilst still not perfect it continues to improve. The current approach to sound workflow also demonstrates that remote collaboration can work. In part because of the smaller file sizes, but also because the disciplines can easily be separated out, Foley artists and SFX and dialogue editors can work on relatively low cost technology in home studios, contributing to the final sound project (timeline) that is mixed in a certified mixing room. As internet bandwidth speeds increase, we will no doubt see it applied in some aspects of the picture pipeline.
Basic visual effects, traditionally completed in an online suite, are now being completed on relatively low cost workstations in a VFX studio (instead of a suite) and ‘published’ back into the editor’s master cut.
For television, depending on storage and compression considerations, it is now realistic to cut a project at what is considered broadcast resolution. This negates the need to do a ‘conform’. So in simple terms when the offline editor has finished the cut, the colour grade can be completed and the master created. Blackmagic’s focus on project timeline interchange between its Da Vinci Resolve and edit systems like Avid and Final Cut Pro could further enhance this approach to workflow.
However for content destined for broadcasters, international distributors and digital platforms like iTunes, compliance with delivery standards remains a significant issue.
And while some distributors are now accepting masters as a file format like Apple Pro Res (HQ) 10bit 4:2:2, there is much work to be done with standards that are capable of dealing with the intricate needs of international markets and special audience needs e.g. closed caption, audio description etc.
Loudness Measurement
The move to loudness measurement from ‘peak’ measurement is significant. It removes the focus on a momentary peak that often prevents television sound mixers from creatively using the dynamic range available in digital television.
However, from a workflow perspective, it has added a complication. Although peak measurement is largely made ‘on the fly’, loudness is measured as an average across the program. Whilst real time ‘monitors’ are coming to market that will provide a guide during mixing, the final mix will remain subject to a software measurement tool that analyses the sound track and provides a measurement against a given standard. The optimum level has been determined from years of research focussed on identifying an ideal loudness level for humans. If the measurement is under, it’s not hitting the ‘sweet spot’ and if it’s over, it may be rejected by distributors. Whilst engineers refine their skills in this area, it can be presumed that further time will be required to achieve the ideal level.
No doubt a log file of the loudness readings will become a mandatory requirement of program master delivery to prove compliance with the required standards.
Focus: Mr and Mrs Murder
For Mr and Mrs Murder, Inspiration Studios’ Problem Solver, Cail Young, developed a workflow built around DNxHD185x 10bit 4:2:2 Log C video streams (that’s a mouthful) captured in the ARRI ALEXA camera.
The entire workflow utilised only one set of original image files for offline, grade and mastering – backed up of course. As each episode was locked off, an AAF file with consolidated media was move from the Avid Media Composer into the DDP Studios grading and mastering environment where colourist Dee McClelland was able to start grading as soon as the file transfer was complete.
While Log C images provide greater latitude for the colourist during grading, in their native form they lack contrast and are not pleasant to look at during editing. To see the images closer to the final look, they are normally rendered with a default grade; commonly termed ‘baking-in a LUT’. To achieve the same result, but while maintaining the full range of the Log C images for the final grade, a series of LUT boxes were installed on the Avid monitors during editing. Whilst in the first instance this was a complication, the latest version of Avid Media Composer allows for LUTs to be applied to source media making it much easier and less expensive.
Further to this, Inspiration Studios’ iPad-based onset review system captured live the ‘baked’ camera output for the majority of shots. However, for high-speed or untethered shots, that were processed offsite, the LUT was ‘baked-in’ before being fed back to set for ongoing reference.
Whilst it is important to have current generation workstation technology, storage is the key consideration. DNxHD185 uses six times more storage than the traditional offline resolution DNxHD36 and any ongoing blowout in the shoot ratio can be a concern. However lower cost high density RAID storage and the division of episodic shows into blocks makes it easier to manage.
As we look ahead, a potential benefit of always dealing with the native Log C files is that any adjustments to colour throughout the editing process – for example using the Baselight for Avid plugin – can be more directly carried forward as a reference in the final grade.
Reality Television
The big challenge for reality television is keeping the audience engaged. It’s no longer about creating a master and sending it to a broadcaster. There are now many delivery platforms in different countries and different time zones that must be considered.
For Shine’s Director of Post Production & Technical Services, Scott Rowan, “timelines are getting shorter and the complexity of engaging the audience is increasing”. In its simplest form, this can be about uploading show segments to a website to allow viewers to catch up on something they missed. However if an amazing performance or controversial event builds social media buzz, the content must be available to build on this energy. If the audience can’t find it the moment they hear about it, it can weaken an existing audience relationship or miss out on the opportunity to build an even larger audience.
Productions are now bigger by nature, not only in their scope but in the time on air, which means a huge increase in the volume of material being shot. Scott continues; “as workflow specialists we have an intimate understanding of the production goals and it represents not only a collaboration to provide efficiencies (keeping costs down), but also to maximise production values from the available budget”.
Like most workflows, the shooting format very much dictates its effectiveness. On major productions The Voice and Masterchef, Shine uses the XDCam50 codec to acquire, post produce and deliver. Acquiring with XDCam enables the underlying efficiency of the entire workflow, highlighting the importance of controlling the workflow from the very start. A further benefit comes from the hybrid XDCam VTR. Normally if an error is identified in a master file during the QC process, it usually means re-encoding the entire file. With the XDCam VTR a simple video insert can be made into the master with an updated file immediately available.
Having all the content in one, easily accessed storage architecture is critical. The producers all have Avid workstations allowing a highly collaborative relationship with the editors; whilst the producers are pulling grabs from interviews to define and support the story arc, the highly skilled editors focus on the quality of finished content and meeting the demanding deadlines.
On the subject of Cloud services, Scott observes that “whilst Avid is moving closer to a true cloud editing solution that works, it will be some time before we can consider this simply because of the volume of data that we generate and the speed at which he have to turn it around”.
The Second Screen
Television directors have long recorded the outputs of individual cameras – referred to as ISO recordings – while they ‘switch a live feed’ to the broadcaster. Over the last decade these recordings have moved from multiple videotape recorders to highly sophisticated media management systems. The director can now simultaneously view all camera angles for any given event and quickly select which replay will go to air. This is also the technology behind the score review systems use by a number of sporting codes.
A recent innovation from technology company EVS allows footage to be selected and uploaded onto a cloud server where it can be accessed by mobile apps. This enabled mobile app ‘Jump In’ to develop a highly sophisticated application for major NRL events like the State of Origin.
As the game progresses a second director reviews the available camera angles on any given key event and decides which angles will be uploaded to the cloud server. These elements are then picked up by Jump In allowing the viewer to customise their viewing experience on the tablet while the main coverage continues on the television.
Being able to see all these angles and allow individual analysis appears to be a key “how good is this moment” driver of social media e.g. Twitter and Facebook. Dylan Cameron, EVS Australia Operations Manager, sees the potential of these apps to be considered as a reference point for social media interactions. “Typically a contentious try can generate considerable social comment. The metadata wrapped around this event can be captured and statistical information provided. The longer term outlook for these applications is to allow not just a replay of the event but also the sense of emotion that surrounded the event from the ‘social media crowd’.”
Advertisers and Broadcasters see great significance in the ‘metadata’ generated by viewers using mobile applications like Fango, Zeebox and Jump In. And program makers are also seeing the potential to mould an idea in response to knowing audience habits and preferences gained from second screen devices.
It therefore seems clear that there is another element in the content production workflow that will increasingly need to be considered; interactivity for an actively engaged audience.
This article first appeared in IF Magazine issue #155