So just how efficient can Final Cut Pro X be in feature film post production? Find out from the story of the post production process of Gabriel, a new film from Portuguese Director Nuno Bernardo.
A few years ago, I wrote an article for FCP.CO talking about my experience using FCPX on our feature-length TV documentary about stand-up comedy, “The Standups”. The text was focused on our adventures and troubles moving from a Classic FCP workflow we used for a decade to the new version of Apple’s video editing software. Since that experience, FCPX has been our NLE of choice, and we’ve been using on Feature Films, High-End TV series, TV Magazines and digital projects. Read the full article here.
Esports, or professional competitive video gaming, has surged in popularity in the past few years, inspired by images of Korean arenas packed with thousands of fans cheering for teenagers playing video games in front of computer screens, creating the same atmosphere you would find at a traditional sporting event.
As esports continue to grow, more personalities and companies are jumping in to get their piece of the pie as market researchers expect the industry to grow to $1.1 billion by 2019.That’s probably why the likes of Shaquille O’Neal, Mark Cuban, Samsung, HTC, Monster Energy and many European Top Football teams now have vested interest in eSports.
Traditional media companies, from TV Networks (like AMC, Turner or Sky) to talent agencies (like WME) are entering the eSports arena to have access to a large and very engaged audience that no longer watches TV or consume other forms of traditional media. However, this doesn’t mean that Millennials will suddenly start watching traditional linear TV. Instead, traditional media companies (their brands) that carry eSports events become relevant to these audiences whatever the medium.
beActive has been working with Portuguese public broadcaster RTP producing live Web streams of eSports events, a TV Magazine format and more recently, a fictional series set in the world of eSports. Although the Magazine is broadcasted on linear TV, the core audience watches it on-line (it premieres first on the RTP Player ahead of the TV Broadcast). This way, the Magazine can find two different audiences: the core eSports fans watch it online, but they recognize this as a RTP show, making the RTP brand relevant for them, at the same time the TV Broadcast increases the credibility of eSports and reaches a wider audience.
From niche audiences to a broader reach
One of the significant challenges for eSports is widening their audiences. Although the numbers are already huge with sold out arenas and millions watching the live streams of the main competitions, eSports need to find new audiences behind the core players and video game fans. The issue is the competition itself happens on a computer. And for the average audience it is difficult to figure out what’s going on the screen (eSports are not as easy to understand as a football or basketball match).
But eSports are a new Entertainment format that can’t be avoid so, event organizers, producers and broadcasters are still trying to find the right narrative, making the live coverage more entertaining. Producers need to create celebrities, well-known teams and establish engaged fan bases, work beActive has been doing, exploring new narrative and storytelling forms that serve the core fan base but also make the full experience more entertaining for a broader audience. We are bringing our expertise in the script and documentary world to this new era of televised videogames, experimenting innovative ways to bring video games to the TV screen.
eSports means live event television aimed at a new audience that no longer watches TV. Whatever media companies choose to stream live eSports events on their OTT platforms or linear TV Broadcast, eSports is an opportunity to reach a “lost” audience. The increasing production budgets of these games, the experimentation on new narratives around eSports events will make eSports on TV (or on OTT and digital platforms) a new form of entertainment that can reach segmented but wider audiences.
For example, the coverage of this year’s “The International”, one of the biggest eSports events in the world, with a prize pool of 18.5 million dollars, was comparable to top prime-time network TV entertainment shows like “The Voice” or sports events like the “Super Bowl”. The live broadcast featured dozens of cameras, top-notch light and stage design, augmented reality motion graphics and VR to extend the experience. After watching this event broadcast is safe to say that eSports video production is now pushing the boundaries of live TV coverage.
Originally published at MIPTRENDS
From beginning to end, a transmedia story should be a social phenomenon, one which draws people together and unifies them through shared experiences. At present, the industry is obsessed with creating toys and applications which are too exclusive. They do not address the primary goal of storytelling—bringing individuals together by revealing some truth about the world around us.
Like the ‘Choose Your Own Adventure’ stories of the 1980s, these gimmick driven products isolate rather than connect your viewers. Well designed alternate reality games are popular because they immerse players within the same social experience (the same way big talent shows like American Idol or The Voice do but on a smaller scale). The players are unified toward a common goal against a common evil.
If transmedia is to be even more successful in the future, we need to concentrate on designed experiences that are socially inclusive which have the power to bring people together through common interests and goals. This will require that we take more care in designing the path along which our readers and viewers access our stories. Transmedial producers have a tendency of creating interactive experiences that are overly complex which ultimately deter audience engagement across every available piece of content. We need to define the ‘path’ between audience access points much like the map function on a video game so that audience members know where they are in relative to the story as a whole and where they’re going, regardless of which piece of content they’ve accessed.
Procedural dramas owe their popularity to the fact that their audience knows exactly what to expect from their format. These types of shows open with the bang: a crime is committed, and the audience is among its witnesses. As the investigators search for the killer, the audience is kept just ahead of their deductions by a steady stream of key information. We might see a confrontation the police weren’t privy to or a clue they overlooked, and in the next three to five minutes we’re encouraged to solve the mystery ourselves. If our point of view was aligned with the investigators’, the show would be tedious, and viewers would switch off.
This sort of experience design is relatively easy to achieve on television because you are restricted to a linear format. Transmedia narratives, on the other hand, are disseminated across multiple platforms. Without a proper ‘map,’ piecing together so many disparate pieces of content can become a bewildering experience. Like a two-thousand-piece jigsaw puzzle, audiences need the overall image to make sense of each individual piece. It’s crucial that we, as transmedia producers, dedicate more time to creating a fluid path so that whatever experience the audience becomes involved with they know exactly where they’re going.
To this end, our priority should always be rewarding our audience as opposed to ourselves. No matter how cool a platform or an experience may seem to you as a producer, your project will be more successful if you design content respective to the audience’s point of view. Always ask yourself what your audience gains from each element of your property. If they spend ten seconds or minutes or hours reading and watching and playing with your content, how are they rewarded? Keep your audience’s most basic needs at the fore of your production plan; make them laugh and cry. Thrill them. Frighten them. No matter what you do, keep your audience emotionally challenged. Put simply, better storytelling is the ultimate reward.
Audiences and consumers are now fully multi-platform. They consume and demand interactive and deep experiences, strong characters, brands and a plot that can be expanded on different media. But creating an engaging experience and build a loyal fan base it’s not an easy task. But a well craft story can help you to create a strong and active community.
In the last couple of years, I’ve been invited to judge a few transmedia and cross-media pitch competitions and awards ceremonies for upcoming or already-developed projects. But something that is becoming more and more common in the hundreds of proposals I’ve been reading is that there’s this tendency of using a checklist-based multi-platform approach to justify that the project is transmedia. The majority of the presentations list the usual Facebook fan page, the Twitter account, the app, the fake website for the bad corporation that is featured in the story, and many other common content pieces that we already saw too many times before.
Just because everyone else seems to be doing that, doesn’t mean that your story needs it! The transmedia approach needs to be organic to the story you want to tell. Recently, I used the Mona Lisa metaphor to explain this problem when talking to one producer. It’s a fact that a big percentage of visitors of the Louvre go to the museum to see the famous Leonardo Da Vinci painting and only visit the corridors from the main entrance to the Mona Lisa room and back. But the Louvre has many other corridors and exhibition rooms that are not that popular: they are only seen by a small portion of the museum visitors.
My advice to that producer, at the time, was: focus on your Mona Lisa, the core story, characters and elements of your project, and the “corridors” that lead to that core element. Don’t try to set up the full Louvre with its dozens of rooms and corridors, especially if your resources are limited. Down the line, if you succeed with your initial approach, you will be able to add another room or another corridor.
Since 2012 we’ve been producing and releasing our most recent series, Beat Girl(below). And we started small, developing a Pinterest profile because the story was aimed at the core audience of this new social media network and was very image-oriented. From there, organically we expanded to a novel, so we could extend the images with words and partnered with Wattpad to release to a wider audience (again, they already had a young female audience on their website).
Using audience feedback and testing, we then defined a more comprehensive multi-platform strategy that included a TV Show, a Web series, a feature film, a game and more complete social media presence, always with the story and the lead character at the center.
A good pitch may help you find the core of your story
A way to test your concept and story is to prepare a pitch document. It’s a way to bring it to life even if it’s to a handful of people. When you pitch you can see how people react to story, character and plot. This process can really help you to find the core and the strength of your STORY.
Here are a few things you should cover on your pitch:
1. Pitch the core story or format (if non-scripted): What is this about? What happens in the story, what is the theme and why does the story need to be told? Also focus on what is new and different in your project if the story is similar to an already-existing TV series or movie.
2. Pitch the characters: in my opinion, the characters are key in transmedia projects as they are the ones that host the experience for the audience in the different platforms and connect all the elements. Who’s guiding us in this experience and how will this person (or group of people) engage an audience? How the audience will relate to the lead character?
3. Why do you think people will connect with the story and the experience? Is it funny? Is it entertaining? Is the audience helping a cause? Are they helping and rooting for lead character to achieve his/her goal? Is there a prize involved? What the audience will win if they enter this transmedia experience?
4. What is the Transmedia plan: list the platforms to be used, put them in a timeline (where the experience starts and ends) and how audience and story crosses to other platforms? What comes first? Where is the story heading?
5. For each platform, more than listing the features of the website, game, facebook page, etc., it’s important to give an explanation of why the project needs that platform to engage the audience. Is it to create engagement, build a community, generate revenues or market the experience?
A good trick to help producers to focus on the core and the strengths of the project and how it will be presented to an audience is to design a banner or create a fake Google Ad to promote the project. This will make producers think on how to seduce the target audience using a one-sentence pitch of the project. If you can’t grab people’s attention and curiosity with one sentence and/or one image it will be difficult, later, to conquer the always-busy on-line audiences.
My story with Final Cut Pro X is very similar to many other directors and editors that bought it when it came out and couldn’t understand how it worked. Why, Apple, WHY?! So, after a few (not that many) hours, FCPX went to the app cemetery, the place where installed (but not used) software rest. After learning non-linear editing with Premiere (the first 1.0 version back in the nineties), using ImMix VideoCube and many other editors in the last two decades, I thought that I was too old to think different.
Like many others, I pretended that FCP7 (or legacy as it’s called now) was enough for my needs. During that period, at beActive, my production company, we edited two feature films and one feature-length documentary using FCP7, while slowly trying other options for the inevitable upgrade: Media Composer or Premiere CC – but none of them were appealing to us, so we kept using the legacy FCP as our NLE of choice, avoiding opening the X and giving it another try.
The Eureka moment came later in 2012 when one of the editors I worked with, suggest to me to edit a music video I directed a few days earlier with FCPX. Very reluctantly, I accepted, mainly because this was a short-form piece, all editing and mastering could be done in the same box and the editor seemed very comfortable with the software. Sitting next to him for a few days to give notes, I could notice how fast he worked and how easy it was to move things around and try new things. The project went flawless and I got intrigued with the X edition.
Later that Christmas, I shot a video with my kids and decided to try FCPX. It was a personal project, no deadline, so nothing at stake. I imported the 7D footage, started dragging the footage to the timeline, moved things around and got frustrated. Again! Although, this time I gave it a little more time and saw a few Ripple Training tutorials to try to understand how this beast works. A couple of hours later, I couldn’t believe how fast and easy it was to assemble and trim footage. I crossed the wall and finally saw why Apple changed the NLE video-editing paradigm.
Things got clearer to me when I was sitting on the edit room supervising another feature film that was being cut in FCP7. Everything I was doing on my personal video with one or two clicks was taking too long on the legacy FCP. Suddenly, X started to make sense and 7 started to feel old and outdated. After this experience I went to investigate more about the X evolution, downloaded a few tutorials and started to plan the switch. Later in 2013 we started using X for short-form and in 2014 we moved the long form projects to X too, finally abandoning FCP7. RIP!
Harland Williams and The Stand Ups Director Nuno Bernardo
The first long-form project we used to test our new workflow, built around FCPX, Resolve and ProTools, was a two one hour documentary series about Stand Up comedy for Irish broadcaster TV3, which I directed. The Stand Ups explores the trials and tribulations of modern comedy while following five stand up comedians in their attempt to make it big at the famous Edinburgh Comedy Festival. The documentary mixes talking heads of some established comedians with fly on the wall segments from the new comedians, previously broadcasted TV footage and a few stand up comedy shows shot multicam.
Robbie, Colm, Chris, Alison & Niamh
During a 5-month production we used several cameras including a Sony FS700 (recording ProRes in an Odyssey 7Q), Blackmagic Cinema Camera, Blackmagic Pocket Cinema Camera, Canon C300 and 7D. Our frame size was 1920×1080 and we shot everything at 23.98, except the C300 that was 24 (more on that later). ProRes 422HQ was our codec of choice and the 7D footage was converted to ProRes using 5DtoRGB. Our DIT technician (also assistant editor) used FCPX to log all the footage. Everything was imported on a single library using the new library model introduced on 10.1.
The Stand Ups documentary timeline in FCPX (Right click for larger image)
We also followed a Proxy workflow so everyone could work separately at different places using only the proxies. A hard drive with a copy of the library and the proxy footage was given to the Editor, second assistant director and myself. Whenever new footage arrived, the Library was updated and the proxy video files directory was updated.
Stringouts with metadata
The first assistant editor was using keywords and the metadata features of FCPX that allowed the editor to easily identify stories, themes and string-outs. Basically, we were doing manually what a tool like Lumberjack System now does automatically (we are testing Lumberjack and we plan to use in our upcoming documentary productions). We used multicams for editing the comedy shows and compound clips to organize the string-outs.
Everything was going smoothly until Apple came out with 10.1.2 and modified the library model. Suddenly there was no way to have a separate folder for proxies so our proxies workflow was gone. I know, never upgrade a system during a project! By September 2014 we synced the different cuts and versions and all the footage in one library before moving forward. During this process, we had problems importing around 30 video files (out of hundreds of files and 5 TB of footage). FCPX simply refused to relink a few files.
We also had problems with resolution, as our off-line proxy editing timelines were set to 960×540. We needed to go back, and manually change the resolution of the multicam files. When we were able to consolidate and conform all edits in a master library, we had lost a couple days but we are able to get back on track. After the 10.1.2 release, we stopped sharing Libraries and importing timelines this way and started swapping XML files between editor, director and assistants.
The whole documentary in DaVinci Resolve (right click for larger image)
We did a picture lock by November 2013 and then grading started. We were now moving the project to Davinci Resolve. When we imported the XML from FCPX, Resolve was not detecting the right files. In some segments of the edit, Resolve was choosing other clips or just putting black slug instead of the original video. We tried this process a few times but the outcome was always the same.
After loosing three days we found our problem. We imported the C300 footage from the cards using the Canon XF Plug-in. FCPX converted it to ProRes 422HQ but the converted files were not being accessed by Resolve. On top of that, we realized that the C300 recorded the footage at 24p and everything else was 23.98p. That was not a problem for FCPX that was able to handle that and make perfect sync multicam at different frame rates, but Resolve was not able to read the correct TC on the XML file to the correct files so that was the reason for wrong video segments and clips appearing on the timeline.
We were able to solve this problem exporting a master file of the final edit ProRes 422HQ and importing it on Resolve. Again, manually we substituted the non-linked or wrongly linked video files with sliced pieces of the ProRes master. This laborious process took us an additional two days. Lucky for us, the process of exporting the project to ProTools using X2Pro Audio Convert was painless. Everything went smoothly as expected (this was a more proven workflow that we’ve been using for more than a year with our short form projects).
Exporting the audio to Protools using X2Pro
An exported project audio in ProTools (Right click for the large image)
The final versioning and delivers were done in Premiere CC. The main reason for this choice was that the tracks layout makes it easier to import the final graded master from Resolve, import the mixed audio from ProTools, add titles and check levels. From Premiere we were able to export the final different masters, with different audio versions and different text and text less versions. That said, one of the deliveries required us to go back to FCPX. We were asked to deliver an SD 16:9 25i interlaced version of the show, converted from the 1920×1090, 23.98p master. Doing different tests, we concluded that FCPX and Compressor provided the best quality for the task (the conversion out of Premiere was full of artifacts).
After this 6-month journey I’m now confident that FCPX is the right tool for editing any type of long form, but specially documentaries and factual shows. The metadata facilities included make it the right tool for the job. The ecosystem created around it makes it more manageable. Tools like Producer’s Best Friend allowed to create a Music Cue Sheet in less than 5 minutes. Finding the right shots and editing is way faster than with other NLEs.
This first journey was not easy and was full of small bumps and problems, but I’m convinced that we made the right decision. And with the lessons learnt, we now have the best workflow and the right tools for the job.
Article originally posted on FCP.CO on April 7, 2015.