Workflow for: SKY CAPTAIN AND THE WORLD OF TOMORROW
Workflow for:
SKY CAPTAIN AND THE WORLD OF TOMORROW
by Sabrina Plisco, A.C.E.
Final Cut Pro v.3
May 2002-July 2004
Keywords: Final Cut Pro, MacIntosh G-4, Kona HD card, Photo Jpeg, Maya, After Effects, DVD Studio Pro, Automatic Duck, NUCODA, Digital Intermediate Suite
1) Preproduction: The World of Tomorrow studio artists (WOT) located in Van Nuys, CA began by creating storyboards like a traditional animated movie. These were cut together in a 24 frame project along with the audio from a table read through with the cast. The director chose to build the studio primarily with MacIntosh computers so editorial was set up with three G-4’s and Final Cut Pro 3. This was state of the art at the time in summer of 2002!
The approach of this movie was to create all the elements in every single shot in the computer except for the actor and the items they touched. So a strong previsualized cut needed to be created to prepare for shooting with the actors.
As the number of artists grew, the WOT studio began to create animatics. These were simple three dimensional Maya animations with look alikes of our actors which we called “puppets”. We used this cut to block scenes and to create a shot list of camera angles which would be used to plan the actual production shoot. Every single shot was given a CG number which would be tracked throughout the entire process. There were over 4000 animatics created but only 2100 shots ended up in the movie.
In some instances, shooting tests were done with actor doubles to test camera lenses. So instead of animatics, we would use these test angles to represent certain shots within a scene. By the time the show went into production, there was a complete edited version of the movie with temp effects and music made up of a combination of storyboards, animatics, and test shots. This was used as a guide for production prep and was also most useful to share with the actors before they shot a scene. The actors were only shot against blue screen so watching the animatic guide of the movie was the only way they could understand was happening around them.
2) Production: (London) The animatics were designed from a computer based 3D set with a “grid pattern” that looked like a topographical map.
Remember the game Battleship? I was told the think of the marking system of the animatic grid like the Battleship game. The actual blue screen stages were then marked with tracking dots which denoted the same marking system. So if an animatic puppet walked from B2 to J7 then the actor on stage would be instructed to walk to those same corresponding dots. Because this was so crucial to the way that this movie was being made, a switcher was used to comp together the live actors over an animatic shot to confirm their accuracy.
To accomplish this, editorial rendered out each animatic shot into an
M-JPEG A file and put them onto a G4 work station for use on the principal photography blue screen stage. The G4 was equipped with an Aurora Igniter card which allowed analog video to be sent to a switcher via QuickTime (QT). On the live action blue screen stage, a camera feed was taken from the HD camera and was converted via AJA down-converter into the switcher that comped the live action over the animatics.
Because this movie was one of the first it’s kind, being shot on HD and being completely digital until the film out, the director chose to shoot in true 24p so there would be no problem in the translation back to film. So on set, the HD camera recorded at true 24P without sound so there would be no hindrance of cables. But there was a second HD backup deck rolling and sound was routed to this deck as well as to a DAT backup. Each was stamped with time of daytime code from the smart slate.
3) Post Production: (Van Nuys, CA) The HD backup tapes with sound were shipped from London to editorial in Van Nuys. We had our own HD deck, so we prepped our own dailies. We were our own post house!
The full length of the HD tapes were captured via a Kona HD card using Standard Definition (SD) Photo JPEG. Each take was cut up and rendered as self-contained QT movie. One of our stumbling blocks was that we found the HD deck and FCP to be buggy when trying to bring in timecode as we captured. So we had to manually assign timecode to each QT take. Using the smart slate on the take, a visual timecode burn in was applied and the clips were then rendered out again and sorted for me to use on my FCP SD workstation. This media was used to create the master off-line cut.
For logging purposes, batch lists of the takes were modified and imported into File Maker Pro.
We had to start locking sections of the movie very quickly after principal photography was completed so that the VFX assembly line could begin. Once a sequence was locked, the SD Photo JPEG material was used
as a guide for upresing the takes in HD. Using the timecode from the SD clips with only 12 frame handles, a batch capture was performed on the HD station. The material was captured at full resolution of 1920x1080 8 Bit using the Kona HD and BlackMagic codec. The HD clips were then ready to deliver to the compositing department along with a line up sheet.
The VFX department took the blue screen HD elements and keyed/comped and created image files mainly using After Effects. But as licenses were obtained for Shake, many artists transitioned to that application.
As the first composited material started to be created, an HD
review station became necessary to study each shot. So the timeline from the master SD offline cut was used to conform an 8 Bit HD version. As shots were completed or updated, they would be added to both the SD and HD versions of the cut. To do this, VFX would deliver both Photo JPEG and
8 Bit HD versions of every comp to editorial.
Essentially this allowed for the director and I to continually work on the creative edit in SD while the HD conformed edit was used for critical technical assessment.
4) Previews and screenings: Our early screenings and previews were done in SD from DVD’s which were burned in DVD Studio Pro. But as more and more of the show was completed in HD, we would output to HD tape stock. If there were shots missing in the HD timeline, we would “bump up” the SD material into 1920x1080 to complete the HD timeline for output.
5) Sound/Music delivery: Since we were a completely digital show which originated in true 24p (not 23.98!) creating the elements for sound and music delivery proved to be a challenge. We had to export our FCP timelines into Photo Jpeg sequences at 24fps splitting audio as requested. We then had to take this sequence into After Effects and create a pulldown 23.976 QT movie. This is the rate that audio must play at to be in sync with film! The 23.976 movies were then rendered out to 29.97 in the
M-JPEG A codec. All this in order to get to videotape speed! So we would take this movie from the G4 workstation via the Aurora Igniter card and output to Beta SP tape for a tape delivery for sound and music.
The only other thing we discovered was that the OMF's were flaky in FCP 3. So we made copies of our FCP timelines from FCP3 and brought them into FCP4 and exported the OMF’s from there.
6) Conforming for film out: As shots were finaled, the vendors created 16 bit SGI files of each composite. Opticals like wipes, dissolves and multi-layering needed to be created into their own single 16 bit SGI file. So Automatic Duck was used to translate the optical counts from Final Cut Pro into After Effects where a 16 bit SGI file could be created.
Once an entire sequence was visually finalized, the sequence was approved to have the 16 bit SGI files of each shot created by the compositing team. Then these files would be conformed to the off-line master cut by the in-house conform team. This process was one of the most painless in our workflow. From Final Cut, a CMX 3600 EDL was created along with a JPEG image sequence and given to the conform team. They imported these into a NUCODA system which is a PC based station that is able to play back and work with 16 bit SGI files. They recreated the offline cut in their timeline and were able to export out entire sequences to be given to EFILM for the color timing in the Digital Intermediate Suite. To double check the process and the edit, they would create a JPEG image sequence that was rendered out via After Effects into a Photo JPEG QT movie, and laid over the offline cut. It was then possible to check the location of cuts and frame ranges that were to be exported to film.
Once the color timing was complete, digital files were converted to
35mm film!
Commentary:
I would have to say I enjoyed working on FCP and would do it again. I know that there have been advances in the networking capabilities since we set up our shop in 2002. This was one of our biggest obstacles. We had a hard time linking up multiple systems to work off the same media and stabilizing them. In the end, we managed but I look forward to a much more reliable networking set up.
I also think some software advances have happened since I began which would only give more power to FCP. But my biggest complaint was the trim mode. It was a bit clunky and limiting and took many more steps than necessary. Hopefully this is improving with the software updates.
I also loved the ease of working in multi levels of video and the fx tools were a tremendous advantage on this show. To me this is where the system shines.
Equipment:
Editor/assistant work stations
(3) 1GHz Dual Mac G4’s with 2 GB of RAM and an internal raid with
200 GB of extra storage along with the 70 GB startup drive.
Network and storage
We set up a linux based file server with 2 TB of storage for the media which 3 machines shared. FCP project files were kept on the main system, as they can’t be saved on a non-Mac formatted drive.
Auxiliary station for DVD burning
Standard Dual 2 GHz Mac G5 with 2 GB RAM. No extra storage.
Used compressor for MPEG 2 compression and Apack for Dolby AC3
Compression for sound DVD Studio Pro.
VFX HD editing stations
2 Ghz Dual G5’s with GB RAM with a fiber card attached to
Apple Xraid--3TB of storage. Included Kona HD cards and a Decklink Pro HD card.
SKY CAPTAIN AND THE WORLD OF TOMORROW
by Sabrina Plisco, A.C.E.
Final Cut Pro v.3
May 2002-July 2004
Keywords: Final Cut Pro, MacIntosh G-4, Kona HD card, Photo Jpeg, Maya, After Effects, DVD Studio Pro, Automatic Duck, NUCODA, Digital Intermediate Suite
1) Preproduction: The World of Tomorrow studio artists (WOT) located in Van Nuys, CA began by creating storyboards like a traditional animated movie. These were cut together in a 24 frame project along with the audio from a table read through with the cast. The director chose to build the studio primarily with MacIntosh computers so editorial was set up with three G-4’s and Final Cut Pro 3. This was state of the art at the time in summer of 2002!
The approach of this movie was to create all the elements in every single shot in the computer except for the actor and the items they touched. So a strong previsualized cut needed to be created to prepare for shooting with the actors.
As the number of artists grew, the WOT studio began to create animatics. These were simple three dimensional Maya animations with look alikes of our actors which we called “puppets”. We used this cut to block scenes and to create a shot list of camera angles which would be used to plan the actual production shoot. Every single shot was given a CG number which would be tracked throughout the entire process. There were over 4000 animatics created but only 2100 shots ended up in the movie.
In some instances, shooting tests were done with actor doubles to test camera lenses. So instead of animatics, we would use these test angles to represent certain shots within a scene. By the time the show went into production, there was a complete edited version of the movie with temp effects and music made up of a combination of storyboards, animatics, and test shots. This was used as a guide for production prep and was also most useful to share with the actors before they shot a scene. The actors were only shot against blue screen so watching the animatic guide of the movie was the only way they could understand was happening around them.
2) Production: (London) The animatics were designed from a computer based 3D set with a “grid pattern” that looked like a topographical map.
Remember the game Battleship? I was told the think of the marking system of the animatic grid like the Battleship game. The actual blue screen stages were then marked with tracking dots which denoted the same marking system. So if an animatic puppet walked from B2 to J7 then the actor on stage would be instructed to walk to those same corresponding dots. Because this was so crucial to the way that this movie was being made, a switcher was used to comp together the live actors over an animatic shot to confirm their accuracy.
To accomplish this, editorial rendered out each animatic shot into an
M-JPEG A file and put them onto a G4 work station for use on the principal photography blue screen stage. The G4 was equipped with an Aurora Igniter card which allowed analog video to be sent to a switcher via QuickTime (QT). On the live action blue screen stage, a camera feed was taken from the HD camera and was converted via AJA down-converter into the switcher that comped the live action over the animatics.
Because this movie was one of the first it’s kind, being shot on HD and being completely digital until the film out, the director chose to shoot in true 24p so there would be no problem in the translation back to film. So on set, the HD camera recorded at true 24P without sound so there would be no hindrance of cables. But there was a second HD backup deck rolling and sound was routed to this deck as well as to a DAT backup. Each was stamped with time of daytime code from the smart slate.
3) Post Production: (Van Nuys, CA) The HD backup tapes with sound were shipped from London to editorial in Van Nuys. We had our own HD deck, so we prepped our own dailies. We were our own post house!
The full length of the HD tapes were captured via a Kona HD card using Standard Definition (SD) Photo JPEG. Each take was cut up and rendered as self-contained QT movie. One of our stumbling blocks was that we found the HD deck and FCP to be buggy when trying to bring in timecode as we captured. So we had to manually assign timecode to each QT take. Using the smart slate on the take, a visual timecode burn in was applied and the clips were then rendered out again and sorted for me to use on my FCP SD workstation. This media was used to create the master off-line cut.
For logging purposes, batch lists of the takes were modified and imported into File Maker Pro.
We had to start locking sections of the movie very quickly after principal photography was completed so that the VFX assembly line could begin. Once a sequence was locked, the SD Photo JPEG material was used
as a guide for upresing the takes in HD. Using the timecode from the SD clips with only 12 frame handles, a batch capture was performed on the HD station. The material was captured at full resolution of 1920x1080 8 Bit using the Kona HD and BlackMagic codec. The HD clips were then ready to deliver to the compositing department along with a line up sheet.
The VFX department took the blue screen HD elements and keyed/comped and created image files mainly using After Effects. But as licenses were obtained for Shake, many artists transitioned to that application.
As the first composited material started to be created, an HD
review station became necessary to study each shot. So the timeline from the master SD offline cut was used to conform an 8 Bit HD version. As shots were completed or updated, they would be added to both the SD and HD versions of the cut. To do this, VFX would deliver both Photo JPEG and
8 Bit HD versions of every comp to editorial.
Essentially this allowed for the director and I to continually work on the creative edit in SD while the HD conformed edit was used for critical technical assessment.
4) Previews and screenings: Our early screenings and previews were done in SD from DVD’s which were burned in DVD Studio Pro. But as more and more of the show was completed in HD, we would output to HD tape stock. If there were shots missing in the HD timeline, we would “bump up” the SD material into 1920x1080 to complete the HD timeline for output.
5) Sound/Music delivery: Since we were a completely digital show which originated in true 24p (not 23.98!) creating the elements for sound and music delivery proved to be a challenge. We had to export our FCP timelines into Photo Jpeg sequences at 24fps splitting audio as requested. We then had to take this sequence into After Effects and create a pulldown 23.976 QT movie. This is the rate that audio must play at to be in sync with film! The 23.976 movies were then rendered out to 29.97 in the
M-JPEG A codec. All this in order to get to videotape speed! So we would take this movie from the G4 workstation via the Aurora Igniter card and output to Beta SP tape for a tape delivery for sound and music.
The only other thing we discovered was that the OMF's were flaky in FCP 3. So we made copies of our FCP timelines from FCP3 and brought them into FCP4 and exported the OMF’s from there.
6) Conforming for film out: As shots were finaled, the vendors created 16 bit SGI files of each composite. Opticals like wipes, dissolves and multi-layering needed to be created into their own single 16 bit SGI file. So Automatic Duck was used to translate the optical counts from Final Cut Pro into After Effects where a 16 bit SGI file could be created.
Once an entire sequence was visually finalized, the sequence was approved to have the 16 bit SGI files of each shot created by the compositing team. Then these files would be conformed to the off-line master cut by the in-house conform team. This process was one of the most painless in our workflow. From Final Cut, a CMX 3600 EDL was created along with a JPEG image sequence and given to the conform team. They imported these into a NUCODA system which is a PC based station that is able to play back and work with 16 bit SGI files. They recreated the offline cut in their timeline and were able to export out entire sequences to be given to EFILM for the color timing in the Digital Intermediate Suite. To double check the process and the edit, they would create a JPEG image sequence that was rendered out via After Effects into a Photo JPEG QT movie, and laid over the offline cut. It was then possible to check the location of cuts and frame ranges that were to be exported to film.
Once the color timing was complete, digital files were converted to
35mm film!
Commentary:
I would have to say I enjoyed working on FCP and would do it again. I know that there have been advances in the networking capabilities since we set up our shop in 2002. This was one of our biggest obstacles. We had a hard time linking up multiple systems to work off the same media and stabilizing them. In the end, we managed but I look forward to a much more reliable networking set up.
I also think some software advances have happened since I began which would only give more power to FCP. But my biggest complaint was the trim mode. It was a bit clunky and limiting and took many more steps than necessary. Hopefully this is improving with the software updates.
I also loved the ease of working in multi levels of video and the fx tools were a tremendous advantage on this show. To me this is where the system shines.
Equipment:
Editor/assistant work stations
(3) 1GHz Dual Mac G4’s with 2 GB of RAM and an internal raid with
200 GB of extra storage along with the 70 GB startup drive.
Network and storage
We set up a linux based file server with 2 TB of storage for the media which 3 machines shared. FCP project files were kept on the main system, as they can’t be saved on a non-Mac formatted drive.
Auxiliary station for DVD burning
Standard Dual 2 GHz Mac G5 with 2 GB RAM. No extra storage.
Used compressor for MPEG 2 compression and Apack for Dolby AC3
Compression for sound DVD Studio Pro.
VFX HD editing stations
2 Ghz Dual G5’s with GB RAM with a fiber card attached to
Apple Xraid--3TB of storage. Included Kona HD cards and a Decklink Pro HD card.
0 Comments:
Post a Comment
<< Home