I've run into this issue a few times recently so I thought I would write a post about it. When I am creating Watchout shows the media can be delivered in various formats. Audio can be .wav or .mp3, Images can be jpegs, .png's or Photoshop files and video can be MPEG-1 or 2, Windows media files or Quicktimes.
Now of course you want the quality of the show to be perfect, so when you are shooting and editing you work with the highest possible quality. RAW images, uncompressed quicktimes or files encoded with lossless codecs.
These are production formats and as such they are suited to editing and distribution without any loss of quality. However, they are not ideal for final playback of the material. I have had a few occasions where clients have delivered production quality content as the final media. Uncompressed Full HD quicktimes or print resolution Photoshop files.
These files are huge (Uncompressed HD can be up to 200Mb per second, that's 12 Gigabytes per minute!) and although they look fantastic they require extremely fast hardware to play them in real time. You would need a RAID array on every playback machine.
Once the editing is completed the final delivery format for video or graphics can be a compressed format, as long as the settings are correct the picture quality should still be excellent.
For HD video I normally recommend either MPEG-2 at a bit rate of 24Mbps or Quicktime using the H264 codec at around 16Mbps. The MPEG-2 files will be slightly larger (although nowhere near the size of uncompressed) but they require less intensive processing for decompression and so will playback on lower specification hardware. H264 is a great codec for distribution as it provides extremely efficient compression whilst maintaining very high image quality. (H264 is the codec used for Blu-Ray discs) However, it does require more processing power and so it should only be used when the display computers are sufficiently powerful. Some graphics cards have H264 acceleration in hardware so this might affect your choice.
It is a good idea to do a test render of a small part (choose something complex) and experiment with codec settings. Check it looks good on your computer but also test it on the playback system.
Graphics files should be RGB, not CMYK, and should not be too much larger in resolution than the display. If the display computer has to scale a huge image down to the resolution of the projector the performance will suffer.
I normally use JPG's at the exact resolution of the screen for background images and PNG's with alpha for logos and titles that will be overlaid over other content.
This also holds true for other systems than Watchout. There are many different types of media players and they all have their preferred playback formats. If somebody asks you for uncompressed video ask them why? Most media players will accept compressed formats, you just need to make sure you use the right settings.
At a recent show the client spotted a typo in a text animation which was part of an HD video. The production company work with After Effects and Final Cut Pro on a Mac and they had to make the change and deliver a new version within four hours. They rendered a new version and then drove 100km to deliver a hard drive with an uncompressed version of the video. I have Sony Vegas Pro on my laptop so I rendered their uncompressed version to MPEG-2 and loaded it into the show. The quality was identical but the file size was 20% of the original. We had high speed internet available at the venue so they could have saved themselves the trip :-)
So remember, size isn't everything. Huge files are OK for post production but they can bring a playback computer to it's knees. They are also a pain to move around as the file sizes make ftp or other transfer systems too slow.
Find out which formats will work on the desired playback system and create the media with the best balance between quality and size.
Thanks for reading and I appreciate any questions or comments. Neil
Wednesday, 27 April 2011
Monday, 25 April 2011
Which software is used to edit Movies?
I recently replied to a question from Dheeraj Gautam on the Sony Vegas group on Linked-in and I also posted it here.This was his response:
Thanks for such an informative reply Neil , much appreciated. Now I have another question. I have seen people using final cut pro for video editing, but what about Big budget Hollywood or Bollywood films? Films like Terminator, Avatar, 127 hours, are they also edited on final cut pro or are they using different software to edit like Autodesk smoke?
You're welcome Dheeraj. To answer your question I need to explain a bit of History.
Originally all films where edited by physically cutting the film and then viewing the edited sequence on a Moviola or a Steenbeck. I'm sure there are still some editors working like this somewhere.
In the early 90's non-linear editing arrived, giving the editor enormous freedom to experiment with the material and create multiple versions of sequences until they arrived at the final cut (pun intended ;-)
So you have editors using Avid, Lightworks, Final Cut Pro and other systems to edit feature films. The manufacturers were keen to promote the fact that their software was being used in Hollywood, because if it's good enough for them it must be good enough for you :-) They used to have lists of films on their websites (Here is a list of films edited on Lightworks and Final cut Pro ) but films would often appear on multiple lists. How could a film be edited on Avid and Lightworks?
The answer is that film editing is a collaborative process. The editor cutting the film may be using Lightworks but the editor putting the effects sequences together could be using Avid so the film would end up credited to both systems.
The other issue is that using non-linear systems meant eventually going back to the film negative. So all these systems needed extensive metadata management to be able to track Keycodes and timecodes and be able to assemble the final edit from the original material. See Apples digital Cinema Tools.
Now we have electronic cameras like the RED and Silicon Imaging, editing systems capable of working with the footage at full resolution and digital projection, so it is possible to shoot, edit and project the movie without having to leave the digital environment.
Sony Vegas Pro can work at resolutions of up to 4K (4096 pixels) and has been used to edit some films like Paranormal activity and Deuce of Spades. It works well with footage from DSLR cameras like the Canon 5D Mk II and 7D and the new Sony PMW-F3 and NEX-FS100U. If you are on a budget and can complete all your work in Vegas that could be one way to go.
Autodesk Smoke is a finishing system more than an editing system. It is often used for the final assembly of digitally shot films before transferring them to film, (the "Digital Intermediate" process.) It can work at up to 8K (8192 pixels) resolution and has powerful grading and compositing capabilities. If you want to assemble a 90 minute feature at uncompressed 8K you'll need quite a bit of storage :-)
In Hollywood today the software is the choice of the editor. Walter Murch famously cut The English Patient on Avid, then cut Cold Mountain on Final Cut Pro and now has apparently gone back to Avid. Thelma Schoonmaker cuts all of Martin Scorsese's movies on Lightworks. The late Sally Menke cut all of Quentin Tarantino's films on Lightworks. The Coen Brothers cut all their movies on Final Cut Pro, under the alias of Roderick Jaynes.
I suppose at the end of the day the real answer is you can use whatever you like as long as you have a workflow that works for your project. If you have the ideas and the passion to create your movie it doesn't matter which software you use.
Thanks for such an informative reply Neil , much appreciated. Now I have another question. I have seen people using final cut pro for video editing, but what about Big budget Hollywood or Bollywood films? Films like Terminator, Avatar, 127 hours, are they also edited on final cut pro or are they using different software to edit like Autodesk smoke?
You're welcome Dheeraj. To answer your question I need to explain a bit of History.
Originally all films where edited by physically cutting the film and then viewing the edited sequence on a Moviola or a Steenbeck. I'm sure there are still some editors working like this somewhere.
In the early 90's non-linear editing arrived, giving the editor enormous freedom to experiment with the material and create multiple versions of sequences until they arrived at the final cut (pun intended ;-)
So you have editors using Avid, Lightworks, Final Cut Pro and other systems to edit feature films. The manufacturers were keen to promote the fact that their software was being used in Hollywood, because if it's good enough for them it must be good enough for you :-) They used to have lists of films on their websites (Here is a list of films edited on Lightworks and Final cut Pro ) but films would often appear on multiple lists. How could a film be edited on Avid and Lightworks?
The answer is that film editing is a collaborative process. The editor cutting the film may be using Lightworks but the editor putting the effects sequences together could be using Avid so the film would end up credited to both systems.
The other issue is that using non-linear systems meant eventually going back to the film negative. So all these systems needed extensive metadata management to be able to track Keycodes and timecodes and be able to assemble the final edit from the original material. See Apples digital Cinema Tools.
Now we have electronic cameras like the RED and Silicon Imaging, editing systems capable of working with the footage at full resolution and digital projection, so it is possible to shoot, edit and project the movie without having to leave the digital environment.
Sony Vegas Pro can work at resolutions of up to 4K (4096 pixels) and has been used to edit some films like Paranormal activity and Deuce of Spades. It works well with footage from DSLR cameras like the Canon 5D Mk II and 7D and the new Sony PMW-F3 and NEX-FS100U. If you are on a budget and can complete all your work in Vegas that could be one way to go.
Autodesk Smoke is a finishing system more than an editing system. It is often used for the final assembly of digitally shot films before transferring them to film, (the "Digital Intermediate" process.) It can work at up to 8K (8192 pixels) resolution and has powerful grading and compositing capabilities. If you want to assemble a 90 minute feature at uncompressed 8K you'll need quite a bit of storage :-)
In Hollywood today the software is the choice of the editor. Walter Murch famously cut The English Patient on Avid, then cut Cold Mountain on Final Cut Pro and now has apparently gone back to Avid. Thelma Schoonmaker cuts all of Martin Scorsese's movies on Lightworks. The late Sally Menke cut all of Quentin Tarantino's films on Lightworks. The Coen Brothers cut all their movies on Final Cut Pro, under the alias of Roderick Jaynes.
I suppose at the end of the day the real answer is you can use whatever you like as long as you have a workflow that works for your project. If you have the ideas and the passion to create your movie it doesn't matter which software you use.
Thursday, 21 April 2011
3D or not 3D?
It seems the whole world has gone 3D crazy over the past year with more and more 3D TV's for sale and now a whole crop of 3D camcorders coming out.
The problem with 3D of course is how do you edit and deliver your 3D footage?
Sony Vegas Pro introduced the ability to edit stereoscopic 3D in version 10 of the software and the latest update extends it's capability to enable burning of 3D Blu-Ray discs direct from the timeline.
Sony are also about to release the HDR-TD10 a full HD 3D camcorder which basically has 2 HD camcorders in one at a price of $1499. The camera records in MVC format (H264 with 2 video streams) which can be edited in Vegas 10d, or you can just let the camera do the edit for you:
Seriously though. Do I need to be shooting and editing 3D? Well it's not really my decision. If my clients ask for 3D I need to be able to provide it.
It is of course very easy to shoot 3D but still produce 2D masters by just ignoring one of the video streams. So by shooting 3D now I am future proofing my content. If things carry on the way they are soon everything will be expected as 3D. So I need to be able to shoot 3D, edit 3D, deliver 3D and watch 3D.
It's that last one that has been one of the main hurdles. 3D TV's are expensive and the whole glasses thing is not something that viewers take to. The majority of 3D TV's to date use "active" 3D where the 2 streams of video are displayed sequentially and active glasses block the left and right eyes to create the 3D effect. The problem is those glasses are heavy, expensive and they need batteries, adding to the hassle of watching 3D.
In 3D cinemas the glasses are lighter and need no batteries, so how do they work? These are using "passive" 3D. The left and right eye images are displayed using circular polarised filters, one clockwise and one anti-clockwise. The glasses have circular polarised lenses so each eye only sees the correct image. Until recently it was impossible to use this method for TV's but now there is a system where there is a polarising filter in front of the screen, with alternate lines of the TV image polarised either clockwise or anti-clockwise.
Sony have used this for some new professional monitors but it looks like LG have beaten them to it for the home with their Cinema 3D range. These TV's come with 7 pairs of passive 3D glasses and new ones can be ordered for $10 each, compared to active glasses at up to $150 per pair!
Passive 3D also has a wider viewing angle and viewers report less headaches when watching passive 3D than when watching active 3D.
Of course the holy grail of 3D is "glasses free" 3D sets. There are some sets that use a micro-lenticular or parallax barrier layer in front of the screen to give a 3D effect without glasses but it is very dependent on the viewing angle and not very clear. Useful as an eye-catching display in a department store maybe but not suitable for watching movies at home.
So for now my money is on the passive 3D systems. I'm sure other manufacturers will be bringing out their own passive 3D sets soon and the active TV models will follow HD-Ready sets into oblivion.
Thanks for reading, Neil
The problem with 3D of course is how do you edit and deliver your 3D footage?
Sony Vegas Pro introduced the ability to edit stereoscopic 3D in version 10 of the software and the latest update extends it's capability to enable burning of 3D Blu-Ray discs direct from the timeline.
Sony HDR-TD10 |
Why spend hours editing your movies when you can let your camcorder do it for you? Highlight Playback identifies and compiles key scenes into a short, entertaining movie complete with music and transitions.Hmm, I'm hoping to get my hands on one of these cameras soon to shoot some test footage so I'll be sure to give that a try :-)
Seriously though. Do I need to be shooting and editing 3D? Well it's not really my decision. If my clients ask for 3D I need to be able to provide it.
It is of course very easy to shoot 3D but still produce 2D masters by just ignoring one of the video streams. So by shooting 3D now I am future proofing my content. If things carry on the way they are soon everything will be expected as 3D. So I need to be able to shoot 3D, edit 3D, deliver 3D and watch 3D.
It's that last one that has been one of the main hurdles. 3D TV's are expensive and the whole glasses thing is not something that viewers take to. The majority of 3D TV's to date use "active" 3D where the 2 streams of video are displayed sequentially and active glasses block the left and right eyes to create the 3D effect. The problem is those glasses are heavy, expensive and they need batteries, adding to the hassle of watching 3D.
In 3D cinemas the glasses are lighter and need no batteries, so how do they work? These are using "passive" 3D. The left and right eye images are displayed using circular polarised filters, one clockwise and one anti-clockwise. The glasses have circular polarised lenses so each eye only sees the correct image. Until recently it was impossible to use this method for TV's but now there is a system where there is a polarising filter in front of the screen, with alternate lines of the TV image polarised either clockwise or anti-clockwise.
Sony have used this for some new professional monitors but it looks like LG have beaten them to it for the home with their Cinema 3D range. These TV's come with 7 pairs of passive 3D glasses and new ones can be ordered for $10 each, compared to active glasses at up to $150 per pair!
Passive 3D also has a wider viewing angle and viewers report less headaches when watching passive 3D than when watching active 3D.
Of course the holy grail of 3D is "glasses free" 3D sets. There are some sets that use a micro-lenticular or parallax barrier layer in front of the screen to give a 3D effect without glasses but it is very dependent on the viewing angle and not very clear. Useful as an eye-catching display in a department store maybe but not suitable for watching movies at home.
So for now my money is on the passive 3D systems. I'm sure other manufacturers will be bringing out their own passive 3D sets soon and the active TV models will follow HD-Ready sets into oblivion.
Thanks for reading, Neil
Wednesday, 20 April 2011
YouTube announces change to encoding format
YouTube Blog: Mmm mmm good - YouTube videos now served in WebM
All new videos uploaded to YouTube will be encoded to WebM an open, royalty-free, media file format for the web, based on On2's VP8 codec for video and Vorbis for audio. It uses the Matroska file structure.
Supported web browsers are Firefox 4 and later, Opera 10.6 and later, Chrome 6 and later and IE 9 (with plug-ins) So no Safari?
So what does this mean to the average YouTube user? Well... nothing. YouTube will still continue to work in the same way it always has.
In the future it means that browsers will have to support WebM and it removes any licensing issues that YouTube might have had with other codecs. I think it is also going to make it easier for YouTube to support live streaming of events.
For professional users who want to be able to encode their material for YouTube so as to maintain the highest possible quality it means we need to find an encoding solution that enables output as WebM. At the moment I can encode to VP8 for video and to Ogg Vorbis for audio but I don't have a tool to combine them into a WebM file. There are some tools mentioned on the WebM site here but I will need to experiment.
As usual any comments gratefully received.
Thanks for reading, Neil
So what does this mean to the average YouTube user? Well... nothing. YouTube will still continue to work in the same way it always has.
In the future it means that browsers will have to support WebM and it removes any licensing issues that YouTube might have had with other codecs. I think it is also going to make it easier for YouTube to support live streaming of events.
For professional users who want to be able to encode their material for YouTube so as to maintain the highest possible quality it means we need to find an encoding solution that enables output as WebM. At the moment I can encode to VP8 for video and to Ogg Vorbis for audio but I don't have a tool to combine them into a WebM file. There are some tools mentioned on the WebM site here but I will need to experiment.
As usual any comments gratefully received.
Thanks for reading, Neil
Tuesday, 19 April 2011
What is the difference between Sony Vegas pro 32 bit and Sony Vegas pro 64 bit ?
Dheeraj Gautam asked this question on the Sony Vegas user group on Linked-In. I thought the answer might be useful to others so I am posting it here too.
32 more bits? Sorry couldn't resist :-)
Vegas pro 64 bit is a fully 64 bit application. What this means is that on a 64 bit operating system like Windows 7 64bit Vegas can use far more RAM.
On a 32 bit system the maximum amount of RAM available is 4Gb. With HD footage and larger and larger files to deal with this is often not enough and so the computer has to keep swapping data from memory to the hard disk. Causing slow performance and sometimes crashes.
On a 64 bit system you can address up to 192Gb which provides far more room for data to be kept in RAM. (Theoretically up to 16Tb but Windows limits this to 192Gb :-)
The down side is that your plug-ins also need to be written for 64 bit in order to use them in 64bit Vegas. There are new 64bit plug-ins coming out every day and the OFX architecture that Sony have adopted means we will get access to many more plug-ins.
I personally run Vegas on a 64bit system (with 8Gb of RAM), but I have both the 32 and 64 bit versions installed. I can work on a project in the 64bit version with all the performance benefits that brings and then if I need to use a 32bit plug-in I can open the same project in the 32bit version of Vegas. Best of both worlds and one of the reasons I love Vegas.
Any comments or questions gratefully received. Neil
32 more bits? Sorry couldn't resist :-)
Vegas pro 64 bit is a fully 64 bit application. What this means is that on a 64 bit operating system like Windows 7 64bit Vegas can use far more RAM.
On a 32 bit system the maximum amount of RAM available is 4Gb. With HD footage and larger and larger files to deal with this is often not enough and so the computer has to keep swapping data from memory to the hard disk. Causing slow performance and sometimes crashes.
On a 64 bit system you can address up to 192Gb which provides far more room for data to be kept in RAM. (Theoretically up to 16Tb but Windows limits this to 192Gb :-)
The down side is that your plug-ins also need to be written for 64 bit in order to use them in 64bit Vegas. There are new 64bit plug-ins coming out every day and the OFX architecture that Sony have adopted means we will get access to many more plug-ins.
I personally run Vegas on a 64bit system (with 8Gb of RAM), but I have both the 32 and 64 bit versions installed. I can work on a project in the 64bit version with all the performance benefits that brings and then if I need to use a 32bit plug-in I can open the same project in the 32bit version of Vegas. Best of both worlds and one of the reasons I love Vegas.
Any comments or questions gratefully received. Neil
Monday, 18 April 2011
FCPX wins Videomaker's 'Best Editing Software' award and it hasn't even been released yet!
It's even more amazing how many of the features shown at the sneak peek (64 bit, native formats, drag audio fades, etc.) Have been in Sony Vegas Pro since version 8 which was released 2 years ago. (at about the same time as FCP7)
I have FCP7 but find the interface clunky and old fashioned and I find it very inflexible in letting me work the way I want to.
I am interested to see what improvements Apple have made but I think I'll wait until I have had a chance to use it before I hand out any awards. :-)
Subscribe to:
Posts (Atom)