Canon h264, Kdenlive and Avisynth workflow? (question for Yellow)

I'm pulling my hair out over Canon h264 MOV files. I like Kdenlive for its use of proxy clips as well as some nice filters and color correction tools, but I still can't get it to play even proxy clips in something like real time (possibly because I'm running it on Ubuntu through Virtualbox on a Windows 7 machine).

So I'm going to try another route: assemble a working copy and do basic color correction in Avisynth, then do effects and color grading in Kdenlive.

I'm making the wild assumption that you are the same Yellow as in the Doom9 forum. So here's my question: QTInput gives a beautiful image for the most part, but then there are some clips that it just blocks out in complete green. I have no idea what they are, and changing different settings on QTInput didn't seem to help. I saw you (or the Doom9 Yellow_) warning about QTInput here: http://forum.doom9.org/archive/index.php/t-162183.html and elsewhere, but FFMpegSource2 seems to bring up weird artefacts in the image.

Could you tell me what settings you use to get your Canon files to behave with AviSynth? (I'm using it with VirtualDub, in case that's important)

This whole Quicktime business ("but it works fine on iMovie on my old Mac" they say) is driving me crazy and making me wonder if I could/should have bought a different camera.

Many thanks in advance. I really could use some guidance!

hi

For proxies I use either of these:

720P if I'm just editing.

-f avi -acodec copy -vcodec mjpeg -s 1280x720 -b 6000k

Or

Or smaller at 640x360 if I'm adding any effects but i still don't expect realtime playback on my machines. Also kdenlive has no GPU acceleration unless vdpau is enabled int he build and I'm not sure even then just how useful it is.

-f avi -acodec copy -vcodec mjpeg -s 640x360 -b 5000k

Or recently with ProRes Proxy but this only seems to work on kdenlive from git built using the build script, which I find extremely useful.

Again 720P

-f mov -acodec copy -vcodec prores -profile 0 -s 1280x720

And smaller at 640x360

-f mov -acodec copy -vcodec prores -profile 0 -s 640x360

The 640x360 ProRes Proxy seems very good.

If you're testing various proxies profiles out be sure that you delete any previous proxies that have been built for the clip(s).

Although I have a number of machines, the one I usually end up using is a 64bit Sempron, 4GB RAM and a entry level NVidia GPU 7300GT and it's fine for editing with proxies of coarse. :-)

But in recent years I don't have time to do much grading, if I do any it's basic and usually end up using AVISynth with YCC plugins like SmoothAdjust, 32bit float and on 16bit data but generally I enjoy using Canon Picture Styles to get the type of look I like in camera, so not sure how well anywhere near realtime playback is in kdenlive with effects overlaid. Most are 8bit precision I think currently as well.

Yes, I am one and the same, peppered over Doom9 and elsewhere. :-)

I use Avisynth with AVSPmod under Wine, (I like the sliders) and eager to start using AVXEdit the Linux port of AVIsynth which is going on at the moment. I use FFmpegSource2, version 2.15 or recently 2.17. Version 2.16 and some svn builds after are buggy and squeeze the luma levels. Also I only use AVIsynth for YCbCr adjustments but if I was to use RGB plugins then ConvertToRGB(matrix="PC.601"). I've not seen any artifacts appearing using FFmpegsource2 but then I don't use MT builds, you may well get horrid output and playback using MT builds, unless that bug has been fixed now.

I stopped using Virtualdub because of it's defaulting to RGB processing, so unless you remember to set it to 'Fast Recompress' rather than 'Full Processing' it will once again squeeze the luma converting YCC to RGB to YCC. :-) So much prefer AVSPmod off the thread at Doom9. There we just have to remember to set the badly named YUV->RGB Preview to PC.601 to suit our source. But it doesn't affect the video only the preview.

The problem with QTInput is it scales the luma and tateu the author has not been able to get QTInput to output full range YV12/YUY2 with the full range flag set 'on' in the source. I've explained the full range flag on my blog here:

http://blendervse.wordpress.com/2012/04/02/waiving-the-fullrange-flag/

So to even get an edit out of kdenlive or other ffmpeg based NLE's the 'fullrange' flag really needs switching off, forcing full luma on a ffmpeg assigned yuvj420p pixel format source ie Canon & Nikon DSLR h264 with the flag set 'on' will be detrimental and still encode out as luma with 16 - 235. So to get a native Canon / Nikon file through kdenlive using for example kdenlives lossless h264 render profile for input into AVIsynth needs the camera files remuxing with a VUI Options aware tool, ie: special build of MP4Box to switch the flag off.

Even Premiere Pro CS5 squeezes the luma for Canon / Nikon DSLR's because of the flag, for all it's talk of being set up to handle DSLR video sources in a wide 32bit float RGB color space because of it's assumption full range flagged video is RGB levels therefore to maintain them in a non flagged source the levels must be scaled into 16 - 235 prorata to maintain them in a YCbCr source, totally stupid.

Look at a Canon C300 source and the levels are full range there, well top end is as it's LOG ie:lifted shadows but the then so is Canon h264 with the Cinestyle Picture Style still flagged full range. C300 files go into CS5 unsqueezed and can be pulled into 16 - 235 range with 32bit precision adjustments where as Canon h264 gets squeezed and reduced in quality for the same levels range. :-) The tell tale 0% To 100% IRE waveform gives that away.

Hope some of that helps.

Wow. Thank you for such a clear and information-rich response. I've spent the past few months trying out practically every option within my price range (which excludes CS5.5), and it was only recently that I realized that most of the helpful information that I was encountering in forums and even blog comments was by one and the same person! Honestly, as big as the internet is, I just find that crazy. :) Thank you for your generosity in sharing your experiences and discoveries.

I read your posts in Doom9 and BlenderVSE about luma squeezing and the full range flag, and I find them fascinating. This seems to be a recurring problem that everyone seems to be experiencing (if I'm not mistaken, I first read about it as a "gamma shift" in MPEGStreamClip and 5DtoRGB), so it flummoxes me why the only workaround is with a patched version of MP4Box. Just plain weird, isn't it? Anyway, for the moment I'd settle for being able just to piece my clips together, pre-process squeeze and all.

You've given me a lot of things to think about and try here. I'll download AVSPmod now and also try out your Kdenlive proxy recipes (up until now I've only been using the default -vcodec), and I'll also see if I can find out what's up with FFMpegSource2 (although I'm pretty sure I have 2.17).

Thank you again for all your suggestions! (though I'm sure I'll be back soon with more frustration and confusion) :)

P.S. One thing I've discovered that might be interesting for you to take a look at: the DAW Reaper uses FFMpeg to handle video so that soundtrack composers can sync their audio with video. But it turns out that Reaper is also capable of cutting the video and playing the cut video back in real time as well. It's the only program I've tried that has allowed me to do that. Reaper then uses FFMpeg to render the video. Unfortunately you can't control the FFMpeg command line, and most of their presets don't work for me, but it is possible to render a useable FFV1 or HuffYUV AVI. And on top of that, you have an audio workstation similar to Cubase for the music, sound design and audio tweaking.
http://www.reaper.fm/download.php
Works in Wine apparently ( http://wiki.cockos.com/wiki/index.php/How_to_run_Reaper_in_Wine_on_Linux ), and the evaluation copy is completely uncrippled. The funny thing is, most of the Reaper community seems completely against the video editing capacity in Reaper being further developed. Go figure.

Thanks, hope some of it is useful. You're comment about Reaper's video capability is interesting I'll take a look.

Why does an DAW software need video editing capability in it? Is it possible to a farmer to do a politician's job? funny. A Sound engineer wouldn't think of doing video editor's job, he has his own job. Rather loading Reaper with wine. Try Ardour, with xjadeo for doing voice over. Not inside Kdenlive, as it doesn't have EDL out or in nor Ardour supports EDL in. If you are using Canon HDSLR's for movie making purpose i don't think kdenlive is best editor, as it doesn't have EDL in or out.

I certainly didn't say that a DAW needs video editing capability. I said that the irony of the situation is that this audio software seems to be one of the few that allows me to preview video cuts in real time without having to transcode two hundred clips.

The real-time preview is, in fact, the big problem. I don't mind writing the EDLs or job scripts manually. Mencoder and FFMpeg render beautifully, and I'm perfectly comfortable with basic scripting for Avisynth, Avidemux, FFMpeg and Mencoder, but a real-time preview is especially important when doing match cuts or trimming a few seconds off a shot. Audio sync is also impossible without real time previews, and since I record with an external Zoom recorder, I need to be able to make sure audio and video are working together.

By the way, I believe you're incorrect about Kdenlive not having an EDL in and out. From what I understand, one of the strengths of the program is that it relies on the MLT framework, so its project files are all in an XML format. If I could somehow get a better preview out of Kdenlive, I would probably try to manually create the rough cut (Avisynth style) by annotating in and out points in a CSV file, then writing a Perl script to insert these into the XML format. I actually haven't tried this yet, but it might be worth looking into...

Hmm, thinking about the XML project file just made me realize that I could probably still use Kdenlive if I left FFMpeg or MEncoder overnight to batch transcode my MOV files into low-res AVI proxies. Then edit, and when it's time to render, just open up the Kdenlive project file and find-replace the file paths from proxy to original. That way I don't have to rely on Kdenlive to transcode the proxies (or have to wait in the moment while it does so) and hopefully I can have real-time video, audio sync and color correction filters.

I don't think i understood your situation. Let me explain my workflow.

Convert all my footage to Mjpeg codec with 15 qscale on it with ffmbc (as ffmpeg is slower IMO). Edit in realtime with Cinelerra.

Render out PAL 24fps, each 20 minutes for doing Voice over in Studio. If other audio source recorded on set (like yours on zoom recorder) i do basic manual sync in cinelerra and leave rest to Sound engineer.

After that i do final editing with all dissolves, fades, and bla bla bla.
Export CMX3600 EDL out from Cinelerra and give original MOV files to Lab guys for color grading. Or if i do the color grading this is the preferred way i worked for 3 feature films. And all the original will be converted raw with 5d2rgb. whichever feasible for the producer - if he gives enough time ;)

I don't know much about XML format. But EDL is pretty old but supported by major Grading packages, smoke, lustre, Scratch, Speedgrade, etc.,
EDL supports speed ramp, fades, dissolves, - its just not in and out point creator.

I hope i made clear, and btw English isn't my native language and i am not good at it.

Hey Mohanohi, ahhh okay. You're absolutely right, Kdenlive doesn't export CMX 3600 formatted EDLs. And you're also correct that it's a really important feature if you're editing in a professional environment with a lot of different people who are going to work with the footage (that's not my case, by the way). :)

The confusion came because a lot of programs have their own "EDL" formats which are mostly (as you said) in/out points for your clips. And I was saying that you can extract in/out points from the .kdenlive project file, and you can also use Melt to create an xml playlist that Kdenlive can import. But none of that has anything to do with CMX 3600. :)

Thank you for sharing your workflow. It's really interesting. I didn't know Cinelerra exported CMX 3600 EDLs. It's good to know, in case I need it one day! I'll also try out ffmbc and your proxy clip formula.

And your English is fantastic, so don't worry!