Core Animation Tutorial: QTMovieLayer and Image Animation

by Matt Long

QTMovieLayer App DesktopIn my first post I wrote about using NSOperation to grab an image of the current frame of a QuickTime movie while it was playing and save it out to the filesystem. Considering the excitement that is surrounding Core Animation these days, I thought it might be interesting to take that project and change it to instead grab the current frame and animate it across the screen using a Core Animation layer (CALayer). I had to employ a few tricks to get it to work, but the end result, while completely useless, is quite excellent.


Getting Ready For The Image

The focus of this tutorial is Core Animation and specifically how to use a QTMovieLayer as well as animating an image across the screen using a CALayer. I, therefore, will not spend any time addressing NSOperation and I think we’ve covered this adequately on this site. I would suggest, however, that you familiarize yourself with Marcus’ post on NSOperation as he discusses calling back into the main thread once the NSOperation has completed. I employ Marcus’ technique in this project so that I can be notified when the image I’m wanting to animate is ready.

You can download the demo project for this post from here: Movie Image Flick Demo Project

Download the project and run it. It will prompt you to select a QuickTime movie to play. Once the movie starts playing, tap your spacebar and watch the images pop off of the movie and animate to random places on the screen. Tap the spacebar quickly to see the effect work in rapid succession.

As in my NSOperation post, I want the video playback to be completely flicker free. I achieve this by obtaining a reference to a copy of the QTMovie object from which I obtain the image at the given time. If you take a look at my main: function in my NSOperation derived class, you’ll see how I am notifying the main thread that it may now do something with the image I’ve obtained.


- (void)main;
    if( movie )
        NSImage* image;
        image = [movie frameImageAtTime:time 
                      withAttributes:imageAttrs error:nil];
        [image retain];
        [[AppDelegate shared] performSelectorOnMainThread:@selector(imageReady:)
        [image release];

In the second last line of the function, notice I am calling back into my main thread and letting the app delegate know that the image is ready. I am also handing the image to the imageReady: function. This is very convenient for what we need to do next. But first, let’s discuss drawing the layers with no backing view or window.

Drawing Layers In Thin Air

This project employs a trick that makes it appear as if the layers are being drawn without a backing view. This is, of course, not really possible, so instead we do a little trick. I achieved this effect by creating a borderless window with a transparent background. As usual, I want to give credit where credit is due. Lucas Newman answered a question I posted on Scott Stevenson’s blog to find out how Lucas did this in his demo project a the Silicon Valley CocoaHeads meeting on Core Animation. You simply create your own NSWindow derived class and override the initWithContentRect: method. Here is the resulting initialization code I came up with from his suggestion.


- (id) initWithContentRect: (NSRect) contentRect
                 styleMask: (unsigned int) aStyle
                   backing: (NSBackingStoreType) bufferingType
                     defer: (BOOL) flag
    if (![super initWithContentRect: contentRect 
                 styleMask: NSBorderlessWindowMask 
                   backing: bufferingType 
                   defer: flag]) return nil;
    [self setBackgroundColor: [NSColor clearColor]];
    [self setOpaque:NO];
    return self;

The only issue with using a transparent backing window/view is that you cannot click through it even though it looks like you ought to be able to. There is probably a way to pass the click on to the next front most window, but I haven’t yet researched this. I’ll let you know when I find the answer, or if you know the answer, post it in the comments.

Rapid Fire Image Animation

I created a method to initialize my NSOperation derived objects and hand them off to the NSOperationQueue. I hooked it up to the spacebar and can then tap the spacebar as quickly as possible and watch the images just pop off the playing movie. Here is my flickIt: function.


- (IBAction)flickIt:(id)sender;
    ImageGrabber *grabber = [[ImageGrabber alloc] init];
    [grabber setTime:[movie currentTime]];
    // Hand off the background movie to the NSOperation derived
    // object
    [grabber setMovie:movieBak];
    // Add the oepration to the queue. It will be run immediately.
    [queue addOperation:grabber];

This function creates the NSOpeartion derived objects and passes a reference to the current time in the playing movie as well as a reference to the background movie that I use to pull the image data from. Remember that this object is copied from the original QTMovie object and is used to keep from needing to pull the image data from the QTMovie object that is actually playing in the QTMovieLayer. Pulling the image from the QTMovie object that is being used for playback would cause the playback to stutter.

As stated earlier, when my image is ready, the NSOperation derived object, ImageGrabber, calls back into the main thread calling the function called imageReady:. Here is its implementation.


- (void)imageReady:(NSImage*)image;
    [[NSAnimationContext currentContext] setDuration:1.5f];

    // Save off the rectangle of the ready layer before
    // it gets animated. We will use this to reset the
    // ready layer.
    CGRect prevRect = [readyLayer frame];

    // Convert the NSImage to a CGImageRef
    CGImageRef imageRef = [self nsImageToCGImageRef:image];
    readyLayer.contents = (id)imageRef;
    readyLayer.backgroundColor = CGColorCreateGenericRGB(0.0f,0.0f,0.0f,1.0f);
    readyLayer.borderColor = CGColorCreateGenericRGB(0.45f,0.45f,0.45f,1.0f);
    readyLayer.borderWidth = 4.0f;
    // Grab a random rectangle on the screen where our image
    // layer will animate to.
    CGRect randRect = [self getRandomRect:prevRect];
    [readyLayer setFrame:randRect];
    [readyLayer setOpacity:1.0f];

    // Reset the ready layer so that it is ready for the next image
    readyLayer = [[[CALayer alloc] init] retain];
    readyLayer.frame = prevRect;
    readyLayer.opacity = 0.01f;
    [[[window contentView] layer] addSublayer:readyLayer];

Notice that we are converting the NSImage that gets passed to this function into a CGImageRef which is what our Core Animation layer requires to set it contents. Here is the code to convert an NSImage to a CGImageRef.


- (CGImageRef)nsImageToCGImageRef:(NSImage*)image;
    NSData * imageData = [image TIFFRepresentation];
    CGImageRef imageRef;
        CGImageSourceRef imageSource = 
                           (CFDataRef)imageData,  NULL);

        imageRef = CGImageSourceCreateImageAtIndex(
                          imageSource, 0, NULL);
    return imageRef;

Preloading A Layer to Animate

Instead of creating the layer, setting its contents, and then animating it all at once, I instead set an empty layer with a near zero opacity to overlay the QTMovieLayer when the movie first loads. Then, when the user taps the spacebar, I populate that invisible layer’s contents with the image data and then animate it. This keeps me from having to create a more complicated animation to get the desired effect.

Once the image layer has it’s new frame set, I re-initialize the ready layer. You see the ready layer get reset with this code from the imageReady: function.

// Reset the ready layer so that it is ready for the next image
readyLayer = [[[CALayer alloc] init] retain];
readyLayer.frame = prevRect;
readyLayer.opacity = 0.01f;
[[[window contentView] layer] addSublayer:readyLayer];

Note: I set the opacity to near zero as when it is set to zero it won’t animate the opacity change properly. I’m looking into why this is and will update this post with what I find when I find it.


Random Points

When I animate the layers, I simply grab a random point and the screen and then scale the image down to 35% of the original. When I call setFrame: the magic ensues and my screen gets really cluttered with lots of images. As I said before, not very useful, but excellent. Here is the random rectangle calculation function, getRandomRect:.


- (CGRect)getRandomRect:(CGRect)startRect;
    CGPoint point = CGPointMake(random() % 
                          (NSUInteger)NSWidth([window frame]), 
                          random() % 
                           (NSUInteger)NSHeight([window frame]));
    return CGRectMake(point.x, point.y, 
                 startRect.size.width * 0.35f, startRect.size.height * 0.35f);


The nuances of Core Animation are quite interesting and provide a fun challenge when trying to solve problems. In other words, the devil is in the details. There is probably a better way to achieve the animation code, but I felt that the solution I came up with here works pretty well. There are details about Core Animation that I am still learning, but this type of effect demonstrates exactly the kinds of interesting things that are possible.


Luke says:

Hello there. When I download your demo and try to compile it, I get these three errors.

“_QTMovieFrameImageTypeNSImage”, referenced from:
_QTMovieFrameImageTypeNSImage$non_lazy_ptr in ImageGrabber.o
“.objc_class_name_QTMovieLayer”, referenced from:
literal-pointer@__OBJC@__cls_refs@QTMovieLayer in AppDelegate.o
“_QTMovieFrameImageType”, referenced from:
_QTMovieFrameImageType$non_lazy_ptr in ImageGrabber.o

Can you give me any pointers as to where this errors are coming from? Thanks.

Matt Long says:

These look like linker errors. Are you building in Debug or Release? It looks like your missing a framework Maybe QTKit?

ewing says:

Funny, I was just going over the
QTMovie frameImageAtTime:withAttributes:error:
API last night and I stumbled upon this by this without looking for it.

Instead of converting from NSImage to CGImageRef, this withAttributes: version of the method returns a void* instead of an NSImage. If you supply the correct attributes, you can tell the method to return you a CGImageRef directly.

[attributes setObject:QTMovieFrameImageTypeCGImageRef forKey:QTMovieFrameImageType];

At least that’s the theory. I was trying to use QTMovieFrameImageTypeCVOpenGLTextureRef for my own purposes, but I keep getting errors when I call this (Error Domain=NSOSStatusErrorDomain Code=-50 “Operation could not be completed. (OSStatus error -50.)” (error in user parameter list)

Using QTMovieFrameImageTypeCGImageRef didn’t return an error for me though so I bet it will work for this case.

[...] Cocoa Is My Girlfriend » Core Animation Tutorial: QTMovieLayer and Image Animation Drawing Layers In Thin Air [...]

[...] Core Animation Tutorial: QTMovieLayer and Image Animation [...]

newacct says:

readyLayer = [[[CALayer alloc] init] retain];

VERY BAD. alloc already does a retain. You are double-retaining.

What’s worse — you never release it. You must release anything that you alloc’d or retained.