OpenGL ES screenshots on iOS

Daniele's picture

Sometimes you could need to take screenshots programmatically for your OpenGL ES application on iOS: you could use it to capture the "best moments in the game" to show them later, or send them via email, or share on some social network; or simply you could hate to have to press two buttons in sync to have a screenshot. Anyway here is some source code to do it, ready to be used with Cocos2D 1.0.

The implementation is based on Apple's Technical Q&A QA1704, with an important fix to manage correctly alpha blended images. Then there is a little trick to integrate it with Cocos2D.

This is the code to get the screenshot as UIImage:

+ (UIImage*)snapshot:(UIView*)eaglview
	GLint backingWidth, backingHeight;
	// Bind the color renderbuffer used to render the OpenGL ES view
	// If your application only creates a single color renderbuffer which is already bound at this point, 
	// this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
	// Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.
// In Cocos2D the render-buffer is already binded (and it's a private property...).
//	glBindRenderbufferOES(GL_RENDERBUFFER_OES, _colorRenderbuffer);
	// Get the size of the backing CAEAGLLayer
	glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
	glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
	NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
	NSInteger dataLength = width * height * 4;
	GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));
	// Read pixel data from the framebuffer
	glPixelStorei(GL_PACK_ALIGNMENT, 4);
	glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
	// Create a CGImage with the pixel data
	// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
	// otherwise, use kCGImageAlphaPremultipliedLast
	CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
	CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
	CGImageRef iref = CGImageCreate (
		width * 4,
		// Fix from Apple implementation
		// (was: kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast).
	// OpenGL ES measures data in PIXELS
	// Create a graphics context with the target size measured in POINTS
	NSInteger widthInPoints, heightInPoints;
	if (NULL != UIGraphicsBeginImageContextWithOptions)
		// On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
		// Set the scale parameter to your OpenGL ES view's contentScaleFactor
		// so that you get a high-resolution snapshot when its value is greater than 1.0
		CGFloat scale = eaglview.contentScaleFactor;
		widthInPoints = width / scale;
		heightInPoints = height / scale;
		UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
	else {
		// On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
		widthInPoints = width;
		heightInPoints = height;
		UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
	CGContextRef cgcontext = UIGraphicsGetCurrentContext();
	// UIKit coordinate system is upside down to GL/Quartz coordinate system
	// Flip the CGImage by rendering it to the flipped bitmap context
	// The size of the destination area is measured in POINTS
	CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
	CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);
	// Retrieve the UIImage from the current context
	UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
	// Clean up
	return image;

To get the screenshot from Cocos2D, the above method must be called after EAGLContext/-presentRenderbuffer: and before glClear() (see: We can do it using the scheduler:

- (void) takeScreenshot
	[[CCScheduler sharedScheduler]
		scheduleSelector: @selector(takeScreenshotAux_:)
		forTarget: self
		interval: 0
		paused: NO];
- (void) takeScreenshotAux_: (id) sender
	[[CCScheduler sharedScheduler] unscheduleSelector:@selector(takeScreenshotAux_:) forTarget:self];
	UIImage * img = [self snapshot: [CCDirector sharedDirector].openGLView];
	UIImageWriteToSavedPhotosAlbum (img, nil, nil, nil);



I suppose that's exactly what I was looking for. Unfortunately I can't compile your source. xCode always means 'GL_RENDERBUFFER_OES' is not a declared identifier. I've imported both <OpenGLES/EAGLDrawable.h> and <QuartzCore/QuartzCore.h>.

may you can help me, thx!


Daniele's picture

Hi Chris,

as of iOS 5.1 SDK, 'GL_RENDERBUFFER_OES' symbol is defined in the "<OpenGLES/ES1/glext.h>" header file of the OpenGLES framework.

Using the original QA1704 code I get my code to take a screen shot which the points work but the colors are entirely white. Just curious if you ran into this problem or have an idea on how to fix it. I am drawing glPoints:

If I use a static array the colors work, but as soon as I use a dynamic pointer array for the colors everything is white.

Daniele's picture

Hi Jeff,

I never met this problem. It could happen if you capture the screenshot before showing the render buffer or after having called glClear(); this could explains the white color.
If you are using a version of cocos2d-iphone after v1.0, maybe something changed and the "trick" explained in the article is no longer correct.
Let me know if you solve your problem!

Thanks for the response, I solved my problem. The cause in my case was that for some reason some of my alpha values in my color array were not 255, so I looped through my structure before draw and forced them all to 255.

Daniele's picture

Weird...anyway I'm glad you solved!

I have used the same code to take snapshot in one of my application but the problem is in Devide with iOS 6 version it is not working (I am able to see blank image appearing all the time). In iOS 6 simulator it is working properly again. Can you help me out to solve this issue.