So I was signing up on Anna’s & Vladimir’s Founders & Funders and realized while filling-out my Tumblr info that I had not posted. In. A. Very. Long. Time.
It’s not like I’ve been sitting still. I do consider myself very lazy; I’m almost an artist at it. But I have been coding simple stuff for people who need simple-stuff code. Specifically, I coded-up a sample app, GLKitSampler of how to use GLKit in a very basic way. I was inspired to do this after my Big Nerd Ranch Advanced iOS class.
Having just finished it last week, this weekend I decided to setup an example where a transformed Blender file is imported and used an ISS model I got from NASA’s 3D model site. Thanks Jeff Lamarche for the Blender -> Obj-C header file export script. So the code was modified. Then, on Tuesday, I discovered that I don’t like the easy way of transforming position and rotation that I used. So I’m going to modify the code to use GLKMatrixStacks.
Part of what has kept me away from writing is AmericaSpace, the site I started in 2008 with some friends. It’s grown. A lot. Recently, I moved from our old MacMini server to a new one. Now the site is much snappier. And we’re running Lion Server, so now everyone has their own Mail, Wiki, Calendar, and Contacts support. A biggie for working with 15 writers and photographers has been the group calendar. It’s my hope to eventually open this up to paying customers and to sponsor events for groups. Someday.
Here are some pictures of the samples; there are 6 of them.
In the summer of 2010, Apple opened up UIGetScreenImage() as a way of taking screenshots in iOS apps. There was great joy in the land.
Then the following September, Apple decided that it was better for app developers to use either UIImagePickerController or methods from AVFoundation to capture images and present a camera view. Happiness was replaced with great sadness in the land.
To help developers, Apple’s iOS Team came out with 4 Technical Q&A’s that tried to show developers how to get around the prohibition on UIGetScreenImage() while still accomplishing the same thing. To put it simply, what had been a one-line job became a many line task.
Worse, in none of the Apple supplied Technical Q&A’s was there an elegant solution for those interested in augmented reality applications…such as I, to take a screen shot. So, in plain English, if you wanted a screenshot of your augmented reality app including its UIKit layer content, sort of like…well, a gun camera, you were out of luck. This bothered me greatly.
So I set-out to build an app, aptly named Screenshot, to demo the implementation of the 4 Technical Q&A’s given by Apple and then, in conclusion, combine those to get as close to mimicking the elegance of UIGetScreenImage(), at least in its results.
Now for the caveat; I know the folks at Apple could do this in a tequila all-nighter induced coma. And I don’t pretend that this is a unique solution. The many, very talented iOS programmers out there likely have their own solutions. I am posting this because I haven’t found those solutions in the open, where the newbie’s hang-out. So this is for you folks, the newbie’s, the non-AVFoundation guru’s.
First, a preview of the help that Apple has offered those trying to find a way around use of UIGetScreenImage() not longer allowed in iOS apps:
- In Technical Q&A 1702 “How to capture video frames from the camera as images using AV Foundation”, Apple shows how to capture video frames from the camera as images using AV Foundation
- In Technical Q&A 1703 “Screen Capture in UIKit Applications”, Apple shows how to take a screenshot in an UIKit application.
- In Technical Q&A 1704 “OpenGL ES View Snapshot”, Apple demonstrates how to take a snapshot of my OpenGL ES view and save the result in a UIImage.
- In Technical Q&A 1714 “Capturing an image using AV Foundation”, Apple pretty clearly shows how to programmatically take a screenshot of an app that contains both UIKit and Camera elements.
This demo application I wrote demonstrates how to use these solutions in combination to accomplish screen capture in an augmented reality environment, including an OpenGL layer, and does so using solutions not mentioned in the Technical Q&A’s but none-the-less a part of AVFoundation’s AVCaptureSession’s AVCaptureConnection.
Simply put, with this demo app I show how to capture a still image that includes the camera, UIKit, and OpenGL layers of an view. If it has a view or you can get an image from it, you can capture it.
Bottom line, this is as close to the old UIGetScreenimage() as is possible to get today that I know today.
A favor to those of you who come up with an even more elegant solution, please share.
From Isaiah’s weblog,
Apple’s Three Laws of Developers
A developer may not injure Apple or, through inaction, allow Apple to come to harm.
A developer must obey any orders given to it by Apple, except where such orders would conflict with the First Law.
A developer must protect its own existence as long as such protection does not conflict with the First or Second Law.
— I. Developer