Cocoa Space

The Government Case Against Apple Looks Flimsy At Best

parislemon:

Report Declan McCullagh and Greg Sandoval:

The U.S. Justice Department’s legal pursuit of Apple for alleged e-book price fixing stretches the boundaries of antitrust law and is likely to end in defeat.

To be clear, that’s just the case against Apple. McCullagh and Sandoval note that the case against the publishers themselves seems much stronger. Given the evidence laid out right now, they’ll simply have a hard time proving that Apple was colluding in this from the beginning. 

The fact remains that what this really is about is Amazon versus the publishing industry. We’ll see how this plays out but Apple appears to be more of a (albeit willing) pawn. 

I’m one of those who doubts Apple’s lawyers were asleep when the deal with the publishers was being worked out.

Source: parislemon

If Google's Really Proud Of Google+, It Should Share Some Real User Figures

parislemon:

Great, long rant by Danny Sullivan about how shifty Google is when it comes to talking about actual Google+ usage numbers. 

Isn’t it strange how when there’s something to legitimately brag about, the bragging is done in a very specific manner? Actual numbers are given. But when there isn’t, numbers become relative. Or exceedingly confusing. Or worse. 

Actually, that’s not strange at all. 

The new Google+ update looks good for the most part. A big improvement. But it doesn’t fix the underlying problem — that most people still aren’t using Google+. I now have over a million followers on Google+ compared to 100,000 on Facebook and 75,000 on Twitter. The click-through numbers when I share across all networks still skew heavily in favor of the latter two. Again, with 10x less followers.

Okay, maybe it’s not about sharing content. But then what is it about?

As Sullivan lays out, there are a lot of Google employees using Google+ (obviously). And there are plenty of folks who sure seem to love Google to no end. But there is not the kind of social activity you see on Twitter and Facebook. 

Maybe that’s okay, maybe it’s not. I personally don’t like all this stuff mucking up my Google searches, but maybe the information gets better in the long run.

The point is that if Google+ is so great, shouldn’t Google be straightforward about the service and how it’s actually being used? Instead, it looks like they have something to hide. 

Source: parislemon

Ummm…been busy.

So I was signing up on Anna’s & Vladimir’s Founders & Funders and realized while filling-out my Tumblr info that I had not posted. In. A. Very. Long. Time.

It’s not like I’ve been sitting still. I do consider myself very lazy; I’m almost an artist at it. But I have been coding simple stuff for people who need simple-stuff code. Specifically, I coded-up a sample app, GLKitSampler of how to use GLKit in a very basic way. I was inspired to do this after my Big Nerd Ranch Advanced iOS class.

Having just finished it last week, this weekend I decided to setup an example where a transformed Blender file is imported and used an ISS model I got from NASA’s 3D model site. Thanks Jeff Lamarche for the Blender -> Obj-C header file export script. So the code was modified. Then, on Tuesday, I discovered that I don’t like the easy way of transforming position and rotation that I used. So I’m going to modify the code to use GLKMatrixStacks.

Part of what has kept me away from writing is AmericaSpace, the site I started in 2008 with some friends. It’s grown. A lot. Recently, I moved from our old MacMini server to a new one. Now the site is much snappier. And we’re running Lion Server, so now everyone has their own Mail, Wiki, Calendar, and Contacts support. A biggie for working with 15 writers and photographers has been the group calendar. It’s my hope to eventually open this up to paying customers and to sponsor events for groups. Someday.

Here are some pictures of the samples; there are 6 of them.

GLKit Sample Code

These are the screen images from the Screenshots app I made to merge the solutions outlined by Apple in Technical Q&A’s 1702, 1703, 1704, and 1714.

How To Legally Replace UIGetScreenImage()

In the summer of 2010, Apple opened up UIGetScreenImage() as a way of taking screenshots in iOS apps. There was great joy in the land.

Then the following September, Apple decided that it was better for app developers to use either UIImagePickerController or methods from AVFoundation to capture images and present a camera view. Happiness was replaced with great sadness in the land. 

To help developers, Apple’s iOS Team came out with 4 Technical Q&A’s that tried to show developers how to get around the prohibition on UIGetScreenImage() while still accomplishing the same thing. To put it simply, what had been a one-line job became a many line task.

Worse, in none of the Apple supplied Technical Q&A’s was there an elegant solution for those interested in augmented reality applications…such as I, to take a screen shot. So, in plain English, if you wanted a screenshot of your augmented reality app including its UIKit layer content, sort of like…well, a gun camera, you were out of luck. This bothered me greatly.

So I set-out to build an app, aptly named Screenshot, to demo the implementation of the 4 Technical Q&A’s given by Apple and then, in conclusion, combine those to get as close to mimicking the elegance of UIGetScreenImage(), at least in its results.

Now for the caveat; I know the folks at Apple could do this in a tequila all-nighter induced coma. And I don’t pretend that this is a unique solution. The many, very talented iOS programmers out there likely have their own solutions. I am posting this because I haven’t found those solutions in the open, where the newbie’s hang-out. So this is for you folks, the newbie’s, the non-AVFoundation guru’s.

First, a preview of the help that Apple has offered those trying to find a way around use of UIGetScreenImage() not longer allowed in iOS apps:

  • In Technical Q&A 1702 “How to capture video frames from the camera as images using AV Foundation”, Apple shows how to capture video frames from the camera as images using AV Foundation
  • In Technical Q&A 1703 “Screen Capture in UIKit Applications”, Apple shows how to take a screenshot in an UIKit application.
  • In Technical Q&A 1704OpenGL ES View Snapshot”, Apple demonstrates how to take a snapshot of my OpenGL ES view and save the result in a UIImage. 
  • In Technical Q&A 1714Capturing an image using AV Foundation”, Apple pretty clearly shows how to programmatically take a screenshot of an app that contains both UIKit and Camera elements. 

This demo application I wrote demonstrates how to use these solutions in combination to accomplish screen capture in an augmented reality environment, including an OpenGL layer, and does so using solutions not mentioned in the Technical Q&A’s but none-the-less a part of AVFoundation’s AVCaptureSession’s AVCaptureConnection.

Simply put, with this demo app I show how to capture a still image that includes the camera, UIKit, and OpenGL layers of an view. If it has a view or you can get an image from it, you can capture it.

Bottom line, this is as close to the old UIGetScreenimage() as is possible to get today that I know today.

A favor to those of you who come up with an even more elegant solution, please share.

Over the last couple of months, I created what evolved into an amalgam of view and layer animation demo apps. At one point, I bundled them up into a Tab Bar based app. Big mistake; the tab bar icons are tiny and leave no space for telling the user what they are about to see. So, over the weekend, I brushed-up on table view controllers and navigation controllers. Oh yeah, the cobwebs were pretty thick there since I hadn’t played with them in over a year. Bad programmer! Anyway, the new app is looking much better. I’ll add images to each row to make it even better. In the meantime, here are some screen shots from my iPod touch 4.

"

From Isaiah’s weblog,

Apple’s Three Laws of Developers

A developer may not injure Apple or, through inaction, allow Apple to come to harm.
A developer must obey any orders given to it by Apple, except where such orders would conflict with the First Law.
A developer must protect its own existence as long as such protection does not conflict with the First or Second Law.
— I. Developer

Amen!

"
Source: yourhead
So I decided to make a sample app that demo’s the screenshot methods outlined in QA1703 and QA1704 as well as my version of QA1714. The resulting screenshot is displayed on a small view in the upper left-hand corner of the screen. 
CocoaCoder.org has a meeting tomorrow night at which I will present my work-around. Once I do that, I’ll post the project for any iOS developer who wants to make use of it or can improve on it.

So I decided to make a sample app that demo’s the screenshot methods outlined in QA1703 and QA1704 as well as my version of QA1714. The resulting screenshot is displayed on a small view in the upper left-hand corner of the screen. 

CocoaCoder.org has a meeting tomorrow night at which I will present my work-around. Once I do that, I’ll post the project for any iOS developer who wants to make use of it or can improve on it.

The result of my substitute of UIGetScreenImage() for getting OpenGL view…ok, a CAEAGLLayer from core animation that is wrapped into a subclass of UIView, and UIKit, really just other UIViews and UIImageViews in my case, into an image.

The result of my substitute of UIGetScreenImage() for getting OpenGL view…ok, a CAEAGLLayer from core animation that is wrapped into a subclass of UIView, and UIKit, really just other UIViews and UIImageViews in my case, into an image.