interesting.
over the course of the iPhone history, there have been various ways to access the camera feature programmatically. in this area, it’s interesting to track the advances of the public api that apple presents as they get better at it.
in the early days they presented a fairly simple version of the camera app that you could customize, but getting at the innards of the camera signal was tricky if not possible at all.
with iOS 4 AVFoundation was ported to iOS from the mac, where it appears to have formed the basis for all of Apple’s forays into media editing. (garage band, logic, iMovie, final cut pro, etc.) Like a lot of software engineering, figuring out how to use it requires getting to understand the underlying rationales that led to particular design decisions. I find the biggest pity in documentation is not explaining how you got there. as a result the reasons for particular aspects of a design can be completely opaque until you understand the motive. I realize that the last sentence is nearly (if not completely) tautological…
the original UIImagePickerController concept was looking at the camera as a camera. AVFoundation just looks at it as a sensor with a visual output. I find thinking of it that way makes it much easier to grasp the rest of the AVFoundation.
more soon.