API to Capture Live Photos in iOS9

Integrating Live Photos in my app

There is really no simple out-of-the-box API from Apple to integrate Live Photo in iOS apps.

That being said, there's an interesting article that has been going around lately which will explain what Live Photo is under the hood and how you can use it.

How to provide hint for Live Photo

You don't need to dissect a Live Photo into still frames and construct an animated UIImage, or dig out the Live Photo's movie file... It's much simpler.

  1. Display your user's Live Photo content in a PHLivePhotoView.
  2. Call startPlaybackWithStyle: and pass
    .Hint for the playback style to get the "hint" that the HIG is talking about.
  3. There's no step three.

Saving Already Created Live Photos

You can create a LivePhoto from separate elements from a LivePhoto by using PHLivePhoto.requestLivePhotoWithResourceFileURLs, you will then be able to save it to the library.

func makeLivePhotoFromItems(imageURL: NSURL, videoURL: NSURL, previewImage: UIImage, completion: (livePhoto: PHLivePhoto) -> Void) {
PHLivePhoto.requestLivePhotoWithResourceFileURLs([imageURL, videoURL], placeholderImage: previewImage, targetSize: CGSizeZero, contentMode: PHImageContentMode.AspectFit) {
(livePhoto, infoDict) -> Void in
if let lp = livePhoto {
completion(livePhoto: lp)
}
}
}

makeLivePhotoFromItems(imgURL, videoURL: movURL, previewImage: prevImg) { (livePhoto) -> Void in
// "livePhoto" is your PHLivePhoto object, save it/use it here
}

You will need the JPEG file URL, the MOV file URL, and a "preview" image (which is usually just the JPEG or a lighter version of it).

Full example working in a Playground here.

Why does video has a smaller size than photo for an iOS live-photo

The size of the video is smaller than the size of the photo. For example, I took a live photo on my phone and the size of it was 3024 pixels by 4032 pixels. The video, on the other hand, was just 1440 pixels by 1080 pixels.

Apple Live Photo file format

Here's the link. Otherwise, here's the text:

Live Photos

Live Photos is a new feature of iOS 9 that allows users to capture and
relive their favorite moments with richer context than traditional
photos. When the user presses the shutter button, the Camera app
captures much more content along with the regular photo, including
audio and additional frames before and after the photo. When browsing
through these photos, users can interact with them and play back all
the captured content, making the photos come to life.

iOS 9.1 introduces APIs that allow apps to incorporate playback of
Live Photos, as well as export the data for sharing. There is new
support in the Photos framework to fetch a PHLivePhoto object from the
PHImageManager object, which is used to represent all the data that
comprises a Live Photo. You can use a PHLivePhotoView object (defined
in the PhotosUI framework) to display the contents of a Live Photo.
The PHLivePhotoView view takes care of displaying the image, handling
all user interaction, and applying the visual treatments to play back
the content.

You can also use PHAssetResource to access the data of a PHLivePhoto
object for sharing purposes. You can request a PHLivePhoto object for
an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also
get PHLivePhoto objects by using NSItemProvider. On the receiving side
of a share, you can recreate a PHLivePhoto object from the set of
files originally exported by the sender.

Guidelines for Displaying Live Photos

It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in
an environment that doesn’t support PHLivePhotoView, it’s recommended
that you present it as a regular photo.

Don’t display the extra frames and audio of a Live Photo separately.
It's important that the content of the Live Photo be presented in a
consistent way that uses the same visual treatment and interaction
model in all apps.

It’s recommended that you identify a photo as a Live Photo by placing
the badge provided by the PHLivePhotoView class method
livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in
the top-left corner of the photo.

Note that there is no support for providing the visual effect that
users experience as they swipe through photos in the Photos app.

Guidelines for Sharing Live Photos

The data of a Live Photo is
exported as a set of files in a PHAssetResource object. The set of
files must be preserved as a unit when you upload them to a server.
When you rebuild a PHLivePhoto with these files on the receiver side,
the files are validated; loading fails if the files don’t come from
the same asset.

If your app lets users apply effects or adjustments to a photo before
sharing it, be sure to apply the same adjustments to all frames of the
Live Photo. Alternatively, if you don’t support adjusting the entire
contents of a Live Photo, share it as a regular photo and show an
appropriate indication to the user.

If your app has UI for picking photos to share, you should let users
play back the entire contents so they know exactly what they are
sharing.When selecting photos to share in your app, users should also
be able to turn a Live Photo off, so they can post it as a traditional
photo.



Related Topics



Leave a reply



Submit