Skip to content
This repository has been archived by the owner on Jun 19, 2023. It is now read-only.

How To support Apple's Live Photo? #578

Open
satifanie opened this issue Feb 3, 2016 · 4 comments
Open

How To support Apple's Live Photo? #578

satifanie opened this issue Feb 3, 2016 · 4 comments

Comments

@satifanie
Copy link

I had tried, but the live phone after uploaded turned to be a normal one when I downloaded it again .
Can live photo be supported?

@javiergonzper
Copy link
Contributor

Hello @satifanie
It is a good functionality. We put it on the backlog to work on it on the future.
Thank you for your feedback!

@javiergonzper
Copy link
Contributor

Reading the documentation: https://developer.apple.com/library/prerelease/ios/releasenotes/General/WhatsNewIniOS/Articles/iOS9_1.html#//apple_ref/doc/uid/TP40016572-DontLinkElementID_2

Looks like that if we export the photos on the Photo Live format the photos are not JPG or PNG so the only devices that could see the pictures will be other iPhones 6s or 6s Plus... I think that it is a bad idea support Live Photo 😞

@rwilliamsit
Copy link

I think it's possible. A live photo is comprised of two simple files. A jpg and a mov file.
There are iOS apps out there that seem to already do this (eg. PhotoSync, which actually claims to support uploading to ownCloud)

Please note: I am no expert here and I haven't used PhotoSync - I just have been doing some research on this topic because I am hoping to cancel my apple iCloud subscription after sorting out these types of challenges.

Anyway, this is what I found about the live photos: The two files are related together with an asset identifier in the form of a a UUID as a string in the metadata.

  1. The JPEG; must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier](17 is the Apple Maker Note Asset Identifier key).

  2. The Quicktime MOV is encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
    Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)

Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.

The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.

@lobeck
Copy link

lobeck commented May 16, 2017

By now this became feasible again, as Apple now released a SDK to embed Live Photos into Websites - https://developer.apple.com/live-photos/

The most important thing is, that both files are uploaded, the image and the adjacent movie file. If they are not modified, uploading them as-is is sufficient. In case there's a conversion, the metadata needs to be adjusted.

@michaelstingl michaelstingl removed the iOS9 label Mar 4, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants