Hi, I've worked in a new project for weeks, which allows people to take pictures with dual cameras and blur the image with depth data. I think I finished the core codes and get better results than Apple works. The main problem is I can't save the blurred image into a portrait photo like the system's camera works.
I used this codes to merge the AVDepthData into JPEGs, and it looks fine.
CGImageDestinationAddAuxiliaryDataInfo(outputDestination, (__bridge CFStringRef)auxDataType, (__bridge CFDictionaryRef)auxData);
I use the PhotoLibrary to save the merged image, but I can't find any way to save them as Apple works.
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetCreationRequest *assetRequest = [PHAssetCreationRequest creationRequestForAsset];
[assetRequest addResourceWithType:PHAssetResourceTypePhoto data:imagedata options:nil];
...........
The image doesn't have this subtype "PHAssetCollectionSubtypeSmartAlbumDepthEffect", and I can't force the images join into the album of PHAssetCollectionTypeSmartAlbum.
When I browse the image I saved in Photos app, the image does have the depth data, and I can turn the depth blur on in the editing mode. But it doesn't show the tag as "Portrait mode" at the left-top corner of image.
Who would like to help me, and show me some codes?
Thank you.