The color difference between the red part of the original video.

First, you need to use the system version above iOS 12.0 and below iOS 12.0 to compare the test result, according to the following steps.


1. Run the project with the sample program named ReaderAsset.

2. After waiting for the screen completed, the console prints out the text 'export movie to PhotosAlbum'.

3. Then go to the phone album and get the new video.

4. Color contrast the new video with the old video. The red color is different from the new video to the old video on devices above iOS 12.0 system, but the same red color on below iOS 12.0 system.


I have provided the demo that can be reproduced and the new video needed to reproduce the problem, and the old video generated, which can be compared and viewed.


http://pab2071yd.bkt.clouddn.com/ReaderAndWriterAsset.zip

http://pab2071yd.bkt.clouddn.com/20190821153411623.MP4

http://pab2071yd.bkt.clouddn.com/20190821153411623-after.mp4

Replies

Please describe precisely what you get, what you expoect and post the part of code where you see the problem.


Without need to run your code.

In the system above iOS 12.0 of iPhone devices, video converted by the following codes is used. Compared with the original video, the color of the red part is different, and the original red part becomes yellow.


The same code, in iPhone devices below iOS 12.0 (excluding iOS 12.0), after the conversion of the original video, there is no color difference compared with the original video, and the red part shows the same red color as the original video.


- (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset {

AVMutableVideoComposition *videoComposition;

AVMutableVideoCompositionInstruction * instruction;


AVMutableVideoCompositionLayerInstruction *layerInstruction = nil;

AVAssetTrack *assetVideoTrack = nil;


if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {

assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

}


CGSize videoSize = CGSizeMake(1280, 720);

CGAffineTransform transform = [self transformWithVideoAssetTrack:assetVideoTrack

rotateAngle:0

displayFrame:CGRectMake(0, 0, videoSize.width, videoSize.height)];


instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

instruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);


layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:assetVideoTrack];


CGSize renderSize = videoSize;

if (CGSizeEqualToSize(renderSize, CGSizeZero)) {

renderSize = CGSizeApplyAffineTransform(assetVideoTrack.naturalSize, transform);

renderSize = CGSizeMake(fabs(renderSize.width), fabs(renderSize.height));

}


videoComposition = [AVMutableVideoComposition videoComposition];

videoComposition.renderSize = renderSize;


float frameRate = 30;

videoComposition.frameDuration = CMTimeMake(1000, 1000 * frameRate);


NSLog(@"frameDuraton {value: %lld, timescale: %d}", videoComposition.frameDuration.value, videoComposition.frameDuration.timescale);


instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];

videoComposition.instructions = [NSArray arrayWithObject:instruction];


return videoComposition;

}