Depth Map is inverted?

The depth map (actually disparity map) I'm extracting from a picture taken with iPhone Plus is inverted. That is: objects that are close to lens are dark/black (low values), and in the distance they're white (high values).


That seems to me to be the opposite of what Apple showed on WWDC. Is that normal? I'm able to get it "right" (close: high values, distance: low values) by simply inverting the whole depth map, but is it supposed to be like that?


I've tested with 2 different pictures and both of them had the same result. I haven't got an iPhone Plus at hand to test it more, appreciate any help.

Replies

How are you reading the depth data? The iPhone 7 Plus saves disparity data by default (closer items are brighter). Are you converting it to the depth data type when loading?

On which build of the OS was the photo taken?


Watch the seed notes carefully. This photo may have been taken on a version of the seed that contained a bug which has since been addressed.

I'm using the photo of a dog that you made availabe (thanks a lot).


I've tried loading the map many different ways, but this is the one I prefered, with Core Image:


[asset requestContentEditingInputWithOptions:nil completionHandler:^(PHContentEditingInput * _Nullable contentEditingInput, NSDictionary * _Nonnull info) {


CIImage * depthImage = [CIImage imageWithContentsOfURL: contentEditingInput.fullSizeImageURL options:@{kCIImageAuxiliaryDepth : [NSNumber numberWithBool:YES]}];


}];


If I simply turn that depthImage into an UIImage and show it, this is what I get:

https://www.dropbox.com/s/9ur2js0wp0ikklp/map-dog.png?dl=0


Closer items are dark.


If I get the photo metadata and print it, it says it has a Disparity map.


Any idea what's going on?

I don't know, because I didn't take the picture. Haven't got an iPhone Plus with me. Any way you could please make available some picture with depth effect taken with iOS 11? Thanks.

So, did anyone find out if the Depth Map is supposed to be inverted (black for closer objects, white in the distance) by the default?

Take CoreImage out of the equation. Look at the actual contents of a disparity map taken from a file and wrapped with an AVDepthData. Close objects will have higher values than far objects.

I took a look at the sample dog photo and verified that it does have a disparity map in its depth data. This is the default for anything taken with the built in Portrait mode camera. Once you get the AVDepthData data, you can get the kCGImageAuxiliaryDataTypeDisparity data.


Hope this helps!

James