Post not yet marked as solved
Hi,
I've done everything required to implement this intent, but when I run either in the simulator or on the device the handlerForIntent does not get called when Maps runs and I click on my Profile icon to try and get the 'Vehicles' option to show up. I know handlerForIntent is working because it's getting called for my custom event for Siri shortcuts. I've added INListCarsIntent to the IntentsSupported list in my extension's info.plist file. I've also included the protocol in my INExtension subclass and added the relevant functions for returning the vehicle list. I've also made the call for Siri authorization, which is returning Authorized.
I'm really at a loss to know why Maps isn't calling out this intent. I'm on iOS16 and XCode 14.
Thanks
Post not yet marked as solved
Hi,Should the position returned by the face tracking ARFaceAnchor transform property match a 3d point derived from taking the depth map and using the intrinsics matrix to translate into a 3d point? I thought they should correlate, but when I look at my head position in the 3d cloud produced from the depth map plus intrinsics matrix it's center position does not seem to correlate to the position returned by the face anchor.I'm using currentFrame.capturedDepthData.cameraCalibrationData.intrinsicMatrix and currentFrame.capturedDepthData.cameraCalibrationData.intrinsicMatrixReferenceDimensions to take the depth map value (front facing camera real meters) to a 3d point using the formula... // use ar kit camera intrinsics float principleX = thisPhoto.intrinsicMatrix.m20; float principleY = thisPhoto.intrinsicMatrix.m21; float focalX = thisPhoto.intrinsicMatrix.m00; float focalY = thisPhoto.intrinsicMatrix.m11; // add point float U = (float)u / (float)thisPhoto.depthWidth; float V = (float)v / (float)thisPhoto.depthHeight; float y = ((float)U * thisPhoto.intrinsicSize.x - principleX) * z / focalX; float x = ((float)V * thisPhoto.intrinsicSize.y - principleY) * z / focalY; ThanksRay
Post not yet marked as solved
Hi All. Not sure what I'm doing wrong here, but AVDepthData, which is supposed to be in meters for non-disparity data (front facing camera) is returning HUGE numbers. I have the code...-(bool)getCurrentFrameDepthBufferIntoBuffer:(ARSession*)session buffer:(BytePtr)buffer width:(int)width height:(int)height bytesPerPixel:(int)bytesPerPixel
{
// do we have a current frame
if (session.currentFrame != nil)
{
// do we have a captured image?
if (session.currentFrame.capturedDepthData != nil)
{
// get depth data parameters
int ciImageWidth = (int)CVPixelBufferGetWidth(session.currentFrame.capturedDepthData.depthDataMap);
int ciImageHeight = (int)CVPixelBufferGetHeight(session.currentFrame.capturedDepthData.depthDataMap);
// how many bytes per pixel
int bytesPerPixel;
if (session.currentFrame.capturedDepthData.depthDataType == kCVPixelFormatType_DisparityFloat16 ||
session.currentFrame.capturedDepthData.depthDataType == kCVPixelFormatType_DepthFloat16)
bytesPerPixel = 2;
else
bytesPerPixel = 4;
// copy to passed buffer
CVPixelBufferLockBaseAddress(session.currentFrame.capturedDepthData.depthDataMap, kCVPixelBufferLock_ReadOnly);
memcpy(buffer, session.currentFrame.capturedDepthData.depthDataMap, ciImageWidth*ciImageHeight*bytesPerPixel);
float *floatBuffer = (float*)buffer;
float maxDepth = 0.0f;
float minDepth = 0.0f;
for (int i=0; i < ciImageWidth*ciImageHeight; i++)
{
if (floatBuffer[i] > maxDepth)
maxDepth = floatBuffer[i];
if (floatBuffer[i] < minDepth)
minDepth = floatBuffer[i];
}
NSLog(@"In iOS, max depth is %f min depth is %f", maxDepth, minDepth);
CVPixelBufferUnlockBaseAddress(session.currentFrame.capturedDepthData.depthDataMap, kCVPixelBufferLock_ReadOnly);
}
}
return true;
}But it's returning min and max values like...2019-06-27 12:32:32.167868+0900 AvatarBuilder[13577:2650159] In iOS, max depth is 3531476501829561451725831270301696000.000000 min depth is -109677129931746407817494761329131520.000000Which looks nothing like meters.