Posts

Post not yet marked as solved
0 Replies
388 Views
I have been doing a bunch of experiments with the estimatedDepthData buffer and trying to unProject points based on the data. It looks like the data is a bit noisy sometimes and distances are way off. I'm working in a custom rendering environment OpenFrameworks so things are a bit different. Basically, my unProject function looks like this and works 0 for screenPoint.z is at near clip and 1 is far clip. I need to figure out a better way to get the depthData pixel to map to that range. I just scale the depth pixel to work for visually. Basic unproject function using GLM.ofRectangle viewport(0, 0, ofGetScreenWidth(), ofGetScreenHeight()); glm::vec3 CameraXYZ; CameraXYZ.x = 2.0f * (screenPoint.x - viewport.x) / viewport.width - 1.0f; CameraXYZ.y = 1.0f - 2.0f * (screenPoint.y - viewport.y) / viewport.height; CameraXYZ.z = screenPoint.z; glm::mat4 projMat = common::convert<glm::mat4, ofmatrix4x4="">(projectionMatrix); glm::mat4 viewMat = common::convert<glm::mat4, ofmatrix4x4="">(viewMatrix); glm::mat4 MVPmatrix = projMat * viewMat; auto world = glm::inverse(MVPmatrix) * glm::vec4(CameraXYZ, 1.0); MVPmatrix = glm::scale(glm::mat4(1.0), glm::vec3(1.f, -1.f, 1.f)) * MVPmatrix; return glm::vec3(world) / world.w;I was going to try and use the built-in Apple unprojectPoint:ontoPlaneWithTransform:orientation:viewportSize: but I'm having a hard time understanding what the ontoPlaneWithTransform would be. Any thoughts?
Posted
by vanderlin.
Last updated
.