I am trying to add a 3d plane with a textured image onto an anchor that is set by recognizing an image. I have followed this documentation so far: https://developer.apple.com/documentation/arkit/recognizing_images_in_an_ar_experience and have tried a bunch of modifications, trying to use the PlaneDetection sample project to try and detect the image, add a thin plane on it with an image, so if I were to look at a specified poster on my wall, it uses the image recognition function and replaces the poster with another image on the plane that I just placed. Does anyone have any experience with this? Even a few keywords that I could research would be helpful. Thank you!
Place an image on a plane at the location of an anchor
More specifically, I have this peice of code using the sample code above as a reference:
/
See LICENSE folder for this sample’s licensing information.
Abstract:
Main view controller for the AR experience.
*/
import ARKit
import SceneKit
import UIKit
class ViewController: UIViewController, ARSCNViewDelegate {
@IBOutlet var sceneView: ARSCNView!
@IBOutlet weak var blurView: UIVisualEffectView!
let planeIdentifiers = [UUID]()
var anchors = [ARAnchor]()
var nodes = [SCNNode]()
/
var planeNodesCount = 0
let planeHeight: CGFloat = 0.01
/
var isPlaneSelected = false
/
var lampNode: SCNNode?
/
lazy var statusViewController: StatusViewController = {
return childViewControllers.lazy.flatMap({ $0 as? StatusViewController }).first!
}()
/
let updateQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! +
".serialSceneKitQueue")
/
var session: ARSession {
return sceneView.session
}
/
override func viewDidLoad() {
super.viewDidLoad()
/
/
sceneView.delegate = self
sceneView.session.delegate = self
/
statusViewController.restartExperienceHandler = { [unowned self] in
self.restartExperience()
}
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
/
UIApplication.shared.isIdleTimerDisabled = true
/
resetTracking()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
session.pause()
}
/
/
var isRestartAvailable = true
/
/
func resetTracking() {
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
let configuration = ARWorldTrackingConfiguration()
configuration.detectionImages = referenceImages
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
statusViewController.scheduleMessage("Look around to detect images", inSeconds: 7.5, messageType: .contentPlacement)
}
/
/
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
updateQueue.async {
/
let plane = SCNPlane(width: referenceImage.physicalSize.width,
height: referenceImage.physicalSize.height)
let planeNode = SCNNode(geometry: plane)
planeNode.opacity = 0.25
/
`SCNPlane` is vertically oriented in its local coordinate space, but
`ARImageAnchor` assumes the image is horizontal in its local space, so
rotate the plane to match.
*/
planeNode.eulerAngles.x = -.pi / 2
/
Image anchors are not tracked after initial detection, so create an
animation that limits the duration for which the plane visualization appears.
*/
/
/
node.addChildNode(planeNode)
}
DispatchQueue.main.async {
let imageName = referenceImage.name ?? ""
self.statusViewController.cancelAllScheduledMessages()
self.statusViewController.showMessage("Detected image “\(imageName)”")
}
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first!
let location = touch.location(in: sceneView)
if !isPlaneSelected {
selectExistingPlane(location: location)
} else {
addNodeAtLocation(location: location)
}
}
/
func selectExistingPlane(location: CGPoint) {
/
let hitResults = sceneView.hitTest(location, types: .existingPlaneUsingExtent)
if hitResults.count > 0 {
let result: ARHitTestResult = hitResults.first!
if let planeAnchor = result.anchor as? ARPlaneAnchor {
for var index in 0...anchors.count - 1 {
/
if anchors[index].identifier != planeAnchor.identifier {
sceneView.node(for: anchors[index])?.removeFromParentNode()
sceneView.session.remove(anchor: anchors[index])
}
index += 1
}
/
anchors = [planeAnchor]
/
isPlaneSelected = true
print("plane selected!")
setPlaneTexture(node: sceneView.node(for: planeAnchor)!)
}
}
}
func setPlaneTexture(node: SCNNode) {
if let geometryNode = node.childNodes.first {
if node.childNodes.count > 0 {
geometryNode.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "./art.scnassets/wood.png")
geometryNode.geometry?.firstMaterial?.locksAmbientWithDiffuse = true
geometryNode.geometry?.firstMaterial?.diffuse.wrapS = SCNWrapMode.repeat
geometryNode.geometry?.firstMaterial?.diffuse.wrapT = SCNWrapMode.repeat
geometryNode.geometry?.firstMaterial?.diffuse.mipFilter = SCNFilterMode.linear
}
}
}
/
func addNodeAtLocation(location: CGPoint) {
guard anchors.count > 0 else {
print("anchors are not created yet")
return
}
let hitResults = sceneView.hitTest(location, types: .existingPlaneUsingExtent)
if hitResults.count > 0 {
let result: ARHitTestResult = hitResults.first!
let newLocation = SCNVector3Make(result.worldTransform.columns.3.x, result.worldTransform.columns.3.y, result.worldTransform.columns.3.z)
let newLampNode = lampNode?.clone()
if let newLampNode = newLampNode {
newLampNode.position = newLocation
sceneView.scene.rootNode.addChildNode(newLampNode)
}
}
}
var imageHighlightAction: SCNAction {
return .sequence([
.wait(duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOpacity(to: 0.15, duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOut(duration: 0.5),
.removeFromParentNode()
])
}
}
I am trying to detect an image saved in the AR Resources folder, stabalize and texture the plane to cover the real life poster with a virtual image based on what the referencename of the image detected is. Thanks