I have added a material to my geometry of SCNNode and now I want to add another material to it and set it to blend mode 'multiply'.I tried a lot but unable to find a way to do this. If we blend the texture asmaterial.lightingModel = .physicallyBasedlet image = UIImage(named: "1.PNG")material.multiply.contents = imagematerial.multiply.contentsTransform = SCNMatrix4MakeScale(10, 10, 0)material.multiply.wrapT = .repeatmaterial.multiply.wrapS = .repeatmaterial.multiply.intensity = 1.0when set to “ physically based”. The multiply doesn’t work....any clue what is wrong?thanks!
Post
Replies
Boosts
Views
Activity
Hello guys,How can I build a Core ML that ranks a list with the images that are most similar to an inputed image?Clarifai.ai has this feature, but I wanted to do it in CoreMLthanks!
How can I set Scenekit to use a custom rendering?Ex, I would like to use Metal Ray Tracing code bellow, as a custom Scenekit rendering modehttps://developer.apple.com/documentation/metalperformanceshaders/metal_for_accelerating_ray_tracingAny input on where I can start is higly appreciated! : )best
I am trying to load a SCN or OBJ asset that has gemotry and textures into following Apple example code:https://developer.apple.com/documentation/metalperformanceshaders/metal_for_accelerating_ray_tracingI didnt find anywehere that I could link SCN files into Scene. How can I do that?
Goal: Convert FBX to USD (with Morph Targets)Tried:1 - Reality converterFBX to USD worked, but not Morph Targets.From what I researched, Reality converter is using USDPython 0.62, which doesnt support Morph targets.Would be great if apple update Reality converter to suppport 0.64 with Morph Targets.2- USDPython 0.64USDZconvert command for FBX to USD worked, but not Morph TargetsLooks like the script that supports FBX to USD is usdStageWithFbx.py (inside example folder). I have seen no documentation on how to run this script.Can someone from Apple help out with step by step instructions?Cheers to all!
Hey guys,Goal: export morph targets DMaya -> Dae -> SceneKitAE, with morph targets controllersIn Maya everything works fine, morphs with sliders work as intendedWhen exported to FBX/DAE and opened on Xcode, the morphs targets are there, but it places the entire avatar, instead of scaling the specific node (nose, arm, etc).https://gyazo.com/6f36a90ce5292b85a6f7a21b9a8918f2How can I export from Maya to DAE, keeping morphs for each node? Or any other Maya -> Scenekit?Thanks!
Wwdc20-10041 mentions that SceneKit can be now be imported into SwiftUI.
Before, the way to do that was using UIViewRepresentable, so we can display a scene of ship for ex.
What would be the new way to show the ship on a SwiftUI view?
Using Scenekit UIViewRepresentable (code block 1 bellow), we could add a tap gesture recognizer hit test like
With the new SceneView for Scenekit/SwiftUI (code block 2), we add a tap gesture recognizer?
I found this API, but its not clear how to use it...
https://developer.apple.com/documentation/scenekit/sceneview/3607839-ontapgesture
import SwiftUI
import SceneKit
import UIKit
import QuartzCore
struct SceneView: UIViewRepresentable {
func makeUIView(context: Context) -> SCNView {
let view = SCNView(frame: .zero)
let scene = SCNScene(named: "ship")!
view.allowsCameraControl = true
view.scene = scene
// add a tap gesture recognizer
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(_:)))
	view.addGestureRecognizer(tapGesture)
return view
}
func handleTap(_ gestureRecognize: UIGestureRecognizer) {
// retrieve the SCNView
let view = SCNView(frame: .zero)
// check what nodes are tapped
let p = gestureRecognize.location(in: view)
let hitResults = view.hitTest(p, options: [:])
// check that we clicked on at least one object
if hitResults.count > 0 {
// retrieved the first clicked object
let result = hitResults[0]
// get material for selected geometry element
let material = result.node.geometry!.materials[(result.geometryIndex)]
// highlight it
SCNTransaction.begin()
SCNTransaction.animationDuration = 0.5
// on completion - unhighlight
SCNTransaction.completionBlock = {
SCNTransaction.begin()
SCNTransaction.animationDuration = 0.5
material.emission.contents = UIColor.black
SCNTransaction.commit()
}
material.emission.contents = UIColor.green
SCNTransaction.commit()
}
}
func updateUIView(_ view: SCNView, context: Context) {
}
}
import SwiftUI
import SceneKit
struct ContentView: View {
var scene = SCNScene(named: "ship.scn")
var cameraNode: SCNNode? {
scene?.rootNode.childNode(withName: "camera", recursively: false)
}
var body: some View {
SceneView(
scene: scene,
pointOfView: cameraNode,
options: []
)
.allowsHitTesting(/*@START_MENU_TOKEN@*/true/*@END_MENU_TOKEN@*/)
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
Is it also possible to build Paired image to image translation (like pix2pix) with CreateML?
Paired = ex B&W to color
Unpaired = ex style transfer
How can I ask questions about Scenekit? Looks like there are no labs?
Should I apply for a Metal or realityKit lab and ask SceneKit questions?
I am trying to build the App example on wwdc20-10037, but get only a "untitled" screen, followed by this error:
2020-06-25 12:26:28.639346+0200 ShapeEdit[2093:54298] [Type Declaration Issues] Type "com.example.ShapeEdit.shapes" was expected to be declared and exported in the Info.plist of ShapeEdit.app, but it was not found.
How can I properly build it?
Edit Shape Code - https://developer.apple.com/forums/content/attachment/759f7b02-e8a7-495b-a4b5-ea41cde9b948
I am trying to build scover "Ray Tracing with Metal" on iMac Pro 2017, but got this error:
2020-06-25 14:29:35.233058+0200 MTLRaytracingSample-macOS[3602:145531] Failed to set (contentViewController) user defined inspected property on (NSWindow): Ray tracing isn't supported on this device
Xcode 12
Big Sur 11.0 Beta (20A4299v)
iMac Pro 2017
3,2 GHz 8-Core Intel Xeon W
32 GB 2666 MHz DDR4
Radeon Pro Vega 56 8 GB
What is the required hardware configuration?
At wwdc20 session 10031, it’s mentioned that:
‘“To learn more about how best to show data in your app, you can download the code for Shape Edit from developer.apple.com. “
Where is the source code for the app?
thanks
Amazing session! : )
Besides style transfer, is there a way to also train image to image (for ex pix2pix) GAN models?
Is the Playground for Shape Edit code available?