Following this video "Accelerate machine learning with Metal" & this sample code "Customizing a TensorFlow operation"
My Environment:
macOS 13.1
Python 3.8.15
tensorflow-deps 2.9.0
tensorflow-macos 2.9.0
tensorflow-metal 0.5.1
after running make at hash_encoder, the hash_encoder_kernel.so appears. Then I tried to run python tiny_nerf_hash.py but it didn't work.
Unlike the python tiny_nerf_mlp.py, it works normally but it didn't take the advantage of accelerating.
Post
Replies
Boosts
Views
Activity
I tried to define the matrix_double3x3 properties but the compiler don't allow me to do.
I can define the matrix_float3x3 or matrix_half3x3 but not for the matrix_double3x3.
But I saw the matrix_types.h, it has typedef simd_double3x3 matrix_double3x3
My goal is to send the matrix_double3x3 to Metal for computing the rectified image.
Environment
Xcode 14.0 (14A309)
Destination:
macOS (Minimum Deployments: 12.3)
macOS 12.6 (21G115)
from iOS & iPadOS 16 Beta 3 Release Notes
It saids to use a device with A12 or A13 but iPad Pro 12.9 (4th Generation) can't detect persons.
I tested with the ARKit replay data that can be captured the person
detected (tested on iPhone 13 Pro Max with iOS 15.5)
not detected (tested on iPad Pro 12.9 Gen4 with iOS 16 beta 3)
Tested with BodyDetection Project
Log
2022-07-10 17:27:27.153899+0700 BodyDetection[3026:1192001] Metal GPU Frame Capture Enabled
2022-07-10 17:27:27.154031+0700 BodyDetection[3026:1192001] Metal API Validation Enabled
2022-07-10 17:27:27.229646+0700 BodyDetection[3026:1192001] [Foundation.IO] Could not locate file 'default-binaryarchive.metallib' in bundle.
2022-07-10 17:27:27.497224+0700 BodyDetection[3026:1192001] [ECS.Core] Class for component AccessibilityComponent already registered
2022-07-10 17:27:27.592351+0700 BodyDetection[3026:1192001] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten.
2022-07-10 17:27:27.870551+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.908466+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.910532+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.911679+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.912939+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.914706+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.915377+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.927888+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.928700+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.929341+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.929987+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.930610+0700 BodyDetection[3026:1192001] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2022-07-10 17:27:27.931604+0700 BodyDetection[3026:1192001] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping.
2022-07-10 17:27:27.931645+0700 BodyDetection[3026:1192001] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping.
2022-07-10 17:27:28.099950+0700 BodyDetection[3026:1192001] [] [17:27:28.099] captureSourceRemote_SetProperty signalled err=-16452 (kFigCaptureSourceError_SourceNotLocked) (Source must be locked for configuration to set properties) at FigCaptureSourceRemote.m:549
2022-07-10 17:27:28.100896+0700 BodyDetection[3026:1192001] [] [17:27:28.101] captureSourceRemote_SetProperty signalled err=-16452 (kFigCaptureSourceError_SourceNotLocked) (Source must be locked for configuration to set properties) at FigCaptureSourceRemote.m:549
2022-07-10 17:27:28.526246+0700 BodyDetection[3026:1192297] ⛔️⛔️⛔️ ERROR [MOVReaderInterface]: Error Domain=com.apple.AppleCV3DMOVKit.readererror Code=9 "CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}" UserInfo={NSLocalizedDescription=CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}} ⛔️⛔️⛔️
2022-07-10 17:27:29.129011+0700 BodyDetection[3026:1192297] ⛔️⛔️⛔️ ERROR [MOVReaderInterface]: Error Domain=com.apple.AppleCV3DMOVKit.readererror Code=9 "CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}" UserInfo={NSLocalizedDescription=CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}} ⛔️⛔️⛔️
Warning (secondary thread): in AppendProperty at line 858 of sdf/path.cpp -- Can only append a property 'preliminary:anchoring:type' to a prim path (/)
Warning (secondary thread): in AppendProperty at line 858 of sdf/path.cpp -- Can only append a property 'triggers' to a prim path (/)
2022-07-10 17:27:30.176576+0700 BodyDetection[3026:1192297] [profiling] ResetStream() : No ops in stream.
I spent many hours to figure out why my app is now other item not the iOS app
you will see the Distribute Content instead of Distribute App
Version and Identifier are now gone
I test it on Xcode 13.3
the last archive you will see in the image is the one that Creation Date is 3 Apr 2022 17:55
I try to build with the previous version (Xcode 13.2.1)
The error appears Package.resolved file is corrupted or malformed
I delete the Package.resolved
Then the error is gone
I try to archive again and it works
Creation Date is 3 Apr 2022 18:30
The problem seems to occur when using an older Xcode project opened with the current version of Xcode (13.3).
Goal Example
The user clicks 1st circle button and 2nd circle button
Then the line will draw from 1st to 2nd
NOTE: Button is movable
What've tried
There are two Heart in BoardView. Each heart has MovableCard
When Heart is tapped, then notify and pass self
struct Heart: View {
var body: some View {
MovableCard("heart.fill")
.onTapGesture {
// Notify this card is tapped
NotificationCenter.default.post(heartOnTap, ["self":self])
}
}
}
There is viewModel in BoardView that observes heartOnTap
struct BoardView: View {
@StateObject var viewModel = ViewModel()
var body: some View {
ZStack{
Heart()
Heart()
Path{ path in
path.move(to: viewModel.firstCard.POS ?? CGPoint(x: -1, y: -1))
path.addLine(to: viewModel.secondCard.POS ?? CGPoint(x: -1, y: -1))
}
}
}
}
ViewModel will listen heartOnTap for receiving Heart: View
I send Heart: View not the tap position because it's movable.
But Heart that conformed View doesn't have any access for directly getting its position
class ViewModel: ObservableObject {
@Published var firstCard: Heart?
@Published var secondCard: Heart?
init() {
NotificationCenter.default.addObserver(self,
selector: #selector(self.didTap(_:)),
name: heartOnTap,
object: nil)
}
@objc func didTap(_ notification: NSNotification) {
guard let heart = notification.userInfo?["self"] as? Heart else { return }
if firstCard == nil {
firstCard = HEARTPOSITION // <- Heart position for drawing the line
}
if secondCard == nil {
secondCard = HEARTPOSITION // <- Heart position for drawing the line
}
}
}
Discussion
Is there a possible solution for drawing line from view to another view. (view is movable)
Alternative Goal
DigicalSim (iOS)
CloudKit + Core Data
I created two entities
Credit Card
Relationship: transactions
Transaction
Relationship: creditcard
Configuration: Default is already set "used with CloudKit"
Problem
The relationships don't sync. For example, on 1st device.
let creditCard = CreditCard(context: viewContext)
creditCard.name = "First Card"
let firstTransaction = CCTransaction(context: viewContext)
firstTransaction.price = "First Transaction"
firstTransaction.price = 100
creditCard.addToTransactions(firstTransaction)
firstTransaction.creditcard = creditCard
try! viewContext.save()
// Check transactions
print(creditCard.transactions.count) // 1
on 2nd device after synced
@FetchRequest(
sortDescriptors: [NSSortDescriptor(keyPath: \CreditCard.name, ascending: true)],
animation: .default)
var creditcards: FetchedResults<CreditCard>
// Check transactions
ForEach(creditcards) { creditcard in
Card("")
.onTapGesture {
print(creditCard.transactions.count) // will have a value of 0 except the device that records the relationship.
}
}
It looks like the relationship never syncs
Data Model
After updating iOS/iPadOS 15 Beta 5, debugging at run time will freeze for about 30 seconds to 1 minute.
then the console will be shown
2021-08-17 19:05:46.063366+0700 AppName[1451:1049560] Writing analzed variants.
Are you guy experiencing the same problem as me?
Description
Create a new project
at ContentView.swift, add import RealityKit
Present error
Error
❌ Failed to build module 'RealityKit' for importation due to the errors above; the textual interface may be broken by project issues or a compiler bug
❌ Failed to build module 'RealityKit' for importation due to the errors above; the textual interface may be broken by project issues or a compiler bug
Overview
After creating a Reality Composer project, then we can add the 3D content to ARView by calling this code
if let boxScene = try? Experience.loadBox() {
arView.scene.anchors.append(boxScene)
}
By calling the code above, the scene will be automatically spawned at the ray of center view.
Expected
To make the user feel natural, the App will spawn at the user's tap position. To do so, I started by getting a tap position
Get a tapped position
let arView = ARView()
context.coordinator.arView = arView
// Register tap gesture
let tapGesture = UITapGestureRecognizer(target: context.coordinator,
action:#selector(context.coordinator.tapped(sender:)))
arView.addGestureRecognizer(tapGesture)
//...
@objc func tapped(sender: UITapGestureRecognizer) {
// Get tap position
let tapPosition = sender.location(in: arView)
// call `spawnBoxAt(_)` to spawn a box a ray of user's tap position
spawnBoxAt(tapPosition)
}
spawnBoxAt(tapPosition)
Experience.loadArrowDownAsync { [weak self] result in
guard let arrowDownScene = try? result.get() else { return }
// Perform a raycast
if let ray = self?.arView.ray(through: tapPosition),
let results = self?.arView.scene.raycast(origin: ray.origin,
direction: ray.direction,
query: .nearest) {
// Get the hit against a SceneUnderstanding
let hitSceneUnderstands = results.filter{ $0.entity is HasSceneUnderstanding }
guard let hitFirst = hitSceneUnderstands.first else { return }
let point = hitFirst.position
// Set position
arrowDownScene.setPosition(point, relativeTo: nil)
self?.arView.scene.anchors.append(arrowDownScene)
arrowDownScene.setPosition(point, relativeTo: nil)
}
}
Unfortunately, the result of this code arrowDownScene.position is equal to point that was from hitFirst.position but the object is not spawned at tap position but still be at the ray of center view