hi there,
i've just started digging into developing for HomeKit, so it's quite possible i'm missing something (or this is expected?), but calling addAndSetupAccessories always fails instantly (no UI appears) when running in the simulator. the error is:
addAndSetupAccessories got error Optional(Error Domain=HMErrorDomain Code=79 "Failed to add the accessory." UserInfo={NSLocalizedDescription=Failed to add the accessory.})
i'm running the HomeKit simulator with a home created + one standard lightbulb accessory added.
if i run on device it works as expected (using my real homekit accessories)
xcode: Version 12.0 beta 4 (12A8179i)
homekit accessory simulator: Version 4.0 (135.3)
Post
Replies
Boosts
Views
Activity
just wondering.. whenever i call readValueWithCompletionHandler on a HMCharacteristic, i get log messages like:
[-[HMCharacteristic readValueWithCompletionHandler:], /Library/Caches/com.apple.xbs/Sources/HomeKit/HomeKit-714.0.4.1.1/Sources/HomeKit/HMCharacteristic.m:513 (DBEA6EA4-BD3B-4673-BB61-934AF9480EB0)] Thread left active (1): <NSThread: 0x282534680>{number = 1, name = (null)}
I've tried various things like changing when I call it, making sure I call it on the main thread etc but always with the same result. Is this expected? bad?
I'm using Xcode 12.0.1 (12A7300), running on iPadOS 14.0
hi there,
i'm running into issues when creating occluders from arbitrary MDLMeshes (i.e. 3d models i create in app and then turn into MDLMeshes, rather than MDLMesh.newBox() etc). some meshes work fine, others hit an assert deep in the PHASE c++ code. i've tested on simulator and iPhone 12 (iOS16 and now iOS17 beta 3).
i've tried to figure out if there's some kind of pattern to which meshes work and which don't, but haven't been able to find one. note that using the build in MDLMesh primitives works fine.
the assert is:
Assertion failed: (voxelIndex < level.mVoxels.Count()), function AddBuilderVoxelToSubtree, file GeoVoxelTree.cpp, line 188.
the code is:
func createOccluder(from meshes: [MDLMesh], at transform: Transform, preset: PHASEMaterialPreset) throws -> PHASEOccluder {
guard let engine else {
throw "No engine"
}
print("audio meshes: \(meshes)")
let material = PHASEMaterial(engine: engine, preset: preset)
var shapes: [PHASEShape] = []
for (_, mesh) in meshes.enumerated() {
let meshShape = PHASEShape(engine: engine, mesh: mesh)
for element in meshShape.elements {
element.material = material
}
shapes.append(meshShape)
}
let occluder = PHASEOccluder(engine: engine, shapes: shapes)
occluder.worldTransform = transform.matrix
try engine.rootObject.addChild(occluder)
return occluder
}
the assert happens with:
let occluder = PHASEOccluder(engine: engine, shapes: shapes)
any ideas on what could be going on here?
cheers,
Mike
screenshots of callstack etc:
hi there,
i'm not sure if i'm missing something, but i've tried passing a variety of CGImages into SCSensitivityAnalyzer, incl ones which should be flagged as sensitive, and it always returns false. it doesn't throw an exception, and i have the Sensitive Content Warning enabled in settings (confirmed by checking the analysisPolicy at run time).
i've tried both the async and callback versions of analyzeImage.
this is with Xcode 15 beta 5.
i'm primarily testing on iOS/iPad simulators - is that a known issue?
cheers,
Mike
hi there,
one of the features I'm most looking forward to as a developer and a user is the journaling app. I'm assuming the app won't make the initial iOS17 launch next week, but I was wondering if there was an idea on when the APIs might be available to start working with?
“Journaling Suggestions API” means the Documented API that enables a display of journaling suggestions.
(from https://developer.apple.com/support/terms/apple-developer-program-license-agreement/)
cheers,
Mike
hi there,
one thing i'm working on for the vision pro requires knowing the user's heading relative to north (magnetic or true). basically it's working with objects located using geo coordinates, and getting an angle of the direction the user is facing relative to them.
on iOS and watchOS, i can use CMMotionManager's CMDeviceMotion heading property. looking at the docs it says this heading is there for visionOS, as is the xMagneticNorthZVertical reference frame.. however it seems like there is no magnetometer?
it's a bit hard to tell in the simulator since isDeviceMotionAvailable, isAccelerometerAvailable and isGyroAvailable all return false. while isMagnetometerAvailable is a compile error (unavailable in visionOS).
is there - or will there be - a way i can get the user's heading relative to north?
I'm running into a confusing difference in the way SpatialTapGesture locations are handled when the targeted entities are children of AnchorEntities.
Context: I'm wanting to create entities at points on interactable entities where the user taps.
The following code snippet works fine with non-anchored entities, but produces incorrect coordinates.
func convertLocation(_ value: EntityTargetValue<SpatialTapGesture.Value>) -> SIMD3<Float> {
return value.convert(value.location3D, from: .local, to: .scene)
}
func handleTap(_ value: EntityTargetValue<SpatialTapGesture.Value>, material: SimpleMaterial) {
let location3D = convertLocation(value)
let tap = createSphereEntity(0.01, material: material, interactable: false)
tap.position = location3D
value.entity.addChild(tap, preservingWorldTransform: true)
}
and, for reference, this is the gesture modifier attached to my RealityView:
.gesture(SpatialTapGesture().targetedToAnyEntity().onEnded({ value in
let material = SimpleMaterial(color: .systemPink, roughness: 0.1, isMetallic: true)
handleTap(value, material: material)
}))
I've tried numerous combinations... .local, .global, .named for CoordinateSpace, .scene, anchor name, entity name etc for SceneRealityCoordinateSpace... preserving world transform and not, adding the new entity to the tapped entity, directly to the anchor, etc.
anyone have any ideas?
also, i noticed that in the docs for NamedCoordinateSpace it mentions
static var immersiveSpace: NamedCoordinateSpace
but no such static property exists for me?
cheers,
Mike
I updated to Sequoia beta 1 and seem to have lost the macOS widgetkit simulator. is this expected?
My other mac running sonoma still has the simulator located at:
/System/Library/CoreServices/WidgetKit Simulator.app
trying to launch my widgets from Xcode 16 beta on my mac running sequoia results in this error:
Couldn't find LSBundleProxy for provided bundle identifier: com.apple.widgetkit.simulator
Domain: com.apple.dt.deviceservices.error
Code: 3
User Info: {
DVTErrorCreationDateKey = "2024-06-18 07:51:54 +0000";
IDERunOperationFailingWorker = IDELaunchServicesLauncher;
}
--
Event Metadata: com.apple.dt.IDERunOperationWorkerFinished : {
"device_identifier" = "00008112-0006250411FB401E";
"device_model" = "Mac14,2";
"device_osBuild" = "15.0 (24A5264n)";
"device_platform" = "com.apple.platform.macosx";
"device_thinningType" = "Mac14,2";
"dvt_coredevice_version" = "397.3.5.1";
"dvt_coresimulator_version" = 980;
"dvt_mobiledevice_version" = "1757.0.0.101.1";
"launchSession_schemeCommand" = Run;
"launchSession_state" = 1;
"launchSession_targetArch" = arm64;
"operation_duration_ms" = 1;
"operation_errorCode" = 3;
"operation_errorDomain" = "com.apple.dt.deviceservices.error";
"operation_errorWorker" = IDELaunchServicesLauncher;
"operation_name" = IDERunOperationWorkerGroup;
"param_debugger_attachToExtensions" = 1;
"param_debugger_attachToXPC" = 1;
"param_debugger_type" = 1;
"param_destination_isProxy" = 0;
"param_destination_platform" = "com.apple.platform.macosx";
"param_diag_113575882_enable" = 0;
"param_diag_MainThreadChecker_stopOnIssue" = 0;
"param_diag_MallocStackLogging_enableDuringAttach" = 0;
"param_diag_MallocStackLogging_enableForXPC" = 1;
"param_diag_allowLocationSimulation" = 1;
"param_diag_checker_tpc_enable" = 1;
"param_diag_gpu_frameCapture_enable" = 0;
"param_diag_gpu_shaderValidation_enable" = 0;
"param_diag_gpu_validation_enable" = 0;
"param_diag_memoryGraphOnResourceException" = 0;
"param_diag_queueDebugging_enable" = 1;
"param_diag_runtimeProfile_generate" = 0;
"param_diag_sanitizer_asan_enable" = 0;
"param_diag_sanitizer_tsan_enable" = 0;
"param_diag_sanitizer_tsan_stopOnIssue" = 0;
"param_diag_sanitizer_ubsan_stopOnIssue" = 0;
"param_diag_showNonLocalizedStrings" = 0;
"param_diag_viewDebugging_enabled" = 1;
"param_diag_viewDebugging_insertDylibOnLaunch" = 1;
"param_install_style" = 2;
"param_launcher_UID" = 2;
"param_launcher_allowDeviceSensorReplayData" = 0;
"param_launcher_kind" = 0;
"param_launcher_style" = 0;
"param_launcher_substyle" = 2;
"param_runnable_appExtensionHostRunMode" = 1;
"param_runnable_productType" = "com.apple.product-type.app-extension";
"param_structuredConsoleMode" = 1;
"param_testing_launchedForTesting" = 0;
"param_testing_suppressSimulatorApp" = 0;
"param_testing_usingCLI" = 0;
"sdk_canonicalName" = "macosx15.0";
"sdk_osVersion" = "15.0";
"sdk_variant" = macos;
}
--
System Information
macOS Version 15.0 (Build 24A5264n)
Xcode 16.0 (23037.4) (Build 16A5171c)
Timestamp: 2024-06-18T19:51:54+12:00
i'm working on an app which shares a swiftdata database between the main app and its widgets. prior to the sequoia/xcode 16 betas this was working fine with setting the same app group for app & widget targets.
however, now whenever i try to run my main app from Xcode i get a user permission requestor saying " would like to access data from other apps.". this happens every time i run it.
whenever the widget is started (via trying to place it on the desktop, or the widgetkit simulator etc) it exits immediately (i assume because it can't show the permission requestor?)
if i disable the app group for the widget, it runs.. however, of course, i can't access the main app's database.
i'm on sequoia beta 2 (24A5279h) and Xcode 16 beta 2 (16A5171r)
note: while the widgetkit simulator is now present in sequoia beta 2, i haven't actually been able to successfully use it