I have two Dockkit anomolies to report. Hoping a DTS person has seen these and/or can comment.
First, my setup: I am controlling the accessory by making repeated calls to set the angular velocity. And the first thing I do is make a call
dockManager.setSystemTrackingEnabled(false)
because I'm doing my own tracking.
I would note that I tried calling track() on my own, with a bunch of observation rectangles (or even just one) but it didn't work well, even though I was calling at the correct rate. Instead, I measure the angular deviation to where I wish my camera was pointed, and set the angular velocity proportional to the error.
First issue: in normal operation, the green tracking light is on, on the hardware (the Instaflow Pro 360 motorized dock). Squeezing the trigger toggles the green light on/off; only when the light is on will the dock accept my calls to set the angular velocity. Fine.
But sometimes squeezing the trigger won't reactivate the green light. In this case, the ONLY thing that seems to work is switching to the Instaflow Pro 360 app, and activating the camera. Immediately the green light turns on, and I'm good (and can return to my own app, with the green light still on).
So what hidden API call does Instaflow have, that I don't that can make this happen? Sure, it's their own app, but I imagine they don't have access to calls I don't, so how does their app manage to get the green light back on?
It doesn't always happen. Would love to know how to snap out of this.
Second issue: While I usually use rectangle from running the vision system to guide my camera position, sometimes I left the user directly control the angular "yaw" velocity (rotation around the vertical axis) directly (by issuing commands over the network).
Sometimes, when the user sets a non-zero velocity, when they set a zero velocity a short time later, the camera doesn't immedately respond and stop. (It's not a network issue. I can verify the API sends a call to set the angular velocity to zero, and the camera keeps rotating for a good fraction of a second.) Most times the camera stops immediately, but sometimes it doesn't.
Oddly, I never see this issue when letting the user set the angular velocity in the "pitch up/down" axis. Just the yaw axis.
Anybody else seen this? I feel like it wasn't a problem till I got to iOS18 but I won't swear to it.
Any advice/assistance/discussion greatly appreciated.
Post
Replies
Boosts
Views
Activity
So I have a small homebuilt device that has a simple Arduino-like chip with wifi capabilities (to be precise, the Xiao Seeed ESP32C, for anyone who cares), and I need my iOS app to talk to this device.
Using the CoreBluetooth framework, we've had no problems --- except that in "noisy" environments sometimes we have disconnects. So we want to try wifi.
We assume that there is no public wifi network available. We'd love to do peer-to-peer networking using Network, but that's only if both devices are from Apple. They're not.
Now, the Xiao device can act as an access point, and presumably I could put my iPhone on that network and use regular TCP calls to talk to it. The problem is that my app wants to both talk to this home-built device, but ALSO make http calls to my server an amazon.
So: how do I let my iOS app talk over wifi to this simple chip, while not losing the ability to also have my app reach a general server (and receive push notifications, etc.)
To be more concrete, imagine that my app needs to be able to discover the access point provided by my device, use low-level TCP socket calls to talk to this local wifi device, all without losing the ability to also make general http calls and be just accessible to push notifications as it was before connecting to this purely local (and very short range, i.e. no more than 30 meters distant) device.
Does this make sense? Have I explained it well enough?
This bites me a lot. I'm looking at the documentation for, say, UNUserNotificationCenter.
And NOWHWERE but NOWHERE do I see anything that says, "hey, on platform *** you should import YYY to use this class."
Am I just not looking in the right place in Apple documentation to find this?
Surely, somewhere at the top level of documentation, it must tell you want the proper package to import is, per platform?
Let's say I have an iOS app on the app store. Anyone can download and use it, but I would like to restrict the app from granting access to certain features to a select set of people I can personally vouch for. So, for example, to get access, the app send email to me, you have to convince me I know you, and if you do, I send you back some kind of token string which you can enter into the app.
However, I'd like for that token to not be shareable, and to be locked to that device.
Is there any kind of persistent ID associated with a device that I can use to tie the token I grant to that persistent ID?
Or can someone suggest a way that once I trust a user, I can give them a token which will cannot be shared to anyone else?
Also, does anyone know if restricting access to app features in this way is any kind of issue with regards to the app review process? The app itself is free, and there are no in-app purchases. I simply don't want certain features of the app (which end up sending push notifications) to get abused.
I'm brand new to Metal. I've googled, but can't get the right answer to come up. (Thanks, unhelpful ChatGPT generated answers polluting everything, but I digress...)
Ultimately, I'm trying to figure out how to use Metal to render 3D DICOM data on iOS specifically. If you're not familiar with DICOM, let's just say I've got a whole stack of CT image slices. Or to get really simple, I've got a cube of voxel values with differing values at each voxel coordinate.
Where do I even start in Metal to render something like this?
(I was trying to get the VTK toolkit compiled for iOS, which uses OpenGL, but that appears to be a dead end. And besides, Metal is supposed to be so much better.)
Thanks for any tips/leads/suggestions/general pointers.
In Xcode 7.2 (and possibly earlier versions of xcode 7.3, though I can't swear to it) it seems like "local" project results (e.g. files in my project) were favored by being put at the top of the completion list. For example, if I had the file "DirectoryViewController.swift" then typing "dire" into the open quickly search box would put my local file at the top, and system stuff like "dirent.h" etc. at the bottom.Now in Xcode 7.3 I find that Open Quickly is swamping me with results that come from stuff used by my project (e.g. names of public variables in swift classes from Foundation or UIKit) rather than stuff that I defined in my project (my own file names/public methods). Everything is being found, but local results no longer appear at the top.Is this intentional? Any chance the order could revert to what it was? A preference?Since there's far more global stuff (that one didn't write) than local stuff, it's very disconcerting that open quickly has become much less useful to me (i have to type many more characters to get what i want now near the top of the list) than it was in previous versions.any suggestions welcome (as are any fixes planned!)