Post not yet marked as solved
Posting this on behalf of my colleague, who has a project in mind that requires a huge amount of RAM. Is it true that modern Mac Pro's can only have up to 192GB of RAM which is about 8 times less than 5 years old intel based Mac Pros?
Post not yet marked as solved
What best to pass to the options parameter of:
MTLDevice.makeBuffer(length:options:)
MTLDevice.makeBuffer(bytes:length:options:)
MTLDevice.makeBuffer(bytesNoCopy:length:options:deallocator:)
Basically I'm looking for a "plain English" explanation of MTLResourceOptions doc page.
Post not yet marked as solved
The man page for getifaddrs states:
The ifa_data field references address family specific data. For AF_LINK addresses it
contains a pointer to the struct if_data (as defined in include file <net/if.h>) which
contains various interface attributes and statistics. For all other address families, it
contains a pointer to the struct ifa_data (as defined in include file <net/if.h>) which
contains per-address interface statistics.
I assume that "AF_LINK address" is the one that has AF_LINK in the p.ifa_addr.sa_family field.
However I do not see "struct ifa_data" anywehere. Is this a documentation bug and if so how do I read this documentation right?
Post not yet marked as solved
How do I download a folder from opensource.apple.com without going inside recursively and downloading individual files?
e.g. one from here: "https://opensource.apple.com/source/Libm/"
PS. no idea what's the proper tag for this post, and as forum insists on having a non-empty tag field I'm using "Foundation" arbitrarily.
Post not yet marked as solved
I'm trying to reproduce a case when there are more dispatch queues than there are threads serving them. Is that a possible scenario?
Post not yet marked as solved
Am I calling this right?
host_priv_t hostPriv = 0;
int err = host_get_host_priv_port(mach_host_self(), &hostPriv);
err = host_processors(hostPriv, &processorList, &processorCount);
host_get_host_priv_port above returns 4 "(os/kern) invalid argument".
Tried with App Sandbox enabled and disabled.
Hi,
Given pthread_id (†), is there a way to find the associated NSThread (when the one exists)? Perhaps using an undocumented / unsupported method – I don't mind.
Thank you!
(†) the pthread_id is neither of the current nor of the main thread.
Post not yet marked as solved
When calling CBCentralManager's connectPeripheral:options: with some Bluetooth devices I'm getting the "Bluetooth Pairing Request" alert on iOS and a similar "Connection Request from:" alert on macOS. Is there a way to determine upfront if the alert is going to be presented or not? Alternatively is there a way to prohibit presenting this alert (in which case the connect request could fail, which is totally fine)? I tried specifying these options:
var manager: CBCentralManager
...
manager.connect(
peripheral,
options: [
CBConnectPeripheralOptionNotifyOnConnectionKey: false,
CBConnectPeripheralOptionNotifyOnDisconnectionKey: false,
CBConnectPeripheralOptionNotifyOnNotificationKey: false
]
)
but those didn't help (and by the doc they shouldn't help as they relate to the use case of app running in background, which is not applicable in my case – my app runs and calls connect when it is in foreground, the unwanted alert is displayed immediately).
Post not yet marked as solved
Using CoreBluetooth I am getting these values from CBCentralManagerDelegate's didDiscover peripheral delegate method:
kCBAdvDataTxPowerLevel: 12 (could be other number like 7, 0 or a small negative number)
This one is taken from advertisementData parameter. This key might be absent.
rssi: -68 (or -60, -100, etc)
this is taken from the "rssi" parameter (always present).
I am looking for a formula to calculate approximate distance based on these two numbers. Is that possible?
I know that ideally I need to know rssi0 (rssi at 1 meter), but I don't see how I can get that via CoreBluetooth API or other means (without actually measuring rssi at one meter distance which is not good for me). How could I approximate rssi0 value with "kCBAdvDataTxPowerLevel"?
Post not yet marked as solved
Hi, is it possible to have the SwiftUI's NavigationView set to show both panels side by side in landscape mode on iPhone Plus/Max sized devices? Similar to how Messages.app does it. Thank you.
Post not yet marked as solved
Why is "fork" prohibited in sandboxed apps?
Where's CBAdvDataManufacturerData format documented?
It looks like there's the company code stored in little endian order in the first two bytes (referencing the companies found in company_identifiers.yaml on bluetooth site). And the rest is arbitrary?
I'd like to see the official documentation about this.
Thank you.
Post not yet marked as solved
how do i make TextEditor autoscrolled? i want to implement a log view based on it - when the scroll position is at bottom, adding new lines shall autoscroll it upwards so the newly added lines are visible. and when the scroll position is not at bottom - adding new lines shall not autoscroll it.
Post not yet marked as solved
I am struggling to see why the following low-level audio recording function - which is based on tn2091 - Device input using the HAL Output Audio Unit - (a great article, btw, although a bit dated, and it would be wonderful if it was updated to use Swift and non deprecated stuff at some point!) fails to work under macOS:
func createMicUnit() -> AUAudioUnit {
let compDesc = AudioComponentDescription(
componentType: kAudioUnitType_Output,
componentSubType: kAudioUnitSubType_HALOutput, // I am on macOS, os this is good
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0, componentFlagsMask: 0)
return try! AUAudioUnit(componentDescription: compDesc, options: [])
}
func startMic() {
// mic permision is already granted at this point, but let's check
let status = AVCaptureDevice.authorizationStatus(for: AVMediaType.audio)
precondition(status == .authorized) // yes, all good
let unit = createMicUnit()
unit.isInputEnabled = true
unit.isOutputEnabled = false
precondition(!unit.canPerformInput) // can't record yet, and know why?
print(deviceName(unit.deviceID)) // "MacBook Pro Speakers" - this is why
let micDeviceID = defaultInputDeviceID
print(deviceName(micDeviceID)) // "MacBook Pro Microphone" - this is better
try! unit.setDeviceID(micDeviceID) // let's switch device to mic
precondition(unit.canPerformInput) // now we can record
print("\(String(describing: unit.channelMap))") // channel map is "nil" by default
unit.channelMap = [0] // not sure if this helps or not
let sampleRate = deviceActualFrameRate(micDeviceID)
print(sampleRate) // 48000.0
let format = AVAudioFormat(
commonFormat: .pcmFormatFloat32, sampleRate: sampleRate,
channels: 1, interleaved: false)!
try! unit.outputBusses[1].setFormat(format)
unit.inputHandler = { flags, timeStamp, frameCount, bus in
fatalError("never gets here") // now the weird part - this is never called!
}
try! unit.allocateRenderResources()
try! unit.startHardware() // let's go!
print("mic should be working now... why it doesn't?")
// from now on the (UI) app continues its normal run loop
}
All sanity checks pass with flying colors but unit's inputHandler is not being called. Any idea why?
Thank you!
Post not yet marked as solved
Tried to ask as a comment in the other thread:
https://developer.apple.com/forums/thread/650386?answerId=628394022#reply-to-this-question
But can't leave a comment in there for some reason (the thread is locked?). Asking exactly the same question, now for iOS 15. Anything changed in this area?
When selecting a stroke path for object on PKCanvas, the option "Snap to Shape" appears.
I understand this function is still in beta and has not made available natively to other PencilKit app. Is there a way using Stroke API to call this function directly after the user hold pencil for half a second when stroke is done drawing, just like how it behaves in native apps?