I saw NSNetServiceBrowser is marked as deprecated on Apple document. But I cannot see replacement of this class.
I need to write mDNS finder program. Should I use NSNetServiceBrowser or another class?
Post
Replies
Boosts
Views
Activity
I test with VLC as RTSP audio client on MacOS.
Every 5 minutes, I hear noise.
The noise continue for 3 sec, happens every 5 min exactly.
During noise period, kernel_task use +25% CPU for 3 sec, Console->wifi.log put message staring with
SCAN request received from pid ??? (locationd) with priority=2, qos=-1 (default), frontmost=no
I checked Wireshark, it receives RTP/UDP packets every 20ms. But during noise period, no package for 140ms. That makes no sound period and noise.
If I disable WiFi and use Ether cable, the noise is gone.
If I disable Settings -> Security & Privacy -> Location Services, the noise is gone.
Is there any way to receive RTP/UDP package during locationd's scan?
My environment:
macOS Big Sur ver 11.4
iMac (Retina 5K, 27-inch, 2017)
VLC 3.0.16(Intel 64bit)
I wrote simple NSMutableData test project.
I profiled with allocations instruments. It shows alloc1() total bytes are 55MB.
But alloc1() only called once and alloced byte should be 1MB. I cannot find the reason of 55MB allocation in alloc1()
Replace this code with fresh macOS App project on Xcode13.
#import "ViewController.h"
@implementation ViewController {
NSTimer *mTimer;
NSMutableData *mData1;
NSMutableData *mData2;
}
- (void)viewDidLoad {
[super viewDidLoad];
mData1 = nil;
mData2 = nil;
mTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self
selector:@selector(timer_cb) userInfo:nil repeats:YES];
}
- (void) timer_cb {
if (mData1 == nil) {
[self alloc1];
}
if (mData2 == nil) {
[self alloc2];
}
[self copy1];
}
- (void) alloc1 {
NSLog(@"alloc1");
mData1 = [NSMutableData dataWithCapacity:1024*1024];
}
- (void) alloc2 {
NSLog(@"alloc2");
mData2 = [NSMutableData dataWithCapacity:1024*1024];
[mData2 resetBytesInRange:NSMakeRange(0, 1024*1024)];
}
- (void) copy1 {
[mData1 replaceBytesInRange:NSMakeRange(0, 1024*1024) withBytes:mData2.bytes];
}
@end
I have Apple TV 4K connected router A, IP is 192.168.1.10.
Apple TV send Bluetooth Low Energy(BLE) advertisement with the IP.
I captured by BLE sniffer.
I try "Screen Mirroring" from MacBook on router B, IP is 192.168.2.10
MacBook send "GET /info...RTSP/1.0" to appleTV:7000.
Apple TV replay with 1368 bytes of "RTSP/1.0 200 OK..." that includes device name, type, features.
But MacBook does not show my AppleTV as Display list.
I like to know why my AppleTV is not recognized as mirroring display even all RTSP traffic has no error.
mDNS from AppleTV is blocked by router.
Ping from MacBook to Apple TV was success.
If Apple TV and MacBook connect on same router, screen mirroring was success.
Router A and B : Netgear Nighthawk
Router netmask : 255.255.255.0 (both)
MacBook : macOS Monterey 12.4
Apple TV : tvOS 15.6(19M65)
I test with ScreenCaptureKit example and success to get desktop image as IOSurface.
I know IOSurface holds GPU memory and not easy to access as DRAM.
Do you know a way to h.264 compress the IOSurface?
Or do I have to convert to CVPixelBuffer from IOSurface?
If you have any sample code for handling IOSurface, it would be useful to me.
I try to run ScreenCaptureKit sample code.
That sample require macOS 13.0 for audio capture.
When I run, the app shows with "No screen recording permission".
I grant Screen Recording permission on System Settings -> Privacy & Security.
But same error happens. I cannot find a way to grant the permission.
I tried restart app, restart Xcode, reboot macOS and
rm -rf ~/Library/Developer/Xcode/DerivedData/CaptureSample-...
This sample app worked after comment out "streamConfig.capturesAudio" and related code on Monterey. This permission issue did not happen on Monterey.
Env:
macOS Ventura 13.0,
Xcode 14.1(14B47b)
Sample code URL : https://developer.apple.com/documentation/screencapturekit/capturing_screen_content_in_macos?language=objc
I try to get AVCaptureDevice instance of a virtual audio plugin, like blackhole.
I need to call AVCaptureDevice.DiscoverySession, because old method (AVCaptureDevice.devicesWithMediaType) is deprecated.
First, I cannot find enum for virtual audio plugin. I try .externalUnknown or .builtInMicrophone. Both result is empty.
I like to know how to list virtual microphone and get AVCaptureDevice instance.
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(
deviceTypes: [ .externalUnknown ],
mediaType: .audio,
position: .unspecified
)
let devs = deviceDiscoverySession.devices
print("devices=\(devs)") // empty list
I use VideoToolbox HW h.264 encoder with M1 MacBook for screen mirroring. I need run encoder with minimal delay mode.
I use these values as encoderSpecification
kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder:true
kVTCompressionPropertyKey_ProfileLevel:kVTProfileLevel_H264_Baseline_AutoLevel
kVTCompressionPropertyKey_RealTime:true
kVTCompressionPropertyKey_AllowFrameReordering:false
I set presentation timestamp with current time.
In compressed callback, I got encoded frame with wrong order.
[13.930533]encode pts=...13.930511
[13.997633]encode pts=...13.997617
[14.013678]compress callback with pts=...13.997617 dts=...13.930511
[14.023443]compress callback with pts=...13.930511 dts=...13.997617
in[]:log time,
pts:presentation timestamp,
dts:decode timestamp
AllowFrameReordering is not working as I expected.
If I need to set other property, please let me know.
I also does not like buffering 2 video frames. If you know settings for no buffering frame, please let me know.
I need to develop Emergency communication app. This app is not used frequently but it need to start immediately when user need to make emergency communication.
If this app is not used for a month, it uninstalled by "Offload unused apps" feature. Is there way to put a flag into app to prevent uninstall?
If application's Info.plist has setting like "Preventing offloading unused apps", it would be great to me.
I write macOS menu app with TextField by SwiftUI on Japanese Input mode. On some conditions, the TextFiled lost focus, no key input, no mouse click. User cannot do anything.
Setup
MacOS Ventura 13.3.1 (a)
Install Japanese Romaji Input source by System Preferences -> Keyboard
Set input mode as "Romaji"
Build test source code
On Xcode 14.3, create new macOS app project "FocusTest" with SwiftUI, Swift.
Replace FocusTestApp.swift with attached code.
Build on Xcode
Steps
Set input mode as "Romaji"
Run FocusTestApp
Click T square icon on top menu
Small windows with globe appear
Click Desktop background area
Click T square icon on top menu
Click PIN
T with PIN textField View appear
That textField lost focus, click inside of textField
Key or click is not accepted.
With US keyboard mode, key input become possible on Step 10. But Focused blue square is missing.
Code of FocusTestApp.swift
import SwiftUI
@main
struct focusTestApp: App {
var body: some Scene {
MenuBarExtra("Test", systemImage: "t.square") {
MainView()
}.menuBarExtraStyle(.window)
}
}
struct MainView: View {
@State private var showingPIN: Bool = false
var body: some View {
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundColor(.accentColor)
Button("PIN") {
print("clicked")
showingPIN = true
}
}
.padding()
.sheet(isPresented: $showingPIN) {
PinView()
}
}
}
struct PinView: View {
@Environment(\.presentationMode) var presentationMode
@State private var pin: String = ""
@FocusState private var pinIsFocused: Bool
var body: some View {
VStack {
Image(systemName: "t.square")
.resizable()
.aspectRatio(contentMode: .fit)
.frame(width: 64.0, height: 64.0)
.foregroundColor(.accentColor)
Text("Enter PIN code")
HStack {
TextField("", text: $pin)
.font(Font.system(size: 28, design: .default))
.frame(width:4*28.0, height:28.0)
.focusable()
.focused($pinIsFocused)
}
.onAppear(){
pinIsFocused = true
}
}
.padding()
}
}
I like to know NullAudio.c is official SDK sample or not.
And the reason of enum and UID is defined in NullAudio.c, not defined in SDK header files.
I try to use kObjectID_Mute_Output_Master, but it defined different values on each 3rd party plugin.
kObjectID_Mute_Output_Master = 10 // NullAudio.c
kObjectID_Mute_Output_Master = 9 // https://github.com/ExistentialAudio/BlackHole
kObjectID_Mute_Output_Master = 6 // https://github.com/q-p/SoundPusher
I can build BlackHole and SoundPusher, these plugin worked.
This enum should be defined SDK header and keep same value on each SDK version.
I like to know why 3rd party defined different value.
If you know the history of NullAudio.c, please let me know.
I had 2 crash report from our customer. Both crash point is same but there is no my code on crash stack trace. How to fix this kind of crash problem.
Thread 1 Crashed:: Dispatch queue: com.apple.root.background-qos
0 libsystem_kernel.dylib 0x7ff81b84922a __pthread_kill + 10
1 libsystem_pthread.dylib 0x7ff81b880f7b pthread_kill + 263
2 libsystem_c.dylib 0x7ff81b7caca5 abort + 123
3 libc++abi.dylib 0x7ff81b83b082 abort_message + 241
4 libc++abi.dylib 0x7ff81b82c23d demangling_terminate_handler() + 266
5 libobjc.A.dylib 0x7ff81b529023 _objc_terminate() + 96
6 libc++abi.dylib 0x7ff81b83a4a5 std::__terminate(void (*)()) + 8
7 libc++abi.dylib 0x7ff81b83a456 std::terminate() + 54
8 libdispatch.dylib 0x7ff81b701a58 _dispatch_client_callout + 28
9 libdispatch.dylib 0x7ff81b704500 _dispatch_continuation_pop + 463
10 libdispatch.dylib 0x7ff81b715dff _dispatch_source_invoke + 2184
11 libdispatch.dylib 0x7ff81b7116a2 _dispatch_root_queue_drain + 343
12 libdispatch.dylib 0x7ff81b711e4d _dispatch_worker_thread2 + 160
13 libsystem_pthread.dylib 0x7ff81b87dc9d _pthread_wqthread + 256
14 libsystem_pthread.dylib 0x7ff81b87cc67 start_wqthread + 15
This crash point is exactly same with this post. I do not throw C++ exception.
https://developer.apple.com/forums/thread/725197
I made s target of "Camera Extension" on Xcode macOS Swift app.
I got Swift code with CMIOExtensionDeviceSource.
I add NSLog() and String.write() to file under FileManager.default.temporaryDirectory.
My camera extension installaion was success and running with FaceTime.
But I cannot see NSLog output or debug output temp file on Xcode or Console.
How can I see debug output from my Camera Extension?
I got code of CMIO CameraExtension by Xcode target and it is running with FaceTime. I guess this kind of Extension has lots of security limitation.
I like to run command like "netstat" in Extension. Is that possible to call Process.run()? I got keep getting error like "The file zsh doesn’t exist". Same code with Process.run() worked in macOS app.
I like to run DistributedNotificationCenter and send text from App to CameraExtension. Is that possible? I do not receive any message on CameraExtension.
If there is any other IPC method between macOS app and CameraExtension, please let me know.
I need to write macOS App, CameraExtension(CMIO) and Uninstaller app.
Bundle ID is like this
App : com.my.app
CameraExtension : com.my.app.cameraex
Uninstaller app : com.my.app.unisntaller
My App can activate CameraExtension by OSSystemExtensionRequest.activationRequest.
But Uninstaller cannot deactivate CameraExtension.
I got error : Error Domain=OSSystemExtensionErrorDomain Code=4 "Extension not found in App bundle: perhaps App is not validly structured"
I set AppGroup and add SystemExtension feature and provision for uninstaller.
I guess "com.my.app.unisntaller" cannot deactivate "com.my.app.cameraex".
What kind of Bundle ID should I use for my uninstaller?
Writing App and Uninstaller is correct way for CameraExtension?
My manager ask to provide easy method for removing all modules.