I test with ScreenCaptureKit example and success to get desktop image as IOSurface.
I know IOSurface holds GPU memory and not easy to access as DRAM.
Do you know a way to h.264 compress the IOSurface?
Or do I have to convert to CVPixelBuffer from IOSurface?
If you have any sample code for handling IOSurface, it would be useful to me.
Post
Replies
Boosts
Views
Activity
I have Apple TV 4K connected router A, IP is 192.168.1.10.
Apple TV send Bluetooth Low Energy(BLE) advertisement with the IP.
I captured by BLE sniffer.
I try "Screen Mirroring" from MacBook on router B, IP is 192.168.2.10
MacBook send "GET /info...RTSP/1.0" to appleTV:7000.
Apple TV replay with 1368 bytes of "RTSP/1.0 200 OK..." that includes device name, type, features.
But MacBook does not show my AppleTV as Display list.
I like to know why my AppleTV is not recognized as mirroring display even all RTSP traffic has no error.
mDNS from AppleTV is blocked by router.
Ping from MacBook to Apple TV was success.
If Apple TV and MacBook connect on same router, screen mirroring was success.
Router A and B : Netgear Nighthawk
Router netmask : 255.255.255.0 (both)
MacBook : macOS Monterey 12.4
Apple TV : tvOS 15.6(19M65)
I wrote simple NSMutableData test project.
I profiled with allocations instruments. It shows alloc1() total bytes are 55MB.
But alloc1() only called once and alloced byte should be 1MB. I cannot find the reason of 55MB allocation in alloc1()
Replace this code with fresh macOS App project on Xcode13.
#import "ViewController.h"
@implementation ViewController {
NSTimer *mTimer;
NSMutableData *mData1;
NSMutableData *mData2;
}
- (void)viewDidLoad {
[super viewDidLoad];
mData1 = nil;
mData2 = nil;
mTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self
selector:@selector(timer_cb) userInfo:nil repeats:YES];
}
- (void) timer_cb {
if (mData1 == nil) {
[self alloc1];
}
if (mData2 == nil) {
[self alloc2];
}
[self copy1];
}
- (void) alloc1 {
NSLog(@"alloc1");
mData1 = [NSMutableData dataWithCapacity:1024*1024];
}
- (void) alloc2 {
NSLog(@"alloc2");
mData2 = [NSMutableData dataWithCapacity:1024*1024];
[mData2 resetBytesInRange:NSMakeRange(0, 1024*1024)];
}
- (void) copy1 {
[mData1 replaceBytesInRange:NSMakeRange(0, 1024*1024) withBytes:mData2.bytes];
}
@end
I test with VLC as RTSP audio client on MacOS.
Every 5 minutes, I hear noise.
The noise continue for 3 sec, happens every 5 min exactly.
During noise period, kernel_task use +25% CPU for 3 sec, Console->wifi.log put message staring with
SCAN request received from pid ??? (locationd) with priority=2, qos=-1 (default), frontmost=no
I checked Wireshark, it receives RTP/UDP packets every 20ms. But during noise period, no package for 140ms. That makes no sound period and noise.
If I disable WiFi and use Ether cable, the noise is gone.
If I disable Settings -> Security & Privacy -> Location Services, the noise is gone.
Is there any way to receive RTP/UDP package during locationd's scan?
My environment:
macOS Big Sur ver 11.4
iMac (Retina 5K, 27-inch, 2017)
VLC 3.0.16(Intel 64bit)
I saw NSNetServiceBrowser is marked as deprecated on Apple document. But I cannot see replacement of this class.
I need to write mDNS finder program. Should I use NSNetServiceBrowser or another class?