Post

Replies

Boosts

Views

Activity

C10k and number of sockets
hello,the wiki article for C10k problem states this:>>> The term [C10k problem] was coined in 1999 by Dan Kegel, citing the Simtel FTP host, cdrom.com, serving 10,000 clients at once over 1 gigabit per second Ethernet in that year. The term has since been used for the general issue of large number of clients, with similar numeronyms for larger number of connections, most recently "C10M" in the 2010s.By the early 2010s millions of connections on a single commodity 1U server became possible: over 2 million connections (WhatsApp, 24 cores, using Erlang on FreeBSD), 10–12 million connections (MigratoryData, 12 cores, using Java on Linux) <<<i wonder how is it possible to have a single physical server supporting more than 64K open connections. is there no limitation on open sockets?as this is an apple forum let's presume this quesion is about OSX (though relevant comments for other platforms are welcomed).
10
0
1.1k
Jan ’20
TestFlight build without version change
When distributing my app to external testers i can enjoy doing so without waiting for testflight review if i keep the app version the same and change only the build number, this works great.once i release the build for appstore, when I submit another testflight build it insists on increasing the app version ("file must contain a higher version than that of the previously approved version"), but if i do so - i have to wait for testflight approval. is there a way around it?1) 0.0.1 build 10 // first testflight build of 0.0.1 - needs testflight approval, ok2) 0.0.1 build 11 // version not changed - no testflight approval needed, good3) 0.0.1 build 12 // version not changed - no testflight approval needed, good4) 0.0.1 build 13 // submitted to appstore - appstore approval, ok, no testflight approval needed, good5) 0.0.1 build 14 // for testflight purposes only! -- error "must contain a higher version"! oops...(kind of weird that i can't submit another version of 0.0.1 just for testflight purposes)6) 0.0.2 build 15 // for testflight purposes only! - needs testflight approval...
3
0
4.1k
Apr ’20
UTI deprecated?
here: https://developer.apple.com/documentation/mobilecoreservices/uttype i can see everything related to universal type identifiers (UTI) is deprecated. what shall i use instead? the task at hand is to determine whether a particular content-type (aka mime type) is image or video, etc with comparing mime strings with some table of my own.
1
0
1.2k
Jun ’20
diffable data source can't identify items?
i have a list like this: id: 1, image: image1, title: title1, badge: 0 id: 2, image: image2, title: title2, badge: 0 id: 3, image: image3, title: title3, badge: 0 ... is my understanding correct that in order to do a smooth "expected" animation when I want to change both the badge of the item and its order i have to manually split this "big" update into two smaller updates (first change then move, or vice versa)? this is somewhat surprising, i would expect a diffable implementation to have a notion of "identity" (in the example above it's "id") and calculate the differences based on that identity plus ite equivalence check rather than just based on the hash/equality check for the whole item.
11
0
7.2k
Jul ’20
NSDiffableDataSourceSnapshot bug?
i found something really odd in NSDiffableDataSourceSnapshot behaviour. the tl;dr version is that this:                 snapshot.moveItem("one", afterItem: "two")                 dataSource.apply(snapshot) gives correct result, while this:                 var snapshotCopy = snapshot                 snapshotCopy.moveItem("one", afterItem: "two")                 dataSource.apply(snapshotCopy) gives incorrect result, despite being semantically equivalent. snapshot is a true value type, right? it is as odd as if i wrote: var b = a b += 1 a = b and got a different result in "a" compared to "a += 1" minimal source fragment attached (doesn't need nib files/storyboards) ========import UIKit class ViewController: UIViewController { 		var tv: UITableView! 		var dataSource: UITableViewDiffableDataSource<String, String>! 		override func viewDidLoad() { super.viewDidLoad() tv = UITableView(frame: view.bounds) tv.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.addSubview(tv) tv.register(UITableViewCell.self, forCellReuseIdentifier: "cell") var originalSnapshot = NSDiffableDataSourceSnapshot<String, String>() originalSnapshot.appendSections(["***"]) originalSnapshot.appendItems(["one", "two", "three"]) dataSource = UITableViewDiffableDataSource<String, String>(tableView: tv) { (tv, indexPath, item) -> UITableViewCell? in let cell = tv.dequeueReusableCell(withIdentifier: "cell", for: indexPath) cell.textLabel?.text = item return cell } dataSource.apply(originalSnapshot) DispatchQueue.main.asyncAfter(deadline: .now() + 2) { var snapshot = self.dataSource.snapshot() snapshot.insertItems(["four"], afterItem: "one") if true { var snapshotCopy = snapshot snapshotCopy.moveItem("one", afterItem: "two") self.dataSource.apply(snapshotCopy) /* gives incorrect result: two, one, three */ } else { snapshot.moveItem("one", afterItem: "two") self.dataSource.apply(snapshot) /* gives correct result: four, two, one, three */ } } } } @UIApplicationMain class AppDelegate: UIResponder, UIApplicationDelegate { var window: UIWindow? func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { window = UIWindow(frame: UIScreen.main.bounds) window!.rootViewController = ViewController() window!.makeKeyAndVisible() return true } }
1
0
583
Jul ’20
diffable data source vs hashability
i did some experiments with the "reference" equivalents of snapshot and data source types (NSDiffableDataSourceSnapshotReference & UITableViewDiffableDataSourceReference vs NSDiffableDataSourceSnapshot & UITableViewDiffableDataSource). partly for better understanding the diffable machinery, partly for Obj-C compatibility, partly due to the fact that using generics is not convenient for the model objects in question due to their profound viral effect on the surrounding code base. it actually works very well. but i noticed that my item value types are not hashable... if i mark them hashable i can see hashValue / hashInto methods being called and everything works as expected, but if i don't mark them hashable the hash methods are not called, and everything still works.. how does it work? Note that hashability is required when using swift wrapper types (NSDiffableDataSourceSnapshot & UITableViewDiffableDataSource)
0
0
709
Jul ’20
DispatchQueue.current
hello, i don't think it is provided by the system already so i'd like to implement a smart version of DispatchQueue.async function - the one that will not reschedule the block if i am already on the queue in question and call the block directly instead in this case.  extension DispatchQueue { func asyncSmart(execute: @escaping () -> Void) { if DispatchQueue.current === self { // ????? execute() } else { async(execute: execute) } } } the immediate problem is that there is no way to get the current queue (in order to compare it with the queue parameter and do the logic branch). anyone've been through it and solved this puzzle?
10
1
8.7k
Jul ’20
network warmup hijacks main thread
hello, i found something peculiar: the very first http/https network load in the app even when done on a background queue affects the main thread significantly. have a look at the screen recording of the app: http shorturl.at/nxAE4 and how non linear the scrolling performance is: http shorturl.at/uADY2 there i scroll a list of simple items and when it hit item 30 it loads it on a background queue. result (in this case error) is delivered on another background queue and ignored. the second load of this or a different URL doesn't cause such delay. what exactly is system doing on the main thread here and can it not do it elsewhere? what is the best way to figure what extra code is executed on main thread? i tried instruments with various tools in it but can't make heads or tails of it to figure out what exactly is going on here. the sample code that used for url loading:let someQueue = DispatchQueue(label: "someQueue") let otherQueue = DispatchQueue(label: "otherQueue") var someOpQueue: OperationQueue = { 		let q = OperationQueue() q.underlyingQueue = someQueue q.name = "someOpQueue" return q }() var session = URLSession(configuration: .default, delegate: nil, delegateQueue: someOpQueue) func loadUrl(_ url: URL) { otherQueue.async { let task = session.dataTask(with: url) { d, r, e in /* ignore */ } task.resume() } }
5
0
708
Aug ’20
virtual camera on macOS?
is there a way to create a virtual camera for macOS? camera that would show up in all apps including Apple's PhotoBooth, Safari, etc? for the sake of this question consider a virtual camera that shows a generated image with numbers 1 to 10 in a loop. is CoreMedia plugin the answer? i've seen the question here related codesigning: https://developer.apple.com/forums/thread/131826 with no conclusion. is DEXT driver extension the answer? note there is no custom hardware here, the camera is virtual and doesn't need hardware; is DEXT allowed at all if there's no hardware device? is KEXT kernel extension the answer? i know it is being phased out but if that's the only way now - so be it. is QT / Sequence Grabber the answer or is that completely dead now? is method swizzling / or other kind of "patching" the answer? any working solution even deprecated or dodgy will do, but of course a modern and apple sanctioned one is preferable.
5
1
6.7k
Aug ’20
a virtual macOS microphone
hello, how do i create a virtual microphone on macOS that can be selected as a default input device in System Settings or in apps like FaceTime / QuickTime Player / Skype, etc? is Audio HAL plugin the way to go? i've seen this macOS 10.15 note: "Legacy Core Audio HAL audio hardware plug-ins are no longer supported. Use Audio Server plug-ins for audio drivers." though i am not sure if that's applicable, as i can think of these interpretations: 1 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported (but you can still use non-legacy ones.) 2 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported." (but you can still use non-hardware ones".) 3 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported". (if you used that functionality to implement audio hardware drivers then your you can use Audio Server plug-ins instead, otherwise you are screwed.) The "Audio Server plugin" documentation is minimalistic: https://developer.apple.com/library/archive/qa/qa1811/_index.html which leads to a 2013 sample code: https://developer.apple.com/library/archive/samplecode/AudioDriverExamples/Introduction/Intro.html and contains a "nullAudio" plugin and a kernel extension backed plugin - neither of those i wasn't able to resurrect (i'm on macOS Catalina now). any hints?
5
0
5.5k
Aug ’20
AVCaptureDevice.videoZoomFactor vanishing point wanted
before i dive into implementing this myself.. is there a way of specifying the vanishing point for a zoom operation? AVCaptureDevice.videoZoomFactor: "The device achieves a zoom effect by cropping around the center of the image captured by the sensor." looks like this is one of those one trick pony features that works in a specific case of vanishing point at the center and if i need a different vanishing point (e.g. to zoom into a corner) i have to implement that myself. FB8703018
1
0
504
Sep ’20