Posts

Post marked as solved
5 Replies
Claude, I'm writing my own Publisher, Subscriber, and Subscriptions. Most everyone uses infinite demands so my questions don't apply, I really want to understand actual usage and with no demo code from Apple using it, no way to know.I recently wrote a small article on Combine, and there are a slew of them on the web, and each has a different take on the topic. So many are wrong, and it would just be great to get this clarified.
Post not yet marked as solved
9 Replies
Thank you so much for this! One question - does xcodebuild examine each library to determine what architecture (and in the case of x86/arm) determine what the target device is? So no need for some manifest telling it this info I guess.Again - thanks so much!
Post marked as solved
29 Replies
BTW, those warnings (and I'm dealing with them too in an open source lib) are now hard errors in Xcode 11.4-beta. FYI.
Post marked as solved
1 Replies
Turns out the NWPath has an array of interfaces ("availableInterfaces)", they are hashable. So one can keep an array of interfaces and their state.That said, I did testing with my phone, turning WIFI and Cellular on and off, singly or together, to see when the block gets called. Based on that, I determine:1) the status of the last block is the overall status (satisfied or not)2 the last status message of "satisfied" returns the expensive state of the overall networking path: if the system believes it can use WIFI, then its false. If the only option is cellular, then its expensive.In other words, if you turn cellular on, then wifi, then wifi off, the last status message references cellular and expensive is set.
Post marked as solved
5 Replies
I understand optional protocols. The compiler knows its optional, that's why "delegate.stream" cannot be called directly.I correctly guessed that I could add a "?" and use this "delegate.stream?(fetcher, handle: .openCompleted)", which compiles just fine.Then it occurred to me - why not try to get the function as its own variable, assuming it exists, with "let stream = delegate.stream" in the guard statement. That works fine.But when I go to use this function, I get a warning that the "handle:" parameter should not be there. It is this error I don' understand.
Post not yet marked as solved
10 Replies
Experiencing the same thing on the Simulator. What is interesting is that if you alternate - select the first row, go back, select the second row, go back, then you can select the first row again. The issue is that you cannot select the same row again.Is there a bug report on this?
Post not yet marked as solved
2 Replies
It just seems I can do better with my own Text() view in a HStack, but haven't tried. The Stepper could be shown in an iPad with a huge amount of space, or in a small iPhone, so no easy way to apply a frame to it. It sure seems odd to me that the text size shrinks with the minimumScaleFactor regardless of the number of characters.
Post marked as solved
2 Replies
Entered a bug on Swift.org, it was accepted, and as a result two other bugs were found :-)https://bugs.swift.org/browse/SR-11934
Post not yet marked as solved
2 Replies
OK - never fails now if there is no filtering?!?! But if I add a filter (either of the two), then it fails at the next loop: let videoComposition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in var outputImage = request.sourceImage defer { request.finish(with: outputImage, context: nil) } do { var angle = self.hueAngle let radians = degrees2radians(angle) angle = (angle + 1) % 360 self.hueAngle = angle self.hueFilter.angle = radians self.hueFilter.inputImage = outputImage outputImage = self.hueFilter.outputImage! } do { var radius = self.radius let inc = self.inc radius += inc if radius > 5 { radius = 5 self.inc *= -1 } if radius < 0 { radius = 0 self.inc *= -1 } self.radius = radius self.blurFilter.radius = radius self.blurFilter.inputImage = outputImage outputImage = self.blurFilter.outputImage! } })Obviously validating the composition makes something better in the composition object.
Post not yet marked as solved
2 Replies
Now I'm really pulling my hair out. I added this test to be 100% sure the video was properly constructed:let ret = videoComposition.isValid(for: asset, timeRange: timeRange, validationDelegate: self) print("VALIDATE", ret)All possible delegate calls implemented with print statemens - but it returns true and I got no delegate calls. But now, the looper is not failing! Sixteen loops so far and still going strong (will leave it on!). Grrrrr
Post not yet marked as solved
2 Replies
Yes, it needs to be transparent. I don't have any of those fancy programs like After Effects, nor do I want to learn them for just this one task. The Pond5 video has a black background embedded in it - the first 30 seconds. The second 30 seconds are the appropriate mask for the equivalent point in the core video.I read a really interesting article on how to apply a mask in real time - I realize I could preprocess it too. Since links here can be problematic, try searching on terms quentinfasquel ios-transparent-video-with-coreimageIn the end, I think I can do this in code: I'll run the video once, from start to finish. I'll cache every frame (30*25 of them), then when I get past the 30 second mark, I'll add the original frame to the mask frame, and output that as an AVAsset (to a writer), saving this final 30 seconds with both images placed such that I can pull each out (per the link above).I'll get more experinece with AVFoundation and not waste time learning to use some tools that I'll never use again.
Post marked as solved
1 Replies
Two updates:- I realized that this reduces alpha doubly - the original pixels have a fractional alpha, which is then multiplied by the alpha in the mask.- I was able to do what I wanted by writing my own CIImageFilter in MTL, and converting the pixels from RGBA with black mixed in, to transparent RGBA that when drawn over a black background appears exactly the same.
Post not yet marked as solved
5 Replies
It won't run for me - on launch it fails to load some dylib
Post not yet marked as solved
1 Replies
I don't know the specific answer here, but I have an idea based some some similar code I did in the past. You need to get the pixel buffers from the source as they are being generated, and in most cases you will just immediately free them. However, when the user stops the video, you will then have the last pixelBuffer displayed. Convert that pixel buffer into a CG (or UI) Image, then get a CIImage from it using the same filters as you are using in the video. Display that CIImage over top of the video. Now, when the filters are modified, they will act on the display CIImage. When you restart the video, you may need to keep that image frontmost for a short period of time until the modified filters "kick in" to the video, then you can animate the over laid CIImage and the user will see the video using the modified filter settings.
Post not yet marked as solved
2 Replies
Digging around I found an Apple page on NSAllowsArbitraryLoadsInWebContent, which says in part:Set this key’s value to YES to exempt your app’s web views from App Transport Security restrictions without affecting your URLSession connections. Domains you specify in the NSExceptionDomains dictionary aren’t affected by this key’s value. A web view is an instance of any of the following classes: WKWebView UIWebView (iOS only) WebView (macOS only)Note that SFSafariViewController isn't mentioned - surely this is because its already exempt.