Is there any way to have a UIBlurEffectView ignore views underneath it?
More than just masking it. Mask of a view below it, it'll still process that view and you'll still see a "halo" around the edge. Ideally I'd like to have a view 'below' the blur, mask it, and also not have the view be processed in the blur.
Post
Replies
Boosts
Views
Activity
Repo is here
Visual Description is here:
Basically the red view should never be inside the blue view. The red view has a fixed aspect ratio. The blue view initially has an aspect ratio different from the red view and is animated to have the same aspect ratio.
The red view is constrained inside the blue view as follows:
let ac = brokenView.widthAnchor.constraint(equalTo: brokenView.heightAnchor, multiplier: 9/16)
let xc = brokenView.centerXAnchor.constraint(equalTo: animView.centerXAnchor)
let yc = brokenView.centerYAnchor.constraint(equalTo: animView.centerYAnchor)
let widthC = brokenView.widthAnchor.constraint(equalTo: animView.widthAnchor)
widthC.priority = .defaultLow
let gewc = brokenView.widthAnchor.constraint(greaterThanOrEqualTo: animView.widthAnchor)
let geHC = brokenView.heightAnchor.constraint(greaterThanOrEqualTo: animView.heightAnchor)
geHC.priority = .required
So the red view should start at equal width, but prioritize always been taller and wider than the blue view while staying centered and keeping a fixed aspect ratio. As you can see in the visual description it is not priortizing being taller.
Any suggestions on how to fix, work around, or otherwise get past this would be appreciated. Really trying to avoid manually doing a spring animation with keyframes and an assload of math.
I have filed a bug on feedback assistant, but figured someone here might have experience//know how. Thanks
First off - I have read and fully understand this post - Apple doesn't want us abusing users' hardware so as to maximize the quality of experience across apps for their customers. I am 100% on board with this philosophy, I understand all design decisions and agree with them.
To the problem:
I have an app that takes photo assets, processes them for network (exportSession.shouldOptimizeForNetworkUse = true), and then uploads them. Some users have been having trouble, did some digging, they're trying to upload 4K 60FPS videos. I think that is ridiculous, but it's not my place.
The issue is that the export time for a 4K60FPS video that is ~40s long can be as long as 2m. So if they select a video to upload, and then background the app that upload will ALWAYS fail because the processing fails (I have BG uploads working just fine). My understanding is that default I have 30s to run things in the background. I can use UIApplication.pleasegivemebackgroundtime to request up to 30 more seconds. That is obviously not enough.
So my second option is BGProcessingTask - but that's not guaranteed to run ever. Which I understand and agree with, but when the user selects a video while the app is in the foreground the expectation is that it immediately begins processing. So I can't use a BGProcessingTask?
Just wondering what the expected resolution here is. I run tasks, beg for time, if it doesn't complete I queue up a BGTask that may or may not ever run? That seems ****** for a user - they start the process, see it begin, but then if the video is too big they just have to deal with it possibly just not happening until later? They open up the app and the progress bar has magically regressed. That would infuriate me.
Is there another option I'm not seeing? Some way to say "this is a large background task that will ideally take 30-60s, but it may take up to ~5-7m. It is user-facing though so it must start right away"? (I have never seen 5-7m, but 1-2m is common)
Another option is to just upload the full 4K60FPS behemoth and do the processing on my end, but that seems worse for the user? We're gobbling upload bandwidth for something we're going to downsample anyway. Seems worse to waste data than battery (since that's the tradeoff at the end of the day).
Anyway, just wondering what the right way to do this is. Trivially reproducible - record 1m 4K60FPS video, create an export session, export, background, enjoy failure.
I have a navigation controller with two VCs. One VC is pushed onto the NavController, the other is presented on top of the NavController. The presented VC has a relatively complex animation involving a CAEmitter -> Animate birth rate down -> Fade out -> Remove. The pushed VC has an 'inputAccessoryView' and can become first responder.
The expected behavior is open presented VC -> Emitter Emits pretty pictures -> emitter stops gracefully.
The animation works perfectly. UNLESS I open pushed VC -> Leave -> go to presented VC. In this case when I open the presented VC the emitter emits pretty pictures -> they never stop. (Please do not ask me how long it took to figure this much out 🤬😔)
The animation code in question is:
let animation = CAKeyframeAnimation(keyPath: #keyPath(CAEmitterLayer.birthRate))
animation.duration = 1
animation.timingFunction = CAMediaTimingFunction(name: .easeIn)
animation.values = [1, 0 , 0]
animation.keyTimes = [0, 0.5, 1]
animation.fillMode = .forwards
animation.isRemovedOnCompletion = false
emitter.beginTime = CACurrentMediaTime()
let now = Date()
CATransaction.begin()
CATransaction.setCompletionBlock { [weak self] in
print("fade beginning -- delta: \(Date().timeIntervalSince(now))")
let transition = CATransition()
transition.delegate = self
transition.type = .fade
transition.duration = 1
transition.timingFunction = CAMediaTimingFunction(name: .easeOut)
transition.setValue(emitter, forKey: kKey)
transition.isRemovedOnCompletion = false
emitter.add(transition, forKey: nil)
emitter.opacity = 0
}
emitter.add(animation, forKey: nil)
CATransaction.commit()
The delegate method is:
extension PresentedVC: CAAnimationDelegate {
func animationDidStop(_ anim: CAAnimation, finished flag: Bool) {
if let emitter = anim.value(forKey: kKey) as? CALayer {
emitter.removeAllAnimations()
emitter.removeFromSuperlayer()
} else {
}
}
}
Here is the pushed VC:
class PushedVC: UIViewController {
override var canBecomeFirstResponder: Bool {
return true
}
override var canResignFirstResponder: Bool {
return true
}
override var inputAccessoryView: UIView? {
return UIView()
}
}
So to reiterate - If I push pushedVC onto the navController, pop it, present PresentedVC the emitters emit, but then the call to emitter.add(animation, forKey: nil) is essentially ignored. The emitter just keeps emitting.
Here are some sample happy print statements from the completion block:
fade beginning -- delta: 1.016232967376709
fade beginning -- delta: 1.0033869743347168
fade beginning -- delta: 1.0054619312286377
fade beginning -- delta: 1.0080779790878296
fade beginning -- delta: 1.0088880062103271
fade beginning -- delta: 0.9923020601272583
fade beginning -- delta: 0.99943196773529
Here are my findings:
The issue presents only when the pushed VC has an inputAccessoryView
AND canBecomeFirstResponder is true
It does not matter if the inputAccessoryView is UIKit or custom, has size, is visible, or anything.
When I dismiss PresentedVC the animation is completed and the print
statements show. Here are some unhappy print examples:
fade beginning -- delta: 5.003802061080933
fade beginning -- delta: 5.219511032104492
fade beginning -- delta: 5.73025906085968
fade beginning -- delta: 4.330522060394287
fade beginning -- delta: 4.786169052124023
CATransaction.flush() does not fix anything
Removing the entire CATransaction block and just calling
emitter.add(animation, forKey: nil) similarly does nothing - the
birth rate decrease animation does not happen
I am having trouble creating a simple demo project where the issue is reproducible (it is 100% reproducible in my code, the entirety of which I'm not going to link here) so I think getting a "solution" is unrealistic. What I would love is if anyone had any suggestions on where else to look? Any ways to debug CAAnimation? I think if I can solve the last bullet - emitter.add(animation, forKey: nil) called w/o a CATransaction - I can break this whole thing. Why would a CAAnimation added directly to the layer which is visible and doing stuff refuse to run?
I am trying to implement (in Swift/iOS) Resumable Uploads in a similar fashion to this wwdc23 video
Fortunately there is an open source solution to do most of the work! The issue is that background URLSessions seem to have a mind of their own?
Basically the gist of the resumable uploads is that an upload will progress to the best of its ability, and then when it fails it's supposed to run a HEAD request to see how much of the file has been written, and then resume uploading from the last good offset.
I can not for the life of me get this to work in a "real life" scenario. Something as trivial as leaving wifi for cell causes everything to break. And this isn't that crazy of a scenario! It's totally reasonable to expect a user to do some stuff before they leave the house, and then take their phone and drive away. The WiFI connection will drop and the user will switch to cell.
As it is right now the way URLSession handles things is that the initial task fails, but instead of loud, hard failing it quietly, softly fails and then as soon as there is network again the request is completely restarted. This breaks everything. Either the original request timed out and attempting to re-write the entire file fails because the offsets dont match any more (prev had some non-0 offset, new has 0 offset, non-zero != 0), or it fails because there are now two requests who think they have the appropriate offsets and everything gets out of sync.
My understanding is that iOS 17 has implemented resumable uploads in a nice way, but how to manage for all the users who don't immediately upgrade? Is there a way to say "Hey URLSession, I know you want to immediately restart this request that failed due to the network disappearing and normally this is what people want, but maybe just cancel that last request that failed and then run this little handler to see where you should really start from and then start from there"
My guess is that this is impossible because it needs to run in the background and iOS doesn't like letting apps run when they're not supposed to be? That's why this all had to be baked into iOS17.
Just looking for anyone who has any idea how to handle this.
Trying to get a "sticky cell" that will orthogonally scroll with the section, but only partially, leaving the trailing end exposed until scrolled backwards.
Demo Code:
https://developer.apple.com/documentation/uikit/views_and_controls/collection_views/implementing_modern_collection_views
(Download the project)
Open: Modern Collection Views > Compositional Layout > Advanced layouts View Controllers > OrthogonalScrollBehaviorViewController.swift
Replace
func createLayout() -> UICollectionViewLayout {
...
}
With
func createLayout() -> UICollectionViewLayout {
let config = UICollectionViewCompositionalLayoutConfiguration()
config.interSectionSpacing = 20
let layout = UICollectionViewCompositionalLayout(sectionProvider: {
(sectionIndex: Int, layoutEnvironment: NSCollectionLayoutEnvironment) -> NSCollectionLayoutSection? in
guard let sectionKind = SectionKind(rawValue: sectionIndex) else { fatalError("unknown section kind") }
let leadingItem = NSCollectionLayoutItem(layoutSize: NSCollectionLayoutSize(
widthDimension: .fractionalWidth(0.5), heightDimension: .fractionalHeight(0.5)))
let orthogonallyScrolls = sectionKind.orthogonalScrollingBehavior() != .none
let containerGroup = NSCollectionLayoutGroup.horizontal(
layoutSize: NSCollectionLayoutSize(widthDimension: .fractionalWidth(1.0),
heightDimension: .fractionalHeight(0.4)),
subitems: [leadingItem])
let section = NSCollectionLayoutSection(group: containerGroup)
section.orthogonalScrollingBehavior = sectionKind.orthogonalScrollingBehavior()
section.visibleItemsInvalidationHandler = { (items, offset, env) in
let buffer: CGFloat = 50
for item in items {
if item.indexPath.item == 0 {
item.zIndex = 25
let w = item.frame.width
if offset.x >= (w - buffer) {
item.transform = CGAffineTransform(translationX: offset.x - (w - buffer), y: 0)
} else {
item.transform = .identity
}
} else {
item.zIndex = -item.indexPath.item
}
}
}
let sectionHeader = NSCollectionLayoutBoundarySupplementaryItem(
layoutSize: NSCollectionLayoutSize(widthDimension: .fractionalWidth(1.0),
heightDimension: .estimated(44)),
elementKind: OrthogonalScrollBehaviorViewController.headerElementKind,
alignment: .top)
section.boundarySupplementaryItems = [sectionHeader]
return section
}, configuration: config)
return layout
}
As you can see it works perfectly RIGHT UP to where the cell is now "offscreen" according to its bounds (even though it is still clearly visible on screen) and it disappears.
I have also tried using a custom UICollectionViewCompositionalLayout that makes sure the sticky element has an attribute in layoutAttributesForElements(in rect: CGRect), exactly the same results: As soon as the 'bounds' are offscreen the cell is removed even if the frame is very clearly still on screen.
How are developers supposed to support interactive reordering of custom cells in UICollectionViews that use UICollectionViewDiffableDataSource?
collectionView.beginInteractiveMovementForItem(at: iPath)
collectionView.updateInteractiveMovementTargetPosition(target)
collectionView.endInteractiveMovement()
Will work maybe 70% of the time, but the other 30% of the time when calling endInteractiveMovement() the dataSource.snapshot does not properly reflect the updated state.
Obviously there is a way to make this work if we use UICollectionViewListCells and the .reorder accessory, but that's not an acceptable solution.
Please advise.
I am getting crazy "illegal instruction: 4" errors all over the place.
Carthage can't update because it is getting this error
I can't archive because my old carthage dependencies are getting this error
All of this started happening after switching to swift 5.2.4
How can I revert to swift 5.2.3?
Alternative solutions are also welcome.
Hello all,Here is a stack overflow of my question if you want to just skip to there:https://stackoverflow.com/questions/60049577/corelocationmanager-never-pausesI'll ask here as well - My app has location services always on, it's eating up a lot of battery overnight. Trying to figure out why as CLLocationManager is set to pause, but it's not pausing?Relevant code is at the SO link, I'll post it here if people want. Long and short is that I put my phone down for an hour, didn't touch my phone, and watched the logs and location services NEVER paused. I think this has been happening since iOS 13? I'm more than able to pause location services myself, but my understanding was that Apple and iOS would automatically pause them? It definitely was pausing at one point but it's not any more. I'm wondering if I have something configured wrong? Any and all suggestions or advice is welcome, thanks.