Posts

Post not yet marked as solved
11 Replies
Hi OOPer, Have you defined the method by yourself? Or is it provided by some third party framework? It is indeed defined by a third-party library. I am using the FireBaseSDK to fetch data from the database.
Post not yet marked as solved
11 Replies
Hi OOPer (my hero in this forum :) ) Does the method getDocuments(completion:) calls completion in the main thread? well, let me explain it like this: I trigger the loadReplies.. function once I fetched the initial comment objects from the database. The comment objects are stored in the dataSource which is passed as a parameter. Once loadReplies is called, my view controller doesn't care actually about the replies as long as they are not fully fetched and returned back via delegate pattern. From my understanding, the fetches above are running NOT in the main thread or :)
Post not yet marked as solved
11 Replies
Hi @OOPer, thanks for the reply. My fetchUserById method looks like this public func fetchUserById(completionHandler:@escaping(_ user: User)->(), uid:String?) { // throws{     var userObject = User()     let _userId = UserUtil.validateUserId(userId: uid)     USER_COLLECTION?.whereField("uid", isEqualTo: _userId).getDocuments(completion: { (querySnapshot, error) in               if error != nil {         print(error?.localizedDescription as Any)       } else {         for document in querySnapshot!.documents {           userObject = User(snapShot: document)           completionHandler(userObject)         }       }     })   }
Post not yet marked as solved
11 Replies
Hi @Claude31, well, you can tell me if it is a good idea :) I am open for every good solution. Just to give you more background information. To a given comment object, I load all the replies. Those replies are stored in the commentReplyArray array. For each comment reply, I have to execute the async tasks to fetch the information, put it to result array and return it. Yes, the replies should be displayed in a table view to a given comment. I want to add additional cells below the cell with the original comment. Maybe I can share the whole function header as well: func loadReplies(commentItem: CommentItem, isSearchBarFocused:Bool, filteredComments:[CommentItem], dataSource:[CommentItem]) {     var dataSource = dataSource     // Get index of the comment     let index:Int     if (isSearchBarFocused) {       index = filteredComments.firstIndex(of: commentItem)!     }     else {       index = dataSource.firstIndex(of: commentItem)!     }           var ips: [IndexPath] = []                   var results: [Int: CommentItem] = [:]           self.dataAccessService.fetchRepliesByCommentId(completionHandler: { (commentReplyArray) in       let group = DispatchGroup()       for var i in 0..<commentReplyArray.count {         let commentReplyObject = commentReplyArray[i]         let commentItem = CommentItem()                   group.enter()           self.dataAccessService.fetchUserById(completionHandler: { (userObject) in             commentItem.userObject = userObject             results[i] = commentItem             defer { group.leave() }          }, uid: commentReplyObject.userId)                           group.enter()           self.dataAccessService.fetchDownloadURLOfProfileImage(organizerId: commentReplyObject.userId) { (contentURL) in           commentItem.userObject.contentURL = contentURL            defer { group.leave() }          }                   group.notify(queue: .main) {           commentItem.commentObject = commentReplyObject           let array = commentReplyArray.compactMap { _ in results[i] }           print(array.count)         }                }     }, commentId: commentItem.commentObject.id)   } This time, I tried to move the  let group = DispatchGroup() into the loop
Post not yet marked as solved
11 Replies
Hi @OOPer, I don't need a "pause" specifically. I just don't know how to construct the result with each response of the sent requests.
Post not yet marked as solved
5 Replies
Hi @OOPer, yuhuu :) I found out the cause and a solution. All video were failing which were longer than 30 seconds. This made me look into the method where I do trim the video func assetByTrimming(timeOffStart: Double) throws -> AVAsset {     let duration = CMTime(seconds: timeOffStart, preferredTimescale: 1)     let timeRange = CMTimeRange(start: CMTime.zero, duration: duration)               let composition = AVMutableComposition()     let videoTrack = self.tracks(withMediaType: AVMediaType.video).first              let size = videoTrack!.naturalSize      let txf = videoTrack!.preferredTransform           var recordType = ""     if (size.width == txf.tx && size.height == txf.ty){       recordType = "UIInterfaceOrientationLandscapeRight"     }     else if (txf.tx == 0 && txf.ty == 0){       recordType = "UIInterfaceOrientationLandscapeLeft"     }     else if (txf.tx == 0 && txf.ty == size.width){       recordType = "UIInterfaceOrientationPortraitUpsideDown"     }     else{       recordType = "UIInterfaceOrientationPortrait"     }           do {       for track in tracks {         // let compositionTrack = composition.addMutableTrack(withMediaType: track.mediaType, preferredTrackID: track.trackID)         // try compositionTrack?.insertTimeRange(timeRange, of: track, at: CMTime.zero)                   if let videoCompositionTrack = composition.addMutableTrack(withMediaType: track.mediaType, preferredTrackID: kCMPersistentTrackID_Invalid) {           try videoCompositionTrack.insertTimeRange(timeRange, of: videoTrack!, at: CMTime.zero)                       if recordType == "UIInterfaceOrientationPortrait" {             let t1: CGAffineTransform = CGAffineTransform(translationX: videoTrack!.naturalSize.height, y: -(videoTrack!.naturalSize.width - videoTrack!.naturalSize.height)/2)             let t2: CGAffineTransform = t1.rotated(by: CGFloat(Double.pi / 2))             let finalTransform: CGAffineTransform = t2             videoCompositionTrack.preferredTransform = finalTransform           }           else if recordType == "UIInterfaceOrientationLandscapeRight" {             let t1: CGAffineTransform = CGAffineTransform(translationX: videoTrack!.naturalSize.height, y: -(videoTrack!.naturalSize.width - videoTrack!.naturalSize.height)/2)             let t2: CGAffineTransform = t1.rotated(by: -CGFloat(Double.pi))             let finalTransform: CGAffineTransform = t2             videoCompositionTrack.preferredTransform = finalTransform           }           else if recordType == "UIInterfaceOrientationPortraitUpsideDown" {             let t1: CGAffineTransform = CGAffineTransform(translationX: videoTrack!.naturalSize.height, y: -(videoTrack!.naturalSize.width - videoTrack!.naturalSize.height)/2)             let t2: CGAffineTransform = t1.rotated(by: -CGFloat(Double.pi/2))             let finalTransform: CGAffineTransform = t2             videoCompositionTrack.preferredTransform = finalTransform           }         }       }     } catch let error {       throw TrimError("error during composition", underlyingError: error)     }     return composition   } Looks like, the output of this file changes the media type which is then not supported. Instead, to trim the video, I can apply a more simple solution &#9; let startTime = CMTime(seconds: Double(0), preferredTimescale: 1000)     let endTime = CMTime(seconds: Double(30), preferredTimescale: 1000)     let timeRange = CMTimeRange(start: startTime, end: endTime)           exportSession.outputURL = destination     exportSession.outputFileType = .mov     exportSession.shouldOptimizeForNetworkUse = true     exportSession.timeRange = timeRange // trim video here Do you know what exactly might be the cause why my assetsByTrimming method causes an error? Thank you again for your assistance!
Post not yet marked as solved
5 Replies
Hello @OOper, thanks for the reply. I can always count on you in this forum :) I put the code snippet. This is what it logs AVAssetExportSessionStatus Optional(Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The operation is not supported for this media., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x28376e250 {Error Domain=NSOSStatusErrorDomain Code=-16976 "(null)"}})
Post marked as solved
2 Replies
Hello Claude31,thanks a lot! You were right. The key/hint for the problem was putting the codes for all language that should be supported.I basically have modified my function like this now public func notifyAboutSubscription(userObject:User, receiverArray:[String]) { var receiverArray = removeChallengeCreatorTokenFromArray(receiverArray: receiverArray) notificationTypeService.clearReceiverListForNotificationType(completionHandler: { (clearedReceiverArray) in receiverArray = clearedReceiverArray let source = self.determineUserType(userObject: userObject) OneSignal.postNotification(["contents": ["en": source + FOLLOW_MESSAGE, "de": source + " folgt dir jetzt", "tr": source + " seni takip ediyor"], "include_player_ids": receiverArray]) }, receiverList: receiverArray, notificationType: NotificationType.follow) }Depending on what language is set on the target device, the correct message is shown. That saved my day. Have a nice weekend!
Post not yet marked as solved
8 Replies
I have it like this:let DARK_BLUE_COLOR = UIColor(red: 10.0/255.0, green: 120.0/255.0, blue: 147.0/255.0, alpha: 1)defined as a constant to be used everywhere.
Post not yet marked as solved
8 Replies
Hi Claude31,thanks for the reply. I meant the background color of the superview (self.view) actually.BTW: The way I am changing the background color of my button works the way as you can see above.
Post not yet marked as solved
8 Replies
Hi Claude31,please have a look at the updated code:func setupView() { // background color of view self.view.backgroundColor = DARK_BLUE_COLOR self.view.translatesAutoresizingMaskIntoConstraints = false createdButton.translatesAutoresizingMaskIntoConstraints = false self.view.addSubview(createdButton) createdButton.leadingAnchor.constraint(equalTo: self.view.leadingAnchor).isActive = true createdButton.widthAnchor.constraint(equalTo: self.view.widthAnchor, constant: self.view.frame.width / 3).isActive = true createdButton.topAnchor.constraint(equalTo: self.view.topAnchor, constant: 60).isActive = true createdButton.heightAnchor.constraint(equalTo: self.view.heightAnchor, constant: 54).isActive = true createdButton.setTitle(CREATED_BUTTON_TITLE, for: .normal) createdButton.setTitleColor(DARK_BLUE_COLOR, for: .normal) createdButton.backgroundColor = UIColor.white createdButton.addTarget(self, action: #selector(fetchMyChallenges), for: .touchDown) // createdButton.title createdButton.layer.borderWidth = 1.5 createdButton.layer.borderColor = UIColor.white.cgColor }
Post not yet marked as solved
8 Replies
View is probably black because you did not set its bacjground color.I am setting the background color of the view but it is still black.If you create constraints (NSConstraint) yourself, you don't need it.If I remove self.view.translatesAutoresizingMaskIntoConstraints = false, then my button has a wrong position and the background color of the view turns to white.Why don't you set the constraints in IB ? It is muche simpler. You can set it propostional, with multiply, for line 14.My view will be much more complex with sub views etc. I thought, doing some layout stuff with IB and the rest programmatically could be confusing. I want to stick to one UI creation approach and that would be the programmatically way.
Post not yet marked as solved
7 Replies
Hi Claude,I thought, I knew how to deal with frames but it turned out, I didn't. So now, I might have found a working solution to have a functionality like autolayout but with frames.
Post not yet marked as solved
7 Replies
I guess, I know what to do:I need to define two variables like:// SCREEN SIZE FOR IPHONE 8 let IPHONE8_SCREEN_WIDTH:CGFloat = 375.0 let IPHONE8_SCREEN_HEIGHT:CGFloat = 667.0and then change my code above to:let accountTypeLabel = UILabel() accountTypeLabel.frame = CGRect(x: width * (53 / IPHONE8_SCREEN_WIDTH), y: height * (172 / IPHONE8_SCREEN_HEIGHT), width: width * (269 / IPHONE8_SCREEN_WIDTH), height: height * (37 / IPHONE8_SCREEN_HEIGHT)) accountTypeLabel.text = ACCOUNT_TYPE_LABEL accountTypeLabel.textColor = UIColor.white accountTypeLabel.backgroundColor = DARK_BLUE_COLOR accountTypeLabel.textAlignment = .center let accountTypeLabelLabelHeight = accountTypeLabel.frame.size.height accountTypeLabel.font = UIFont(name: "Kefa", size: accountTypeLabelLabelHeight * (22 / accountTypeLabelLabelHeight)) This works for any devices then, right?
Post not yet marked as solved
7 Replies
You are absolutely correct.In case of iPhone 8 screen, the width is 375. Width for iPhone 11 is 414I actually don't put the frame correctly to the screen size.It would have been correct, if I made it like this:53/375 = 0.1413And then used in the frame like this..... CGRect(x: width * 0.1413) ..because then, it is set relatively to the screen even when using other device.So what would be the easiest way to change it being adjustable to screen size? I mean working with something likewidth * 0.1413 is not really easyy to read and understand.