Post

Replies

Boosts

Views

Activity

Users reporting cloud synching not working
Hello, I have an app on the store and as of today, users are reporting that cards they collect are no longer syncing with iCloud. I haven't updated my app in about three weeks and the code hasn't changed. The app syncs correctly in the development build but not the release. When I log in to the CloudKit Dashboard the Dev and Production Schema are correct. What can I do? Orlando Lion's Eye - MTG
0
0
630
Mar ’21
Help with Mysterious crash happening in App Review
Hello, I'm new to coding and trying to figure out a crash I'm getting back from App Review. I have been unsuccessful in replicating the crash and symbolicating the logs but they hinted at what happened when my app crashed. They "tapped on the scan button" in my app which opens a camera view that starts a VNRecognizeText method. My app is 99% SwiftUI and I only started learning to code in March 2020 so the UIKit elements in the camera view are really tough for me. If the reviewer is using an older iPad and I have specified 4K preset would that crash their device instead of falling back on the native resolution? session.sessionPreset = AVCaptureSession.Preset.hd4K3840x2160 Ultimately I'm trying to learn if how I've setup the request or session warrants a crash at App Review but not on any device or simulator I have access to and how I would guard against an older device that runs iOS 14 but the camera/ hardware is somehow causing this crash I can't replicate! Thank you for helping! Full Code below:         request = VNRecognizeTextRequest(completionHandler: recognizeTextHandler)                  // setup session         let session = AVCaptureSession()         session.beginConfiguration()                  let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .unspecified)         guard videoDevice != nil, let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice!), session.canAddInput(videoDeviceInput) else {             print("No camera detected")             return         }         session.addInput(videoDeviceInput)         session.commitConfiguration() //        session.sessionPreset = AVCaptureSession.Preset.hd4K3840x2160         self.captureSession = session                  // setup video output         let videoDataOutput = AVCaptureVideoDataOutput()         videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey : NSNumber(value: kCVPixelFormatType_32BGRA)] as [String : Any]         videoDataOutput.alwaysDiscardsLateVideoFrames = true         let queue = DispatchQueue(label: "com.MyApp.VideoQueue")         videoDataOutput.setSampleBufferDelegate(self, queue: queue)         guard captureSession!.canAddOutput(videoDataOutput) else {             fatalError()         }         if session.canAddOutput(videoDataOutput) {             session.addOutput(videoDataOutput)             videoDataOutput.connection(with: AVMediaType.video)?.preferredVideoStabilizationMode = .standard         } else {             print("Could not add VDO output")             return         }                  do {             try videoDevice!.lockForConfiguration()             videoDevice!.videoZoomFactor = 1             videoDevice!.autoFocusRangeRestriction = .near             videoDevice!.unlockForConfiguration()         } catch {             print("Could not set zoom level due to error: \(error)")             return         }                  videoConnection = videoDataOutput.connection(with: .video)     }          override class var layerClass: AnyClass {         AVCaptureVideoPreviewLayer.self     }          required init?(coder: NSCoder) {                  fatalError("init(coder:) has not been implemented")     }          var videoPreviewLayer: AVCaptureVideoPreviewLayer {         return layer as! AVCaptureVideoPreviewLayer     }          override func didMoveToSuperview() {         super.didMoveToSuperview()                  if nil != self.superview {             self.videoPreviewLayer.session = self.captureSession             self.videoPreviewLayer.videoGravity = .resizeAspectFill             self.captureSession?.startRunning()         } else {             self.captureSession?.stopRunning()         }     }               func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {         if connection.videoOrientation != .portrait {             connection.videoOrientation = .portrait             return         }                       if let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {             let requestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, orientation: .up, options: [:])             request.recognitionLevel = .accurate             request.usesCPUOnly = false             request.usesLanguageCorrection = false             do {                 try requestHandler.perform([request])             } catch {                 print(error)             }         }     } }
1
1
1.1k
Feb ’21
Accessing Core Data in a Shortcut Intent Handler?
Hello, Ive been working on adding shortcuts to my app and have had some success with executing API calls and returning strings. But when I try to use the user input to create a Core Data entity or to check the persistent store it only returns nil or fails. In the Intents handler class I have executed a @FetchRequest and then in the logic try to parse the collection. Another example is I've tried  @Environment(\.managedObjectContext) var managedObjectContext but entities I try to create don't work. The same code works in the app though. Any ideas how to do this? P.S. My app is written in swiftUI!
1
1
913
Dec ’20
Data flow and managed objects in SwiftUI
Hello, I'm still struggling with passing data around in SwiftUI with Core Data. In my test app currently theres a button on top to add a group of six grocery entries. I'm trying to show the list of "entries" in the view and when tapped a detail .sheet modal is presented with the tapped/chosen item passed in. Then when the entry detail is tapped the item is deleted from storage. I'm having two distinct issues: the detail view only ever shows the first item in the entries. In this case, "Butter Detail" is always presented. when tapped, the detail text crashes the app with the error "An NSManagedObjectContext cannot delete objects in other contexts". I though @StateObject could help me here but I'm still unclear on its implementation. Currently the .sheet model is within the ForEach, because it cant find the "entry in". Should I be storing the "entry in" in a @Stat var and passing that into the modal? I don't know how to store an entity in that manner. struct ContentView: View { @Environment(\.managedObjectContext) var managedObjectContext @FetchRequest(entity: Entry.entity(), sortDescriptors: [NSSortDescriptor(keyPath: \Entry.name, ascending: true)], predicate: nil) var entries: FetchedResults<Entry> @State var isPresented = false var body: some View {         VStack {             Button(action: {                 Entry.create(in: managedObjectContext)             }) {                 Text("Button")             }             Text("\(entries.count)")             List{                 ForEach(entries, id: \.self) { entry in                     Text(entry.name ?? "Unknown")                         .onTapGesture {                             isPresented = true                         }                         .sheet(isPresented: $isPresented) {                             Detail(entry: entry)                    }                 }             }         }     } } struct Detail: View { @Environment(\.managedObjectContext) var managedObjectContext @StateObject var entry: Entry     var body: some View {         Text("\(entry.name ?? "Error") Detail")             .onTapGesture {                 do {                     managedObjectContext.delete(entry)                     try managedObjectContext.save()                 } catch {                     print(error)                 }             }     } }
4
0
1.8k
Jul ’20
Assign and array or dictionary as a Core Data Attribute?
Hello, I'm very new to Core Data and have been trying to implement it in my SwiftUI app. I have been trying to learn and find a solution to assigning an array or dictionary as an attribute to an entity but have had no success. Is there an example somewhere that can be shared on the best way to accomplish this? An example would be my object can have attribute like color, but some objects can have multiple colors. Thanks!
1
0
3k
Jun ’20
How do I implement @StateObject in AsyncImage?
Hello, I had a lab session earlier today to try and handle an issue with images getting dumped in SwiftUI and always seeing the placeholder image instead of the loaded image. The engineer recommended @StateObject as a replacement for @ObservedObject in the AsyncImage but Im getting an error. Is there a recommended way to use this new feature of SwiftUI for Async Images? With @StateObject in place of @ObservedObject 'loader' below throws error "Cannot assign to property: 'loader' is a get-only property" struct AsyncImage<Placeholder: View>: View {     @ObservedObject private var loader: ImageLoader     private let placeholder: Placeholder?     init(url: URL, placeholder: Placeholder? = nil) {         loader = ImageLoader(url: url)         self.placeholder = placeholder     }     var body: some View {         image             .onAppear(perform: loader.load)             .onDisappear(perform: loader.cancel)     }     private var image: some View {         Group {             if loader.image != nil {                 Image(uiImage: loader.image!)                     .resizable()                     .aspectRatio(contentMode: .fit)             } else {                 Image("Placeholder")                     .resizable()                     .aspectRatio(contentMode: .fit)             }         }     } } Thanks!
6
0
3.4k
Jun ’20