Post

Replies

Boosts

Views

Activity

MFMessageComposeViewController, SwiftUI, with attachment
My app needs to send iMessages with an attached data file. The view comes up modally and the user needs to press the send button. When the button is tapped the error occurs and the attachment and message is not delivered. I have tried three implementations of UIViewControllerRepresentable for MFMessageComposeViewController that have worked for iOS 15 and 16, but not 17. If I make the simplest app to show the problem, it will reliably fail on all iOS versions tested. If I remove the attachment, the message gets sent. In contrast, the equivalent UIViewControllerRepresentable for email works perfectly every time. After struggling for weeks to get it to work, I am beginning to believe this is a timing error in iOS. I have even tried unsuccessfully to include dispatch queue delays. Has anybody else written a swiftUI based app that can reliably attach a file to an iMessage send? UIViewControllerRepresentable: import SwiftUI import MessageUI import UniformTypeIdentifiers struct AttachmentData: Codable { var data:Data var mimeType:UTType var fileName:String } struct MessageView: UIViewControllerRepresentable { @Environment(\.presentationMode) var presentation @Binding var recipients:[String] @Binding var body: String @Binding var attachments:[AttachmentData] @Binding var result: Result<MessageComposeResult, Error>? func makeUIViewController(context: Context) -> MFMessageComposeViewController { let vc = MFMessageComposeViewController() print("canSendAttachments = \(MFMessageComposeViewController.canSendAttachments())") vc.recipients = recipients vc.body = body vc.messageComposeDelegate = context.coordinator for attachment in attachments { vc.addAttachmentData(attachment.data, typeIdentifier: attachment.mimeType.identifier, filename: attachment.fileName) } return vc } func updateUIViewController(_ uiViewController: MFMessageComposeViewController, context: Context) { } func makeCoordinator() -> Coordinator { return Coordinator(presentation: presentation, result: $result) } class Coordinator: NSObject, MFMessageComposeViewControllerDelegate { @Binding var presentation: PresentationMode @Binding var result: Result<MessageComposeResult, Error>? init(presentation: Binding<PresentationMode>, result: Binding<Result<MessageComposeResult, Error>?>) { _presentation = presentation _result = result } func messageComposeViewController(_ controller: MFMessageComposeViewController, didFinishWith result: MessageComposeResult) { defer { $presentation.wrappedValue.dismiss() } switch result { case .cancelled: print("Message cancelled") self.result = .success(result) case .sent: print("Message sent") self.result = .success(result) case .failed: print("Failed to send") self.result = .success(result) @unknown default: fatalError() } } } } SwiftUI interface: import SwiftUI import MessageUI struct ContentView: View { @State private var isShowingMessages = false @State private var recipients = ["4085551212"] @State private var message = "Hello from California" @State private var attachment = [AttachmentData(data: Data("it is not zip format, however iMessage won't allow the recipient to open it if extension is not a well-known extension, like .zip".utf8), mimeType: .zip, fileName: "test1.zip")] @State var result: Result<MessageComposeResult, Error>? = nil var body: some View { VStack { Button { isShowingMessages.toggle() } label: { Text("Show Messages") } .sheet(isPresented: $isShowingMessages) { MessageView(recipients: $recipients, body: $message, attachments: $attachment, result: $result) } .onChange(of: isShowingMessages) { newValue in if !isShowingMessages { switch result { case .success(let type): switch type { case .cancelled: print("canceled") break case .sent: print("sent") default: break } default: break } } } } } } #Preview { ContentView() }
1
0
329
Sep ’24
How do I get iMessage to recognize my custom file format?
I have an app that shares files between users of the app by attaching the file to either an email or an iMessage. The original implementation attached a JSON file but I was able to build a smaller binary file, allowing me to have more content within the file size limit. Mail attachments work just fine. I've modified the info plist to define the app-specific file extension. But no matter where I define it, iMessage refuses to put up a preview where the user can select my app to open it. I tried putting it in a document type with a public.text, public.data roles. I tried adding it to the Exported Type Identifiers as a public-mime-type. I made sure the identifiers all have the reverse url com... My temporary iMessage workaround is to name the file with .zip, a known public extension. It doesn't have valid zip data, it just declares it as a zip file and allows the user to pick my app to open it because I added zip as an exported type identifier. But I don't want the user to think my app can also open zip files. I would be happy if the preview looked similar to what is done for zip files - a file icon, but with my app's name inside the icon. Any help with this configuration nightmare is appreciated. Mike
0
0
163
Aug ’24
Need AirPlay capability for Image and Live Photos
The Photos app on iOS is capable of playing Live Photos and showing images on another screen. My SwiftUI app is able to AirPlay videos and I can make the button appear or not using AVPlayer's isExternalPlaybackActive boolean. However, that boolean is not available for images and Live Photos. I've tried using an AVRoutePickerView(), which gives me a button for the user to start/stop AirPlay, but I have not found a way to associate it with SwiftUI's Image or my LivePhotoView: struct LivePhotoView: UIViewRepresentable { var livephoto: PHLivePhoto @Binding var playback:Bool @AppStorage("playAudio") var playAudio = userDefaults.bool(forKey: "playAudio") func makeUIView(context: Context) -> PHLivePhotoView { let view = PHLivePhotoView() try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}()) view.contentMode = .scaleAspectFit return view } func updateUIView(_ lpView: PHLivePhotoView, context: Context) { lpView.livePhoto = livephoto if playback { lpView.isMuted = false try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}()) try! AVAudioSession.sharedInstance().setActive(true) lpView.isMuted = false lpView.startPlayback(with: .full) } else { lpView.stopPlayback() } } } struct RouteButtonView: UIViewRepresentable { func makeUIView(context: Context) -> AVRoutePickerView { let routePickerView = AVRoutePickerView() routePickerView.tintColor = .gray routePickerView.prioritizesVideoDevices = true return routePickerView } func updateUIView(_ uiView: AVRoutePickerView, context: Context) { // No update needed } } Am I missing something? If it's a system internal API only available to the Photos App, why? Mike
1
1
533
May ’24
WatchConnectivity with simulator does not connect
I'm hoping to avoid updating my devices to the beta software and I've made good progress with upgrading to swift2 and getting my iPhone app and watch app to WatchOS 2. I'm at the point where I must deal with mirroring my Core Data from the iPhone to the watch.The problem is I cannot get the iPhone simulator and the Apple Watch simulator to agree on the WatchConnectivity statuses. I've writting a bare-bones test program that shows all the status fields from both sides and this is what it reports:I've had this problem with XCode 7 beta 4, and just upgraded to beta 5 with no difference. I've restarted the simulators, reset content and settings, rebooted the mac. Nothing helps. For a few hours earlier this week, in beta 4, iOS was indicating installed and I even saw the URL. I was getting error 7005, which was fixed by resetting the simulators. Then I succeeded in sending and the watch simulator recieving one test dicionary, even though the iOS simulator reported the watch not reachable. But it did not remain stable long enough for me to work on serializing the Core Data records.I've tried various suggestions on this forum. For exampe: Run the watch kit scheme first then control-run the iOS app. Has anyone reliably used the simulators to transmit a WCApplicationContext to the watch?Code snippets on the iOS app: override func viewDidLoad() { super.viewDidLoad() if (WCSession.isSupported()) { print("WCSession is Supported") let session = WCSession.defaultSession() session.delegate = self; session.activateSession() updateStatus(statusReport) } } func updateAppleWatch() { print("updateAppleWatch") let session = WCSession.defaultSession() print("Paired: \(session.paired) Installed: \(session.watchAppInstalled) Reachable: \(session.reachable)") if session.watchAppInstalled { print("watchAppInstalled") print(" watch directory: \(session.watchDirectoryURL)") let theDict = [ "0": ["title": "t1", "subtitle": "s1"], "1": ["title": "t2", "subtitle": "s2"]] do { print("Sending \(theDict)") try session.updateApplicationContext(theDict) sendReport.text = "Send \(sendCount++) Succeeded" } catch let err as NSError { print("updateApplicationContext error \(err)") sendReport.text = "\(err)" } } }Code snippets on the watch app: override func awakeWithContext(context: AnyObject?) { super.awakeWithContext(context) let session = WCSession.defaultSession() session.delegate = self; session.activateSession() print ("Paired: \(session.paired)\nInstalled: \(session.watchAppInstalled)\nReachable: \(session.reachable)\nURL: \(session.watchDirectoryURL)\nDelegate: \(session.delegate)") } func session(session: WCSession, didReceiveApplicationContext applicationContext: [String : AnyObject]) { print(applicationContext) }
8
0
8.6k
Aug ’15