Posts

Sort by:
Post not yet marked as solved
0 Replies
7 Views
Hello! We have an app that utilises the SpeechKit Framework. Especially the local on-device speech recognition for the audio files with the user selected language. Up until recently it worked as expected. However after updating one of our testing device to iOS 17.4.1 we found out that the local recognition on it stopped working completely. The error that we are getting has code 102 at its localised description reads: "Failed to access assets". That sounds just like a rear though known issue in previous iOS versions. The solution was inconvenient for our users but at least it worked – they were to go to the System settings and tweak with the dictation setting in the keyboard section. Right now no tweaks of this sort appear to help us fix the situation. We even tried to do the setting reset of the device (not the factory reset though). The error persists. it appears one one of our devices 100% of the time, halting the local recognition process. It sometimes shows on other devices for some particular languages too, but it does not show for other languages. As it is a UX breaking bug for our app, today I decided to check the logs of the Console app at the moment of the recognition attempt. There are lots of errors with code 1101 which from our research appear to be the general notifications about some local recognition setup problems. Removing the lines about the 1101 error from the log we have some interesting stuff remaining, that is (almost) never mentioned in any of the searchable webpages in the Internet. I assume they are the private API calls that the SpeechKit Framework executes under the hood: default localspeechrecognition -[UAFAssetSet assetNamed:]_block_invoke 9067C4F1-0B29-4A57-85DD-F8740DF7C344: No assets in asset set com.apple.siri.understanding default localspeechrecognition -[UAFAssetSet assetNamed:] 9067C4F1-0B29-4A57-85DD-F8740DF7C344: Returning com.apple.siri.asr.assistant from source none error localspeechrecognition -[SFEntitledAssetManager _assetWithAssetConfig:regionId:] No asset found with name: com.apple.siri.asr.assistant, asset set: com.apple.siri.understanding, usage: <private> error localspeechrecognition +[LSRConnection modelRootWithLanguage:clientID:modelOverrideURL:returningAssetType:error:] Fetch asset error (null) error localspeechrecognition -[LSRConnection prepareRecognizerWithLanguage:recognitionOverrides:modelOverrideURL:anyConfiguration:task:clientID:error:] modelRoot is nil (null) default OurApp [0x113e96d40] invalidated because the current process cancelled the connection by calling xpc_connection_cancel() Looks like there are some language-model related problems that appeared after the device was updated to 17.4.1. The Settings -> General -> Keyboard -> Dictation Languages appear to be configured correctly, the dictation toggle is On, we tried tweaking all these setting, rebooting the device and resetting the device settings. However the log lines still tell us that there is something wrong with the private resources of the SpeechKit framework. We are very concerned as the speech recognition is the core of out application's logic. And we don't understand what is the scale of possible impact of such a faulty behaviour (rare occurrences / some users / all users?) and how we can fix it to provide our users with the desired behaviour.
Posted
by
Post not yet marked as solved
0 Replies
7 Views
We're developing our first App Clip, and having issues with accessing the keychain. At the moment, we've been running the App Clip from Xcode with the environment variable, _XCAppClipURL, set to a URL the app clip responds to. This is the failing call: var addquery: [CFString: Any] = [ kSecClass: kSecClassGenericPassword, kSecAttrLabel: "secretKey", kSecAttrAccount: userID] if let accessGroup = accessGroup { addquery[kSecAttrAccessGroup] = accessGroup } SecItemDelete(addquery as CFDictionary) addquery[kSecAttrAccessible] = kSecAttrAccessibleAfterFirstUnlockThisDeviceOnly addquery[kSecUseDataProtectionKeychain] = true addquery[kSecValueData] = key let status = SecItemAdd(addquery as CFDictionary, nil) guard status == errSecSuccess else { throw MercuryError(message: "Failed storing db encryption key: \(status)") } The status that is reported has the value -34018, which I haven't tried to compare to the massive list of constants. The trouble seems to be kSecAttrAccessGroup, to which we're our App Group value, e.g. group.com.bundle.different.string as shown below. Both full app and clip belong to this group. We've also tried setting kSecAttrAccessGroup to the keychain access group setup for the full app, but we can't seem to add a keychain sharing entitlement to the app clip... at least not in the UI. So we're unsure if setting the param to this is supposed to work in this case. If instead we avoid setting kSecAttrAccessGroup altogether, the call to SecItemAdd succeeds in the App Clip. The full app seems to be able to later access the stored key. But this is less safe, right? Given that previous versions of the full app are currently using the App Group for this call, changing it would be a disruption for our users. For completeness, the full app works with the app group passed in or not as always. Setup Stuff Full App Bundle ID is something like: com.bundle.ident App Clip has the default suffix added: com.bundle.ident.Clip App Clip Entitlements: <dict> <key>com.apple.developer.associated-domains</key> <array> <string>webcredentials:domain.com?mode=developer</string> <string>webcredentials:domain2.com?mode=developer</string> <string>applinks:domain.com?mode=developer</string> <string>applinks:domain2.com?mode=developer</string> </array> <key>com.apple.developer.parent-application-identifiers</key> <array> <string>$(AppIdentifierPrefix)com.bundle.ident</string> </array> <key>com.apple.security.application-groups</key> <array> <string>group.com.bundle.different.string</string> </array> </dict> I don't seem to be able to add the keychain-access-groups key to this one, but, given that we're scoping the keychain access to the App Group, it shouldn't matter(?). Relevant Full App Entitlements: <dict> <key>com.apple.developer.associated-domains</key> <array> <string>webcredentials:domain.com?mode=developer</string> <string>webcredentials:domain2.com?mode=developer</string> <string>applinks:domain.com?mode=developer</string> <string>applinks:domain2.com?mode=developer</string> </array> <key>com.apple.security.app-sandbox</key> <true/> <key>com.apple.security.application-groups</key> <array> <string>group.com.bundle.different.string</string> </array> <key>keychain-access-groups</key> <array> <string>$(AppIdentifierPrefix)com.bundle.third.string</string> </array> </dict> What's missing here is the com.apple.developer.associated-appclip-app-identifiers, but I understand that's added automatically when I archive, so that it shouldn't be relevant here.
Posted
by
Post not yet marked as solved
0 Replies
10 Views
I am really stuck. I uploaded my Mac Catalyst app. The binary passes validation beforehand. I submit for review. After being in "waiting for review" for a couple of minutes it is rejected with "invalid binary" and comes back with an email saying "ITMS-90053: This bundle is invalid - The bundle identifier is already in use by a different software package." The only app that is using the same bundle is the IOS version where I added the Mac platform.
Posted
by
Post not yet marked as solved
0 Replies
17 Views
Has anyone else been experiencing issues with the WeatherKit API recently? Since Saturday we've been seeing about 1.5% of our requests time out. Apple hasn't posted anything on their status page https://developer.apple.com/system-status/ and the issue doesn't seem to be getting any better.
Posted
by
Post not yet marked as solved
0 Replies
16 Views
Currently I am trying to create some shortcuts for my iOS 17 app. The AppIntent looks like this: class PostClass{ public init() { let url = URL(string: "http://myurl")! var request = URLRequest(url: url) request.httpMethod = "POST" struct Message: Encodable { let device_type: String } let message = Message( device_type: "device" ) let data = try! JSONEncoder().encode(message) request.httpBody = data request.setValue( "application/json", forHTTPHeaderField: "Content-Type" ) self.request = request } } var activateDevice = PostClass() @available(iOS 17, *) struct ActivateIntent: AppIntent { static let title: LocalizedStringResource = "Activate" func perform() async throws -> some IntentResult { let task = URLSession.shared.dataTask(with: activateDevice.request) { data, response, error in let statusCode = (response as! HTTPURLResponse).statusCode if statusCode == 200 { print("SUCCESS") } else { print("FAILURE") } } task.resume() return .result() } } Unfortunately, the shortcut throws an error after every second execution. I looked into the ips-file and saw the following traceback: Thread 2 Crashed: 0 Runner 0x1028ddd30 closure #1 in ActivateIntent.perform() + 388 1 CFNetwork 0x1a6f39248 0x1a6f2a000 + 62024 2 CFNetwork 0x1a6f57210 0x1a6f2a000 + 184848 3 libdispatch.dylib 0x1adced13c _dispatch_call_block_and_release + 32 4 libdispatch.dylib 0x1adceedd4 _dispatch_client_callout + 20 5 libdispatch.dylib 0x1adcf6400 _dispatch_lane_serial_drain + 748 6 libdispatch.dylib 0x1adcf6f64 _dispatch_lane_invoke + 432 7 libdispatch.dylib 0x1add01cb4 _dispatch_root_queue_drain_deferred_wlh + 288 8 libdispatch.dylib 0x1add01528 _dispatch_workloop_worker_thread + 404 9 libsystem_pthread.dylib 0x201dd4f20 _pthread_wqthread + 288 10 libsystem_pthread.dylib 0x201dd4fc0 start_wqthread + 8 Is there any way to locate the error with these information? Has it something to do with the fact that my code is not thread-safe? Or is there any internal bug? Grateful for any help, thanks in advance!
Posted
by
Post not yet marked as solved
0 Replies
24 Views
Hello, we recently added a build plugin to our swift package and we have a Xcode project that uses the swift package as a dependency and we build that Xcode project from the command line using xcodebuild build-for-testing. After we added that build plugin to the swift package, xcodebuild now fails and says this: If I remove that build plugin from the swift package then xcodebuild build-for-testing will build successfullly. It is also worth noting then when we run the xcodebuild build-for-testing command we pass in multiple -destination flags. If we pass one -destination flag then xcodebuild succeeds even with the build plugin. So it seems like the issue is around having multiple destination flags and a build plugin added. This sounds like a bug within xcodebuild. Any thoughts or ideas on why this isn't working would be helpful! Here is an example of what our full xcodebuild command looks like: xcodebuild build-for-testing -project “SomeProject” -scheme “SomeScheme” -configuration Debug -destination ‘SomeDestination1’ -destination ‘SomeDestination2’ -destination ‘SomeDestination3’ -disableAutomaticPackageResolution -onlyUsePackageVersionsFromResolvedFile -skipPackageUpdates -skipPackagePluginValidation -allowProvisioningUpdates DEVELOPMENT_TEAM=SomeTeam Thank you!
Posted
by
Post not yet marked as solved
1 Replies
26 Views
Whenever I open a .unix socket (i.e.: /var/run/usbmuxd) I get the following errors in Xcode console: nw_socket_set_common_sockopts [C13:1] setsockopt SO_NECP_CLIENTUUID failed [22: Invalid argument] Type: Error | Timestamp: 2024-04-18 15:48:44.813556-04:00 | Process: TH Dev | Library: Network | Subsystem: com.apple.network | Category: connection | TID: 0x425e2 nw_socket_set_common_sockopts setsockopt SO_NECP_CLIENTUUID failed [22: Invalid argument] Type: Error | Timestamp: 2024-04-18 15:48:44.813682-04:00 | Process: TH Dev | Library: Network | Subsystem: com.apple.network | Category: | TID: 0x425e2 nw_socket_copy_info [C13:1] getsockopt TCP_INFO failed [102: Operation not supported on socket] Type: Error | Timestamp: 2024-04-18 15:48:44.814484-04:00 | Process: TH Dev | Library: Network | Subsystem: com.apple.network | Category: connection | TID: 0x425e2 nw_socket_copy_info getsockopt TCP_INFO failed [102: Operation not supported on socket] Type: Error | Timestamp: 2024-04-18 15:48:44.814523-04:00 | Process: TH Dev | Library: Network | Subsystem: com.apple.network | Category: | TID: 0x425e2 While communication to/from the socket seems to work, the operations leading to these errors shouldn't be attempted if the socket doesn't support them.
Posted
by
Post not yet marked as solved
0 Replies
27 Views
please help me to renew my developer membership i have click on the renew membership button then filled out the payment form i then received and email over 2 weeks now saying that apple acknowledge my order but my app is still off the app store and my membership is not renewed please some one help me thanks
Posted
by
Post not yet marked as solved
0 Replies
23 Views
I am trying to create a near real-time drawing of waveform data from within a SwiftUI app. The data is streaming in from the hardware and I've verified that the draw(in ctx: CGContext) override in my custom CALayer is getting called. I have added this custom CALayer class as a sublayer to a UIView instance that I am making available via the UIViewRepresentable protocol. The only time I see updated output from the CALayer is when I rotate the device and layout happens (I assume). How can I force SwiftUI to update every time I render new data in my CALayer? More Info: I'm porting an app from the Windows desktop. Previously, I tried to make this work by simply generating a new UIImage from a CGContext every time I wanted to update the display. I quickly exhausted memory with that technique because a new context is being created every time I call UIGraphicsImageRenderer(size:).image { context in }. What I really wanted was something equivalent to a GDI WritableBitmap. Apparently this animal allows a programmer to continuously update and re-use the contents. I could not figure out how to do this in Swift without dropping down to the old CGBitmapContext stuff written in C and even then I wasn't sure if that would give me a reusable context that I could output in SwiftUI each time I refreshed it. CALayer seemed like the answer. I welcome any feedback on a better way to do what I'm trying to accomplish.
Posted
by
Post not yet marked as solved
0 Replies
21 Views
I am trying to migrate some of my mapkit code to the Map Server API. My JWT is fine and I can use that within the API Playground just fine. However, when I try to implement a test using PHP and cURL to get my Map token, I receive a 401 Not Authorized Error. My code is below: <?php $URL="https://maps-api.apple.com/v1/token"; $accesstoken = "<MY TOKEN HERE>"; $authorization = 'Authorization: Bearer'.$accesstoken; $headers = []; $ch = curl_init($URL); curl_setopt($ch, CURLOPT_HTTPHEADER, [$authorization]); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_HEADERFUNCTION, function($curl, $header) use (&$headers) { $len = strlen($header); $header = explode(':', $header, 2); if (count($header) < 2) // ignore invalid headers return $len; $headers[strtolower(trim($header[0]))][] = trim($header[1]); return $len; } ); $result=curl_exec($ch); $status_code = curl_getinfo($ch, CURLINFO_HTTP_CODE); //get status code curl_close ($ch); print_r($headers); echo($result); echo($status_code); ?> Now the exact response that I am receiving is: Array ( [date] => Array ( [0] => Thu, 18 Apr 2024 18:32:15 GMT ) [content-type] => Array ( [0] => application/json;charset=utf8 ) [content-length] => Array ( [0] => 51 ) [connection] => Array ( [0] => keep-alive ) [cache-control] => Array ( [0] => max-age=0 ) [x-rid] => Array ( [0] => c9507281-bc32-46ac-be3b-dc59e97e7fed ) [strict-transport-security] => Array ( [0] => max-age=31536000; includeSubDomains; ) ) {"error":{"message":"Not Authorized","details":[]}} 401 Any help getting this to work would be appreciated.
Posted
by
Post not yet marked as solved
1 Replies
31 Views
I'm implementing a bitonic sort in Metal with a Swift app. This requires 100's kernel dispatch calls for each of the swap stages which touch the whole array, the work required by the GPU is small. I haven't been able to get this to run fast enough in Swift and it seems its due to a high overhead for each dispatchThread command. I rewrote the test program in Objective C with a super-simple kernel function and its runs 25x faster from Objective C! Kernel function kernel void fill(device uint8_t *array [[buffer(0)]], const device uint32_t &N [[buffer(1)]], const device uint8_t &value [[buffer(2)]], uint i [[thread_position_in_grid]]) { if (i < N) { array[i] = value; } } The Swift code is: func fill(pso:MTLComputePipelineState, buffer:MTLBuffer, N: Int, passes: Int) { guard let commandBuffer = commandQueue.makeCommandBuffer() else { return } let gridSize = MTLSizeMake(N, 1, 1) var threadGroupSize = pso.maxTotalThreadsPerThreadgroup if (threadGroupSize > N) { threadGroupSize = N; } let threadgroupSize = MTLSizeMake(threadGroupSize, 1, 1); for pass in 0..<passes { guard let computeEncoder = commandBuffer.makeComputeCommandEncoder() else { return } var value:UInt8 = UInt8(pass); var NN:UInt32 = UInt32(N); computeEncoder.setComputePipelineState(pso) computeEncoder.setBuffer(buffer, offset: 0, index: 0) computeEncoder.setBytes(&NN, length: MemoryLayout<UInt32>.size, index: 1) computeEncoder.setBytes(&value, length: MemoryLayout<UInt8>.size, index: 2) computeEncoder.dispatchThreadgroups(gridSize, threadsPerThreadgroup: threadgroupSize) computeEncoder.endEncoding() } commandBuffer.commit() commandBuffer.waitUntilCompleted() } let device = MTLCreateSystemDefaultDevice()! let library = device.makeDefaultLibrary()! let commandQueue = device.makeCommandQueue()! let funcFill = library.makeFunction(name: "fill")! let pso = try? device.makeComputePipelineState(function: funcFill) var N = 16384 let passes = 100 let buffer = device.makeBuffer(length:N, options: [.storageModePrivate])! for _ in 1...10 { let startTime = DispatchTime.now() fill(pso:pso!, buffer:buffer, N:N, passes:passes) let endTime = DispatchTime.now() let elapsedTime = endTime.uptimeNanoseconds - startTime.uptimeNanoseconds print("Elapsed time:", Float(elapsedTime)/1_000_000, "ms"); } and the Objective C code (which should be almost identical) is void fill(id<MTLCommandQueue> commandQueue, id<MTLComputePipelineState> funcPSO, id<MTLBuffer> A, uint32_t N, int passes) { id<MTLCommandBuffer> commandBuffer = [commandQueue commandBuffer]; MTLSize gridSize = MTLSizeMake(N, 1, 1); NSUInteger threadGroupSize = funcPSO.maxTotalThreadsPerThreadgroup; if (threadGroupSize > N) { threadGroupSize = N; } MTLSize threadgroupSize = MTLSizeMake(threadGroupSize, 1, 1); for(uint8_t pass=0; pass<passes; pass++) { id<MTLComputeCommandEncoder> computeEncoder = [commandBuffer computeCommandEncoder]; [computeEncoder setComputePipelineState:funcPSO]; [computeEncoder setBuffer:A offset:0 atIndex:0]; [computeEncoder setBytes:&N length:sizeof(uint32_t) atIndex:1]; [computeEncoder setBytes:&pass length:sizeof(uint8_t) atIndex:2]; [computeEncoder dispatchThreads:gridSize threadsPerThreadgroup:threadgroupSize]; [computeEncoder endEncoding]; } [commandBuffer commit]; [commandBuffer waitUntilCompleted]; } int main() { NSError *error; id<MTLDevice> device = MTLCreateSystemDefaultDevice(); id<MTLLibrary> library = [device newDefaultLibrary]; id<MTLCommandQueue> commandQueue = [device newCommandQueue]; id<MTLFunction> funcFill = [library newFunctionWithName:@"fill"]; id<MTLComputePipelineState> pso = [device newComputePipelineStateWithFunction:funcFill error:&error]; // Prepare data int N = 16384; int passes = 100; id<MTLBuffer> bufferA = [device newBufferWithLength:N options:MTLResourceStorageModePrivate]; for(int it=1; it<=10; it++) { CFTimeInterval startTime = CFAbsoluteTimeGetCurrent(); fill(commandQueue, pso, bufferA, N, passes); CFTimeInterval duration = CFAbsoluteTimeGetCurrent() - startTime; NSLog(@"Elapsed time: %.1f ms", 1000*duration); } } The Swift output is: Elapsed time: 89.35556 ms Elapsed time: 63.243744 ms Elapsed time: 62.39568 ms Elapsed time: 62.183224 ms Elapsed time: 63.741913 ms Elapsed time: 63.59463 ms Elapsed time: 62.378654 ms Elapsed time: 61.746098 ms Elapsed time: 61.530384 ms Elapsed time: 60.88774 ms The objective C output is 2024-04-18 19:27:45.704 compute_test[3489:92754] Elapsed time: 3.6 ms 2024-04-18 19:27:45.706 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.709 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.712 compute_test[3489:92754] Elapsed time: 2.6 ms 2024-04-18 19:27:45.714 compute_test[3489:92754] Elapsed time: 2.7 ms 2024-04-18 19:27:45.717 compute_test[3489:92754] Elapsed time: 2.8 ms 2024-04-18 19:27:45.720 compute_test[3489:92754] Elapsed time: 2.8 ms 2024-04-18 19:27:45.723 compute_test[3489:92754] Elapsed time: 2.7 ms 2024-04-18 19:27:45.726 compute_test[3489:92754] Elapsed time: 2.5 ms 2024-04-18 19:27:45.728 compute_test[3489:92754] Elapsed time: 2.5 ms I compile the Swift code for Release, optimised for speed. I can't believe there should be a difference here, so what could be different, and what might I be doing wrong? thanks Adrian
Posted
by
Post not yet marked as solved
1 Replies
39 Views
I'm building a Mac OSX Menubar app (build target is 14.0) and need a Settings window as part of it. I define the window as a standalone view in my @main block as follows: struct xyzApp: App { MenuBarExtra { MenubarView() } label: { Label("XYZ", image: "xyz") } .menuBarExtraStyle(.window) Window("Settings", id: "settings-window") { SettingsView() }.windowResizability(.contentSize) } The Settings view looks like this var body: some View { TabView { Form { }.tabItem { Label("Tab1",systemImage: "gear") } Form { }.tabItem { Label("Tab2",systemImage: "gear") } } } } However the Tabview is not being rendered correctly, there's no image and the sizing is wrong I tested the same code on a regular app with a Settings() declaration in the @main block and it works fine. Any points on what I'm doing wrong would be very helpful. Thanks!
Posted
by
Post not yet marked as solved
1 Replies
44 Views
I know apple updated their policy related to sign in (see https://developer.apple.com/news/?id=f1v8pyay, "More flexibility for sign in options in apps" section), but the wording of the guidelines (https://developer.apple.com/app-store/review/guidelines/#login-services) is a bit difficult to understand: Apps that use a third-party or social login service (such as Facebook Login, Google Sign-In, Sign in with Twitter, Sign In with LinkedIn, Login with Amazon, or WeChat Login) to set up or authenticate the user’s primary account with the app must also offer as an equivalent option another login service with the following features: the login service limits data collection to the user’s name and email address; the login service allows users to keep their email address private as part of setting up their account; and the login service does not collect interactions with your app for advertising purposes without consent. As far as I can tell, FB, Google, Amazon, etc. do not offer these protections. Would Apple Sign In still be required in this case?
Posted
by
Post not yet marked as solved
0 Replies
27 Views
I'm trying to build a project with a moderately complex Reality Composer Pro project, but am unable to because my Mac mini (2023, 8GB RAM) keeps running out of memory. I'm wondering if there are any known memory leaks in realitytool, but basically the tool is taking up 20-30GB (!) memory during builds. I have a Mac Pro for content creation, which is why I didn't go for more RAM on the mini – it was supposed to just be a build machine for Apple Silicon compatibility, as my Pro is Intel. But, I'm kinda stuck here. I have a scene that builds fine, but any time I had a USD – in this case a tree asset – with lots of instances, or a lot of geometry, I run into the memory issue. I've tried greatly simplifying the model, but even a 2MB USD is resulting in the crash. I'm failing to see how adding a 2MB asset would cause the memory of realitytool to balloon so much during builds. If someone from Apple is willing to look, I can provide the scene – but it's proprietary so I can't just post it publicly here.
Posted
by
Post marked as solved
1 Replies
36 Views
Hi All, Our "Account Holder" has invited a new user to our corporate dev account in order for them to participate in our new round of TestFlight testing. When they follow the link in the invitation and attempt to "Create Your Apple ID" and enter the special code displayed on the screen correctly they receive a dialog: Your request could not be completed at this time. Your account cannot be created at this time. They've been trying over the last few days so it seems unlikely this is a backend issue (but who knows). I also (in desperation) tried resending them an invite and they had the exact same result. Hopefully someone here has seen (or is seeing) this issue and perhaps has an idea of how to resolve this. Thanks very much in advance, Eric
Posted
by
Post not yet marked as solved
0 Replies
28 Views
Hello, I'm trying to get a better understanding of the SMS.db tables and fields. Does anybody have a document that describes each table and the fields within that table? I'd like to understand what each field in the Messages Table is used for but it would be a bonus if the is info is also available for the other tables contained in the SMS.db as well.
Posted
by
Post not yet marked as solved
1 Replies
48 Views
Hi everyone, I'm working on submitting my app to the App Store and need to capture screenshots for the 5.5-inch iPhone display size (iPhone 6 Plus, 7 Plus, and 8 Plus). Unfortunately, Xcode 17.4 doesn't offer simulators for these devices. I've searched Apple's support resources but haven't found any solutions. Would anyone have suggestions on how to generate the required 5.5-inch screenshots for App Store submission? Thanks in advance for your help!
Posted
by
Post not yet marked as solved
0 Replies
37 Views
I want to distribute my app as unlisted, review team also recommends me this. 12th of April I submitted the request and got bot mail about awaiting maximum of 3 business days. I has been carrying about this all this time, so that submitted another one request yesterday. Review team always replies very fast in hours, but apple support which is processing my request confuses me. Is here somebody with same trouble or how much time have you been awaiting for feedback? P.S.: I can wait, but I don't see any progress and afraid of non-processing of my request because there was notification about max 3 days of awaiting, but at least a week has been gone.
Posted
by
Post not yet marked as solved
0 Replies
29 Views
Description says this event will be raised when "An identifier for a process that notifies endpoint security that it is updating a file." What does this mean ? Similarly when will ES_EVENT_TYPE_NOTIFY_FILE_PROVIDER_MATERIALIZE event be raised ? Do these events get raised if any cloud provider sync app like Google Drive/Dropbox/OneDrive that usages fileprovider framework to sync the data ? In my endpoint secutiry app, I have registered for these events but i didnt receive any event *i do receive other endpoint secutiry events like ES_EVENT_TYPE_NOTIFY_CLONE etc.
Posted
by
Post not yet marked as solved
0 Replies
28 Views
I have testd with TestFlight but after I changed some code and info.plist, suddenly it says "Invalid Binary". Here are my info.plist changed which I got from my github repo. Diff code-block --- a/ios/Runner/Info.plist +++ b/ios/Runner/Info.plist @@ -2,22 +2,14 @@ <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> - <key>LSApplicationQueriesSchemes</key> - <array> - <string>https</string> - </array> - <key>NSPhotoLibraryUsageDescription</key> - <string>Enlingo needs access to your photo library to save your profile picture.</string> - <key>NSCameraUsageDescription</key> - <string>Enlingo needs access to your camera to take a profile picture.</string> - <key>NSMicrophoneUsageDescription</key> - <string>Enlingo needs access to your microphone to record your voice.</string> + <key>ITSAppUsesNonExemptEncryption</key> + <false/> <key>CADisableMinimumFrameDurationOnPhone</key> <true/> <key>CFBundleDevelopmentRegion</key> <string>$(DEVELOPMENT_LANGUAGE)</string> <key>CFBundleDisplayName</key> - <string>Enlingo</string> + <string>EnLingo</string> <key>CFBundleExecutable</key> <string>$(EXECUTABLE_NAME)</string> <key>CFBundleIdentifier</key> @@ -27,7 +19,18 @@ <key>CFBundleLocalizations</key> <array> <string>en</string> + <string>ja</string> <string>ko</string> + <string>hi</string> + <string>es</string> + <string>fr</string> + <string>de</string> + <string>pt</string> + <string>en</string> + <string>vi</string> + <string>zh_CN</string> + <string>zh_TW</string> + <string>zh</string> </array> <key>CFBundleName</key> <string>enlingo</string> @@ -54,8 +57,18 @@ <string>$(FLUTTER_BUILD_NUMBER)</string> <key>FLTEnableImpeller</key> <false/> + <key>LSApplicationQueriesSchemes</key> + <array> + <string>https</string> + </array> <key>LSRequiresIPhoneOS</key> <true/> + <key>NSCameraUsageDescription</key> + <string>Enlingo needs access to your camera to take a profile picture.</string> + <key>NSMicrophoneUsageDescription</key> + <string>Enlingo needs access to your microphone to record your voice.</string> + <key>NSPhotoLibraryUsageDescription</key> + <string>Enlingo needs access to your photo library to save your profile picture.</string> <key>NSSpeechRecognitionUsageDescription</key> <string>Enlingo needs access to your microphone to record your voice.</string> <key>UIApplicationSupportsIndirectInputEvents</key> @@ -69,15 +82,10 @@ <key>UISupportedInterfaceOrientations</key> <array> <string>UIInterfaceOrientationPortrait</string> - <string>UIInterfaceOrientationLandscapeLeft</string> - <string>UIInterfaceOrientationLandscapeRight</string> </array> <key>UISupportedInterfaceOrientations~ipad</key> <array> <string>UIInterfaceOrientationPortrait</string> - <string>UIInterfaceOrientationPortraitUpsideDown</string> - <string>UIInterfaceOrientationLandscapeLeft</string> - <string>UIInterfaceOrientationLandscapeRight</string> </array> </dict> </plist> any suggestion or help would be appreciated... and one more question... I tested with storekit in XCode with copy-scheme. Is it relevant to "Invalid Binary" that I have storekit cert and configuration? I know it sounds ridiculous - catch a straw....
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all