I am using UIImagePickerController to let users pick videos to use in my app. However I noticed that best quality I am able to get is 1280x720 probably because of the compression that always happen.
These videos are 4K and I would like to get this resolution or at least 1080p.
Here is my code for creating picker:
let picker = UIImagePickerController()
picker.delegate = self
picker.videoQuality = .typeHigh
picker.sourceType = .photoLibrary
picker.mediaTypes = UIImagePickerController.availableMediaTypes(for: .photoLibrary) ?? []
assert(!picker.mediaTypes.isEmpty)
picker.mediaTypes = ["public.movie"]
picker.allowsEditing = true
present(picker, animated: true, completion: nil)
Modifying allowsEditing or videoQuality does not do anything.
Thanks for help!
Post
Replies
Boosts
Views
Activity
Hello,
does PHPhotoPickerViewController offer any options to specify which resolution should the selected video to be? Meaning can it do automatic compression like UIImagePickerViewController with videoExportPreset settings?
Or is this supposed to be handled in the app itself? Say I want to always have Full HD or 4K videos (if available) how would I go about achieving that?
Thanks
Hello!
I am playing around with the PHPickerViewController and so far I was able to get the selected images by loading them into UIImage instances but I don't know how to get the selected video.
Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]):
let provider = result.itemProvider
guard provider.hasItemConformingToTypeIdentifier(AVFileType.mov.rawValue) else { return }
						provider.loadItem(forTypeIdentifier: AVFileType.mov.rawValue, options: nil) { (fileURL, error) in
								if let error = error {
										print(error)
										return
								}
								guard let videoURL = fileURL as? URL else { return }
								DispatchQueue.main.async {
										let fm = FileManager.default
										let destination = fm.temporaryDirectory.appendingPathComponent("video123.mov")
										try! fm.copyItem(at: videoURL, to: destination)
										let playerVC = AVPlayerViewController()
										playerVC.player = AVPlayer(url: destination)
										self.present(playerVC, animated: true, completion: nil)
								}
						}
I get crash trying to copy the item. It says the source file does not exists but the path looks real to me.
"The file “3C2BCCBC-4474-491B-90C2-93DF848AADF5.mov” couldn’t be opened because there is no such file." I tried it without copying first and just passing the URL to AVPlayer but nothing would play.
I am testing this on a simulator.
Thanks for help!
Hello,
so I started using Compositional layout for cases where previously I would use either a Table View or Collection View with the Flow layout.
While Compositional works great for complex layout, I have a bit of an issue building simple screens. For example one header cell and then three tappable cells below it.
let itemSize = NSCollectionLayoutSize(widthDimension: .fractionalWidth(1), heightDimension: height)
let layoutItem = NSCollectionLayoutItem(layoutSize: itemSize)
let layoutGroup = NSCollectionLayoutGroup.vertical(layoutSize: itemSize, subitems: [layoutItem])
I am using this setup to setup the only group that gets passed to the NSCollectionLayoutSection and it works great, but I dont know what is the best way to specify height? I am using .estimated(200) for now, but this in the end (correctly) results in header that has height like 300 and then those buttons with 45 each. I just feel like I am not using the Compositional layout sizing properly.
Thanks for help.
Hello,
I initially started the discussion over at Swift Forums - https://forums.swift.org/t/issue-with-connect-proxy-some-connections-dont-work-probably-sockets/45575 because I am using SwiftNIO but it appears that it makes more sense here.
I have an app with NetworkExtension of the packet tunnel provider type. So I am using the NEPacketTunnelProvider. As part of the extension, I have SwiftNIO server with connect proxy which is used to route the traffic.
This works great in almost all cases, but apps like Signal or WhatsApp don't work. They don't display any kind of "no connection" indicator but the messages aren't send or received.
Over at Swift Forums I got an answer from Quinn “The Eskimo!” that I think points to the actual problem:
My experience is that a lot of these ‘chat’ apps explicitly bind their
connections to the WWAN interface, which means they don’t use the
default route and thus aren’t seen by the packet tunnel provider.
My packet tunnel provider is configured to use the default routes for IP4 and IP6.
So my concern is how to configure it, that it works also for the chat apps? There are couple of apps from the App Store that I tried which use this same approach and while they are active, Signal works fine.
Yes, I know this solution is not ideal and not the main intended usecase for packet tunnel, but it is the only option available on iOS..
Thanks for help! Happy to clarify further.
Hello, I am working on a NEPacketTunnelProvider and I am not exactly sure if I correctly understand the NEPacketTunnelNetworkSettings.
For example the ipv4Settings according to docs:
This property contains the IPv4 routes specifying what IPv4 traffic to route to the tunnel, as well as the IPv4 address and netmask to assign to the TUN interface.
So this seems like unless I set all the possible routes here, the tunnel should not work for all the traffic?
Currently I have this:
swift
let ipv4Settings: NEIPv4Settings = NEIPv4Settings(
addresses: ["192.169.89.1"],
subnetMasks: ["255.255.255.255"]
)
Which seems to work pretty well, both for WiFi and cellular. In the past I tried various other addresses, even manually including all the IPV4 routes but I never noticed any effect regarding the tunnel.
Then there is the includedRoutes property.
The routes that specify what IPv4 network traffic will be routed to the TUN interface.
So this is basically another way to set the address like in the constructor for NEIPv4Settings?
This seems to work best when I don't set anything. I tried setting all the routes but that did not change things a bit. The only difference is when I set includedRoutes to NEIPv4Route.default(). Then some apps stop working when the tunnel is active.
This is strange, because even setting all the available routes + default one doesn't fix this "issue".
What is the relation between these properties? It is best to not set includedRoutes if the tunnel works fine?
And lastly. What about dnsSettings? This looks like another optional property. Does it make sense to manually specify DNS to point maybe to 1.1.1.1?
Hello,
I am working on a small app to easily create time relative reminders. Meaning I can quickly create reminder that will remind me about something after 45 minutes from now..
I want to add configurable shortcuts, so users can use this app via Siri and the Shortcuts app.
I have created the Intents.intentdefinition file and it's (so far one) shortcut correctly displays in the Shortcuts app with the parametrs.
But I am unsure how should I now handle it? Some tutorials and docs mention the delegate methods in SceneDelegate while others point to Intents Extension which is supposed to handle the shortcut?
swift
override func restoreUserActivityState(_ activity: NSUserActivity) {
}
This shortcut what I want does not need more user interaction than just providing the two input parameters.
So the simplest way to handle that would be nice.
Ideally if that work could happen in background without opening my app? Because after the reminder is added, there is nothing to do in my app. Is that possible?
Hello,
I am working on a NetworkExtension that uses NEPacketTunnelProvider.
Is there any option to somehow see all traffic that gets send to my extension?
Some apps have issues working over my tunnel, so I want to check the connections and see what is happening.
Hello,
I noticed an issue with my VPN configuration created with NETunnelProviderManager. When the user has multiple apps that use VPN configuration and are active, I cannot activate my network extension.
I am getting this error: NEVPNError.Code.configurationDisabled
For ObjC it's NEVPNErrorConfigurationDisabled
An error code indicating the VPN configuration associated with the VPN manager isn’t enabled.
So if this happens, I need to open Settings - VPN and select my app's profile.
How to do this programatically? Other apps are able to re-enable their VPN profile, if another one was selected.
Hello,
in my usecase, I want to use the matchDomains property, so my VPN (NEPacketTunnelProvider) handles traffic for just some apps and not everything that happens on the device.
But settings matchDomains to anything other than [""] doesn't seem to work properly.
It works for websites in Safari but in my testing not for other apps. Let's use Instagram as an example:
let proxySettings: NEProxySettings = NEProxySettings()
proxySettings.matchDomains = ["instagram.com"]
With this settings, using the Instagram app doesn't send traffic to my VPN. However if I set something like "theverge.com" as the domain, it gets set to my app.
According to the docs, the matchDomains uses suffixes:
If the destination host name of a HTTP connection shares a suffix with one of these strings then the proxy settings will be used for the HTTP connection. Otherwise the proxy settings will not be used.
I also tried wildcards like *.instagram.com without much luck.
How would I go about this? Is there any internal limits on how many domains I can match like this?
Thanks
Hello,
I am using apply(_:completionHandler:) on the NEHotspotConfigurationManager.shared to prompt the user to join particular WiFi network.
In my testing it works all the time, but via Sentry I am getting a lot of errors with this code from the NEHotspotConfigurationErrorDomain domain.
In particular I am getting: NEHotspotConfigurationErrorInternal and NEHotspotConfigurationErrorUnknown.
It doens't appear to be connected to particular iOS version nor device.
The WiFi network should be pretty same too - this is an app that connects to the Nintendo Switch game console.
From some anectodal reports it looks like when the user tries to connect via Camera.app (by scanning QR code) that works for the WiFi connection.
It is possible to debug this further?
Hello,
I am trying to use AVAudioFile to save audio buffer to .wav file. The buffer is of type [Float].
Currently I am able to successfully create the .wav files and even play them, but they are blank - I cannot hear any sound.
private func saveAudioFile(using buffer: [Float]) {
let fileUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("\(UUID().uuidString).wav")
let fileSettings = [
AVFormatIDKey: Int(kAudioFormatLinearPCM),
AVSampleRateKey: 15600,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
guard let file = try? AVAudioFile(forWriting: fileUrl, settings: fileSettings, commonFormat: .pcmFormatInt16, interleaved: true) else {
print("Cannot create AudioFile")
return
}
guard let bufferFormat = AVAudioFormat(settings: settings) else {
print("Cannot create buffer format")
return
}
guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: bufferFormat, frameCapacity: AVAudioFrameCount(buffer.count)) else {
print("Cannot create output buffer")
return
}
for i in 0..<buffer.count {
outputBuffer.int16ChannelData!.pointee[i] = Int16(buffer[i])
}
outputBuffer.frameLength = AVAudioFrameCount(buffer.count)
do {
try file.write(from: outputBuffer)
} catch {
print(error.localizedDescription)
print("Write to file failed")
}
}
Where should I be looking first for the problem? Is it format issue?
I am getting the data from the microphone with the AVAudioEngine.
Its format is created like this:
let outputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(15600), channels: 1, interleaved: true)!
And here is the installTap implementation with the buffer callback:
input.installTap(onBus: 0, bufferSize: AVAudioFrameCount(sampleRate*2), format: inputFormat) { (incomingBuffer, time) in
DispatchQueue.global(qos: .background).async {
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: AVAudioFrameCount(outputFormat.sampleRate * 2.0))
var error: NSError? = nil
let inputBlock: AVAudioConverterInputBlock = { inNumPackets, outStatus in
outStatus.pointee = AVAudioConverterInputStatus.haveData
return incomingBuffer
}
formatConverter.convert(to: pcmBuffer!, error: &error, withInputFrom: inputBlock)
if error != nil {
print(error!.localizedDescription)
}
else if let pcmBuffer = pcmBuffer, let channelData = pcmBuffer.int16ChannelData {
let channelDataPointer = channelData.pointee
self.buffer = stride(from: 0, to: self.windowLengthSamples, by: 1).map { Float(channelDataPointer[$0]) / 32768.0 }
onBufferUpdated(self.buffer)
}
}
}
The onBufferUpdated is the block that provides [Float] for the saveAudioFile method above.
I have tried some experiements with different output formats, but that ended up with unplayable audio files.
Hello,
I am unable to use UIApplication.shared.open to open URL to my app's iCloud Drive folder that I am using to export files to.
I am using this method to get the URL:
url(forUbiquityContainerIdentifier: nil)?.appendingPathComponent("Documents")
But when testing this URL with UIApplication.shared.canOpenURL I am getting back false.
-canOpenURL: failed for URL: "file:///private/var/mobile/Library/Mobile%20Documents/iCloud~MyApp/Documents/" - error: "The operation couldn’t be completed. (OSStatus error -10814.)"
Since my app's local document folder can be opened by prefixing the URL with shareddocuments:// I tried that for iCloud Drive URL, but didn't work either.
Is this even possible?
Hello,
I have successfully implemented NEPacketTunnelProvider network extension in iOS app which works fine most of the time.
By working fine I mean it starts, stops (it is configured to disconnect on sleep) and handles network traffic as expected.
However I have a few reports that sometimes it doesn't start correctly. It hangs on "Connecting..." when checking the Settings -> VPN.
As far as I can tell even with waiting for minutes, it seems still stuck.
Re-installing either the VPN provider extension or entire app fixes this problem.
What could be causing such random and very rare issues? This doesn't seem to be connected to single iOS version for example.
Hello,
I am trying to play around with the Live Text API according to this docs - https://developer.apple.com/documentation/visionkit/enabling_live_text_interactions_with_images?changes=latest_minor
But it always fails with [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
I am running this on a UIImage instance that I got from VNDocumentCameraViewController.
This is my current implementation that I run after the scanned image is displayed:
private func setupLiveText() {
guard let image = imageView.image else {
return
}
let interaction = ImageAnalysisInteraction()
imageView.addInteraction(interaction)
Task {
let configuration = ImageAnalyzer.Configuration([.text])
let analyzer = ImageAnalyzer()
do {
let analysis = try await analyzer.analyze(image, configuration: configuration)
DispatchQueue.main.async {
interaction.analysis = analysis
}
} catch {
print(error.localizedDescription)
}
}
}
It does not fail, it returns non-nil analysis object, but setting it to the interaction does nothing.
I am testing this on iPhone SE 2020 which has the A13 chip. This feature requires A12 and up.