We trying to capture image RAW format and and save image data in JPEG format(resolution of 1080*1920), but facing issues regarding this image conversion. As per Apple developer documentation RAW image is captured in maximum possible resolution without performing any operations.
To retrieve the RAW image, isRawPhoto check is added in function.
photoOutput(_:didFinishProcessingPhoto:error:)
But there is no option to set the resolution of the raw photo while capturing the RAW image. While configuring camera session we can set sessionPreset as per our need. But to capture raw image we have to set it to .photo (which captures high photo output). We also tried to capture raw photo in sessionPreset resolution of 1080*1920, but we made changes to capture raw photo so this change will not be applied while photo capture.
Post
Replies
Boosts
Views
Activity
Is there a way to transfer photo from iPhone to iPad using lightning - Type C cable? I want to connect iPhone as an external device or USB device and show all the image folder like when USB is connected to iPad.
I was able to do this with USB storage but not with any other iOS device like iPhone.
Is there any way to do this ?
I am trying to connect iPhone 7 as an external device to iPad Pro 11 2020 using lightning to typeC cable. My aim is to transfer photos from iPhone 7 to iPad Pro 11. But i want to view iPhone 7 as an external device so i can view all the photos from it.
If any USB is connected to iPad 11, i am able to get all the folders inside it using UIDocumentPicker. But when i connect iPhone 7 then i have to manually import photos from Photos app.
Is there a way to connect iPhone 7 as an USB or an external device to iPad11Pro?
Is there a way to change the fNumber for iPad Pro 11 front camera programmatically. I am using builtInTrueDepthCamera for front camera.
I am using iPad pro11 front camera and device type is trueDepthCamera. Is there a way for a user to tap on the live view (AVCaptureDevice) and then focus at the tapped point.
Also i want to set custom exposure on the focus point.
So is there any way to change the focus as well as exposure to the tapped location for front camera for iPad 11 pro.
I am using AVCapturePhoto to capture image. In didFinishProcessingPhoto i am getting image data using fileDataRepresentation. But when i convert this data to UIImage, it loses most of its metadata.
I need to draw bezier path on UIImage and still maintain metadata.
Is there any way to do this.
I have created one custom framework (MyFramework.framework), I am trying to set ASLR (PIE) flag for that framework. But when i use otool it does not show this flag. But PIE flag is shown for any other ipa. Is there way i can set the PIE flag for framework and check if it is applied using otool.
Earlier we were using automatically created provisioning profile from XCode for generating IPA. Now we decided to use manual provisioning profile created from Apple Developer Portal.
Manual Provisioning profiles were created and imported successfully in XCode, and we can create IPA.
But we are observing strange behaviour of IPA when we tested them near expiry date (By changing date of iPad).
Sometime IPA over works the expiry date of provisioning profile and sometimes it doesn’t work within expiry.
For Example,
We tested it by changing and setting date and time of iPad within and over expiry date of profile
Tested on 1st August 2022
IPA A (Expiry date - 8th July 2023) -> Works till 31st July 2023
IPA B (Expiry date - 1st August 2023) -> Works till 31st July 2023
Tested on 2nd August 2022
IPA A (Expiry date - 8th July 2023) -> Works till 31st July 2023
IPA B (Expiry date - 1st August 2023) -> Works till 1st August 2023
IPA C (Expiry date - 2nd August 2023) -> Works till 1st August 2023
We also tested changing time of iPad to match date and time of expiry of profile, but it also didn't work for us.
Does anyone know reason for this strange behaviour of expiry date for provisioning profile?
Or if someone could guide us on way to create and test such profile with future expiry dates.
I am using AVCaptureDevice.ExposureMode.locked and setWhiteBalanceModeLocked for capturing photos. On the live view/ camera view it is showing properly i.e exposure on face does not change with the background light. But when using capture image it shows blackish image in didFinishProcessing.
End goal is to capture image whose exposure or brightness does not change with the colour of clothes or background light.
Code used for reference: AVCamManual
I have a csv file with values ranges between 0-1. When converted this csv values to MLMultiArray and then to UIImage it shows grey image. But the image is actually coloured. So is there any step missing or do I have to perform any more action to get this image to be coloured.
Sample csv values - [0.556862745 0.62745098 0.811764706]
I have a csv file with values ranges between 0-1. When converted this csv values to MLMultiArray and then to UIImage it shows grey image. But the image is actually coloured. So is there any step missing or do I have to perform any more action to get this image to be coloured.
Sample csv values - [0.556862745 0.62745098 0.811764706]
Code
func parseCSV(data: String) -> [Float] {
var finalArray = [Float]()
var rows = data.components(separatedBy: "\n")
for row in rows {
let columns = row.components(separatedBy: ",")
if columns.count == 3 {
let r = columns[0]
let g = columns[1]
var b = columns[2]
if b.contains("\r"){
b = b.replacingOccurrences(of: "\r", with: "")
}
finalArray.append(Float(r)!)
finalArray.append(Float(g)!)
finalArray.append(Float(b)!)
}
}
return finalArray
}
let m = try MLMultiArray(shape: [1, 768, 512, 3], dataType: .double)
for (index, element) in data.enumerated() {
m[index] = NSNumber(value: element)
}
let model: ModelInput = {
do {
let config = MLModelConfiguration()
return try ModelInput(configuration: config)
} catch {
fatalError("Couldn't create Prediction model")
}
}()
let op = try model.prediction(input: ModelInput(input1: m))
let opvalue = op.featureValue(for: "Output")
let multiArray = opvalue!.multiArrayValue!
We want to animate the images using animationImages propery of UIImageView in our app.
There are 45 heif images with dimensions of 1668x2388.
Below is the code i am using to animate images.
let imageViewAnimation = UIImageView()
imgViewAnimation.animationImages = images
imgViewAnimation.animationDuration = 5.0
imgViewAnimation.animationRepeatCount = 1
imgViewAnimation.startAnimating()
Timer.scheduledTimer(withTimeInterval: 5.0 + 0.5, repeats: false) { _ in
DispatchQueue.main.async {
self.showAlert()
}
}
The issue i am facing is, it takes more than 5.0 seconds (animationDuration) to display all the images.
Alert is shown before all the images are shown.
We face this issue only for few sets of images. For some other sets, it is working fine.
Is this issue due to heif images used, or due to memory and CPU load of using large number of images.
We are trying to access and copy some files [e.g., video file] from another PC on local network using SMB protocol.
We found some third party libraries like AMSMB2 for this.
But we want to try to use Apple inbuilt features like File management mentioned in - https://developer.apple.com/videos/play/wwdc2019/719/
We could able to select file from SMB server using document picker in app manually. Also we got its url in debug which gets generated under "Shared" section in files app.
The URL I get from document picker is -> /private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder
Now we want to avoid manual selection of file to user.
We want directly open "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder" path
as soon as document picker opens. So that user can directly select file. But it is not working. It opens normal files app and all folders.
Getting below error -
Access Shared URL directly using documentPicker "Error - CFURLResourceIsReachable failed because it was passed a URL which has no scheme"
Sharing the code which I tried to open this shared folder directly :
let url = URL (string: "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/TestUser/iOS/SMB_ShareFolder")
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: [UTType.folder])
documentPicker.delegate = self
documentPicker.allowsMultipleSelection = false
documentPicker.modalPresentationStyle = .custom
documentPicker.definesPresentationContext = true
documentPicker.directoryURL = url!
documentPicker.transitioningDelegate = customTransitioningDelegate
present(documentPicker, animated: true, completion: nil)
I get error in console - CFURLResourceIsReachable failed because it was passed a URL which has no scheme
2024-07-05 17:49:38.501059+0530 VideoImportPOC[1327:336989] [DocumentManager] revealDocumentAtURL encountered an error: Error Domain=NSCocoaErrorDomain Code=262
"The file couldn’t be opened because the specified URL type isn’t supported."
Can you please provide inputs if it is possible access files directly in this way? or any other suggestions.
like 1
What’s New in File Management and Quick Look - WWDC19 - Videos - Apple Developer
Your iOS app can now access files stored on external devices via USB and SMB. Understand best practices for creating a document-based app...
We are trying to access and copy some files [e.g., video file] from another PC on local network using SMB protocol.
We found some third party libraries like AMSMB2 for this.
But we want to try to use Apple inbuilt features like File management mentioned in - https://developer.apple.com/videos/play/wwdc2019/719/
We could able to select file from SMB server using document picker in app manually. Also we got its url in debug which gets generated under "Shared" section in files app.
The URL I get from document picker is -> /private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder
Now we want to avoid manual selection of file to user.We want directly open "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder" path as soon as document picker opens. So that user can directly select file. But it is not working. It opens normal files app and all folders.
Getting below error -
Access Shared URL directly using documentPicker "Error - CFURLResourceIsReachable failed because it was passed a URL which has no scheme"
Sharing the code which I tried to open this shared folder directly :
let url = URL (string: "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/Pranjali/iOS/SMB_ShareFolder")
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: [UTType.folder])
documentPicker.delegate = self
documentPicker.allowsMultipleSelection = false
documentPicker.modalPresentationStyle = .custom
documentPicker.definesPresentationContext = true
documentPicker.directoryURL = url!
documentPicker.transitioningDelegate = customTransitioningDelegate
present(documentPicker, animated: true, completion: nil)
I get error in console - CFURLResourceIsReachable failed because it was passed a URL which has no scheme 2024-07-05 17:49:38.501059+0530 VideoImportPOC[1327:336989] [DocumentManager] revealDocumentAtURL encountered an error: Error Domain=NSCocoaErrorDomain Code=262 "The file couldn’t be opened because the specified URL type isn’t supported."
Can you please provide inputs if it is possible to access files directly in this way? or any other suggestions.
I want to convert CGPoint into SCNVector3. I am using ARFaceTrackingConfiguration for face tracking.
Below is my code to convert SCNVector3 to CGPoint
let point = faceAnchor.verticeAndProjection(to: sceneView, facePoint: faceAnchor.geometry.vertices[0])
print(point, faceAnchor.geometry.vertices[0])
which prints below values
CGPoint = (350.564453125, 643.4456787109375)
SIMD3<Float>(0.014480735, 0.01397189, 0.04508282)
extension ARFaceAnchor{
// struct to store the 3d vertex and the 2d projection point
struct VerticesAndProjection {
var vertex: SIMD3<Float>
var projected: CGPoint
}
// return a struct with vertices and projection
func verticeAndProjection(to view: ARSCNView, facePoint: Int) -> CGPoint{
let point = SCNVector3(geometry.vertices[facePoint])
let col = SIMD4<Float>(SCNVector4())
let pos = SIMD4<Float>(SCNVector4(point.x, point.y, point.z, 1))
let pworld = transform * simd_float4x4(col, col, col, pos)
let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z))
let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y))
return p
}
}
extension matrix_float4x4 {
/// Get the position of the transform matrix.
public var position: SCNVector3 {
get{
return SCNVector3(self[3][0], self[3][1], self[3][2])
}
}
}
Now i want to convert same CGPoint to SCNVector3.
I tried using below code but it is not giving expected values, which is SIMD3(0.014480735, 0.01397189, 0.04508282)
let projectedOrigin = sceneView.projectPoint(SCNVector3Zero)
let unproject = sceneView.unprojectPoint(SCNVector3(point.x, point.y, CGFloat(projectedOrigin.z)))
let vector = SCNVector3(unproject.x, unproject.y, unproject.z)
Is there any way to convert CGPoint to SCNVector3? I cannot use hitTest because this CGPoint is not present on the node. It is present somewhere on the face area.