We are trying to build a simple image capture app using AVFoundation and AVCaptureDevice.
Custom settings are used for exposure point and bias.
But when image is captured using front camera , the image captured from the app and front native camera does not match.
The image captured from the app includes more area than the native app.
Also there is difference between the tilt angle between two images.
So is there any way to capture image exactly same as native camera using AVFoundation and AVCaptureDevice.
Native
Custom
Post
Replies
Boosts
Views
Activity
We are trying to build a video recording app using AVFoundation and AVCaptureDevice.
No custom settings are used like iso, exposure duration. All the settings are kept to auto.
But when video is captured using front camera and 1080x1920 dimensions, the video captured from the app and front native camera does not match.
In settings i have kept video setting as 30 fps 1080x1920.
The video captured from the app includes more area than the native app. Also some values like iso, exposure duration does not match.
So is there any way to capture video exactly same as native camera using AVFoundation and AVCaptureDevice.
I have attached screenshots from video for reference.
Native
AVCapture
Do we have any way to programmatically calculate CPU usage details in app periodically or in any specific method which can probably have high usage
For eg - According to Xcode my app uses 128% and 253% respectively.
Can we get this figure programmatically and also other possible CPU usage details ?
Is there any way to place 3D objects, maybe using ARKit or Metalkit in the video.
I have tried to extract frames from video, then draw a cube using SCNNode and then render it into UIImage, then gather all images and create video.
But this is not feasible solution as it creates huge memory spike and ultimately gives memory warning.
So is there any other way to draw 3D objects on the video file.
I want to convert CGPoint into SCNVector3. I am using ARFaceTrackingConfiguration for face tracking.
Below is my code to convert SCNVector3 to CGPoint
let point = faceAnchor.verticeAndProjection(to: sceneView, facePoint: faceAnchor.geometry.vertices[0])
print(point, faceAnchor.geometry.vertices[0])
which prints below values
CGPoint = (350.564453125, 643.4456787109375)
SIMD3<Float>(0.014480735, 0.01397189, 0.04508282)
extension ARFaceAnchor{
// struct to store the 3d vertex and the 2d projection point
struct VerticesAndProjection {
var vertex: SIMD3<Float>
var projected: CGPoint
}
// return a struct with vertices and projection
func verticeAndProjection(to view: ARSCNView, facePoint: Int) -> CGPoint{
let point = SCNVector3(geometry.vertices[facePoint])
let col = SIMD4<Float>(SCNVector4())
let pos = SIMD4<Float>(SCNVector4(point.x, point.y, point.z, 1))
let pworld = transform * simd_float4x4(col, col, col, pos)
let vect = view.projectPoint(SCNVector3(pworld.position.x, pworld.position.y, pworld.position.z))
let p = CGPoint(x: CGFloat(vect.x), y: CGFloat(vect.y))
return p
}
}
extension matrix_float4x4 {
/// Get the position of the transform matrix.
public var position: SCNVector3 {
get{
return SCNVector3(self[3][0], self[3][1], self[3][2])
}
}
}
Now i want to convert same CGPoint to SCNVector3.
I tried using below code but it is not giving expected values, which is SIMD3(0.014480735, 0.01397189, 0.04508282)
let projectedOrigin = sceneView.projectPoint(SCNVector3Zero)
let unproject = sceneView.unprojectPoint(SCNVector3(point.x, point.y, CGFloat(projectedOrigin.z)))
let vector = SCNVector3(unproject.x, unproject.y, unproject.z)
Is there any way to convert CGPoint to SCNVector3? I cannot use hitTest because this CGPoint is not present on the node. It is present somewhere on the face area.
We are trying to access and copy some files [e.g., video file] from another PC on local network using SMB protocol.
We found some third party libraries like AMSMB2 for this.
But we want to try to use Apple inbuilt features like File management mentioned in - https://developer.apple.com/videos/play/wwdc2019/719/
We could able to select file from SMB server using document picker in app manually. Also we got its url in debug which gets generated under "Shared" section in files app.
The URL I get from document picker is -> /private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder
Now we want to avoid manual selection of file to user.
We want directly open "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder" path
as soon as document picker opens. So that user can directly select file. But it is not working. It opens normal files app and all folders.
Getting below error -
Access Shared URL directly using documentPicker "Error - CFURLResourceIsReachable failed because it was passed a URL which has no scheme"
Sharing the code which I tried to open this shared folder directly :
let url = URL (string: "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/TestUser/iOS/SMB_ShareFolder")
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: [UTType.folder])
documentPicker.delegate = self
documentPicker.allowsMultipleSelection = false
documentPicker.modalPresentationStyle = .custom
documentPicker.definesPresentationContext = true
documentPicker.directoryURL = url!
documentPicker.transitioningDelegate = customTransitioningDelegate
present(documentPicker, animated: true, completion: nil)
I get error in console - CFURLResourceIsReachable failed because it was passed a URL which has no scheme
2024-07-05 17:49:38.501059+0530 VideoImportPOC[1327:336989] [DocumentManager] revealDocumentAtURL encountered an error: Error Domain=NSCocoaErrorDomain Code=262
"The file couldn’t be opened because the specified URL type isn’t supported."
Can you please provide inputs if it is possible access files directly in this way? or any other suggestions.
like 1
What’s New in File Management and Quick Look - WWDC19 - Videos - Apple Developer
Your iOS app can now access files stored on external devices via USB and SMB. Understand best practices for creating a document-based app...
We are trying to access and copy some files [e.g., video file] from another PC on local network using SMB protocol.
We found some third party libraries like AMSMB2 for this.
But we want to try to use Apple inbuilt features like File management mentioned in - https://developer.apple.com/videos/play/wwdc2019/719/
We could able to select file from SMB server using document picker in app manually. Also we got its url in debug which gets generated under "Shared" section in files app.
The URL I get from document picker is -> /private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder
Now we want to avoid manual selection of file to user.We want directly open "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/testuser/iOS/SMB_ShareFolder" path as soon as document picker opens. So that user can directly select file. But it is not working. It opens normal files app and all folders.
Getting below error -
Access Shared URL directly using documentPicker "Error - CFURLResourceIsReachable failed because it was passed a URL which has no scheme"
Sharing the code which I tried to open this shared folder directly :
let url = URL (string: "/private/var/mobile/Library/LiveFiles/com.apple.filesystems.smbclientd/asd0QUsers/Pranjali/iOS/SMB_ShareFolder")
let documentPicker = UIDocumentPickerViewController(forOpeningContentTypes: [UTType.folder])
documentPicker.delegate = self
documentPicker.allowsMultipleSelection = false
documentPicker.modalPresentationStyle = .custom
documentPicker.definesPresentationContext = true
documentPicker.directoryURL = url!
documentPicker.transitioningDelegate = customTransitioningDelegate
present(documentPicker, animated: true, completion: nil)
I get error in console - CFURLResourceIsReachable failed because it was passed a URL which has no scheme 2024-07-05 17:49:38.501059+0530 VideoImportPOC[1327:336989] [DocumentManager] revealDocumentAtURL encountered an error: Error Domain=NSCocoaErrorDomain Code=262 "The file couldn’t be opened because the specified URL type isn’t supported."
Can you please provide inputs if it is possible to access files directly in this way? or any other suggestions.
We want to animate the images using animationImages propery of UIImageView in our app.
There are 45 heif images with dimensions of 1668x2388.
Below is the code i am using to animate images.
let imageViewAnimation = UIImageView()
imgViewAnimation.animationImages = images
imgViewAnimation.animationDuration = 5.0
imgViewAnimation.animationRepeatCount = 1
imgViewAnimation.startAnimating()
Timer.scheduledTimer(withTimeInterval: 5.0 + 0.5, repeats: false) { _ in
DispatchQueue.main.async {
self.showAlert()
}
}
The issue i am facing is, it takes more than 5.0 seconds (animationDuration) to display all the images.
Alert is shown before all the images are shown.
We face this issue only for few sets of images. For some other sets, it is working fine.
Is this issue due to heif images used, or due to memory and CPU load of using large number of images.
I have a csv file with values ranges between 0-1. When converted this csv values to MLMultiArray and then to UIImage it shows grey image. But the image is actually coloured. So is there any step missing or do I have to perform any more action to get this image to be coloured.
Sample csv values - [0.556862745 0.62745098 0.811764706]
Code
func parseCSV(data: String) -> [Float] {
var finalArray = [Float]()
var rows = data.components(separatedBy: "\n")
for row in rows {
let columns = row.components(separatedBy: ",")
if columns.count == 3 {
let r = columns[0]
let g = columns[1]
var b = columns[2]
if b.contains("\r"){
b = b.replacingOccurrences(of: "\r", with: "")
}
finalArray.append(Float(r)!)
finalArray.append(Float(g)!)
finalArray.append(Float(b)!)
}
}
return finalArray
}
let m = try MLMultiArray(shape: [1, 768, 512, 3], dataType: .double)
for (index, element) in data.enumerated() {
m[index] = NSNumber(value: element)
}
let model: ModelInput = {
do {
let config = MLModelConfiguration()
return try ModelInput(configuration: config)
} catch {
fatalError("Couldn't create Prediction model")
}
}()
let op = try model.prediction(input: ModelInput(input1: m))
let opvalue = op.featureValue(for: "Output")
let multiArray = opvalue!.multiArrayValue!
I have a csv file with values ranges between 0-1. When converted this csv values to MLMultiArray and then to UIImage it shows grey image. But the image is actually coloured. So is there any step missing or do I have to perform any more action to get this image to be coloured.
Sample csv values - [0.556862745 0.62745098 0.811764706]
I am using AVCaptureDevice.ExposureMode.locked and setWhiteBalanceModeLocked for capturing photos. On the live view/ camera view it is showing properly i.e exposure on face does not change with the background light. But when using capture image it shows blackish image in didFinishProcessing.
End goal is to capture image whose exposure or brightness does not change with the colour of clothes or background light.
Code used for reference: AVCamManual
Earlier we were using automatically created provisioning profile from XCode for generating IPA. Now we decided to use manual provisioning profile created from Apple Developer Portal.
Manual Provisioning profiles were created and imported successfully in XCode, and we can create IPA.
But we are observing strange behaviour of IPA when we tested them near expiry date (By changing date of iPad).
Sometime IPA over works the expiry date of provisioning profile and sometimes it doesn’t work within expiry.
For Example,
We tested it by changing and setting date and time of iPad within and over expiry date of profile
Tested on 1st August 2022
IPA A (Expiry date - 8th July 2023) -> Works till 31st July 2023
IPA B (Expiry date - 1st August 2023) -> Works till 31st July 2023
Tested on 2nd August 2022
IPA A (Expiry date - 8th July 2023) -> Works till 31st July 2023
IPA B (Expiry date - 1st August 2023) -> Works till 1st August 2023
IPA C (Expiry date - 2nd August 2023) -> Works till 1st August 2023
We also tested changing time of iPad to match date and time of expiry of profile, but it also didn't work for us.
Does anyone know reason for this strange behaviour of expiry date for provisioning profile?
Or if someone could guide us on way to create and test such profile with future expiry dates.
I have created one custom framework (MyFramework.framework), I am trying to set ASLR (PIE) flag for that framework. But when i use otool it does not show this flag. But PIE flag is shown for any other ipa. Is there way i can set the PIE flag for framework and check if it is applied using otool.
I am using AVCapturePhoto to capture image. In didFinishProcessingPhoto i am getting image data using fileDataRepresentation. But when i convert this data to UIImage, it loses most of its metadata.
I need to draw bezier path on UIImage and still maintain metadata.
Is there any way to do this.
I am using iPad pro11 front camera and device type is trueDepthCamera. Is there a way for a user to tap on the live view (AVCaptureDevice) and then focus at the tapped point.
Also i want to set custom exposure on the focus point.
So is there any way to change the focus as well as exposure to the tapped location for front camera for iPad 11 pro.