Hi everyone,
I am wondering under which settings the camera(s) were set by the time they were calibrated.
For instance, one aspect that is easy to find is the reference resolution of the images taken when calibrating the intrinsics, this is by retrieving intrinsicMatrixReferenceDimensions. Making sure that the principal point is referenced to the by the time resolution used when the calibration was ongoing.
However, recently I saw that there are focusing modes that potentially displace the lens' physical position. Settings like:
AutoFocusRangeRestriction: none, near, far
setFocusModeLocked: Locks the lens position at the specified value, and sets the focus mode to a locked state.
My concern lies the impact this focusing lens displacements have on the intrinsic matrix parameters, if the lens is displaced, these parameters no longer describe the camera since the lens position has changed w.r.t. the lensPosition set when they were calibrated [0-1].
If my understanding is correct the AutoFocusRangeRestriction is just a range freedom the system is allowed to auto-focus and not a specific lens position.
Conversely, the setFocusModeLocked does indeed fix the lensPosition to a certain value [0 - 1].
In simple words, what is the focus lensPosition the cameras were set when calibrating them for intrnisics?
Posts under iPhone tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi All,
I am develop the a braille keyboard with iOS, when I testing the typing function in notes , the screen update is very slow after typing, we suppose the respond should been instance change, I am not sure how to setup voiceover setting.
does any document supply on this issue?
or does a any guideline for this?
Hi everyone,
I am wondering under which settings the camera(s) were set by the time they were calibrated.
For instance, one aspect that is easy to find is the reference resolution of the images taken when calibrating the intrinsics, this is by retrieving
intrinsicMatrixReferenceDimensions.
Making sure that the principal point is referenced to the by the time resolution used when the calibration was ongoing.
However, recently I saw that there are focusing modes that potentially displace the lens' physical position.
Settings like:
AutoFocusRangeRestriction: none, near, far
setFocusModeLocked: Locks the lens position at the specified value, and sets the focus mode to a locked state.
My concern lies the impact this focusing lens displacements can have on the intrinsic matrix parameters, like these parameters no longer describe the camera since the lens position has changed.
In simple words, what is the focus 'mode'/'range' the cameras were set when calibrating them for intrnisics?
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version.
I have updated iOS and WhatsApp, nothing helped to resolve.
Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
Hello,
I'm sporadically getting the unknown CoreHaptics error described in the title, when trying to create a CHHapticPatternPlayer from an .ahap file — this does not occur in a repeatable manner, unfortunately...
This occurs in a SpriteKit game, running on both iOS 17 and 18.
Anyone know what the error means, since it's apparently undocumented?
Note that the haptics engine is started right before creating the pattern, and the start request doesn't appear to fail...
Thanks,
D.
I recently encountered an issue with Xcode 16.2 while attempting to integrate Settings.bundle into a new app. I added Settings.bundle as a new file (using the provided template), but when I ran the app (the default "Hello World" project), the expected three default controls (Name, Enabled, Slider) did not appear in the app's settings.
To troubleshoot, I downgraded my system to macOS Sonoma 14.7.2 and Xcode 15.4 (on a 2023 Mac Mini, M2). After this downgrade, everything worked as expected. With a new project, adding Settings.bundle, and running the app, the settings entry for the app appeared, including the three default fields.
This behavior suggests a potential issue or incompatibility with Xcode 16.2.
Xcode 16.2 無法在IOS 18.2 Debug
Xcode 16.2
iOS 18.2
直接建立新專案
Xcode -> Create New Project -> Multiplatform -> Application -> App
選擇 實體手機 -> 執行
error: attach by pid '1050' failed -- attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.)
Logging Error: Failed to initialize logging system due to time out. Log messages may be missing. If this issue persists, try setting IDEPreferLogStreaming=YES in the active scheme actions environment variables.
I had recently updated my iPhone 15 pro to iOS 18.2, but I deleted image playground app by mistake. When I go to the App Store to download it, it says the app isn’t available in your country. All and any feedback is appreciated. Thank you.
Good evening. I have come up with this amazing and revolutionary idea that is going to change the ent world for the better.
I’m a 23 year old Design Engineer from Greece and I would like to come to direct contact with the Apple Design Team, in order to discuss this project and the best case scenario, sell it to you.
The foldable iPhone is not a good idea. Its not revolutionary and it’s going to leave a big black mark in Apple history.
With my idea, the whole world will see again the greatness of the biggest company in history.
We have working code to fetch step data from HealthKit after requesting the necessary permissions. However, we’ve encountered an issue specific to one device, the iPhone 16 Pro Max. When querying the data, we do not receive a response, and the code enters an infinite loading state without completing the request.
The user who is facing this issue has tried logging in on another device, and it works fine. On the problematic device (iPhone 16 Pro Max), the request does not complete.
For reference, I’ve included the code below. Resolving this issue is crucial, so we would appreciate any guidance on what steps we can take to troubleshoot or resolve the problem on this specific device.
Please note that the device has granted permission to access HealthKit data.
static let healthStore = HKHealthStore()
static func limitReadFromHealthKitBetweenDates(fromDate: Date, toDate: Date = Date(), completion: @escaping ([HKStatistics]) -> Void) {
guard let stepsQuantityType = HKQuantityType.quantityType(forIdentifier: .stepCount) else { return }
let ignoreUserEntered = HKQuery.predicateForObjects(withMetadataKey: HKMetadataKeyWasUserEntered, operatorType: .notEqualTo, value: true)
let now = toDate
var interval = DateComponents()
interval.day = 1
var calendar = Calendar.current
calendar.locale = Locale(identifier: "en_US_POSIX")
var anchorComponents = calendar.dateComponents([.day, .month, .year], from: now)
anchorComponents.hour = 0
let anchorDate = calendar.date(from: anchorComponents) ?? Date()
let query = HKStatisticsCollectionQuery(quantityType: stepsQuantityType,
quantitySamplePredicate: ignoreUserEntered,
options: [.cumulativeSum],
anchorDate: anchorDate,
intervalComponents: interval)
query.initialResultsHandler = { _, results, error in
guard let results = results else {
print("Error returned from resultHandler: \(String(describing: error?.localizedDescription))")
return
}
print(results)
var statisticsArray: [HKStatistics] = []
results.enumerateStatistics(from: fromDate, to: now) { statistics, _ in
statisticsArray.append(statistics)
if statistics.endDate.getddmmyyyyslashGMT == now.getddmmyyyyslashGMT {
completion(statisticsArray)
}
}
}
healthStore.execute(query)
}
Please note that the code works on all devices except the problematic one. Could you please guide me on the next steps to resolve this issue?
After the iOS 18.2 update, I can no longer see the some website as I could see it on the desktop. ( for example Jeostok . Com)I have tried this in many browsers. For example, I get the same result in browsers such as Safari Chrome Firefox opera and edge. The home page is not displayed properly. It was displaying properly before the update and now it is not displaying properly. This website is my website. I frankly do not know how to solve this problem, if anyone knows, I am waiting for your help.
There is a display issue when browsing wireless networks in the dropdown menu.
In iOS version 18.1.1, the Wi-Fi switch is in the closed state;
Step 1: Open the notification dropdown, and the first image bug appears;
It will take some time for it to display [normally.]
https://www.reddit.com/r/iphone/comments/16vo3l9/ios_17_volume_slider_bug/
Please fix this problem. It still exists in iOS17.7.2. My device is iPhone 15 Pro.
This bug affects all iOS 17 versions and has not been resolved until now
I was online waiting for an online rep from life lock and saw OTP code redacted. Whoever’s hacked my phone does this with all verifications for emails, passwords etc. I don’t know what to to. I don’t know how to deal with this stuff. Someone please help me. My phones been taken over by these people.
When I try to install packages through spm it doesn't find the packages.swift file, and with pods I have problems with Foundation. Trying to install Firebase shows that many pods have issues with double quotes. I don't know if it's my PC using everything updated or what, but I'm having a lot of problems and I haven't been able to reconcile my first application. I don't want to give up, and I would like some advice, since you have managed it, I would appreciate some help. I have followed tutorials, cases of people with similar problems, but it always throws some error.
My pc is mac m1
On the iOS 18 system, we have found that some pages will definitely experience this kind of crash phenomenon when displayed. It's puzzling that I can modify different codes and this kind of crash won't occur on the 18 system. This has had a great impact on my code development, and I don't know if I can still use this API in the future. Can you help me solve this dilemma. thank
when my iPhone15 pro max upgrade to iOS18.1.1,it can not connect to hotPot of my lot device(os android5.1) any more and my iPhone12(iOS 18.1.1) has no issues.
Both the 15 pro max and the iPhone12 works well with another device (OS android 10.0).
had tried:
1.Forget Network (and re-add your desired Wifi network),
2.Reset Network Settings (under Settings/General/Transfer or Reset iPhone)
3.Turn Airplane Mode On then Off after a few seconds
4.Restart the iPhone.
5.Rest all setting
6.Disable VPN
7.close the the settings from rotating my WiFi address
Did anyone have similar issues?
Hi everyone,
I am working on a 3D reconstruction project.
Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone.
One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project.
I am using a AVCaptureSession with the two cameras AVCaptureDevice:
builtInWideAngleCamera
builtInUltraWideCamera
The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this:
Extrinsic Matrix (Ultra-Wide to Wide):
[0.9999968, 0.0008149305, -0.0023960583, 0.0]
[-0.0008256607, 0.9999896, -0.0044807075, 0.0]
[0.002392382, 0.0044826716, 0.99998707, 0.0].
[-14.277955, -8.135408e-10, -0.3359985, 0.0]
The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates).
Has anyone encountered this 'issue' before?
Is there a flaw in my reasoning or something I might be missing?
Any comments are very much appreciated.
I am building a video conferencing app using LiveKit in Flutter and want to implement Picture-in-Picture (PiP) mode on iOS. My goal is to display a view showing the speaker's initials or avatar during PiP mode. I successfully implemented this functionality on Android but am struggling to achieve it on iOS.
I am using a MethodChannel to communicate with the native iOS code. Here's the Flutter-side code:
import 'package:flutter/foundation.dart';
import 'package:flutter/services.dart';
class PipController {
static const _channel = MethodChannel('pip_channel');
static Future<void> startPiP() async {
try {
await _channel.invokeMethod('enterPiP');
} catch (e) {
if (kDebugMode) {
print("Error starting PiP: $e");
}
}
}
static Future<void> stopPiP() async {
try {
await _channel.invokeMethod('exitPiP');
} catch (e) {
if (kDebugMode) {
print("Error stopping PiP: $e");
}
}
}
}
On the iOS side, I am using AVPictureInPictureController. Since it requires an AVPlayerLayer, I had to include a dummy video URL to initialize the AVPlayer. However, this results in the dummy video’s audio playing in the background, but no view is displayed in PiP mode.
Here’s my iOS code:
import Flutter
import UIKit
import AVKit
@main
@objc class AppDelegate: FlutterAppDelegate {
var pipController: AVPictureInPictureController?
var playerLayer: AVPlayerLayer?
override func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
let controller: FlutterViewController = window?.rootViewController as! FlutterViewController
let pipChannel = FlutterMethodChannel(name: "pip_channel", binaryMessenger: controller.binaryMessenger)
pipChannel.setMethodCallHandler { [weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) in
if call.method == "enterPiP" {
self?.startPictureInPicture(result: result)
} else if call.method == "exitPiP" {
self?.stopPictureInPicture(result: result)
} else {
result(FlutterMethodNotImplemented)
}
}
GeneratedPluginRegistrant.register(with: self)
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
}
private func startPictureInPicture(result: @escaping FlutterResult) {
guard AVPictureInPictureController.isPictureInPictureSupported() else {
result(FlutterError(code: "UNSUPPORTED", message: "PiP is not supported on this device.", details: nil))
return
}
// Set up the AVPlayer
let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!)
let playerLayer = AVPlayerLayer(player: player)
self.playerLayer = playerLayer
// Create a dummy view
let dummyView = UIView(frame: CGRect(x: 0, y: 0, width: 1, height: 1))
dummyView.isHidden = true
window?.rootViewController?.view.addSubview(dummyView)
dummyView.layer.addSublayer(playerLayer)
playerLayer.frame = dummyView.bounds
// Initialize PiP Controller
pipController = AVPictureInPictureController(playerLayer: playerLayer)
pipController?.delegate = self
// Start playback and PiP
player.play()
pipController?.startPictureInPicture()
print("Picture-in-Picture started")
result(nil)
}
private func stopPictureInPicture(result: @escaping FlutterResult) {
guard let pipController = pipController, pipController.isPictureInPictureActive else {
result(FlutterError(code: "NOT_ACTIVE", message: "PiP is not currently active.", details: nil))
return
}
pipController.stopPictureInPicture()
playerLayer = nil
self.pipController = nil
result(nil)
}
}
extension AppDelegate: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP started")
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("PiP stopped")
}
}
Questions:
How can I implement PiP mode on iOS without using a video URL (or AVPlayerLayer)?
Is there a way to display a custom UIView (like a speaker’s initials or an avatar) in PiP mode instead of requiring a video?
Why does PiP not display any view, even though the dummy video URL is playing in the background?
I am new to iOS development and would greatly appreciate any guidance or alternative approaches to achieve this functionality. Thank you!
Dear Apple Support Team,
I hope this message finds you well.
I am writing to seek clarification on a specific aspect of Wi-Fi connectivity related to the iPhone 16 series running iOS 18.0. We have encountered an issue where the iPhone 16 series devices fail to connect to Wi-Fi networks, and this failure subsequently affects other devices running iOS 18.0.
To better understand the root cause of this issue, I would like to inquire about the differences in the "authentication and encryption" processes between the iPhone 16 series running iOS 18.0 and other devices running iOS 18.0. Specifically, are there any changes or updates in the Wi-Fi authentication and encryption mechanisms that are unique to the iPhone 16 series?
Understanding these differences will greatly assist us in diagnosing and resolving the connectivity issues we are experiencing.
Thank you for your assistance. I look forward to your prompt response.
Best regards,
WJohn