What approach should be used instead in iOS 14?
Post
Replies
Boosts
Views
Activity
Something about the Picker label seems to have changed in iPadOS 15. The following View uses the version of Picker's init() marked for deprecation, but changing the deployment target from 14.5 to 15.0 and using the new version doesn't change the behavior. The first .gif shows the expected behavior that we've been seeing until 15.0. Notice in the second .gif that the label is hidden, and that the frame modifier is ignored. An extra Text view has been added to show that the binding is working as expected.
iPadOS 14.5 target / Xcode 12.5.1
iPadOS 15.0 target / Xcode 13.0 Beta 5
struct OuterView: View {
@State var colorViewModel = ColorViewModel()
var body: some View {
ColorView(colorViewModel: colorViewModel)
}
}
class ColorViewModel: ObservableObject {
@Published var colors = ["yellow", "green", "blue"]
@Published var selectedColor = "Select a color"
}
struct ColorView: View {
@ObservedObject var colorViewModel: ColorViewModel
var body: some View {
VStack {
Picker(selection: $colorViewModel.selectedColor,
label: Text(colorViewModel.selectedColor)
.frame(maxWidth: .infinity, alignment: .leading)
.padding()
) {
ForEach(colorViewModel.colors, id: \.self) { color in
Text(color) }
}
.pickerStyle(MenuPickerStyle())
.border(Color.black, width: 2)
Text(colorViewModel.selectedColor)
}
.padding()
}
}
struct ColorView_Previews: PreviewProvider {
static var previews: some View {
OuterView()
}
}
In the function processLastArData() a command buffer is committed and the output of the last MPS is immediately assigned without issuing a waitUntilCompleted() on the buffer. What am I missing?
https://developer.apple.com/documentation/arkit/environmental_analysis/displaying_a_point_cloud_using_scene_depth?language=objc
M1 iPad Pro with iPadOS 16 Beta 3
Xcode 14.0 beta 3
In a freshly created Xcode 14 beta 2 app using the Augmented Reality App template with Content Technology set to Metal, ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing returns a 60 fps, 1920 x 1440 video format.
So, session.captureHighResolutionFrame fails to deliver a high res frame.
Seeing a flurry of these in Xcode 15 Beta 8 while debugging an ARKit app
<<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)