Post

Replies

Boosts

Views

Activity

Too many empty "required" UIView.init(coder:) methods
Hi,I have a lot of UIViews where the compiler forces me to add an init(coder:) initializer, like this:class FooView : UIView /* or a UIView subclass */ { ... required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } ... }It claims it's required but my program runs fine without it. I do not create the views from an archive.This makes me wonder if something is wrong here with the design of the library, or the concept of a 'required' initializer. What do people think? Does it make sense or is this a wart? If so, can it be fixed?Rob
5
1
2.5k
Mar ’17
Custom UIView over specific text in WKWebView?
Hi,Is there a way to embed a UIView inside the HTML laid out by WKWebView? Or, get the exact screen coordinates of an element, so that an overlaid sibling UIView could appear to be part of the HTML? I've got some rich text that I'm currently rendering with the WKWebView, and I'd like to create an animation around some of the text, using a UIView, not Javascript/HTML. I may also try UITextView, since the text is pretty simple.Rob
1
0
679
Mar ’18
HelloTriangle in Swift: error about depth attachment pixel format
Hi,I'm trying to port this to Swift. https://developer.apple.com/documentation/metal/hello_triangleTo get it to run I had to add the following line to the renderer class. Any idea why it's not needed in the Objective-C version? pipelineStateDescriptor.depthAttachmentPixelFormat = metalView.depthStencilPixelFormatWithout it I get an error: -[MTLDebugRenderCommandEncoder validateFramebufferWithRenderPipelineState:]:1232: failed assertion `For depth attachment, the render pipeline's pixelFormat (MTLPixelFormatInvalid) does not match the framebuffer's pixelFormat (MTLPixelFormatDepth32Float).'Rob
3
0
5.3k
Mar ’18
How to save and load MTLTexture, getting back same RGB values?
Hi,I need to save and load metal textures to a file. Example code below is below. I'm noticing the RGB values are changing as it gets saved and reloaded again.metal texture pixel: RGBA: 42,79,12,95after save and reload: 66,88,37,95I thought it might be a colorspace issue, but the colorspaces I’ve tried all had the same problem. `genericRGBLinear` got close, but there’s got to be a way to save the RGB data and get it back exactly. ?thanks,RobCode:// saving... let ciCtx = CIContext() let ciImage = CIImage(mtlTexture: metalTexture, options: [:]) [ … transfrom to flip y-coordinate …] let colorSpace = CGColorSpaceCreateDeviceRGB() let cgImage = ciCtx.createCGImage(ciImage, from: fullRect, format: kCIFormatRGBA8, colorSpace: colorSpace)! let imageDest = CGImageDestinationCreateWithData(mData, kUTTypePNG, 1, nil)! CGImageDestinationAddImage(imageDest, cgImage, nil) CGImageDestinationFinalize(imageDest) // loading... let src = CGImageSourceCreateWithData(imgData, nil) let img = CGImageSourceCreateImageAtIndex(src, 0, nil) let loader = MTKTextureLoader(device: self.metalDevice) let texture = try! loader.newTexture(cgImage: img, options: [:])
8
1
6.2k
Apr ’18
Upgrading app's UIDocument format from Data to FileWrapper?
Hi,I had a document-based iOS app working, but want to change it so it saves to a package. Seems like it's better when big chunks of a file may not be changing. In Xcode, under the Target > Info > Exported UTI > Conforms To, I had: "public.data, public.content". If I change that to "com.apple.package", then I can't open my old files to upgrade them. But if I *add* "com.apple.package", then the app opens both kinds as desired. I wonder if having it conform to all three of those types is going to cause other problems.Rob
3
0
1.3k
Jul ’18
iCloud appears to have destroyed my local Documents and Desktop
[edited down from my previous long-winded post]I’m using latest Catalina beta (7).I tried to turn on “Documents and Desktop” under System Preferences > Internet Accounts > iCloud options. It seems to have failed, and the checkbox does not stay on. It also seems to have deleted all my local files from ~/Documents and ~/Desktop, without actually moving them to iCloud. The files in iCloud drive have cloud icons and can’t be opened.Are the files in some hidden staging area? I can’t believe the system would delete them for real without successfully uploading them to iCloud first.If not, I hope my Time Machine backup works…
5
0
2.7k
Aug ’19
UI Test localization screenshots fail - they're between screens
I'd like to use the new localization screenshots and test plans feature to take screenshots in different languages, for the app store. I end up with images that are half one screen and half another, like it's some kind of timing issue. My test code is below. Is it missing something that would give it the right timing on the screenshots?I wrote a UI test and set "Localization Screenshots" to "On" in the test plan's settings. The UI test walks through a few screens and the resulting test report has a few image files attached labeled "Localization screenshot". Some images are are split so that the left side is one view controller and the right side is another, like it captured a push navigation transition. Another image has two overlaid screens, each half faded.I'm running in the simulator. My test code looks like: func testTakeScreenshots() { let app = XCUIApplication() app.launch() // At workouts page app.tables["workouts"].cells.element(boundBy: 0).tap() // At first workout app.navigationBars.buttons["edit"].tap() // At workout edit screen, click first exercise app.tables.cells.element(boundBy: 0).tap() ... }
1
0
1.2k
Sep ’19
iOS 13, iPad Pro now says hardware does not support read-write texture?
I have some code that used to run on my iPad Pro. Today I compiled it for iOS 13, with Xcode 11, and I get errors like this: validateComputeFunctionArguments:834: failed assertion `Compute Function(merge_layer): Shader uses texture(outTexture[1]) as read-write, but hardware does not support read-write texture of this pixel format.'The pixel format is showing as `MTLPixelFormatBGRA8Unorm`. That's what I expected.The debugger says the device has no support for writeable textures. (lldb) p device.readWriteTextureSupport (MTLReadWriteTextureTier) $R25 = tierNoneDid some devices lose support for texture writing in iOS 13?Rob
12
0
4.7k
Oct ’19
Simple layout still fails
I'm surprised this simple code still doesn't work on iOS 13.3 / Xcode 11.3. On my iPhone SE it's almost all off screen.It's just two pieces of text, side by side, and two pickers, side by side. Anyone know a workaround? struct ContentView: View { @State var choice: Int = 10 @State var choice2: Int = 10 var body: some View { return VStack { HStack { Text("Some text here") Spacer() Text("Foo baz") } HStack { Picker(selection: self.$choice, label: Text("C1")) { ForEach(0..<10) { n in Text("\(n) a").tag(n) } } Picker(selection: self.$choice2, label: Text("C2")) { ForEach(0..<10) { n in Text("\(n) b").tag(n) } } } } } }
4
1
3.6k
Dec ’19
How does SwiftUI update if objectWillChange fires *before* change
I'm wondering how SwiftUI updates work in connection with ObservableObjects. If a SwiftUI View depends on an `ObservableObject`, the object's `objectWillChange` publisher fires, and SwiftUI learns about the change, before the change happens. At this point, SwiftUI can't re-render a View, right? Because the new properties aren't there yet. So what does SwiftUI do? Does it schedule the change for later? That doesn't make sense either - how can it know when the object will be ready to be used in a new rendering of the UI? ~ Rob
3
0
4.7k
Jan ’20
Header docs have more than Xcode doc viewer - a bug?
As an example, I wanted to know how to init a `Slider`. I know two ways of finding documentation, but they give different results. Is that a bug?1. Go to Xcode's Help menu > Developer Documentation, and search for "Slider". The docs for `init` do not describe the parameters.2. Control click on `Slider` in the code editor window, then "Jump to Definition", and I see the `Slider` struct and some extensions. There *are* documentation comments describing the parameters to `init`, like the `onEditingChanged` parameter.(Xcode version 11.3.1 (11C504))
2
0
440
Jan ’20
popover, smaller size?
I'm showing a popover with a short list of items to choose from. I'd like it to be pretty small, but I can't seem to control the size. It's too big, with a lot of extra space around its content. Anyone know how to control the popover size? VStack { Button(action: { self.showPickerPopup = true }) { Text(pickerTypeButtonText) }.popover(isPresented: $showPickerPopup) { List(self.choosablePickerTypes) { ptype in Text(ptype.displayName).onTapGesture { print("picked \(ptype)") self.showPickerPopup = false } }.frame(minWidth: 0.0, minHeight: 0.0) .frame(width: 200, height:300) // ^^^-- these frame calls are ineffective } Spacer() }
0
0
749
Jan ’20
Can't use protocols with SwiftUI models?
I've been using protocols to help model a hierarchy of different object types. As I try to convert my app to use SwiftUI, I'm finding that protocols don't work with the ObservableObject that you need for SwiftUI models. I wonder if there are some techniques to get around this, or if people are just giving up on "protocol oriented programming" when describing their SwftUI models? There is example code below. The main problem is that it seems impossible to have a View that with an model of protocol `P1` that conditionally shows a subview with more properties if that model also conforms to protocol `P2`.For example, I'm creating a drawing/painting app, so I have "Markers" which draw on the canvas. Markers have different properties like color, size, shape, ability to work with gradients. Modeling these properties with protocols seems ideal. You're not restricted with a single inheritance class hierarchy. But there is no way to test and down-cast the protocol...protocol Marker : ObservableObject { var name: String { get set } } protocol MarkerWithSize: Marker { var size: Float { get set } } class BasicMarker : MarkerWithSize { init() {} @Published var name: String = "test" @Published var size: Float = 1.0 } struct ContentView<MT: Marker>: View { @ObservedObject var marker: MT var body: some View { VStack { Text("Marker name: \(marker.name)") if marker is MarkerWithSize { // This next line fails // Error: Protocol type 'MarkerWithSize' cannot conform to 'MarkerWithSize' // because only concrete types can conform to protocols MarkerWithSizeSection(marker: marker as! MarkerWithSize) } } } } struct MarkerWithSizeSection<M: MarkerWithSize>: View { @ObservedObject var marker: M var body: some View { VStack { Text("Size: \(marker.size)") Slider(value: $marker.size, in: 1...50) } } }Thoughts?
3
1
8.8k
Feb ’20
onTapGesture is working for finger, but not Pencil
Here's a weird one. Do I need to do something different to handle Pencil taps, or is this a bug?The following code shows a standard `Button`, and a few of custom Views called `SelectButtons`. The `Button` registers a tap with a Pencil or finger, as expected. The `SelectButtons` only work with my finger, on my iPad Pro. The Pencil taps don't register. struct ContentView: View { let nums = [1,2,3] @State var selected: Int = 1 var body: some View { VStack { Button(action: { print("Test button tapped")}) { Text("Test button") } ForEach(nums, id: \.self) { num in HStack { SelectButton(num: num, current: self.$selected) Text("Number \(num)") } } } } } struct SelectButton: View { let num: Int @Binding var current: Int var isSelected: Bool { current == num } var body: some View { ZStack { Circle().inset(by: 5).stroke() Circle().inset(by: 7).fill(isSelected ? Color.blue : Color.clear) }.frame(width:40, height:40) .onTapGesture { print("SelectButton.onTapGesture \(self.num)") self.current = self.num } } }
2
0
870
Feb ’20
Where to put "adapter" model object, created by View?
Maybe this is too abstract or vague, but I'm wondering if the pattern is familiar to anyone.I have a color gradient editor view with a color picker subview that edits the color of the currently selected gradient stop. The design that makes sense to me is to have the gradient editor create a model object for it's color picker, a sort of adapter model that presents the current stop to that picker. (Color pickers take an ObservableObject model that conforms to a certain protocol.) But I can't seem to make this work in SwiftUI. It seems you can only have two kinds of model/state: global objects passed in to the root of the View hierarchy as @ObservedObject or @EnvironmentObject, or simple value types wrapped in @State.
0
0
282
Feb ’20