Post

Replies

Boosts

Views

Activity

13b2 ignoring CF_RETURNS_NOT_RETAINED?
I just downloaded and tried Xcode 13b2 on our iOS project. I've been using b1 with a surprising amount of success, switching to 12.5.1 for builds. But now 13b2 seems to have an issue. In our AVCapture code, you get handed an AVCapturePhoto, and it has methods that return CF_RETURNS_NOT_RETAINED/Unmanaged. If you try to .takeUnretainedValue() on these, the compiler complains that CGImage has no such method. It appears to be ignoring the Obj-C directive. I'm also unable to view the generated Swift file for that Obj-C header. I get "Couldn't generate Swift Representation" "Error (from SourceKit): Could not load the stdlib module". Anyone else run into this? I filed FB9211460 about it.
1
0
1.1k
Jun ’21
Xcode 13 templates don't add built products to navigator
I just created a new macOS app project in Xcode 13 beta. SwiftUI, unit tests, no Core Data. I just noticed that it does not add the traditional "Products/MyApp.app" group and file, and I can't figure out how to add those via the UI. At least one solution online edits the pbxproj file directly; I'd rather not do that. Is there any sanctioned way to add this? Is not having it a bug?
0
0
540
Jun ’21
SwiftUI previews fail with
As in the locked question here - https://developer.apple.com/forums/thread/674534, I'm constantly running into this error: Compiling failed: 'main' attribute cannot be used in a module that contains top-level code Once it starts, it's not clear how to stop it. No changes were made to my AppDelegate (It's a mostly-UIKit app that I'm adding some SwiftUI views to). It compiles just fine with a regular build, but the SwiftUI preview can't build it. This is proving to be a real hindrance. I can sometimes clear the condition by cleaning build and test results, and relaunching Xcode. But not always. I filed FB9104575 and included the diagnostics.
0
0
539
May ’21
Coalescing @Published changes?
I've got the following code that updates a @Published var messages: OrderedSetMessage property: swift public func add(messages inMsgs: [IncomingMessage]) { for msg in inMsgs { let msg = Message(fromIncoming: msg, user: user) self.messages.append(msg) } self.messages.sort() } In my SwiftUI view, however, .onChanged(self.stream.messages) gets called three times each time a single message is added. I tried operating on a local copy of self.messages, and then just setting self.messages = local, but that didn't change anything. Maybe the issue is on the SwftUI side? In any case, how are published updates to a property coalesced?
4
0
1.3k
May ’21
SwiftUI macOS app show new document on launch
As of 11.3, DocumentGroup defaults to showing the open panel (From the release notes: "DocumentGroup apps now show an Open panel on launch, even when iCloud isn’t in use. (66446310).") Seems like it was considered a bug before that it didn't. Thing is, I don't like this behavior and don't want it, especially while I'm working on my app. I want to automatically create a new document. Is there any way to set that?
0
0
410
Apr ’21
SwiftUI macOS: tracking the mouse in a view?
I've got macOS SwiftUI app that displays an image, currently using Image. I need to display information about the pixel under the mouse pointer (its position, color, etc.) in some text fields at the bottom of the window. I can't find an appropriate event handler to attach to Image. Traditionally I would have used mouseEntered, mouseExited, and mouseMoved. These are available for an NSHostingView, but that's for wrapping native views. I found onHover(), which takes the place of mouseEntered and mouseExited, but it neither works (the perform method is never called for me), nor does it provide movement and position.
4
2
6.4k
Apr ’21
Adjusting L16 pixel format values in custom CIFilter/CIKernel
Is there documentation describing the semantics of a Metal CIKernel function? I have image data where each pixel is a signed 16-bit integer. I need to convert that into any number of color values, starting with a simple shift from signed to unsigned (e.g. the data in one image ranges from about -8,000 to +20,000, and I want to simply add 8,000 to each pixel's value). I've got a basic filter working, but it treats the pixel values as floating point, I think. I've tried using both sample_t and sample_h types in my kernel, and simple arithmetic: extern "C" coreimage::sample_h heightShader(coreimage::sample_h inS, coreimage::destination inDest) { coreimage::sample_h r = inS + 0.1; return r; } This has an effect, but I don't really know what's in inS. Is it a vector of four float16? What are the minimum and maximum values? They seem to be clamped to 1.0 (and perhaps -1.0). Well, I’ve told CI that my input image is CIFormat.L16, which is 16-bit luminance, so I imagine it's interpreting the bits as unsigned? Anyway, where is this documented, if anywhere (the correspondence between input image pixel format and the actual values that get passed to a filter kernel)? Is there a type that lets me work on the integer values? This document - https://developer.apple.com/metal/MetalCIKLReference6.pdf implies that I can only work with floating-point values. But it doesn't tell me how they're mapped. Any help would be appreciated. Thanks.
0
0
580
Apr ’21
Best practices for displaying very large images (macOS SwiftUI)
I’m writing an app that, among other things, displays very large images (e.g. 106,694 x 53,347 pixels). These are GeoTIFF images, in this case containing digital elevation data for a whole planet. I will eventually need to be able to draw polygons on the displayed image. There was a time when one would use CATiledLayer, but I wonder what is best today. I started this app in Swift/Cocoa, but I'm toying with the idea of starting over in SwiftUI (my biggest hesitation is that I have yet to upgrade to Big Sur). The image data I have is in strips, with an integral number of image rows per strip. Strips are not guaranteed to be contiguous in the file. Pixel formats vary, but in the motivating use case are 16 bits per pixel, with the values signifying meters. As a first approximation, I can simply display these values in a 16 bpp grayscale image. Is the right thing to do to set up a CoreImage pipeline? As I understand it that should give me some automatic memory management, right? I’m hoping to find out the best approach before I spend a lot of time going down the wrong path.
0
0
610
Apr ’21
Xcode misses breakpoints
I have some code calling this method: mutating func readOffset() UInt64 { let offset: UInt64 debugLog("readOffset") switch (self.formatVersion) { case .v42: let offset32: UInt32 = self.reader.get() offset = UInt64(offset32) case .v43: offset = self.reader.get() } return offset } If I put a breakpoint on the switch statement, Xcode never stops there, and if the debugLog() call is commented out, I can't even step into the function at the call site; it just runs to the next breakpoint in my code, wherever that happens to be. If I put the breakpoint on debugLog(), it stops at the breakpoint. If I put breakpoints at the self.reader.get() calls, it stops at those breakpoints AND I can step into it. This is a unit test targeting macOS, and optimization is -Onone. Xcode 12.4 (12D4e) on Catalina 10.15.7 (19H524).
2
0
692
Apr ’21
Implementing a complex vector art app in SwiftUI (macOS)?
For years I've poked at a little personal project, an electronic schematic capture app. It's basically a specialized version of something like Illustrator or Omnigraffle, in that you create graphical objects from primitives, instantiate them onto the canvas, and connect them with polylines. I'm very new to SwiftUI, but I'm wondering if it makes sense to build a new custom view to handle drawing this canvas as a "native" SwiftUI view. I know it's possible to wrap NSViews in SwiftUI, but if SwiftUI can handle it, I'd like to just reimplement it. There are a number of requirements that complicate things: This view lives inside a scroll view (or at least, it has bounds that usually extend beyond the window). The view contains custom graphics and text. Some graphical elements span large portions of the canvas (e.g. the poly lines connecting components). The number of individual elements can be quite high (performance concerns). Quadtrees are often used to help with this. It zooms Marquee-selection Mouse down, drag, and up changes the model in significant and varied ways. Hovering can change appearance of some items. Can SwiftUI handle all this? I tried to find an example or documentation, but was not having much luck. Almost everything is iOS-focused, so precise and nuanced mouse handling is uncommon.
0
0
411
Jan ’21
Encoding UTF-16LE character for USB Product String/Display in USBProber
I'm developing a little USB device for use with macOS, and the name includes the non-ASCII character ū: LATIN SMALL LETTER U WITH MACRON Unicode: U+016B, UTF-8: C5 AB My source file is UTF-8 encoded, but as I understand it, USB uses UTF-16LE encoding for all its strings. GCC (which I'm using to compile the code for the device) doesn't implement the \u unicode point escape. So I tried "productname\xc5\xab", which causes USB Prober to report the Product String as "productname\u016b". Is that just USB Prober not properly rendering the string? Or am I still not encoding it correctly?
0
0
748
Jan ’21
Sandbox issues opening SQLite associated/sidecar files
I’ve been having a heckuva time getting macOS (Catalina) to let my app open the associated -wal and -shm files. Googling for answers, it seems macOS should already know about these, but if not, I can create NSIsRelatedItemType additions. But that didn't seem to work for me: Is there something more I need to do? If I put the main SQLite file to open inside the app's container, then SQLite can open the associated files just fine. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>CFBundleDevelopmentRegion</key> <string>$(DEVELOPMENT_LANGUAGE)</string> <key>CFBundleDocumentTypes</key> <array> <dict> <key>CFBundleTypeName</key> <string>SQLiteDocument</string> <key>CFBundleTypeRole</key> <string>Editor</string> <key>LSHandlerRank</key> <string>Default</string> <key>LSItemContentTypes</key> <array> <string>org.sqlite.sqlite3</string> </array> <key>NSDocumentClass</key> <string>$(PRODUCT_MODULE_NAME).Document</string> </dict> &#9;&#9;<dict> &#9;&#9;&#9;&#9;<key>CFBundleTypeExtensions</key> &#9;&#9;&#9;&#9;<array> &#9;&#9;&#9;&#9;&#9;&#9;<string>sqlite-shm</string> &#9;&#9;&#9;&#9;&#9;&#9;<string>sqlite-wal</string> &#9;&#9;&#9;&#9;&#9;&#9;<string>sqlite-journal</string> &#9;&#9;&#9;&#9;</array> &#9;&#9;&#9;&#9;<key>CFBundleTypeName</key> &#9;&#9;&#9;&#9;<string>Support Type</string> &#9;&#9;&#9;&#9;<key>CFBundleTypeRole</key> &#9;&#9;&#9;&#9;<string>Editor</string> &#9;&#9;&#9;&#9;<key>NSIsRelatedItemType</key> &#9;&#9;&#9;&#9;<true/> &#9;&#9;</dict> </array> <key>CFBundleExecutable</key> <string>$(EXECUTABLE_NAME)</string> <key>CFBundleIconFile</key> <string></string> <key>CFBundleIdentifier</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>CFBundleInfoDictionaryVersion</key> <string>6.0</string> <key>CFBundleName</key> <string>$(PRODUCT_NAME)</string> <key>CFBundlePackageType</key> <string>$(PRODUCT_BUNDLE_PACKAGE_TYPE)</string> <key>CFBundleShortVersionString</key> <string>1.0</string> <key>CFBundleVersion</key> <string>1</string> <key>LSMinimumSystemVersion</key> <string>$(MACOSX_DEPLOYMENT_TARGET)</string> <key>NSMainStoryboardFile</key> <string>Main</string> <key>NSPrincipalClass</key> <string>NSApplication</string> <key>UTExportedTypeDeclarations</key> <array> <dict> <key>UTTypeConformsTo</key> <array> <string>public.database</string> <string>public.data</string> </array> <key>UTTypeDescription</key> <string>SQLite3</string> <key>UTTypeIcons</key> <dict/> <key>UTTypeIdentifier</key> <string>org.sqlite.sqlite3</string> <key>UTTypeTagSpecification</key> <dict> <key>public.filename-extension</key> <array> <string>sqlite</string> </array> </dict> </dict> </array> <key>UTImportedTypeDeclarations</key> <array> <dict> <key>UTTypeConformsTo</key> <array> <string>public.database</string> <string>public.data</string> </array> <key>UTTypeDescription</key> <string>SQLite3</string> <key>UTTypeIcons</key> <dict/> <key>UTTypeIdentifier</key> <string>org.sqlite.sqlite3</string> <key>UTTypeTagSpecification</key> <dict> <key>public.filename-extension</key> <array> <string>sqlite</string> </array> </dict> </dict> </array> </dict> </plist>
0
0
476
Jan ’21
CloudKit subscriptions in custom keyboard extension?
I'm developing a custom keyboard, and I would like certain data to sync across the user's devices. In the containing app, that's straightforward, and I already have my keyboard and containing app sharing data. But if the user makes a change on a device while the custom keyboard is displayed on another device, I don't see any straightforward way of letting that custom keyboard know, so it can update its display accordingly.I think I can have the container app get updated, and then possibly inform the custom keyboard via the filesystem, but this is exceedingly clunky.Since CloudKit subscriptions require responding to push notifications, I'm not sure there's a better way, is there?
0
0
364
Apr ’20
Creating an AVCapture device on macOS?
I'd like to make a macOS app that composites one or more video camera streams into an output stream that can then be used by another app as a capture device for an AVCaptureSession. Is this possible? Looking through AVFoundation and Core Media docs, there's nothing obvious. I realize I may need to create a driver, that's fine, but I don't really know where to start.
0
0
431
Mar ’20