Can anyone tell me whether it is possible to write a usb driver for a third party usb camera or will Apple only allow the hardware manufacturer to develop drivers ?And if Apple are OK with that is there anything legal that might prevent one from doing so ? And finally are the hardware interfaces proprietary or are they likely to be using standard interfacing hardware ?I am currently trying to contact the manufacturers but that may take a while.
Post
Replies
Boosts
Views
Activity
How do you select items in a grid view ? There appears to be no equivalent to List(selection:)
Anyone know if it is possible to select items in the new GridView ? Is there a List(selection:) equivalent API?
The following code seems to fail to list anything in the list view
var body: some View {
List(selection: $controller.selectedObject) {
ForEach(self.controller.thumbNails, id:\.self) { object in
Group {
if hasImage {
Text("Image")
} else {
Text("No Image")
}
}
}
}
I assume this isa bug because it seems to work under Xcode 11.5
Is it possible to use the tracking separator on macOS with SwiftUI?
Seems hard to believe that this could be so difficult - is the standard AppKit option of something like a square button and type 'toggle' not available under swiftUI or is there a modifier somewhere that can be used to achieve the same thing.
Any reason the .toolbar modifier doesn't work in macOS 11 beta? Is there some global setting required in the app to enable this as no toolbar is being shown at all.
I am trying to get the same toolbar item behaviours as Xcode 12 and Safari but nothing I have tried seems to produce the same size icons or the mouseover background. Is there a step by step guide that explains exactly what the procedures is and what types of toolbar items are supposed to be used. I have tried Toolbar items with images and custom views with different types of buttons but to no avail. It's hard to believe it can be this hard and the WWDC 2020 talk about adopting the new look being a few simple steps if any seems completely wrong.
How are you supposed to navigate these forums ? There seems to be no way to find you previous posts or any replies to them ?
Hi, is there any way to connect the DTK to the LG 5K display other than USB-C which appears to be at lower resolution.
I am trying to implement a image editing application using Core Image that allows the user to apply filters and effects much like Photos does. I can't find any current examples that how how to render an image in MTKView and that allow realtime view of changes to filter parameters. All the examples seems to run on the main thread which results in everything blocking and a rather poor user experience. Apple's Photo's app (macOS) allows you to drag the sliders and see the result in realtime.
I found some articles talking about calling the metal draw(in:) API from background threads and managed to implement something that appears to work. Is there any sample code showing how to use metal to provide realtime filtering of images with Core Image.
Thanks
I guess Apple won't respond to this but does anyone know what the timeline might be for Apple to provide support for the Sony Alpha RAW files.
Or what is the typical timeframe to release an update to provide this support ?
I have an application that uses a Transformable property in Core Data to store NSAttributedStrings and get compiler warnings
Object.property is using a nil or insecure value transformer. Please switch to NSSecureUnarchiveFromDataTransformerName or a custom NSValueTransformer subclass of NSSecureUnarchiveFromDataTransformer [2]
NSSecureUnarchiveFromDataTransformerName does not support archiving and unarchiving of NSAttributedStrings and so as I understand it I have to create a custom transformer, register that in AppDelegate and enter the transformer class name in the Core Data models object property details.
Below is the custom transformer class, however I get an error when trying to decode existing attributed strings. Can anyone shed any light on this ? Why is the new unarchiver unable to handle the NSFileWrapper given this is a property of the NSTextAttachment and works fine with the deprecated unarchiver ?
Is this a bug or intentional ?
Is there some way to add support for unarchiving the NSFileWrapper ?
@implementation NSAttributedStringValueTransformer(Class)transformedValueClass {
return [NSAttributedString class];
}(void)initialize {
[NSValueTransformer setValueTransformer:[[self alloc]init] forName:@"NSAttributedStringValueTransformer"];
}
(BOOL)allowsReverseTransformation {
return YES;
}
(NSData*)transformedValue:(NSAttributedString*)value {
NSError *error;
NSData* stringAsData = [NSKeyedArchiver archivedDataWithRootObject:value requiringSecureCoding:false error:&error];
if (error != nil) {
NSLog(@"Error encoding attributed string: %@", error.localizedDescription);
return nil;
}
return stringAsData;
}
(NSAttributedString*)reverseTransformedValue:(NSData*)value {
NSError *error;
/* This works */
/* NSAttributedString* string = [NSKeyedUnarchiver unarchiveObjectWithData: value]; /
/* This fails with the error The data couldn’t be read because it isn’t in the correct format. */
NSAttributedString* string = [NSKeyedUnarchiver unarchivedObjectOfClass:[NSAttributedString class] fromData:value error:&error];
if (error != nil) {
NSLog(@"Error decoding attributed string: %@", error);
return nil;
}
return string;
}
@end
Resulting Error:
Error decoding attributed string:
[Error](https://developer.apple.com/forums/content/attachment/547d2a08-9220-42aa-9d29-1b3129feb864){: .log-attachment}
Error Domain=NSCocoaErrorDomain Code=4864 "value for key 'NSFileWrapper' was of unexpected class 'NSFileWrapper (0x1c9d40d48) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/Foundation.framework]'. Allowed classes are '{(
"NSURL (0x1c9d1b988) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSAttributedString (0x1c9d36668) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/Foundation.framework]",
"NSFont (0x1c9e1a3c8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
"NSDictionary (0x1c9d1adf8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSArray (0x1c9d1ab28) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"NSColor (0x1c9ec2198) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIKitCore.framework]",
"NSTextAttachment (0x1c9e1aad0) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
"NSGlyphInfo (0x1c9e198d8) [/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIFoundation.framework]",
...
I would like to get arrays of red, green and blue histogram data from the output of the CIAreaHistogramFilter. My current approach is not working.
According to the docs CIAreaHistogramFilter returns an image with width = bin size (256) in my case and height = 1
so each pixel contains the count of the rgb values for that bin.
if let areahistogram = self.areaHistogramFilter(ciImage) {
let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
var pixelBuffer : CVPixelBuffer?
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(areahistogram.extent.size.width), Int(areahistogram.extent.size.height), kCVPixelFormatType_32ARGB, attrs, &pixelBuffer)
guard (status == kCVReturnSuccess) else {
return
}
self.hContext.render(areahistogram, to: pixelBuffer!)
CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0));
let int32Buffer = unsafeBitCast(CVPixelBufferGetBaseAddress(pixelBuffer!), to: UnsafeMutablePointerUInt32.self)
let int32PerRow = CVPixelBufferGetBytesPerRow(pixelBuffer!)
var data = [Int]()
for i in 0..256 {
/* Get BGRA value for pixels */
let BGRA = int32Buffer[i]
data.append(Int(BGRA))
let red = (BGRA 16) & 0xFF;
let green = (BGRA 8) & 0xFF;
let blue = BGRA & 0xFF;
os_log("data[\(i)]:\(BGRA) red: \(red) green: \(green) blue: \(blue)")
}
CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
}
results in zeros.
data[0]:0 red: 0 green: 0 blue: 0
data[1]:0 red: 0 green: 0 blue: 0
...
data[255]:134678783 red: 7 green: 7 blue: 7
similarly produces a bunch of zeros
or this
let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer!)
let buffer = baseAddress!.assumingMemoryBound(to: UInt8.self)
for i in stride(from: 0, to: 256*4, by: 4) {
let blue = buffer[i]
let green = buffer[i+1]
let red = buffer[i+2]
os_log("data[\(i)]: red: \(red) green: \(green) blue: \(blue)")
}
or this variation that seems simpler
var red = [UInt8]()
var green = [UInt8]()
var blue = [UInt8]()
for i in 0..256 {
// Get BGRA value for pixel
let BGRA = int32Buffer[i]
withUnsafeBytes(of: BGRA.bigEndian) {
red.append($0[0])
green.append($0[1])
blue.append($0[2])
}
}
Is it possible to change the drag and drop preview image. I am currently using the following code but the imagePreviewHandler never gets called and the image is always just a rendering of the visible portion of the gridView
...
grid
.drag(if: isDraggable, data: {
return self.dragData()
})
...
func dragData()-NSItemProvider{
let itemProvider = NSItemProvider(object: fileService )
itemProvider.previewImageHandler = { (handler, _, _) - Void in
os_log("previewImageHandler called")
if let image = NSImage(named: "film") {
handler?(image as NSSecureCoding?, nil)
} else {
let error = NSError(domain:"", code:001, userInfo:[ NSLocalizedDescriptionKey: "Unable to create preview image"])
handler?(nil, error)
}
}
return itemProvider
}
struct Draggable: ViewModifier {
let condition: Bool
let data: () - NSItemProvider
@ViewBuilder
func body(content: Content) - some View {
if condition {
content.onDrag(data)
} else {
content
}
}
}
extension View {
public func drag(if condition: Bool, data: @escaping () - NSItemProvider) - some View {
self.modifier(Draggable(condition: condition, data: data))
}
}