I’m trying to build a detail view screen where a user can edit the contents of an object and save it, but I’m struggling to figure out how best to initialize the view.
I’m passing in a “MailFilter” object (which I decoded from a web API) to this view (from a List item’s NavigationLink), and am creating state for each of the fields which need to be editable, such that I can use them with a SwiftUI form. I have a custom initializer to set these parameters (and make sure they aren’t nil—as they can be in the actual filter object).
struct FilterView: View {
var filter: MailFilter
@State var from: String
@State var to: String
init(_ filter: MailFilter) {
self.filter = filter
from = filter.criteria.from ?? ""
to = filter.criteria.to ?? ""
}
var body: some View {
Form {
Section(header: Text("From")) {
TextField("From", text: $from)
}
Section(header: Text("To")) {
TextField("To", text: $to)
}
}
}
}
However, this approach doesn’t seem to work. I’m given errors that I’m trying to use self. before initializing the variables—in the initializer! Is there a different approach I should be taking to get my object’s data into this detail view?
Post
Replies
Boosts
Views
Activity
I've been looking to use some filters on my live camera feed, and have been following guidance of the AVCamFilter sample project - https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avcamfilter_applying_filters_to_a_capture_stream to use a MTKView to render these live.
Within this project, there is some code in PreviewMetalView.swift (in setupTransform()) which scales the X and Y dimensions of the preview down, such that the entire preview is visible on screen:
				if textureWidth > 0 && textureHeight > 0 {
						switch textureRotation {
						case .rotate0Degrees, .rotate180Degrees:
								scaleX = Float(internalBounds.width / CGFloat(textureWidth))
								scaleY = Float(internalBounds.height / CGFloat(textureHeight))
								
						case .rotate90Degrees, .rotate270Degrees:
								scaleX = Float(internalBounds.width / CGFloat(textureHeight))
								scaleY = Float(internalBounds.height / CGFloat(textureWidth))
						}
				}
				// Resize aspect ratio.
				resizeAspect = min(scaleX, scaleY)
				if scaleX < scaleY {
						scaleY = scaleX / scaleY
						scaleX = 1.0
				} else {
						scaleX = scaleY / scaleX
						scaleY = 1.0
				}
I am, instead, trying to scale the image such that it is zoomed in, filling the view ("Aspect Fill", think like the Snapchat camera view), but, as I'm not familiar with scaling textures for MetalKit to render, I'm not entirely sure how I can scale this such that part of the preview goes beyond the view bounds. What is the best approach to this (I'm imagine this is just some simply multiplication that is alluding me, but would love any help I can get)