Hey, making the jump into SwiftUI and hit a hurdle. For my app home screen I want a struct to sift through various states and return the appropriate View based on a condition. In Swift I would usually apply some sort of Manager object to review the data states and return a class that inherits from UIView - but trying to do the same same in SwiftUI is causing me some problems - Example here:
struct HomeScreenViewDirector {
func fetchView() -> View {
if someDataState {
return PopulatedHomeScreen()
} else {
return EmptyView()
}
}
}
struct EmptyHomeScreen: View {
var body: some View {
HStack {
navigationBarTitle("Empty State")
}
}
}
struct PopulatedHomeScreen: View {
var body: some View {
navigationBarTitle("Populated State")
}
}
Attempting to do this generates the error
Protocol 'View' can only be used as a generic constraint because it has Self or associated type requirements
on the -> View return type. From some research it looks like I could change this to some View and they wrap all my returns as AnyView but that feels like it's the wrong approach.
Any tips on how and why to handle this
Cheers!
Post
Replies
Boosts
Views
Activity
Hey I wonder if someone can point me in the right direction.
I am building a Mac Audio App that I need to perform the following actions
Select audio input device
Show a live audio graph of device input (think GarageBand/ Logic)
Capture the audio for replay
Output the sound or make itself selectable as an input device for another app - Similar to how plugins work for GarageBand Logic etc
I have looked into using the following so far:
AudioKit
AVFoundation / AVCaptureDevice
AudioKit
This framework looks amazing and has sample apps for most of the things I want to do, however it seems that it will only accept the audio input that the Mac has selected in settings. This is a non starter for me as I want the user to be able to chose in App (like GarageBand does or Neural Dsp plugins)
Using AudioEngine I can get the available input devices, but everything I have found points to them not being changeable in App - here's code to display them
struct InputDevicePicker: View {
@State var device: Device
var engine: AudioEngine
var body: some View {
Picker("Input: \(device.deviceID)", selection: $device) {
ForEach(getDevices(), id: \.self) {
Text($0.name)
}
}
.pickerStyle(MenuPickerStyle())
.onChange(of: device, perform: setInputDevice)
}
func getDevices() -> [Device] {
AudioEngine.inputDevices.compactMap { $0 }
}
func setInputDevice(to device: Device) {
// set the input device on the AudioEngine
}
}
}
Alternatively
AVFoundation
This has a nice API for listing devices and setting the input, but when it comes to working with the data the delegate provides, I don't have the first clue how I would handle this in terms creating an audio graph and saving the data for replay. Here's the delegate method for reference
extension Recorder: AVCaptureAudioDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
print("Audio data recieved")
// Needs to save audio here
// Needs to play through speakers or other audio source
// Need to show audio graph
}
}
It would be great if someone with experience using these could advise if this is possible on either and where I can look for examples / guidance
Any help pointers, life savers would be appreciated
Thanks if you got this far !