Post

Replies

Boosts

Views

Activity

Improve display grid of SwiftUI Charts
Hello all! I'm implementing a view that shows a grid of different histograms to show a final report to the user. It could have ~100 rows and ~10 columns. I'm using a LazyVGrid and it looks like: Note: the example contains only 3 rows. It takes less than a second to render the grid, but you can feel the app is blocked for a few ms. I was wonder if some of you know how to render asynchronously the chart views so, at least, not block the interface. Thanks!
0
0
500
Nov ’23
NavigationSplitView + FetchResults
Hello, I'm using a NavigationSplitView and feeding the sidebar using FetchResults. In the detail view, the user can update the field name of the entity and then, the sidebar becomes useless, selecting random rows and hiding some of them, like these pictures: I could reproduce the issue using a sample project. If you want to have the project to test it, create a simple project on Xcode: iOS app / +Use Core Data & +Host in CloudKit Then, add a field "name String?" in the Entity and use this code in ContentView.swift: import SwiftUI import CoreData extension Binding { public func replaceNil<T>(_ defaultValue: T) -> Binding<T> where Value == Optional<T> { return .init { wrappedValue ?? defaultValue } set: { wrappedValue = $0 } } } struct Header: View { @ObservedObject var item: Item var body: some View { VStack { if let name = item.name, !name.isEmpty { Text(name) } else { Text("---") .foregroundColor(.secondary) } } } } struct ItemDetail: View { @ObservedObject var item: Item var body: some View { VStack { Spacer() if let timestamp = item.timestamp { Text(timestamp, formatter: itemFormatter) } else { Text("---") .foregroundColor(.secondary) } Button("Update") { item.timestamp = Date.now } TextField("Name", text: $item.name.replaceNil("")) Spacer() } .padding() } } struct ContentView: View { @Environment(\.managedObjectContext) private var viewContext @State private var selected: Item.ID? @FetchRequest( sortDescriptors: [ SortDescriptor(\.name, order: .forward), SortDescriptor(\.timestamp, order: .forward), ], animation: .default) private var items: FetchedResults<Item> var body: some View { NavigationSplitView { List(selection: $selected) { ForEach(items) { item in Header(item: item) } .onDelete(perform: deleteItems) } .toolbar { ToolbarItem(placement: .navigationBarTrailing) { EditButton() } ToolbarItem { Button(action: addItem) { Label("Add Item", systemImage: "plus") } } } } detail: { if let selected, let index = items.firstIndex(where: { $0.id == selected }) { ItemDetail(item: items[index]) } else { Text("Select something") } } } private func addItem() { withAnimation { do { let newItem = Item(context: viewContext) newItem.name = "Name" newItem.timestamp = Date() try viewContext.save() selected = newItem.id } catch { let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } } } private func deleteItems(offsets: IndexSet) { withAnimation { offsets.map { items[$0] }.forEach(viewContext.delete) do { try viewContext.save() } catch { let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } } } } private let itemFormatter: DateFormatter = { let formatter = DateFormatter() formatter.dateStyle = .short formatter.timeStyle = .medium return formatter }() struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView().environment(\.managedObjectContext, PersistenceController.preview.container.viewContext) .previewInterfaceOrientation(.landscapeLeft) } } Any ideas?
2
0
855
Feb ’23
View like the camera app
Hello all, I would like to understand how to create a SwiftUI View like the official camera app. When the device orientation changes, the view is not animating, and the buttons just rotate (4:3, 1x...). The camera app view is compose by flash and live buttons, camera preview, config buttons, and big button to shot the photo. In portrait, it is from top to bottom, in landscape, from left to right. Also, when the last pictures view is shown, it is adapted to the current orientation, like if the camera preview view was rendered in the same device orientation. Ideas? Thanks!
2
0
2.4k
Dec ’22
accessibility PLAY button: no speak or add a delay?
Hello, I'm developing an app that plays a short audio when you push a button. When I'm testing VoiceOver and "push" the button, the voice says "Play button, playing..." over the audio I'm playing, and it is impossible to listen to it. What do you think is the best approach? I would like to add a delay to my audio, but I could not see a callback when the voiceOver said the sentence. Is it correct to remove the accessibility label for this kind of buttons (play/pause/stop)? Thanks!
2
0
903
Jan ’22
"Principal Class" in SwiftUI Life cycle
Hello, In the last release, the "Principal Class" is not working anymore. I'm using an own UIApplication class to implement a "Kiosk mode" in my app (after five minutes with no events, it comes back to the root view). In the new release, the SwiftUI life cycle is not using my UIApplication class. Is there a trick to do it? Thanks!
0
0
546
Oct ’21
NavigationView title iOS 15
Hello, The NavigationView just changed and I'm a bit lost. Could you tell me the correct way to show the title with a background (material) if the content is a map? Thanks!!! import SwiftUI struct ExampleStationListView_Previews: PreviewProvider { static var previews: some View { NavigationView { Map(mapRect: .constant(MKMapRect.world)) .ignoresSafeArea() .navigationBarTitle("Select Station", displayMode: .inline) } } }
0
0
668
Sep ’21
UIColor(Color.accentColor) .getRed(g:b:a:) not working
Hello! I have this code on my main app: .onAppear { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 UIColor(Color.accentColor).getRed(&r, green: &g, blue: &b, alpha: &a) print("\(r), \(g), \(b), \(a)") } If I'm using "Color.red", the output is: 0.9999999403953552, 0.23137253522872925, 0.18823528289794922, 1.0 But, if I'm using Color.accentColor: 0.0, 0.4784314036369324, 0.9999999403953552, 1.0 Instead of: 0.3059999942779541, 0.16499999165534973, 0.5180000066757202, 1.0 Which is the one set on the Assets.xcassets. It is not only Color.accentColor, but all the colors defined at the xcassets. For example, a color with any/dark appearances will always return the any components. Details: https://github.com/heltena/SwiftUI-GetComponents Thanks!
0
1
1.3k
Aug ’21
UIActivityView + Video --> Couldn't send message
Hello all! I’m working on Earthtunes, an app to listen to the Solid Earth. Ad: It is available in the App Store is you wanna take a look (I’ll be happy to hear from you!) The point is that this app generates a video to send a sound (not only earthquake, but a cut of the sound of a seismometer) by text, whatsapp… ...and… sometimes, it raises an error saying that it could not send the video. I could not find the reason of it. I’m posting here [1] a simplified project that shows this error and I would like to ask you if you know what it is going on there. It seems related to the video, but I'm always using the same codec and sometimes works, sometimes does not. Thanks a lot!!! [1] https://github.com/heltena/ExportTunes
2
0
1.2k
Aug ’21
AVFoundation / C5 note is not playing properly
Hello, I'm trying to play some waves I'm downloading from a seismometer and the sound is not good. I decided to create a simple wave (C5 note, 523.25 Hz) and play it and it does not work too. Here is my code: import AVFoundation import Combine class ContentViewModel: ObservableObject { &#9;&#9;let audioEngine: AVAudioEngine &#9;&#9;let player: AVAudioPlayerNode &#9;&#9; &#9;&#9;let data: [Double] &#9;&#9;let sampleRate: Double &#9;&#9;init() { &#9;&#9;&#9;&#9;let sinFrequency: Double = 523.25&#9;/* C5 */ &#9;&#9;&#9;&#9;let sampleRate: Double = 44100 &#9;&#9;&#9;&#9;let seconds: Double = 5 &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let range = 0 ..< Int(seconds * sampleRate) &#9;&#9;&#9;&#9;self.data = range.map { sin(2.0 * .pi * Double($0) * sinFrequency / sampleRate) } &#9;&#9;&#9;&#9;self.sampleRate = sampleRate &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;audioEngine = AVAudioEngine() &#9;&#9;&#9;&#9;let _ = audioEngine.mainMixerNode &#9;&#9;&#9;&#9;audioEngine.prepare() &#9;&#9;&#9;&#9;try! audioEngine.start() &#9;&#9;&#9;&#9;try! AVAudioSession.sharedInstance().setCategory(.playback) &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;self.player = AVAudioPlayerNode() &#9;&#9;&#9;&#9;audioEngine.attach(player) &#9;&#9;} &#9;&#9;func copyBuffer<T: FixedWidthInteger>(data: [Double], buffer: AVAudioPCMBuffer, channelData: UnsafePointer<UnsafeMutablePointer<T>>) { &#9;&#9;&#9;&#9;buffer.frameLength = buffer.frameCapacity &#9;&#9;&#9;&#9;let buffData = data.map { T(Double(T.max) * $0) } &#9;&#9;&#9;&#9;memcpy(channelData[0], buffData, Int(buffer.frameCapacity) * MemoryLayout<T>.size) &#9;&#9;} &#9;&#9;enum BufferType { &#9;&#9;&#9;&#9;case int16 &#9;&#9;&#9;&#9;case int32 &#9;&#9;} &#9;&#9; &#9;&#9;func createBuffer(for type: BufferType) -> AVAudioPCMBuffer { &#9;&#9;&#9;&#9;switch type { &#9;&#9;&#9;&#9;case .int16: &#9;&#9;&#9;&#9;&#9;&#9;guard &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let channelData = buffer.int16ChannelData &#9;&#9;&#9;&#9;&#9;&#9;else { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;fatalError() &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;copyBuffer(data: data, buffer: buffer, channelData: channelData) &#9;&#9;&#9;&#9;&#9;&#9;return buffer &#9;&#9;&#9;&#9;case .int32: &#9;&#9;&#9;&#9;&#9;&#9;guard &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)), &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let channelData = buffer.int16ChannelData &#9;&#9;&#9;&#9;&#9;&#9;else { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;fatalError() &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;copyBuffer(data: data, buffer: buffer, channelData: channelData) &#9;&#9;&#9;&#9;&#9;&#9;return buffer &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;func play(for type: BufferType) { &#9;&#9;&#9;&#9;let buffer = createBuffer(for: type) &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let linkFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1) &#9;&#9;&#9;&#9;audioEngine.connect(player, to: audioEngine.mainMixerNode, format: linkFormat) &#9;&#9;&#9;&#9;audioEngine.prepare() &#9;&#9;&#9;&#9;audioEngine.mainMixerNode.outputVolume = 0.5 &#9;&#9;&#9;&#9;player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil) &#9;&#9;&#9;&#9;if !player.isPlaying { &#9;&#9;&#9;&#9;&#9;&#9;player.play() &#9;&#9;&#9;&#9;} &#9;&#9;} } You can listen to the note looking for "Middle C Sine Wave for Ten Hours - 261.6 hertz" on YouTube (the title is wrong, this video is for C5). Could you please tell me why my sound does not sound like the real C5 note? Thanks!!! You can create a simple ContentView in Swift with this code: import SwiftUI struct ContentView: View { &#9;&#9;@StateObject var viewModel = ContentViewModel() &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;VStack { &#9;&#9;&#9;&#9;&#9;&#9;Spacer() &#9;&#9;&#9;&#9;&#9;&#9;HStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Button("Play Int16") { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;viewModel.play(for: .int16) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Button("Play Int32") { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;viewModel.play(for: .int32) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;Spacer() &#9;&#9;&#9;&#9;} &#9;&#9;} }
1
0
1.1k
Oct ’20