Hello all!
I'm implementing a view that shows a grid of different histograms to show a final report to the user. It could have ~100 rows and ~10 columns. I'm using a LazyVGrid and it looks like:
Note: the example contains only 3 rows.
It takes less than a second to render the grid, but you can feel the app is blocked for a few ms.
I was wonder if some of you know how to render asynchronously the chart views so, at least, not block the interface.
Thanks!
Post
Replies
Boosts
Views
Activity
Hello all,
I would like to understand how to create a SwiftUI View like the official camera app. When the device orientation changes, the view is not animating, and the buttons just rotate (4:3, 1x...).
The camera app view is compose by flash and live buttons, camera preview, config buttons, and big button to shot the photo. In portrait, it is from top to bottom, in landscape, from left to right.
Also, when the last pictures view is shown, it is adapted to the current orientation, like if the camera preview view was rendered in the same device orientation.
Ideas?
Thanks!
Hello,
I'm using a NavigationSplitView and feeding the sidebar using FetchResults. In the detail view, the user can update the field name of the entity and then, the sidebar becomes useless, selecting random rows and hiding some of them, like these pictures:
I could reproduce the issue using a sample project. If you want to have the project to test it, create a simple project on Xcode: iOS app / +Use Core Data & +Host in CloudKit
Then, add a field "name String?" in the Entity and use this code in ContentView.swift:
import SwiftUI
import CoreData
extension Binding {
public func replaceNil<T>(_ defaultValue: T) -> Binding<T> where Value == Optional<T> {
return .init {
wrappedValue ?? defaultValue
} set: {
wrappedValue = $0
}
}
}
struct Header: View {
@ObservedObject var item: Item
var body: some View {
VStack {
if let name = item.name, !name.isEmpty {
Text(name)
} else {
Text("---")
.foregroundColor(.secondary)
}
}
}
}
struct ItemDetail: View {
@ObservedObject var item: Item
var body: some View {
VStack {
Spacer()
if let timestamp = item.timestamp {
Text(timestamp, formatter: itemFormatter)
} else {
Text("---")
.foregroundColor(.secondary)
}
Button("Update") {
item.timestamp = Date.now
}
TextField("Name", text: $item.name.replaceNil(""))
Spacer()
}
.padding()
}
}
struct ContentView: View {
@Environment(\.managedObjectContext) private var viewContext
@State private var selected: Item.ID?
@FetchRequest(
sortDescriptors: [
SortDescriptor(\.name, order: .forward),
SortDescriptor(\.timestamp, order: .forward),
],
animation: .default)
private var items: FetchedResults<Item>
var body: some View {
NavigationSplitView {
List(selection: $selected) {
ForEach(items) { item in
Header(item: item)
}
.onDelete(perform: deleteItems)
}
.toolbar {
ToolbarItem(placement: .navigationBarTrailing) {
EditButton()
}
ToolbarItem {
Button(action: addItem) {
Label("Add Item", systemImage: "plus")
}
}
}
} detail: {
if let selected, let index = items.firstIndex(where: { $0.id == selected }) {
ItemDetail(item: items[index])
} else {
Text("Select something")
}
}
}
private func addItem() {
withAnimation {
do {
let newItem = Item(context: viewContext)
newItem.name = "Name"
newItem.timestamp = Date()
try viewContext.save()
selected = newItem.id
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
}
}
private func deleteItems(offsets: IndexSet) {
withAnimation {
offsets.map { items[$0] }.forEach(viewContext.delete)
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
}
}
}
private let itemFormatter: DateFormatter = {
let formatter = DateFormatter()
formatter.dateStyle = .short
formatter.timeStyle = .medium
return formatter
}()
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView().environment(\.managedObjectContext, PersistenceController.preview.container.viewContext)
.previewInterfaceOrientation(.landscapeLeft)
}
}
Any ideas?
Will it not be great to port RealityKit to the iPad Pro with M1 and capture objects on it?
I think there is no info related to it, but, am I missing something? Is there a plan to move it to iOS soon?
Best!
Hello,
I'm developing an app that plays a short audio when you push a button. When I'm testing VoiceOver and "push" the button, the voice says "Play button, playing..." over the audio I'm playing, and it is impossible to listen to it.
What do you think is the best approach?
I would like to add a delay to my audio, but I could not see a callback when the voiceOver said the sentence.
Is it correct to remove the accessibility label for this kind of buttons (play/pause/stop)?
Thanks!
Hello,
In the last release, the "Principal Class" is not working anymore. I'm using an own UIApplication class to implement a "Kiosk mode" in my app (after five minutes with no events, it comes back to the root view).
In the new release, the SwiftUI life cycle is not using my UIApplication class.
Is there a trick to do it?
Thanks!
Hello,
The NavigationView just changed and I'm a bit lost. Could you tell me the correct way to show the title with a background (material) if the content is a map?
Thanks!!!
import SwiftUI
struct ExampleStationListView_Previews: PreviewProvider {
static var previews: some View {
NavigationView {
Map(mapRect: .constant(MKMapRect.world))
.ignoresSafeArea()
.navigationBarTitle("Select Station",
displayMode: .inline)
}
}
}
Hello!
I have this code on my main app:
.onAppear {
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 0.0
UIColor(Color.accentColor).getRed(&r, green: &g, blue: &b, alpha: &a)
print("\(r), \(g), \(b), \(a)")
}
If I'm using "Color.red", the output is:
0.9999999403953552, 0.23137253522872925, 0.18823528289794922, 1.0
But, if I'm using Color.accentColor:
0.0, 0.4784314036369324, 0.9999999403953552, 1.0
Instead of:
0.3059999942779541, 0.16499999165534973, 0.5180000066757202, 1.0
Which is the one set on the Assets.xcassets.
It is not only Color.accentColor, but all the colors defined at the xcassets. For example, a color with any/dark appearances will always return the any components.
Details:
https://github.com/heltena/SwiftUI-GetComponents
Thanks!
Hello all!
I’m working on Earthtunes, an app to listen to the Solid Earth. Ad: It is available in the App Store is you wanna take a look (I’ll be happy to hear from you!)
The point is that this app generates a video to send a sound (not only earthquake, but a cut of the sound of a seismometer) by text, whatsapp…
...and…
sometimes, it raises an error saying that it could not send the video. I could not find the reason of it. I’m posting here [1] a simplified project that shows this error and I would like to ask you if you know what it is going on there.
It seems related to the video, but I'm always using the same codec and sometimes works, sometimes does not.
Thanks a lot!!!
[1] https://github.com/heltena/ExportTunes
Hi all,I'm looking for an algorithm to generate random numbers in a kernel shader function, similar to "curand" for Cuda, but I couldn't find it. Is there some interesting library?Thank you so much,
Hello,
I'm trying to play some waves I'm downloading from a seismometer and the sound is not good.
I decided to create a simple wave (C5 note, 523.25 Hz) and play it and it does not work too.
Here is my code:
import AVFoundation
import Combine
class ContentViewModel: ObservableObject {
		let audioEngine: AVAudioEngine
		let player: AVAudioPlayerNode
		
		let data: [Double]
		let sampleRate: Double
		init() {
				let sinFrequency: Double = 523.25	/* C5 */
				let sampleRate: Double = 44100
				let seconds: Double = 5
				
				let range = 0 ..< Int(seconds * sampleRate)
				self.data = range.map { sin(2.0 * .pi * Double($0) * sinFrequency / sampleRate) }
				self.sampleRate = sampleRate
				
				audioEngine = AVAudioEngine()
				let _ = audioEngine.mainMixerNode
				audioEngine.prepare()
				try! audioEngine.start()
				try! AVAudioSession.sharedInstance().setCategory(.playback)
				
				self.player = AVAudioPlayerNode()
				audioEngine.attach(player)
		}
		func copyBuffer<T: FixedWidthInteger>(data: [Double], buffer: AVAudioPCMBuffer, channelData: UnsafePointer<UnsafeMutablePointer<T>>) {
				buffer.frameLength = buffer.frameCapacity
				let buffData = data.map { T(Double(T.max) * $0) }
				memcpy(channelData[0], buffData, Int(buffer.frameCapacity) * MemoryLayout<T>.size)
		}
		enum BufferType {
				case int16
				case int32
		}
		
		func createBuffer(for type: BufferType) -> AVAudioPCMBuffer {
				switch type {
				case .int16:
						guard
								let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false),
								let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)),
								let channelData = buffer.int16ChannelData
						else {
								fatalError()
						}
						copyBuffer(data: data, buffer: buffer, channelData: channelData)
						return buffer
				case .int32:
						guard
								let inputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: 1, interleaved: false),
								let buffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: UInt32(data.count)),
								let channelData = buffer.int16ChannelData
						else {
								fatalError()
						}
						copyBuffer(data: data, buffer: buffer, channelData: channelData)
						return buffer
				}
		}
		
		func play(for type: BufferType) {
				let buffer = createBuffer(for: type)
				
				let linkFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1)
				audioEngine.connect(player, to: audioEngine.mainMixerNode, format: linkFormat)
				audioEngine.prepare()
				audioEngine.mainMixerNode.outputVolume = 0.5
				player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil)
				if !player.isPlaying {
						player.play()
				}
		}
}
You can listen to the note looking for "Middle C Sine Wave for Ten Hours - 261.6 hertz" on YouTube (the title is wrong, this video is for C5).
Could you please tell me why my sound does not sound like the real C5 note?
Thanks!!!
You can create a simple ContentView in Swift with this code:
import SwiftUI
struct ContentView: View {
		@StateObject var viewModel = ContentViewModel()
		
		var body: some View {
				VStack {
						Spacer()
						HStack {
								Button("Play Int16") {
										viewModel.play(for: .int16)
								}
								Button("Play Int32") {
										viewModel.play(for: .int32)
								}
						}
						Spacer()
				}
		}
}