Post

Replies

Boosts

Views

Activity

UITextView will not scroll to bottom under specific circumstances
Within my UIViewController I have a UITextView which I use to dump current status and info into. Obviously evry time I add text to the UITextView I would like it to scroll to the bottom. So I've created this function, which I call from UIViewController whenever I have new data. func updateStat(status: String, tView: UITextView) { db.status = db.status + status + "\n" tView.text = db.status let range = NSMakeRange(tView.text.count - 1, 0) tView.scrollRangeToVisible(range) tView.flashScrollIndicators() } The only thing that does not work is the tView.scrollRangeToVisible. However, if from UIViewController I call: updateStat(status: "...new data...", tView: mySession) let range = NSMakeRange(mySession.text.count - 1, 0) mySession.scrollRangeToVisible(range) then the UITextView's scrollRangeToVisible does work. I'm curious if anyone knows why this works when called within the UIViewController, but not when called from a function? p.s. I have also tried the updateStatus function as an extension to UIViewController, but that doesn't work either
1
0
595
Aug ’22
Do I have to build in support for user scrolling through a UITextView object?
I am trying to add a UITextView within my app to output data to. Naturally the data will eventually be bigger than the size of the UITextView, and the view is a set size. So I would like the user to be able to scroll through its content. However, I cannot scroll through the content in the app. Am I supposed to build the scrolling function myself? Seems weird that I would have to do that, but I cannot seem to find the answer to this on the web. I’ve also noticed that no vertical scroll at shows up when the text count is larger than the size of the object, which makes me wonder if I am missing a property or two. func createStatusField() -> UITextView { let myStatus = UITextView(frame: CGRect(x: 50, y: 50, width: 100, height: 300)) myStatus.autocorrectionType = .no myStatus.text = "hello there" myStatus.backgroundColor = .secondarySystemBackground myStatus.textColor = .secondaryLabel myStatus.font = UIFont.preferredFont(forTextStyle: .body) myStatus.layer.zPosition = 1 myStatus.isScrollEnabled = true myStatus.showsVerticalScrollIndicator = true return myStatus }
3
0
623
Aug ’22
Questions about Xcode 13.4.1 and supported iOSs
I already posted about Xcode 13.4.1 not supporting iPhone's 15.6 iOS. But the answer raised even more questions. If the latest version of Xcode (13.4.1) won't support iOS 15.6, why should I think an earlier version of Xcode would? What is the real solution to getting Xcode to run apps on that iOS? Github does not have files past 15.5? Does Xcode automatically update its supported iOS files behind the scenes? Is there a planned date for Xcode to support iOS 15.6? Thank you
0
0
584
Aug ’22
How to go back to a prior version of Xcode? Version 13.4.1 won't work with iPhone iOS 15.6
So I've found out from other posts that Xcode 13.4.1 won't debug apps on iPhones with iOS 15.6. The solution everyone that everyone seems to agree on is to go back to Xcode 13.3.1. While I am downloading the xip file for that version, I want to check first on how to install the older version? I don't need to mess things up any worse than they are now.
2
0
1.6k
Aug ’22
Am unable to add an AVAudioMixerNode to downsize my recording
Am at the beginning of a voice recording app. I store incoming voice data into a buffer array, and write 50 of them to a file. The code works fine, Sample One. However, I would like the recorded files to be smaller. So here I try to add an AVAudioMixer to downsize the sampling. But this code sample gives me two errors. Sample Two The first error I get is when I call audioEngine.attach(downMixer). The debugger gives me nine of these errors: throwing -10878 The second error is a crash when I try to write to audioFile. Of course they might all be related, so am looking to include the mixer successfully first. But I do need help as I am just trying to piece these all together from tutorials, and when it comes to audio, I know less than anything else. Sample One //these two lines are in the init of the class that contains this function... node = audioEngine.inputNode recordingFormat = node.inputFormat(forBus: 0) func startRecording() { audioBuffs = [] x = -1 node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in x += 1 audioBuffs.append(buffer) if x >= 50 { audioFile = makeFile(format: recordingFormat, index: fileCount) mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)") fileCount += 1 for i in 0...49 { do { try audioFile!.write(from: audioBuffs[i]); } catch { mainView?.setLabelText(tag: 4, text: "write error") stopRecording() } } ...cleanup buffer code } }) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") } } Sample Two //these two lines are in the init of the class that contains this function node = audioEngine.inputNode recordingFormat = node.inputFormat(forBus: 0) func startRecording() { audioBuffs = [] x = -1 // new code let format16KHzMono = AVAudioFormat.init(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 11025.0, channels: 1, interleaved: true) let downMixer = AVAudioMixerNode() audioEngine.attach(downMixer) // installTap on the mixer rather than the node downMixer.installTap(onBus: 0, bufferSize: 8192, format: format16KHzMono, block: { [self] (buffer, _) in x += 1 audioBuffs.append(buffer) if x >= 50 { // use a different format in creating the audioFile audioFile = makeFile(format: format16KHzMono!, index: fileCount) mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)") fileCount += 1 for i in 0...49 { do { try audioFile!.write(from: audioBuffs[i]); } catch { stopRecording() } } ...cleanup buffers... } }) let format = node.inputFormat(forBus: 0) // new code audioEngine.connect(node, to: downMixer, format: format)//use default input format audioEngine.connect(downMixer, to: audioEngine.outputNode, format: format16KHzMono)//use new audio format downMixer.outputVolume = 0.0 audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") } }
0
0
1.1k
Aug ’22
Detecting background volume level in Swift
Am trying to distinguish the differences in volumes between background noise, and someone speaking in Swift. Previously, I had come across a tutorial which had me looking at the power levels in each channel. It come out as the code listed in Sample One which I called within the installTap closure. It was ok, but the variance between background and the intended voice to record, wasn't that great. Sure, it could have been the math used to calculate it, but since I have no experience in audio data, it was like reading another language. Then I came across another demo. It's code was much simpler, and the difference in values between background noise and speaking voice was much greater, therefore much more detectable. It's listed here in Sample Two, which I also call within the installTap closure. My issue here is wanting to understand what is happening in the code. In all my experiences with other languages, voice was something I never dealt with before, so this is way over my head. Not looking for someone to explain this to me line by line. But if someone could let me know where I can find decent documentation so I can better grasp what is going on, I would appreciate it. Thank you Sample One func audioMetering(buffer:AVAudioPCMBuffer) { // buffer.frameLength = 1024 let inNumberFrames:UInt = UInt(buffer.frameLength) if buffer.format.channelCount > 0 { let samples = (buffer.floatChannelData![0]) var avgValue:Float32 = 0 vDSP_meamgv(samples,1 , &avgValue, inNumberFrames) var v:Float = -100 if avgValue != 0 { v = 20.0 * log10f(avgValue) } self.averagePowerForChannel0 = (self.LEVEL_LOWPASS_TRIG*v) + ((1-self.LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel0) self.averagePowerForChannel1 = self.averagePowerForChannel0 } if buffer.format.channelCount > 1 { let samples = buffer.floatChannelData![1] var avgValue:Float32 = 0 vDSP_meamgv(samples, 1, &avgValue, inNumberFrames) var v:Float = -100 if avgValue != 0 { v = 20.0 * log10f(avgValue) } self.averagePowerForChannel1 = (self.LEVEL_LOWPASS_TRIG*v) + ((1-self.LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel1) } } Sample Two private func getVolume(from buffer: AVAudioPCMBuffer, bufferSize: Int) -> Float { guard let channelData = buffer.floatChannelData?[0] else { return 0 } let channelDataArray = Array(UnsafeBufferPointer(start:channelData, count: bufferSize)) var outEnvelope = [Float]() var envelopeState:Float = 0 let envConstantAtk:Float = 0.16 let envConstantDec:Float = 0.003 for sample in channelDataArray { let rectified = abs(sample) if envelopeState < rectified { envelopeState += envConstantAtk * (rectified - envelopeState) } else { envelopeState += envConstantDec * (rectified - envelopeState) } outEnvelope.append(envelopeState) } // 0.007 is the low pass filter to prevent // getting the noise entering from the microphone if let maxVolume = outEnvelope.max(), maxVolume > Float(0.015) { return maxVolume } else { return 0.0 } }
1
0
2.2k
Jul ’22
Can I calculate the bufferSize on .installTap to equal X seconds
Below is a quick snippet of where I record audio. I would like to get a sampling of the background audio so that later I can filter out background noise. I figure 10 to 15 seconds should be a good amount of time. Although I am assuming that it can change depending on the iOS device, the format returned from .inputFormat is : <AVAudioFormat 0x600003e8d9a0: 1 ch, 48000 Hz, Float32> Based on the format info, is it possible to make bufferSize for .installTap be just the write size for whatever time I wish to record for? I realize I can create a timer for 10 seconds, stop recording, paster the files I have together, etc. etc. But if I can avoid all that extra coding it would be nice. let node = audioEngine.inputNode let recordingFormat = node.inputFormat(forBus: 0) makeFile(format: recordingFormat) node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in audioMetering(buffer: buffer) print ("\(self.averagePowerForChannel0) \(self.averagePowerForChannel1)") if self.averagePowerForChannel0 < -50 && self.averagePowerForChannel1 < -50 { ... } else { ... } do { ... write audio file } catch {return};}) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") }
2
0
963
Jul ’22
Where can I call register(_:forCellReuseIdentifier:) if I am not using a UITableViewController?
Currently I am placing a UITableView inside my UIViewController, with multiple other items. I am not using a UITableViewController. However I am unable to register the identifier "masterlist", or should I say I am not sure where I can register it. I am not using storyboard, and if I am still to register it in my UIViewController's viewDidLoad, I am unable to figure out the proper syntax. Is this something I can do? class BeginNewCustomer: UIViewController { let myList = createTblView() override func viewDidLoad() { super.viewDidLoad() myList.delegate = self myList.dataSource = self } func createTblView() -> UITableView { let myTable = MyTableView(frame: CGRect(x: 0, y: 0, width: 0, height: 0 )) myTable.backgroundColor = .white return myTable } extension BeginNewInv: UITableViewDelegate, UITableViewDataSource { func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int { return dbClass.invsList.count } func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell { let cell = UITableViewCell() // let cell = tableView.dequeueReusableCell(withIdentifier: "masterlist", for: indexPath) var config = cell.defaultContentConfiguration() .... fill the configuration cell.contentConfiguration = config return cell } }
3
0
631
Jul ’22
How can I get the text of a label using the new UIContentConfiguration
Am trying to port away from .textLabel.text, as per Apple but I can't seem to find the way to get the text I set. I now set my cells this way: let cell = UITableViewCell() var config = cell.defaultContentConfiguration() config.text = dbClass.nameList[indexPath.row].clientName config.textProperties.color = .black config.textProperties.alignment = .center cell.contentConfiguration = config But when I try to get the text of a selected cell (once the user hits the OK button) I can't seem to reverse engineer how to get the text. let cell = myClientList.cellForRow(at: myInvsList.indexPathForSelectedRow!) let config = cell?.contentConfiguration dbClass.curMasterinvList = ?????? (can't find what property to read here I even tried let config = cell?.defaultContentConfiguration() Hoping that the text would be in there, but the text there is blank, so am assuming that's just the standard config before I've changed it. I have Googled as much as possible, but can't seem to find this very simple need.
4
0
893
Jul ’22
How can I find my app created log on my iPhone?
I am using the code below to create my own debug log for my app. On the simulator, I have no problem viewing that log. I simply print out the documents directory in the debugger, then open it in my finder. However I do not know how to access the created log on my iPhone itself. Even if I go to Window -> Devices and Simulator's, when I look at my app's container it's empty. Although I would like to be able to access the file from any actual device in the future. Am I using the wrong directory? I even used allDomainsMask in place of userDomainMask below, but to no avail. debugFileURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("myApp.log") if let handle = try? FileHandle(forWritingTo: dbClass.debugFileURL!) { handle.seekToEndOfFile() // moving pointer to the end handle.write(text.data(using: .utf8)!) // adding content handle.closeFile() // closing the file } else { try! text.write(to: dbClass.debugFileURL!, atomically: false, encoding: .utf8) }
1
0
447
Jul ’22
Changing my entities in CoreData gave me a NSCocoaErrorDomain Code=134140 and I can't go back
Updated info below, in bold. I went and changed one of the entities in the CoreData of my app. For all my entities I have them selected as "Manual" for Codegen So I deleted all four files (for two entities), cleaned the build folder, regenerated the CoreData files with Editor -> Create NSManagedObject Subclass. Now every time I run the app I get a fatalError in the following code in the AppDelegate: lazy var persistentContainer: NSPersistentContainer = { let container = NSPersistentContainer(name: “Invoice_Gen") container.loadPersistentStores(completionHandler: { (storeDescription, error) in if let error = error as NSError? { fatalError("Unresolved error \(error), \(error.userInfo)") } }) return container }() The error code being [error] error: addPersistentStoreWithType:configuration:URL:options:error: returned error NSCocoaErrorDomain (134140) Even if I remove the files for the CoreData entities, and comment out anything related to them code wise, I will still get this crash. If someone has any idea of whether I have to delete something else, or am whatever I would so appreciate it. This one has me more stumped than anything before it. The change I made was to turn one of the entities' attribute from String to Int When I changed it back, everything works. So from my research on Google there is something about the mapping model. But I can not find it at all.
1
0
1.3k
Jun ’22
Is there an order for response, data, and error in URLSession delegates?
_Posted this here, on Stackoverflow. But after 2+ weeks got only 10 views and no responses. _ I have my URLSession delegates working, but am curious about the order I get back response, data, and error. Right now am testing purposeful errors on my server side, so I can check the response if it's not 200. If it's not, I do not need to process the data, etc. So I could use a flag, but have to make sure that I'l always get response, data, and error in a specific order? Does such an order exist? extension UploadInv: UIDocumentPickerDelegate, URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate { // Error received func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) { if let err = error { print("Error: \(err.localizedDescription)") } } // Response received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: (URLSession.ResponseDisposition) -> Void) { completionHandler(URLSession.ResponseDisposition.allow) DispatchQueue.main.async { [self] in if let httpResponse = response as? HTTPURLResponse { if httpResponse.statusCode != 200 { DispatchQueue.main.async { [self] in self.myStatus.text = "Error \(String(httpResponse.statusCode))" myBackButt.isEnabled = true } } } } } // Data received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) { DispatchQueue.main.async { [self] in
3
0
952
May ’22