Hi there.
I want to listen for every text change in UITextView. The setup is very trivial.
override func viewWillAppear(_ animated: Bool) {
		super.viewWillAppear(animated)
		NotificationCenter.default.addObserver(
				self,
				selector: #selector(textViewDidChangeWithNotification(_:)),
				name: UITextView.textDidChangeNotification,
				object: nil
		)
}
@objc private func textViewDidChangeWithNotification(_ notification: Notification) {
		 print("Text: \(String(describing: inputTextView.text))")
}
It works OK in most cases, but then I have found some UITextInput's black box magic.
Step 1: Type 'I'.
Step 2: Important step. Select all text with double tap on the field.
Step 3: Select 'If' from word suggestions.
And there will be no notification for that new 'If'. This is important part for the task I have. On the other side if the caret will be at the end of the previously typed 'I' and we select 'If' then I receive notification about that change. So the difference is only in tapping 'If' suggestion when word is fully selected or not selected.
Is there any way to observe ALL text changes?
Of course I know that I can get text by using:
func textViewDidEndEditing(_ textView: UITextView)
but I need to observe all changes in real time as user types. The other option I always see is to use:
func textView(_ textView: UITextView, shouldChangeTextIn range: NSRange, replacementText text: String) -> Bool
but this method is a bad practice to observe changes. If you type two
spaces repeatedly iOS will replace first space with dot and obviously
will not inform you about this action in this method and a lot of other
problems with it.
So is there any way to observe ALL text changes?
Post
Replies
Boosts
Views
Activity
Hi there!My goal is to trim audio quite precisely. I'm facing some strange issue when exporting it using AVAssetExportSession.The code is pretty straightforward.import UIKit
import AVFoundation
import PlaygroundSupport
let asset: AVURLAsset = AVURLAsset(url: Bundle.main.url(forResource: "tmp", withExtension: "aac")!)
print(asset)
let timeRange = CMTimeRange(
start: CMTime(seconds: 20.0, preferredTimescale: asset.duration.timescale),
end: CMTime(seconds: 25.0, preferredTimescale: asset.duration.timescale)
)
let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetPassthrough)!
exportSession.outputFileType = .m4a
let fm = FileManager.default
let tmpDirURL = FileManager.default.temporaryDirectory.appendingPathComponent("cut.m4a")
try? fm.removeItem(at: tmpDirURL)
exportSession.outputURL = tmpDirURL
print(tmpDirURL)
exportSession.timeRange = timeRange
exportSession.exportAsynchronously {
switch exportSession.status {
case .completed:
print("completed")
default:
print("exportSession: \(exportSession.error?.localizedDescription ?? "error")")
}
}When I started analyzing results in the Audacity, I see that file is trimmed with some error which is very critical.If I align it by peaks (on the eye) I see ~500 ms error in this particular case. Error varies and repeats for different files I've tried.I've tried it with AVMutableComposition. Same result.Maybe I'm doing something wrong? Or am I missing something? I want files to be cut exactly by the time I set timeRange property of AVAssetExportSession.