My iPhone 11 already has iOS 17 beta 6'
But "personal voice" does not appear'
Is this feature not available on the iPhone 11, even though it already has iOS 17 beta 6?
Another question: can I add with AVSpeechUtterance a synthesized audio with the voice of the "personal voice" in a file or buffer with a APP make for me?
I thank!
Post
Replies
Boosts
Views
Activity
Hello,
I'm putting an audio of
AVSpeechSynthesizer.write()
in a video and some photos.
I tried to put a very long text, to the point of having made a video of around 50 minutes.
When saving the video to the gallery, the APP would freeze until it was saved.
In other cases, the APP would crash and I would have to compile again.
I tried to use
PHPhotoLibrary.shared().performChanges()
instead of
UISaveVideoAtPathToSavedPhotosAlbum
.
But the APP would crash until you saved the video to the gallery or the APP
it crashed and wouldn't come back.
Here's the code:
private let synthesizer = AVSpeechSynthesizer()
private var counterImage = 0
let semaphore = DispatchSemaphore(value: 0)
init(_ texts: [String]) {
Misc.obj.lData.removeAll()
Misc.obj.selectedPhotos.append(createBlueImage(CGSize(width: 100, height: 100)))
Misc.obj.selectedPhotos.append(createBlueImage(CGSize(width: 100, height: 100)))
super.init()
synthesizer.delegate = self
DispatchQueue.global().async {
do {
try self.nextText(texts)
msgErro("Completed.")
} catch {
msgErro(error.localizedDescription)
}
}
}
func nextText(_ texts: [String]) throws {
var audioBuffers = [CMSampleBuffer]()
var videoBuffers = [CVPixelBuffer]()
var lTime = [0.0]
for text in texts {
var time = Double.zero
var duration = AVAudioFrameCount.zero
let utterance = AVSpeechUtterance(string: texto)
utterance.voice = AVSpeechSynthesisVoice(language: "pt-BR")
utterance.rate = 0.2
synthesizer.write(utterance) { buffer in
if let buffer = buffer as? AVAudioPCMBuffer, let sampleBuffer = buffer.toCMSampleBuffer(presentationTime: .zero) {
audioBuffers.append(sampleBuffer)
duration += buffer.frameLength
time += Double(buffer.frameLength) / buffer.format.sampleRate
}
}
semaphore.wait()
if Misc.obj.selectedPhotos.indices.contains(contadorImagem) {
let image = Misc.obj.selectedPhotos[counterImage]
//let imagemEscrita = imagem.addTexto(textos[quantTxt])
let pixelBuffer = image.toCVPixelBuffer()
videoBuffers.append(pixelBuffer!)
lTime.append(time)
// Increase counterImage
counterImage += 1
if counterImage == Misc.obj.selectedPhotos.count {
counterImage = 0
}
}
}
let url = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("*****/output.mp4")
try FileManager.default.createDirectory(at: url.deletingLastPathComponent(), withIntermediateDirectories: true)
if FileManager.default.fileExists(atPath: url.path()) {
try FileManager.default.removeItem(at: url)
}
let audioProvider = SampleProvider(buffers: audioBuffers)
let videoProvider = SampleProvider(buffers: videoBuffers, lTempo: lTempo)
let audioInput = createAudioInput(audioBuffers: audioBuffers)
let videoInput = createVideoInput(videoBuffers: videoBuffers)
let adaptor = createPixelBufferAdaptor(videoInput: videoInput)
let assetWriter = try AVAssetWriter(outputURL: url, fileType: .mp4)
assetWriter.add(videoInput)
assetWriter.add(audioInput)
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: .zero)
let writerQueue = DispatchQueue(label: "Asset Writer Queue")
videoInput.requestMediaDataWhenReady(on: writerQueue) {
if let buffer = videoProvider.getNextBuffer() {
adaptor.append(buffer, withPresentationTime: videoProvider.getPresentationTime())
} else {
videoInput.markAsFinished()
if audioProvider.isFinished() {
self.semaphore.signal()
}
}
}
audioInput.requestMediaDataWhenReady(on: writerQueue) {
if let buffer = audioProvider.getNextBuffer() {
audioInput.append(buffer)
} else {
audioInput.markAsFinished()
if audioProvider.isFinished() {
self.semaphore.signal()
}
}
}
semaphore.wait()
assetWriter.finishWriting {
switch assetWriter.status {
case .completed:
msgRelatos("Completed.")
UISaveVideoAtPathToSavedPhotosAlbum(url.path, nil, nil, nil)
case .failed:
if let error = assetWriter.error {
msgErro("Error: \(error.localizedDescription)")
} else {
msgRelatos("No recorded.")
}
default:
msgRelatos("Error not found.")
}
}
}
}
extension TesteFala: AVSpeechSynthesizerDelegate {
func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) {
semaphore.signal()
}
}
Hello!
I have a few questions'
I don't use GitHub since I am deafblind and a Braille display user. I avoid overloading my usage with extra resources, as accessibility tends to complicate things'
If I provide a link, should the project be temporary, or must I leave it there permanently?
As for a crash log, I don't have one, but the project is very small'
I don’t have an error message either'
Thank you!
Hello!
See if
this link works
.
Hello,
I tried to open the code-level support request twice, but I am not receiving the automatic email.
Hello,
I am not receiving automatic email when I send messages to the Developer Technical support.
Help me.
Hello,
I have already sent the attachment via email. Now I am just waiting for a response.
Hello,
I am deaf-blind and use a wheelchair. I received guidance from Apple Developer Support that is beyond my capacity. It involved viewing a panel in Xcode to fix a bug in my app, but this panel is not accessible. I need special attention from Apple Support. I’m worried they may not assist me fully in this. I can only follow instructions related to source code, not Xcode panels very well. Please inform Support that I am deaf-blind, use a Braille display, and am mainly limited to working with source code text. Panels like Instruments > Allocations are challenging in Braille. Please help me!
Hello,So much inaccessibility. The tags are not accessible for selection on a Braille display, and I am a Braille display user. I tried manually adding the backtick to create the code block, but it didn’t work. It’s very frustrating. I’m facing a memory issue in my app. I was asked to reduce the project size, which I did. However, the issue wasn’t resolved. I need someone who can truly focus on this. I've tried everything with my app—many, many workarounds. The main project is much larger than this small segment. I need special assistance because this disability isn’t just a "difficulty"; it’s a real barrier. From my experiences over the past several months, after more than a year as a visually and hearing-impaired developer, I’d go as far as to say that Apple’s preparedness for developers with disabilities is poor and in urgent need of significant improvements.
Hello,
Yes, I’ve already tried that. It’s giving an error when I try to write with "crase"AVASSETWRITER"crase". How would a while loop look when writing the video, placing the content from the source video into an output video?
Hello,
Here’s the translation:
I actually want an example of two while loops after setting up the configuration: one while loop for the AVAssetReader to handle the image and audio, and then a while loop for the output video using AVAssetWriter. I understood the configuration part; I just need the part with the while loops for the source and destination videos. An example of both, handling both image and audio.