I need to open a pure keyboard without smart prompts at the top, emoji, or mic icon at the bottom.
The default keyboard state is:
as you can see there are smart prompts at the top and other buttons at the bottom.
After the keyboardType was changed to textView.keyboardType = .asciiCapable it looks like this
much better, but still there is a mic button at the bottom, and looks like there is no way to hide it.
Is there a way?
Post
Replies
Boosts
Views
Activity
I have 2 apps for audio playback, I need to check a case when the first app was interrupted the sound should stops to play, however, when the interruption is end the app should return to playback.
There is a great example with exact implementation of it - https://stackoverflow.com/a/71678102/5709159
The trick is when the interruption app finished its playback this event has to be fired:
try? AVAudioSession.sharedInstance().setActive(false, options: .notifyOthersOnDeactivation)
and the first app (one should continue to play) has to catch this event like this
self.observer = NotificationCenter.default.addObserver(forName:
AVAudioSession.interruptionNotification, object: nil, queue: nil) {
[weak self] n in
guard let self = self else { return } // legal in Swift 4.2
let why = n.userInfo![AVAudioSessionInterruptionTypeKey] as! UInt
let type = AVAudioSession.InterruptionType(rawValue: why)!
switch type {
case .began:
print("interruption began:\n\(n.userInfo!)")
case .ended:
print("interruption ended:\n\(n.userInfo!)")
guard let opt = n.userInfo![AVAudioSessionInterruptionOptionKey] as? UInt else {return}
let opts = AVAudioSession.InterruptionOptions(rawValue: opt)
if opts.contains(.shouldResume) {
print("should resume")
print("Is playing: \(self.player.isPlaying)")
print("is I prepared to play: \(self.player.prepareToPlay())")
print("bp tried to resume play: did I? \(self.player.play())")
} else {
print("not should resume")
}
@unknown default: fatalError()
}
}
The problem is that this line - self.player.prepareToPlay() returns false
What am I missing here?
I use AVAudioEngine in order to play sound in my app. Everything is working properly in case of moving the app from the foreground to the background and vice versa, the only problem is that I need to find a way to send events to the other music stream apps after I finished my playback, so they can continue to play.
I'll give an example - I open Spotify (music stream app) and click on the play button so the music starts to play, then I open my app and click on the play button, what happens is - music from Spotify is paused and music from my app starts, then I close my app and nothing happens. The expected result here is - that after I closed my app, Spotify should continue to play the music from that moment where it paused. I assume there should be a kind of event that I should fire after I finished and other streaming apps can catch it and continue.
Am I right?
*it could be iMusic instead of Spotify or any other app
I don't understand what is wrong with my project, but everything was worked without any issues, then in one moment started to appear a popup (attached)
I tried to restart my phone as well as laptop, tried to close and open again XCode, tried to create new target even updated my device to 15.4 iOS version, nothing helps.
What am I missing?
I have the native static lib that has a few .h files as API, the dir structure looks like this
3rdParties -
- MyAPIHeader.h
- my.modulemap
my modulemap file looks like this
module MyAPIHeader {
header "MyAPIHeader.h"
export *
}
and everything works well, up until I need to add another API file to my modulemap structure that does not reside in the same dir.
anotherProjectDir -
- AnotherAPIHeader.h
3rdParties -
- MyAPIHeader.h
- my.modulemap
and my modulemap file looks like this
module MyAPIHeader {
header "MyAPIHeader.h"
export *
}
module AnotherAPIHeader {
header "AnotherAPIHeader.h"
export *
}
and then when I try to use AnotherAPIHeader functions I got such an error in the modulemap file
Header 'AnotherAPIHeader.h' not found
I tried to set the path to the .h file in the module map as relative (not works) then I tried to set the path to the header file in the target (not works).
To sum up - when the .h file that is included in the module map resides in the same dir as a module map it works, when .h file resides in the other dir there is no way to set relative dir to that .h file.
What am I missing?
The problem is that if I create SCNReferenceNode and try to get referenceURL
...
let refNode: SCNReferenceNode = SCNReferenceNode()
let curUrl = refNode.referenceURL
...
in order to check if it is nil, I got an exception
Thread 1: EXC_BREAKPOINT (code=1, subcode=0x186d06dd0)
And looks like there is no way to work around this.
P.S. I know there is an option to give an URL to the constructor like SCNReferenceNode(myURL), but I have to use an approach where I create SCNReferenceNode before then refURL is available.
I cloned the project from the repo and built it as a result I see a lot of warnings that are not related to my laptop at all
So, the warning says that there is no such file by path
\Users\alekseytimoshchenko\Library\...
But there is no such user - alekseytimoshcenko on my laptop at all. Looks like the problem is that somewhere in the settings defined hardcoded path (instead of relative), but I checked all the paths everything is ok.
What is a possible problem here?
I need to retrieve the Data from [Float] in order to do it I use such an approach
var vertexData: [Float] = [...]
let p_vertexData: Data = Data(buffer: UnsafeBufferPointer(start: vertexData, count: VDataCount))
and it works well except for the annoying warning that I get
Initialization of 'UnsafeBufferPointer' results in a dangling buffer pointer
so, I changed it to the recommended approach
let p_vertexData: Data = withUnsafeBytes(of: vertexData) { Data($0) }
But this way it doesn't work well, I don't know what exactly is going on, but the result (where I use the Data) is not what is expected.
The question is - how to get the Data in a proper way?
It is a kind of weird phenomenon that I found out. There is a very simple example, I have an ARSCNView which is empty, just rootnode and camera, so it means that the camera is open, but there is nothing on the screen (just camera).
There is a method that I call from the viewDidLoad() { printAllScene(node: scnView.scene.rootNode) } in order to print all the nodes on the scene
private func printAllScene(node: SCNNode) {
print("node: \(node), parent is: \(String(describing: node.parent))")
for curNode in node.childNodes {
printAllScene(node: curNode)
}
}
This recursive method prints all the nodes on the scene, so after I call it in the viewDidLoad() I get such an output
>>> 1
node: <SCNNode: 0x283313300 | 1 child>, parent is: nil
node: <SCNNode: 0x283318500 | camera=<SCNCamera: 0x1065261c0> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 1 child>)
<<<
everything is ok here, but then after I call the same print method from the button click in a few seconds I get another output
>>> 3
node: <SCNNode: 0x283313300 | 3 children>, parent is: nil
node: <SCNNode: 0x283318500 pos(-0.002828 0.016848 0.034329) rot(0.229921 0.895326 -0.381477 0.336716) scale(1.000000 1.000000 1.000000) | camera=<SCNCamera: 0x1065261c0> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 3 children>)
node: <SCNNode: 0x283336d00 | light=<SCNLight: 0x103973dd0 | type=probe> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 3 children>)
node: <SCNNode: 0x283324800 pos(-0.007079 -0.074922 0.031407) rot(0.000000 -1.000000 0.000000 1.474911) | light=<SCNLight: 0x103985120 | type=probe> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 3 children>)
<<<
It has already had 3 children instead of one, how?? and then after a few seconds more I try to click again and get another output
>>> 3
node: <SCNNode: 0x283313300 | 4 children>, parent is: nil
node: <SCNNode: 0x283318500 pos(0.000974 0.011970 0.039510) rot(0.231398 0.878019 -0.418974 0.304382) scale(1.000000 1.000000 1.000000) | camera=<SCNCamera: 0x1065261c0> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 4 children>)
node: <SCNNode: 0x283336d00 | light=<SCNLight: 0x103973dd0 | type=probe> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 4 children>)
node: <SCNNode: 0x283324800 pos(-0.007079 -0.074922 0.031407) rot(0.000000 -1.000000 0.000000 1.474911) | light=<SCNLight: 0x11aceec70 | type=probe> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 4 children>)
node: <SCNNode: 0x28332ac00 pos(-0.007079 -0.074922 0.031407) rot(0.000000 -1.000000 0.000000 1.474911) | light=<SCNLight: 0x11b908130 | type=probe> | no child>, parent is: Optional(<SCNNode: 0x283313300 | 4 children>)
<<<
It has 4 children, interesting huh?
Also should be mentioned that I see according to the documentation there is such a property automaticallyUpdatesLighting that seems very close to being the right solution, I set it in viewDidLoad like this scnView.automaticallyUpdatesLighting = false, but it doesn't make difference in my case, I still see that light nodes are adding.
Could it be related to the configuration? Like the ARKit tried to create the light nodes related to the real light ambiance in order to simulate it on the scene or something like this?
I need to set the image in the center of the SCNScene in order to do this I use such an approach
...
let image = UIImage(named: "my_image")
let scene = SCNScene()
scene.background.contents = image
...
but the image I get is stretched in order to fill the entire screen, what I need is just to set it center.
I tried different approaches, but without luck.
For example, create such an extension was the last attempt
extension SCNScene {
func addBackground(imageName: String = "YOUR DEFAULT IMAGE NAME", contentMode: UIView.ContentMode = .scaleToFill) {
// setup the UIImageView
let backgroundImageView = UIImageView(frame: UIScreen.main.bounds)
backgroundImageView.image = UIImage(named: imageName)
backgroundImageView.contentMode = contentMode
backgroundImageView.translatesAutoresizingMaskIntoConstraints = false
// adding NSLayoutConstraints
let leadingConstraint = NSLayoutConstraint(item: backgroundImageView, attribute: .leading, relatedBy: .equal, toItem: self, attribute: .leading, multiplier: 1.0, constant: 0.0)
let trailingConstraint = NSLayoutConstraint(item: backgroundImageView, attribute: .trailing, relatedBy: .equal, toItem: self, attribute: .trailing, multiplier: 1.0, constant: 0.0)
let topConstraint = NSLayoutConstraint(item: backgroundImageView, attribute: .top, relatedBy: .equal, toItem: self, attribute: .top, multiplier: 1.0, constant: 0.0)
let bottomConstraint = NSLayoutConstraint(item: backgroundImageView, attribute: .bottom, relatedBy: .equal, toItem: self, attribute: .bottom, multiplier: 1.0, constant: 0.0)
self.background.contents = backgroundImageView
NSLayoutConstraint.activate([leadingConstraint, trailingConstraint, topConstraint, bottomConstraint])
}
}
UPD
Actual result is
desired result is
I need to load the file (https://tmptetaviplayerpoc.blob.core.windows.net/temp-files/alisa/alise_000.tet) in order to do this I use the standard iOS method
...
if let loadPath: URL = self.URL {
let start = DispatchTime.now()
let (data, _, error) = loadPath.download()
let end = DispatchTime.now()
...
the downloading speed I get is weird as - file size is 3738940 bytes / 3.7 MB MB, a number of frames are 20, download time is - 6.600698917 sec.
So, I made a calculation and the result is that 20 (current fame number) * 1,5 (30fps) * 8 (bytes) = 44 Mbs, so it means if my internet connection is 44Mbs it would be enough in order to keep the speed, BUT I did a speed test and I see that my connection is 180 Mbs.
So, the question is - what is a possible reason that I have 180 Mbs and it takes me to download the file 6.6 sec?
There is such a problem, I wrote my static lib (Swift + native), so what should I do in order to use it in the app
I have added the lib to the app + I did all the preparations (linkage in the settings, etc) and eventually, I can use my lib in the app and everything is working, BUT the problem is that the app doesn't see API methods, what I mean is - for example, there is a method for creating object MyLib.createObj() on the one hand this method works, but on the other hand there is a hint about this method when I type MyLib. I expect (as usual) that after I put a . it will show me all possible methods, right? Or for example, if I type MyLib.cre, so I expect that it autocompletes it like MyLib.createObj(), but in the case with my lib, I need to write every letter blindly, without autocompletion or a list of possible methods.
Feels like the app doesn't see the API file (where the list of all lib's methods).
If I missed something let me know.
I am new to iOS development. I got a project to maintain, the problem is that I can't find where in the storyboard belongs the variable. For example, in my ViewControl there is such a line
...
@IBOutlet weak var imgBottomFade: UIImageView!
...
So, I need to find this UI view in the storyboard, I open storyboard, but I see there are more than 10 screens each of them pretty sophisticated, and I start to click on each view on the screen and check the name (equal to imgBottomFade), one view by another, one screen by another... pretty annoying, I believe in XCode should be kind of way to find out where the UI view connects from Controller to Storyboard.
Let me know what I missed?
I have iPhone 12 64Gb, I got a notification that the memory was almost full, then I tried to delete a few apps from the phone, but it was weird because they still were on the screen like I tried to delete them, but nothing happen... Then I decided to reboot the phone (turn it off and on) and now I see such a screen (screenshot attached)
and sometimes (like once in 3 min) I see that it opens a desktop for a sec and comes back to the loading state...
What could be a possible reason for such behavior and how to fix it?
I have a kind of trivial request, I need to seek sound playback. The problem is I don't have a local sound file, I got a pointer to the sound instead (as well as other params), from (my internal) native lib.
There is a method that I use in order to convert UnsafeRawPointer to AVAudioPCMBuffer
...
var byteCount: Int32 = 0
var buffer: UnsafeMutableRawPointer?
defer {
buffer?.deallocate()
buffer = nil
}
if audioReader?.getAudioByteData(byteCount: &byteCount, data: &buffer) ?? false && buffer != nil {
let audioFormat = AVAudioFormat(standardFormatWithSampleRate: Double(audioSampleRate), channels: AVAudioChannelCount(audioChannels))!
if let pcmBuf = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: AVAudioFrameCount(byteCount)) {
let monoChannel = pcmBuf.floatChannelData![0]
pcmFloatData = [Float](repeating: 0.0, count: Int(byteCount))
//>>> Convert UnsafeMutableRawPointer to [Int8] array
let int16Ptr: UnsafeMutablePointer<Int16> = buffer!.bindMemory(to: Int16.self, capacity: Int(byteCount))
let int16Buffer: UnsafeBufferPointer<Int16> = UnsafeBufferPointer(start: int16Ptr, count: Int(byteCount) / MemoryLayout<Int16>.size)
let int16Arr: [Int16] = Array(int16Buffer)
//<<<
// Int16 ranges from -32768 to 32767 -- we want to convert and scale these to Float values between -1.0 and 1.0
var scale = Float(Int16.max) + 1.0
vDSP_vflt16(int16Arr, 1, &pcmFloatData[0], 1, vDSP_Length(int16Arr.count)) // Int16 to Float
vDSP_vsdiv(pcmFloatData, 1, &scale, &pcmFloatData[0], 1, vDSP_Length(int16Arr.count)) // divide by scale
memcpy(monoChannel, pcmFloatData, MemoryLayout<Float>.size * Int(int16Arr.count))
pcmBuf.frameLength = UInt32(int16Arr.count)
usagePlayer.setupAudioEngine(with: audioFormat)
audioClip = pcmBuf
}
}
...
So, at the end of the method, you can see this line audioClip = pcmBuf, where the prepared pcmBuf is pass to the local variable.
Then, what I need is just to start it like this
...
/*player is AVAudioPlayerNode*/
player.scheduleBuffer(buf, at: nil, options: .loops)
player.play()
...
and that is it, now I can hear the sound. But let's say I need to seek forward in 10 sec, in order to do this I need stop() the player node, set a new AVAudioPCMBuffer but this time with an offset of 10 sec.
The problem is there is no method for offset, nor from the player node side nor AVAudioPCMBuffer.
For example, if I would work with a file (instead of a buffer), I could use this method
...
player.scheduleSegment(
file,
startingFrame: seekFrame,
frameCount: frameCount,
at: nil
)
...
There at least you can use startingFrame: seekFrame and frameCount: frameCount params.
But in my case I don't use file I use buffer and the problem is that - there is no such params for buffer implementation.
Looks like I can't implement seek logic if I use AVAudioPCMBuffer.
What am I doing wrong?