I'm wondering if I need to submit a bug report for this or if this is really what Apple intends for SwiftUI. Ideally I can find out I'm wrong.
I have a .searchable List/Feature that can trigger my search function from the .onSubmit(of: .search) modifier function. When I do this it seems that there is no way to perform the same behavior as dismissSearch the Environment Value in this function. I have several views that are interchangeable inside this List's container that are all listening for my ViewModels isSearchingActive published property. I set my ViewModel.isSearchingActive by listening to changes of the child view's Environment Value of isSearching. I also listen for changes to ViewModel.isSearchingActive in my children to call the Environment Value dismissSearch. This does not seem to work.
Here is a playground Minimal Reproducible Example. I am hoping I do not need to make a bug report, I hope I am just wrong... Thanks in advance!
import SwiftUI
import PlaygroundSupport
struct SearchingExample: View {
@State var searchText = ""
@State var didSubmit = false
var body: some View {
NavigationStack {
SearchedView(didSubmit: $didSubmit)
.searchable(text: $searchText)
.onSubmit(of: .search) {
didSubmit = true
}
}
}
}
struct SearchedView: View {
@Environment(\.isSearching) var isSearching
@Environment(\.dismissSearch) var dismissSearch
@Binding var didSubmit: Bool
var body: some View {
VStack {
Text(isSearching ? "Searching!" : "Not searching.")
Button(action: { dismissSearch() }, label: {
Text("Dismiss Search")
})
Button(action: {
// Return Key for playground
}, label: {
Image(systemName: "paperplane")
})
.frame(width: 30, height: 30)
.keyboardShortcut(.defaultAction)
if didSubmit {
Text("You Submitted Search!")
.onAppear {
Task { @MainActor in
try await Task.sleep(nanoseconds: 3_000_000_000)
self.didSubmit = false
}
}
}
}
}
}
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.setLiveView(SearchingExample())
Post
Replies
Boosts
Views
Activity
I've found a handful of other posts that seem to be also not answered regarding the Network framework not receiving messages on UDP channels. @Eskimo, whom I perceive as a legend of the apple forums even posted some nearly identical code about a year and a half ago that he said worked where recent posts comment that they are having the same experience as I am.What I'm doing:I want to switch my networking framework to use Network framework's NWConnection. Then I want to send and receive on multiple(3) streams to some non-apple device that I connect to via WiFi with NEHotspotConfigurationManager. This worked before with SwiftSocket and CocoaAsyncSocket.Problem:NWConnection.receiveMessage(completion:) doesn't seem to get called by UDP, here is my code.Any ideas what's missing or wrong?class UDPClient {
var connection: NWConnection
var address: NWEndpoint.Host
var port: NWEndpoint.Port
var delegate: UDPListener?
private var listening = true
var resultHandler = NWConnection.SendCompletion.contentProcessed { NWError in
guard NWError == nil else {
print("ERROR! Error when data (Type: Data) sending. NWError: \n \(NWError!)")
return
}
}
init?(address newAddress: String, port newPort: Int32, listener isListener: Bool = true) {
guard let codedAddress = IPv4Address(newAddress),
let codedPort = NWEndpoint.Port(rawValue: NWEndpoint.Port.RawValue(newPort)) else {
print("Failed to create connection address")
return nil
}
address = .ipv4(codedAddress)
port = codedPort
listening = isListener
connection = NWConnection(host: address, port: port, using: .udp)
connect()
}
func connect() {
connection.stateUpdateHandler = { newState in
switch (newState) {
case .ready:
print("State: Ready")
if self.listening { self.listen() }
case .setup:
print("State: Setup")
case .cancelled:
print("State: Cancelled")
case .preparing:
print("State: Preparing")
default:
print("ERROR! State not defined!\n")
}
}
connection.start(queue: .global())
}
func send(_ data: Data) {
connection.send(content: data, completion: resultHandler)
}
private func listen() {
while listening {
connection.receiveMessage { data, context, isComplete, error in
print("Receive isComplete: " + isComplete.description)
guard let data = data else {
print("Error: Received nil Data")
return
}
print("Data Received")
}
}
}
}
I connect to a device's WiFi hotspot to control the external device. The device WiFi does not provide internet/data for the iOS device. The popup that asks if the user wants to use Cellular Data disrupts the usage of my app/device and can cause serious problems by disrupting the user's control of the device.
Is there a way to prevent it from happening, prompt it to happen immediately, or to do it myself and tell the NEHotspotConfigurationManager to use Cellular Data or not since the WiFi network won't be providing internet access?
I am processing an H264 encoded video stream from a non-apple IoT device. I want to record bits of this video stream.
I'm getting an error when I try to save to the photo gallery:
The operation couldn’t be completed. (PHPhotosErrorDomain error 3302.)
My Code, let me know if I need to share more:
private func beginRecording() {
self.handlePhotoLibraryAuth()
self.createFilePath()
guard let videoOutputURL = self.outputURL,
let vidWriter = try? AVAssetWriter(outputURL: videoOutputURL, fileType: AVFileType.mp4),
self.formatDesc != nil else {
print("Warning: No Format For Video")
return
}
let vidInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: nil, sourceFormatHint: self.formatDesc)
guard vidWriter.canAdd(vidInput) else {
print("Error: Cant add video writer input")
return
}
let sourcePixelBufferAttributes = [
kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32ARGB),
kCVPixelBufferWidthKey as String: "1280",
kCVPixelBufferHeightKey as String: "720"] as [String : Any]
self.videoWriterInputPixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
assetWriterInput: vidInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributes)
vidInput.expectsMediaDataInRealTime = true
vidWriter.add(vidInput)
guard vidWriter.startWriting() else {
print("Error: Cant write with vid writer")
return
}
vidWriter.startSession(atSourceTime: CMTimeMake(value: self.videoFrameCounter, timescale: self.videoFPS))
self.videoWriter = vidWriter
self.videoWriterInput = vidInput
print("Recording Video Stream")
}
Save the Video
private func saveRecordingToPhotoLibrary() {
let fileManager = FileManager.default
guard fileManager.fileExists(atPath: self.path) else {
print("Error: The file: \(self.path) not exists, so cannot move this file camera roll")
return
}
print("The file: \(self.path) has been save into documents folder, and is ready to be moved to camera roll")
This is what Fails
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: self.path))
}) { completed, error in
guard completed else {
print ("Error: Cannot move the video \(self.path) to camera roll, error: \(String(describing: error?.localizedDescription))")
return
}
print("Video \(self.path) has been moved to camera roll")
}
}
When Recording ends we save the video
private func endRecording() {
guard let vidInput = videoWriterInput, let vidWriter = videoWriter else {
print("Error, no video writer or video input")
return
}
vidInput.markAsFinished()
if !vidInput.isReadyForMoreMediaData {
vidWriter.finishWriting {
print("Finished Recording")
guard vidWriter.status == .completed else {
print("Warning: The Video Writer status is not completed, status: \(vidWriter.status.rawValue)")
print(vidWriter.error.debugDescription)
return
}
print("VideoWriter status is completed")
self.saveRecordingToPhotoLibrary()
}
}
}
What I am trying to do
Record a stream of CMSampleBuffer that lack timestamps
Problem
I am trying to add timestamps and it fails to append the new sample buffer
Steps to reproduce:
get video stream from external camera, h.264 encoded in .mp4 format
create filepath to save video and handle auth/permissions
create a variable frameDuration = CMTime(seconds: 1.0/30.0, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
create a variable for next presentation timestamp nextPTS: CMTime = CMTimeMake(value: 0, timescale: 0)
Setup AVAssetWriter using CMSampleBufferGetFormatDescription(mySampleBuffer)
Setup AVAssetWriter Func
private func setupAssetWriter(format formatDescription: CMFormatDescription?) -> Bool {
// allocate the writer object with our output file URL
guard let videoWriter = try? AVAssetWriter(outputURL: URL(fileURLWithPath: self.path), fileType: AVFileType.mp4),
formatDescription != nil else {
print("Error: No Format For Video to create AVAssetWriter")
return false
}
self.assetWriter = videoWriter
// initialize a new input for video to receive sample buffers for writing
// passing nil for outputSettings instructs the input to pass through appended samples, doing no processing before they are written
let videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: nil, sourceFormatHint: formatDescription)
videoInput.expectsMediaDataInRealTime = true
guard videoWriter.canAdd(videoInput) else {
print("Error: Cannot add Video Input to AVAssetWriter")
return false
}
videoWriter.add(videoInput)
// initiates a sample-writing at time 0
self.nextPTS = CMTime.zero
videoWriter.startWriting()
videoWriter.startSession(atSourceTime: CMTime.zero)
self.assetWriter = videoWriter
self.assetWriterInput = videoInput
return true
}
Append frames as they come in
func appendFrame(_ sampleBuffer: CMSampleBuffer) {
// set up the AVAssetWriter using the format description from the first sample buffer captured
if self.assetWriter == nil {
let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer)
guard self.setupAssetWriter(format: formatDescription) else {
print("Error: Failed to set up asset writer")
return
}
}
// re-time the sample buffer - in this sample frameDuration is set to 30 fps
var timingInfo = CMSampleTimingInfo.invalid // a way to get an instance without providing 3 CMTime objects
timingInfo.duration = self.frameDuration
timingInfo.presentationTimeStamp = self.nextPTS
var sbufWithNewTiming: CMSampleBuffer? = nil
guard CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault,
sampleBuffer: sampleBuffer,
sampleTimingEntryCount: 1, // numSampleTimingEntries
sampleTimingArray: &timingInfo,
sampleBufferOut: &sbufWithNewTiming) == 0 else {
print("Error: Failed to set up CMSampleBufferCreateCopyWithNewTiming")
return
}
// append the sample buffer if we can and increment presentation time
guard let writeInput = self.assetWriterInput, writeInput.isReadyForMoreMediaData else {
print("Error: AVAssetWriterInput not ready for more media")
return
}
guard let sbufWithNewTiming = sbufWithNewTiming else {
print("Error: sbufWithNewTiming is nil")
return
}
if writeInput.append(sbufWithNewTiming) {
self.nextPTS = CMTimeAdd(self.frameDuration, self.nextPTS)
} else {
let error = self.assetWriter!.error
NSLog("Error: Failed to append sample buffer: \(error!)")
}
}
Cannot append the sample buffer, the
NSLog("Error: Failed to append sample buffer: \(error!)")
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-17507), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x283cf7810 {Error Domain=NSOSStatusErrorDomain Code=-17507 "(null)"}}
I can't find any code -17507 references online
My app is getting rejected over and over for what I consider a bogus reason. After 5 days of back and forth I finally just cancelled and will resubmit it in a few hours to hopefully get a new reviewer.
I followed the rules.
This app was published before using the same information on a different account(which is now gone so I am moving the app over)
I updated the background images and colors to use SwiftUI. This app was published on a new apple developer account. Same app, no new features(actually has less features)
I provided an equal quality video to demo the features and how I use WiFi.
I also provided the video from the previous app review and a new video to show the updated UI. 3 videos in total, all showing the same drone controller with different colors.
https://youtu.be/hDjZirlId_Q
What can I even do? This is absurd because now I'm going to miss out on christmas downloads of my app. Is there any way to escalate or get a new reviewer?
I wanted to use SwiftUI to manage dynamic forms for one of my apps. The dynamic forms aren't compatible with enum, and every single usage demo or blog post on iOS 15's FocusState is using an enum.
Official docs use enum
My example without Enum, for you see that it's possible. But should it be done?
// Focus State Example
//
// Created by Michael Robert Ellis on 12/7/21.
//
import SwiftUI
struct MyObject: Identifiable, Equatable {
var id: String
public var value: String
init(name: String, value: String) {
self.id = name
self.value = value
}
}
struct ContentView: View {
@State var myObjects: [MyObject] = [
MyObject(name: "aa", value: "1"),
MyObject(name: "bb", value: "2"),
MyObject(name: "cc", value: "3"),
MyObject(name: "dd", value: "4")
]
@State var focus: MyObject?
var body: some View {
ScrollView(.vertical) {
VStack {
Text("Header")
ForEach(self.myObjects) { obj in
Divider()
FocusField(displayObject: obj, focus: $focus, nextFocus: {
guard let index = self.myObjects.firstIndex(of: $0) else {
return
}
self.focus = myObjects.indices.contains(index + 1) ? myObjects[index + 1] : nil
})
}
Divider()
Text("Footer")
}
}
}
}
struct FocusField: View {
@State var displayObject: MyObject
@FocusState var isFocused: Bool
@Binding var focus: MyObject?
var nextFocus: (MyObject) -> Void
var body: some View {
TextField("Test", text: $displayObject.value)
.onChange(of: focus, perform: { newValue in
self.isFocused = newValue == displayObject
})
.focused(self.$isFocused)
.submitLabel(.next)
.onSubmit {
self.nextFocus(displayObject)
}
}
}
Am I being dumb? Am I missing something or am I on to something?
The beginRecording() func and the endRecording() func matter the most. InterpretRawFrameData is just decoding the video stream the entire time and i want to record parts of the stream. In the decodeFrameData func i attempt and fail to append a CMSampleBuffer to my AVAssetWriterInput objectimport Foundation
import ************
import AVFoundation
import Photos
typealias FrameData = Array
protocol VideoFrameDecoderDelegate {
func receivedDisplayableFrame(_ frame: CVPixelBuffer)
}
class VideoFrameDecoder {
static var delegate: VideoFrameDecoderDelegate?
private var formatDesc: CMVideoFormatDescription?
private var decompressionSession: VTDecompressionSession?
var isRecording: Bool = false {
didSet { isRecording ? beginRecording() : endRecording() }
}
var outputURL: URL?
var path = ""
var videoWriter: AVAssetWriter?
var videoWriterInput: AVAssetWriterInput?
func interpretRawFrameData(_ frameData: inout FrameData) {
var naluType = frameData[4] & 0x1F
if naluType != 7 && formatDesc == nil { return }
// Replace start code with the size
var frameSize = CFSwapInt32HostToBig(UInt32(frameData.count - 4))
memcpy(&frameData, &frameSize, 4)
// The start indices for nested packets. Default to 0.
var ppsStartIndex = 0
var frameStartIndex = 0
var sps: Array?
var pps: Array?
// SPS parameters
if naluType == 7 {
for i in 4..<40 {
if frameData[i] == 0 && frameData[i+1] == 0 && frameData[i+2] == 0 && frameData[i+3] == 1 {
ppsStartIndex = i // Includes the start header
sps = Array(frameData[4..<i])
// Set naluType to the nested packet's NALU type
naluType = frameData[i + 4] & 0x1F
break
}
}
}
// PPS parameters
if naluType == 8 {
for i in ppsStartIndex+4..<ppsstartindex+34 {<br=""> if frameData[i] == 0 && frameData[i+1] == 0 && frameData[i+2] == 0 && frameData[i+3] == 1 {
frameStartIndex = i
pps = Array(frameData[ppsStartIndex+4..<i])
// Set naluType to the nested packet's NALU type
naluType = frameData[i+4] & 0x1F
break
}
}
guard let sps = sps,
let pps = pps,
createFormatDescription(sps: sps, pps: pps) else {
print("===== ===== Failed to create formatDesc")
return
}
guard createDecompressionSession() else {
print("===== ===== Failed to create decompressionSession")
return
}
}
if (naluType == 1 || naluType == 5) && decompressionSession != nil {
// If this is successful, the callback will be called
// The callback will send the full decoded and decompressed frame to the delegate
decodeFrameData(Array(frameData[frameStartIndex...frameData.count - 1]))
}
}
private func decodeFrameData(_ frameData: FrameData) {
let bufferPointer = UnsafeMutableRawPointer(mutating: frameData)
// Replace the start code with the size of the NALU
var frameSize = CFSwapInt32HostToBig(UInt32(frameData.count - 4))
memcpy(bufferPointer, &frameSize, 4)
// Create a memory location to store the processed image
var outputBuffer: CVPixelBuffer?
var blockBuffer: CMBlockBuffer?
var status = CMBlockBufferCreateWithMemoryBlock(
allocator: kCFAllocatorDefault,
memoryBlock: bufferPointer,
blockLength: frameData.count,
blockAllocator: kCFAllocatorNull,
customBlockSource: nil,
offsetToData: 0,
dataLength: frameData.count,
flags: 0, blockBufferOut: &blockBuffer)
// Return if there was an error allocating processed image location
guard status == kCMBlockBufferNoErr else { return }
// Do some image processing
var sampleBuffer: CMSampleBuffer?
let sampleSizeArray = [frameData.count]
status = CMSampleBufferCreateReady(
allocator: kCFAllocatorDefault,
dataBuffer: blockBuffer,
formatDescription: formatDesc,
sampleCount: 1, sampleTimingEntryCount: 0,
sampleTimingArray: nil,
sampleSizeEntryCount: 1,
sampleSizeArray: sampleSizeArray,
sampleBufferOut: &sampleBuffer)
// Return if there was an error
if let buffer = sampleBuffer,
status == kCMBlockBufferNoErr {
let attachments: CFArray? = CMSampleBufferGetSampleAttachmentsArray(buffer, createIfNecessary: true)
if let attachmentsArray = attachments {
let dic = unsafeBitCast(CFArrayGetValueAtIndex(attachmentsArray, 0), to: CFMutableDictionary.self)
CFDictionarySetValue(
dic,
Unmanaged.passUnretained(kCMSampleAttachmentKey_DisplayImmediately).toOpaque(),
Unmanaged.passUnretained(kCFBooleanTrue).toOpaque())
// Decompress with ************
var flagOut = VTDecodeInfoFlags(rawValue: 0)
status = VTDecompressionSessionDecodeFrame(
decompressionSession!,
sampleBuffer: buffer,
flags: [],
frameRefcon: &outputBuffer,
infoFlagsOut: &flagOut)
// Record CMSampleBuffer with AVFoundation
if isRecording,
let vidInput = videoWriterInput,
vidInput.isReadyForMoreMediaData {
print("Appended: \(vidInput.append(buffer))")
}
}
}
}
private func createFormatDescription(sps: [UInt8], pps: [UInt8]) -> Bool {
let pointerSPS = UnsafePointer(sps)
let pointerPPS = UnsafePointer(pps)
let dataParamArray = [pointerSPS, pointerPPS]
let parameterSetPointers = UnsafePointer(dataParamArray)
let sizeParamArray = [sps.count, pps.count]
let parameterSetSizes = UnsafePointer(sizeParamArray)
let status = CMVideoFormatDescriptionCreateFromH264ParameterSets(
allocator: kCFAllocatorDefault,
parameterSetCount: 2,
parameterSetPointers: parameterSetPointers,
parameterSetSizes: parameterSetSizes,
nalUnitHeaderLength: 4,
formatDescriptionOut: &formatDesc)
return status == noErr
}
private func createDecompressionSession() -> Bool {
guard let desc = formatDesc else { return false }
if let session = decompressionSession {
VTDecompressionSessionInvalidate(session)
decompressionSession = nil
}
let decoderParameters = NSMutableDictionary()
let destinationPixelBufferAttributes = NSMutableDictionary()
var outputCallback = VTDecompressionOutputCallbackRecord()
outputCallback.decompressionOutputCallback = callback
outputCallback.decompressionOutputRefCon = UnsafeMutableRawPointer(Unmanaged.passUnretained(self).toOpaque())
let status = VTDecompressionSessionCreate(
allocator: kCFAllocatorDefault,
formatDescription: desc,
decoderSpecification: decoderParameters,
imageBufferAttributes: destinationPixelBufferAttributes,
outputCallback: &outputCallback,
decompressionSessionOut: &decompressionSession)
return status == noErr
}
private var callback: VTDecompressionOutputCallback = {(
decompressionOutputRefCon: UnsafeMutableRawPointer?,
sourceFrameRefCon: UnsafeMutableRawPointer?,
status: OSStatus,
infoFlags: VTDecodeInfoFlags,
imageBuffer: CVPixelBuffer?,
presentationTimeStamp: CMTime,
duration: CMTime) in
guard let newImage = imageBuffer,
status == noErr else {
// -12909 is Bad Video Error, nothing too bad unless there's no feed
if status != -12909 {
print("===== Failed to decompress. VT Error \(status)")
}
return
}
// print("===== Image successfully decompressed")
delegate?.receivedDisplayableFrame(imageBuffer!)
}
private func handlePhotoLibraryAuth() {
if PHPhotoLibrary.authorizationStatus() != .authorized {
PHPhotoLibrary.requestAuthorization { _ in }
}
}
private func createFilePath() {
let fileManager = FileManager.default
let urls = fileManager.urls(for: .documentDirectory, in: .userDomainMask)
guard let documentDirectory: NSURL = urls.first as NSURL? else {
fatalError("documentDir Error")
}
guard let videoOutputURL = documentDirectory.appendingPathComponent("iTello-\(Date()).mp4") else {
return
}
outputURL = videoOutputURL
path = videoOutputURL.path
if FileManager.default.fileExists(atPath: path) {
do {
try FileManager.default.removeItem(atPath: path)
} catch {
print("Unable to delete file: \(error) : \(#function).")
return
}
}
}
private func beginRecording() {
handlePhotoLibraryAuth()
createFilePath()
guard let videoOutputURL = outputURL,
let vidWriter = try? AVAssetWriter(outputURL: videoOutputURL, fileType: AVFileType.mp4) else {
fatalError("AVAssetWriter error")
}
if formatDesc == nil { print("Warning: No Format For Video") }
let vidInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: nil, sourceFormatHint: formatDesc)
guard vidWriter.canAdd(vidInput) else {
print("Error: Cant add video writer input")
return
}
vidInput.expectsMediaDataInRealTime = true
vidWriter.add(vidInput)
guard vidWriter.startWriting() else {
print("Error: Cant write with vid writer")
return
}
vidWriter.startSession(atSourceTime: CMTime.zero)
self.videoWriter = vidWriter
self.videoWriterInput = vidInput
print("Recording Video Stream")
}
private func saveRecordingToPhotoLibrary() {
let fileManager = FileManager.default
guard fileManager.fileExists(atPath: self.path) else {
print("Error: The file: \(self.path) not exists, so cannot move this file camera roll")
return
}
print("The file: \(self.path) has been save into documents folder, and is ready to be moved to camera roll")
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: URL(fileURLWithPath: self.path))
}) { completed, error in
guard completed else {
print ("Error: Cannot move the video \(self.path) to camera roll, error: \(String(describing: error?.localizedDescription))")
return
}
print("Video \(self.path) has been moved to camera roll")
}
}
private func endRecording() {
guard let vidInput = videoWriterInput, let vidWriter = videoWriter else {
print("Error, no video writer or video input")
return
}
vidInput.markAsFinished()
if !vidInput.isReadyForMoreMediaData {
vidWriter.finishWriting {
print("Finished Recording")
guard vidWriter.status == .completed else {
print("Warning: The Video Writer status is not completed, status: \(vidWriter.status.rawValue)")
print(vidWriter.error.debugDescription)
return
}
print("VideoWriter status is completed")
self.saveRecordingToPhotoLibrary()
self.videoWriterInput = nil
self.videoWriter = nil
}
}
}
}I'm getting AVFoundation error -11800 and the writer is failing for some unknown reason. I'm streaming an h264/mp4 video from some non-apple product. The video is decoding and displaying properly but not recording. The append operation fails and fails and I can't determine why from the error.
My users get a prompt that disrupts the use of my app.
The prompt comes more than 10 seconds into using a remote controlled IoT device via WiFi. The device is a drone, so it's not plausible to pause and it's absurd to have them wait so long for this prompt to appear to begin flight safely.
The Wi-Fi network "MySSID" does not appear to be connected to the Internet
Do you want to temporarily use cellular data?
Use Cellular Data
Keep Trying Wi-Fi
I connect with NEHotspotConfigurationManager
https://developer.apple.com/documentation/networkextension/nehotspotconfigurationmanager
hotspotConfig = NEHotspotConfiguration(ssid: mySSID)
guard let hotspotConfig = hotspotConfig else {
print("Error while connecting to WiFi")
return
}
hotspotConfig.joinOnce = true
NEHotspotConfigurationManager.shared.removeConfiguration(forSSID: ssid)
NEHotspotConfigurationManager.shared.apply(hotspotConfig) { [self] (error) in
if let error = error {
print(self.handleConnectionError(error))
completion(false)
} else if self.wifiConnectionInfo() != nil {
completion(true)
} else {
completion(false)
}
}
How do I prevent this? Or is there any workaround?