On my purchase page I use RevenueCat to make the initial purchase and in my SettingsVC I have the in-app purchase API (as required) to later resubscribe:
// ...
try await AppStore.showManageSubscriptions(in: window as! UIWindowScene)
I followed these directions and these directions. Using an iPhone 8 simulator I logged into iCloud as a sandbox tester eg. sandboxtest%test.com, then logged into its Settings, loaded my app, made a purchase, the subscription went through and eventually expired. While in the iPhone 8 I checked my SettingsVC > in-app purchase API and it said Expired Nov 5, 2023 ... Select an option to resubscribe. So it worked. I send sandboxtest%test.com and the pw to App Review and got the below rejection:
Guideline 2.1 - Performance - App Completeness
We discovered one or more bugs in your app. Specifically, your app
displayed an error page when the Mange Subscription tab was tapped. We
found that while you have submitted in-app purchase products for your
app, the in-app purchase functionality is not present in your binary.
If you would like to include in-app purchases in your app, you will
need to upload a new binary that incorporates the in-app purchase API
to enable users to make a purchase
They sent me a screenshot and the in-app purchase API said Cannot Connect - Retry.
I later use my actual device and try these 3 ways:
1- While logged in as myself, without using any Sandbox Account, delete the app, run it again, then log into my app with sandboxtest%test.com
2- While logged in as myself, use a different sandbox tester such as whatever%test.com to log into the Sandbox Account, delete the app, run it again, then log into my app with sandboxtest%test.com
3- While logged in as myself, use sandboxtest%test.com for the Sandbox Account, delete the app, run it again, then log into my app with sandboxtest%test.com
For all 3 RevenueCat prints the initial subscription and the expiration date, but for some reason when I go to the in-app purchase API, it always returns You do not have any subscriptions.
What's strange is when I go back to the iPhone 8 while still logged in as sandboxtest%test.com, the in-app purchase API still shows Expired Nov 5, 2023 ... Select an option to resubscribe.
I'm kinda lost here because to use an actual device to login, it sends a SMS, so I don't see how giving the App Reviewer the sandboxtest%test.com/pw info to login into the device and iCloud will help him/her make a purchase because they can't get the SMS. I would assume they would only need sandboxtest%test.com, but that does't work for them.
Any advice?
Post
Replies
Boosts
Views
Activity
I've been able to center the middle of a 16:9 landscape video, crop the video, and then create a 9:16 portrait version of the video similar to how Apple does it in the Photos album.
The only issue is the resulting portrait video isn't centered in the middle of the screen (images below).
How can I get the resulting portrait video in the center of the screen?
func createExportSession(for videoURL: URL) {
let asset = AVURLAsset(url: videoURL)
let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality)!
exporter.videoComposition = turnHorizontalVideoToPortraitVideo(asset: asset)
exporter.outputURL = // ...
exporter.outputFileType = AVFileType.mp4
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously { [weak self] in
// ...
// the exporter.url is eventually added to an AVURLAsset and played inside an AVPlayer
}
}
func turnHorizontalVideoToPortraitVideo(asset: AVURLAsset) -> AVVideoComposition {
let track = asset.tracks(withMediaType: AVMediaType.video)[0]
let renderSize = CGSize(width: 720, height: 1280)
var transform1 = track.preferredTransform
transform1 = transform1.concatenating(CGAffineTransform(rotationAngle: CGFloat(90.0 * .pi / 180)))
transform1 = transform1.concatenating(CGAffineTransform(translationX: track.naturalSize.width, y: 0))
let transform2 = CGAffineTransform(translationX: track.naturalSize.height, y: (track.naturalSize.width - track.naturalSize.height) / 2)
let transform3 = transform2.rotated(by: CGFloat(Double.pi/2)).concatenating(transform1)
let translate = CGAffineTransform(translationX: renderSize.width, y: renderSize.height)
let rotateFromUpsideDown = translate.rotated(by: CGFloat(Double.pi)) // without this the portrait video is always upside down
let finalTransform = transform3.concatenating(rotateFromUpsideDown)
let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
transformer.setTransform(finalTransform, at: .zero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: .zero, duration: asset.duration)
instruction.layerInstructions = [transformer]
let videoComposition = AVMutableVideoComposition()
videoComposition.frameDuration = CMTime(value: 1, timescale: 30)
videoComposition.renderSize = renderSize
videoComposition.instructions = [instruction]
return videoComposition
}
Initial horizontal video:
Resulting portrait video after running the above code. The portrait video is incorrectly centered:
This is the way that it should be centered:
In my app I use WKWebView and the user can go to different websites like google, wikipedia, vimeo, etc. The issue is if the user decides to go to https://www.youtube.com, when the user taps a thumbnail to play it, it doesn't autoplay because I don't have the videoId (eg. "https://youtu.be/MpvshzR6tNk"), I just have the the youtube website url
For example:
func loadRequest() {
let strThatUserEntered = "https://youtube.com"
let urlStr = "strThatUserEntered"
guard let url = URL(string: urlStr), let request = URLRequest(url: url) else { return }
wkWebView.load(request)
wkWebView.allowsBackForwardNavigationGestures = true
}
Now when the user selects a random thumbnail, the video loads, the youtube play button appears, and when it's pressed, I get: An error occurred, please try again later (this video definitely works)
How can I enable autoplay on any selected youtube thumbnail , if I don't have the videoID?
code:
override func viewDidLoad()
super.viewDidLoad()
let webConfiguration = WKWebViewConfiguration()
webConfiguration.allowsInlineMediaPlayback = true
webConfiguration.mediaTypesRequiringUserActionForPlayback = []
wkWebView = WKWebView(frame: .zero, configuration: webConfiguration)
wkWebView.navigationDelegate = self
wkWebView.uiDelegate = self
// pin wkWebView anchors to screen
loadRequest()
}
Deleted by owner
I followed the [Ray Wenderlich]tutorial to merge videos. The finished result is 1 merged video where portrait videos are at the top of the screen and landscape videos are at the bottom of the screen. In the image below the portrait videos plays first and then landscape video plays after it. The landscape video is from the Photos Library.
code:
let mixComposition = AVMutableComposition()
let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
let audioCompositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
var count = 0
var insertTime = CMTime.zero
var instructions = [AVMutableVideoCompositionInstruction]()
for videoAsset in arrOfAssets {
let audioTrack = videoAsset.tracks(withMediaType: .audio)[0]
do {
try videoCompositionTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
try audioCompositionTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: audioTrack, at: insertTime)
let layerInstruction = videoCompositionInstruction(videoCompositionTrack!, asset: videoAsset, count: count)
let videoCompositionInstruction = AVMutableVideoCompositionInstruction()
videoCompositionInstruction.timeRange = CMTimeRangeMake(start: insertTime, duration: videoAsset.duration)
videoCompositionInstruction.layerInstructions = [layerInstruction]
instructions.append(videoCompositionInstruction)
insertTime = CMTimeAdd(insertTime, videoAsset.duration)
count += 1
} catch { }
}
let videoComposition = AVMutableVideoComposition()
videoComposition.instructions = instructions
videoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
videoComposition.renderSize = CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)
// ...
exporter.videoComposition = videoComposition
AVMutableVideoCompositionLayerInstruction:
func videoCompositionInstruction(_ track: AVCompositionTrack, asset: AVAsset, count: Int) -> AVMutableVideoCompositionLayerInstruction {
let instruction = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
let assetTrack = asset.tracks(withMediaType: .video)[0]
let transform = assetTrack.preferredTransform
let assetInfo = orientationFromTransform(transform)
var scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.width
if assetInfo.isPortrait {
scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.height
let scaleFactor = CGAffineTransform(scaleX: scaleToFitRatio, y: scaleToFitRatio)
instruction.setTransform(assetTrack.preferredTransform.concatenating(scaleFactor), at: .zero)
} else {
let scaleFactor = CGAffineTransform(scaleX: scaleToFitRatio, y: scaleToFitRatio)
var concat = assetTrack.preferredTransform.concatenating(scaleFactor)
.concatenating(CGAffineTransform(translationX: 0,y: UIScreen.main.bounds.width / 2))
if assetInfo.orientation == .down {
let fixUpsideDown = CGAffineTransform(rotationAngle: CGFloat(Double.pi))
let windowBounds = UIScreen.main.bounds
let yFix = assetTrack.naturalSize.height + windowBounds.height
let centerFix = CGAffineTransform(translationX: assetTrack.naturalSize.width, y: yFix)
concat = fixUpsideDown.concatenating(centerFix).concatenating(scaleFactor)
}
instruction.setTransform(concat, at: .zero)
}
if count == 0 {
instruction.setOpacity(0.0, at: asset.duration)
}
return instruction
}
Orientation:
func orientationFromTransform(_ transform: CGAffineTransform) -> (orientation: UIImage.Orientation, isPortrait: Bool) {
var assetOrientation = UIImage.Orientation.up
var isPortrait = false
let tfA = transform.a
let tfB = transform.b
let tfC = transform.c
let tfD = transform.d
if tfA == 0 && tfB == 1.0 && tfC == -1.0 && tfD == 0 {
assetOrientation = .right
isPortrait = true
} else if tfA == 0 && tfB == -1.0 && tfC == 1.0 && tfD == 0 {
assetOrientation = .left
isPortrait = true
} else if tfA == 1.0 && tfB == 0 && tfC == 0 && tfD == 1.0 {
assetOrientation = .up
} else if tfA == -1.0 && tfB == 0 && tfC == 0 && tfD == -1.0 {
assetOrientation = .down
}
return (assetOrientation, isPortrait)
}
I have an audio url (.m4a) that I create using the AVAudioRecorder. I want to share that audio on Instagram so I convert the audio to a video. The issue is after the conversion, when I save the video url to the Files app using the UIActivityViewController, I can replay the video, see the playback time (eg 7 seconds) and hear the audio with no problem. A black screen with a sound icon appears.
But when I save the same exact converted audio-video file to the Photos Library using the UIActivityViewController, inside the Photos Library the video shows the 7 seconds but nothing plays, the video is all gray, and the sound icon doesn't show.
Why is the video successfully saving/playing in the Files app but saving and not playing in the Photos Library?
I tried setting the exporter.outputFileType as both .mov and .mp4 and the issue is exactly the same.
let asset: AVURLAsset = AVURLAsset(url: audioURL)
let mixComposition = AVMutableComposition()
guard let compositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID()) else { return }
let track = asset.tracks(withMediaType: .audio)
guard let assetTrack = track.first else { return }
do {
try compositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: assetTrack.timeRange.duration), of: assetTrack, at: .zero)
} catch {
print(error.localizedDescription)
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough) else { return }
let dirPath = NSTemporaryDirectory().appending("\(UUID().uuidString).mov")
let outputFileURL = URL(fileURLWithPath: dirPath)
exporter.outputFileType = .mov // I also tried .mp4
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously {
switch exporter.status {
// ...
guard let videoURL = exporter.outputURL else { return }
// present UIActivityViewController to save videoURL and then save it to the Photos Library via 'Save Video`
}
}
In both situations below I'm successfully connected to WiFi. Using Safari and Chrome (I have a VPN on chrome) with my mac I can successfully connect to google, apple, and youtube.
When I use an actual device to connect to any website using WKWebView everything works fine.
But when I try to connect to any website (including google, apple, and youtube) using WKWebView and the simulator I get this error.
func webView(_ webView: WKWebView, didFailProvisionalNavigation navigation: WKNavigation!, withError error: Error) {
let code = error._code
print(code) // prints 1200
print(error.localizedDescription)) // An SSL error has occurred and a secure connection to the server cannot be made
}
Why can I successfully connect using a real device but not the simulator? Btw this just started happening today. It never happened before and I didn't change any Xcode settings.
I found an answer that said in the simulator go to Settings > Developer > Allow HTTP Services and to toggle it on. It still doesn't work wether on or off.
I also tried this which works fine on a real device using WKWebView
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
But on the simulator using WKWebView the website still doesn't show and I get the error:
The certificate for this server is invalid. You might be connecting to a server that is pretending to be "wwwgoogle.com" which could put your confidential information at risk.
I'm running Xcode 12.4, testing on iOS 13 and 14.
How can a user be logged in/iCloud Available, but still Not Authenticated?
When I fetch a record, the CKErrorCode is NotAuthenticated and it prints Couldn't get an authentication token
let predicate = NSPredicate(format: "id == %@", id)
let query = CKQuery(recordType: RecordType.Media, predicate: predicate)
CKContainer.default().publicCloudDatabase.perform(query, inZoneWith: nil) { (ckRecords, error) in
if let err = error {
let code = err._code
// error code is 9 - case NotAuthenticated /* Not authenticated (writing without being logged in, no user record) */
return
}
// ...
}
But when I check to see if the user is logged in, I use the code below, and in both situations it prints iCloud Available
1-
if FileManager.default.ubiquityIdentityToken != nil {
print("iCloud Available")
} else {
print("iCloud Unavailable")
}
2-
CKContainer.default().accountStatus { (accountStatus, error) in
switch accountStatus {
case .available:
print("iCloud Available")
case .noAccount:
print("No iCloud account")
case .restricted:
print("iCloud restricted")
case .couldNotDetermine:
print("Unable to determine iCloud status")
@unknown default:
print("Unable to determine iCloud status")
}
}
FYI, this seems to happen in this order.
1- I'm home on wiFi, logged into iCloud, everything works fine.
2- I leave my house, switch to a hot spot, I'm still logged into iCloud and everything works fine
3- I come back into my house, switch back to wiFi, of course I'm still logged in, then the above issue occurs. It's as if it wants me to log in again even though I'm already logged in and it says iCloud Available.
This issue is happening on both the real device and the simulator.
UPDATE
I just found this post, it seems lots of devs are having this same problem
When I try to access the cloud database it says No Containers
I have everything set up correctly. Everything in the pics below with the gray outline has the same exact identifier iCloud.com.myCo.myApp
Entitlements:
Xcode:
developer.apple:
I did a deep clean, closed Xcode, reopened it, deep clean again, build. Still No Containers.
This is a paid account.
I have several properties inside my CoreData Entity file named UnsavedModel, here are some of them.
I created my own file with a CollectionView and when I instantiate the newBodyText and newHttpsStr properties, it crashes. The correct values are being fed into both of the properties, which are just Strings, but it only crashes on those properties, none of the other properties. If I don't give those properties a value, everything works fine. What's the issue here?
and also
The problem occurs in this order
HomeVC > buttonPress immediately push on UnsavedVC
UnsavedVC (cells correctly load) > backButtonPressed immediately pop back to HomeVC
HomeVC > buttonPress immediately push on UnsavedVC
CRASH
FYI I have no problems writing to those properties. But when I do try to delete those 2 properties, it crashes, but have no problem deleting any of the other properties.
I've also gotten a crash on Thread 1 Queue : com.apple.main-thread (serial) and objc_msgSend for the same push/pop/push issue.
code:
class UnsavedController: UIViewController, UICollectionViewDataSource, UICollectionViewDelegateFlowLayout {
var datasource = [CopyCoreDataModel]()
override func viewDidLoad() {
super.viewDidLoad()
fetchData()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
navigationController?.setNavigationBarHidden(false, animated: false)
}
func fetchData() {
guard let appDelegate = UIApplication.shared.delegate as? AppDelegate else { return }
let context = appDelegate.persistentContainer.viewContext
// I also tried context.performAndWait { ... run the do-try-catch fetchRequest in here... }
let fetchRequest: NSFetchRequest<UnsavedModel> = UnsavedModel.fetchRequest()
do {
let results = try context.fetch(fetchRequest)
for result in results {
guard let id = result.value(forKey: "id") as? String else { continue }
let isContained = tableData.contains(where: { $0.id ?? "" == id })
if !isContained {
let copy = CopyCoreDataModel(id: id, unsavedModel: result)
datasource.append(copy)
}
}
collectionView.reloadData()
} catch {
print(error)
}
}
}
// CopyCoreDataModel is necessary because I run some functions based on certain properties
class CopyCoreDataModel {
var fileUrl: URL?
var id: String?
var unsavedModel: UnsavedModel?
var postDate: Double?
// otherProperties of type String, Double, and Boolean
var micUrl: String?
var newBodyText: String?
var newHttpsStr: String?
init(id: String, unsavedModel: UnsavedModel) {
self.id = id
self.unsavedModel = unsavedModel
// self otherProperties = unsavedModel.otherProperties // run some function on some of the other properties. These all work perfectly fine
if let micUrl = unsavedModel.micUrl { // works perfectly fine
self.micUrl = micUrl
// function to get micURL from FileManager that eventually sets self.fileUrl
}
if let newBodyText = unsavedModel.newBodyText { // crash occurs here only if it has a value
self.newBodyText = newBodyText
}
if let newHttpsStr = unsavedModel.newHttpsStr { // crash occurs here only if it has a value
self.newHttpsStr = newHttspStr
}
}
}
func writeData(micURL: URL) {
guard let appDelegate = UIApplication.shared.delegate as? AppDelegate else { return }
let context: NSManagedObjectContext = appDelegate.persistentContainer.viewContext
let entity: NSEntityDescription = NSEntityDescription.entity(forEntityName: "UnsavedModel", in: context)!
let object: NSManagedObject = NSManagedObject(entity: entity, insertInto: context)
object.setValue(UUID().uuidString, forKey: "id")
object.setValue(micURL.path, forKey: "micUrl")
object.setValue("abc", forKey: "newBodyText")
object.setValue("https...", forKey: "newHttpsStr")
// set other properties
do {
try context.save()
} catch let error as NSError {
print("could not save . \(error), \(error.userInfo)")
}
}
I also setup a CoreDataManager sharedInstance class and accessed the context through there let context = CoreDataManager.sharedInstance.persistentContainer.viewContext but the same issue occurs
I also changed the CopyCoreDataModel's initializer to use a dict of the k/v from the unsavedModel but the crash still occurs
class CopyCoreDataModel {
// same exact properties
init(id: String, dict: [String: Any]) {
// set the properties using the values from the dict
}
}
func fetchData() {
let context = // ...
let fetchRequest = // ...
do {
let results = try context.fetch(fetchRequest)
for result in results {
guard let id = result.value(forKey: "id") as? String else { continue }
let isContained = datasource.contains(where: { $0.id ?? "" == id })
if !isContained {
let dict = createDictFromUnsavedModel(unsavedModel: result)
let copy = CopyCoreDataModel(id: id, dict: dict)
datasource.append(copy)
}
}
collectionView.reloadData()
} catch {
}
}
func createDictFromUnsavedModel(unsavedModel: UnsavedModel) -> [String:Any] {
var dict = [String: Any]()
// set dict using k/v from unsavedModel
return dict
}
I'm using peer-to-peer and I successfully connect one client to another but when I send the echo to the individual connection the receiving client doesn't get a response.
Initial send
let data = NYKeyedArchiver....
let message = NWProtocolWebSocket.Metadata(opcode: .text)
let context = NWConnection.ContentContext(identifier: "send",
metadata: [message])
connection.send(content: data, contentContext: context, isComplete: true, completion: .contentProcessed({ (error) in
if let error = error { return }
print("Sent")
}))
Receive data and send Echo response
func receivedIncoming(connection) {
connection.receive(minimumIncompleteLength: 1, maximumLength: 65535) { (data, context, isComplete, error) in
if let err = error {
print(err) // never gets hit
return
}
if let data = data, !data.isEmpty {
// do something with data
if let color = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as? UIColor {
// this should only run on the other device once echo is received after 5 secs
self?.view.backgroundColor = color
}
let randomColor = UIColor.random // func create random color
let colorData = randomColor.encode() // func encode color to data
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
connection.send(content: colorData, completion : .idempotent)
// I've also tried
let message = NWProtocolWebSocket.Metadata(opcode: .text)
let context = NWConnection.ContentContext(identifier: "send", metadata: [message])
connection.send(content: colorData, contentContext: context, isComplete: true, completion: .contentProcessed({ (error) in
if let error = error { return }
print("Color data sent") // this always prints
}))
}
} else {
print("data is empty") // never gets hit
}
}
}
NWConnection
weak var delegate: PeerConnectionDelegate?
var connection: NWConnection?
// Outgoing Connection
init(endPoint: NWEndpoint, delegate: PeerConnectionDelegate) {
self.delegate = delegate
let tcpOptions = NWProtocolTCP.Options()
tcpOptions.enableKeepalive = true
tcpOptions.keepaliveIdle = 2
let parameters = NWParameters(tls: nil, tcp: tcpOptions)
parameters.includePeerToPeer = true
parameters.allowLocalEndpointReuse = true
connection = NWConnection(to: endPoint, using: parameters)
startOutgoingConnection()
}
// Incoming Connection
init(connection: NWConnection, delegate: PeerConnectionDelegate) {
self.delegate = delegate
self.connection = connection
startIncomingConnection()
}
func startIncomingConnection() {
connection?.stateUpdateHandler = { (nwConnectionState) in
case .ready:
self.delegate?.receivedIncoming(connection)
// ...
}
Why is the echo data being sent but not received?
I'm not native to networking so maybe I'm misunderstanding how endPoint information is gathered.
Device_A is browsing and discovers Device_B or the other way around, it doesn't matter because they will both discover each other and send data to open a connection. Because the Network framework does not resolve ipAddresses, - https://developer.apple.com/forums/thread/129644 when a connection is first made I use either the remoteEndPoint (connection.currentPath?.remoteEndpoint // fe80::9821:7fff:fcea:74c4%awdl0.10059) or the endPoint description (connection.endpoint.debugDescription // myApp (2)._myApp._tcplocal.) as a uniqueID for the connection.
I send the data and some other info across the wire, if successful I then place either endPoint inside an ivar dictionary as the key with another value so that I know which device to map a response to once a device responds back.
Right now I only have 2 devices connected, the issue is when I receive a response from Device_B, I'm actually getting the incorrect endPoint information. For some reason I keep getting the endPoint info from Device_A back. It seems like the same endPoint information is getting sent twice. Once to Device_A and then again to Device_B. This exact same thing occurs on Device_B but in reverse. I'm confused as to why this is happening.
For example Device_A first discovers itself, the remoteEndPoint is *fe80::9821:7fff:fcea:74c4%awdl0.10059*, it sends the data. When Device_A receives its own message, I filter it out using the userId and I see the same endPoint info. But when Device_A discovers Device_B, the remoteEndPoint is *fe80::9821:7fff:fcea:74c4%awdl0.27788*. When I receive a response from Device_B, the endPoint information is showing the first one from Device_A *fe80::9821:7fff:fcea:74c4%awdl0.10059*. The same remoteEndPoint info is duplicated. The same exact thing is happening if I use endpoint.debugDescription. This issue occurs on both devices.
NWBrowser:
browser.browseResultsChangedHandler = { (results, changes) in
for change in changes {
switch change {
case .added(let browseResult):
switch browseResult.endpoint {
case .service(let name, let type,_,_):
let connection = PeerConnection(to: browseResult.endpoint)
// ...
}
PeerConnection:
var connection: NWConnection?
init(to endPoint: NWEndpoint) {
// tcpOptions ...
// params ...
// initialize connection and delegate that sends out data
connection.stateUpdateHandler = { (nwConnectionState) in
case .ready:
self.delegate.sendOutgoing(connection)
}
Send Data:
var dict = [String:String]()
var endPointArr = [String]()
func sendOutgoing(_ connection: NWConnection) {
let endPoint = connection.currentPath?.localEndpoint?.debugDescription
or
let endPoint = connection.currentPath?.remoteEndpoint?.debugDescription
// encode the endPoint and currentUserId with some other info, then send it across the wire, if successful make the endPoint a key inside a dictionary and add it to an array to keep track of what endPoints were received
connection.send(content: encodedData, contentContext: context, isComplete: true, completion: .contentProcessed({ [weak self](error) in {
if let error = error { return }
self?.dict[endPoint] = someUniqueValue
self?.endPointArr.append(endPoint)
}))
}
Receiving a response
connection.receive(minimumIncompleteLength: 1, maximumLength: 65535) { [weak self](data, context, isComplete, error) {
if let err = error { return }
if let data = data, !data.isEmpty {
self?.received(data)
}
}
func received(_ data) {
guard let retrievedData = try! NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as? MyModel else { return }
guard let endPoint = retrievedData.endPoint as? String, let userID = retrievedData.userId as? String else { return }
print(endPoint) // fe80::9821:7fff:fcea:74c4%awdl0.10059
if userID == Auth.auth().currentUser?.uid {
dict[endPoint] = nil
return
}
endPointArr.forEach { (endPoint) in
print(endPoint) // prints both fe80::9821:7fff:fcea:74c4%awdl0.10059 and fe80::9821:7fff:fcea:74c4%awdl0.27788
}
// this never runs because the key just got set to nil above because Device_B has the same endPoint info
for (key, value) in dict where key == endpoint {
print("key=\(key) : value=\(value)") // someUniqueValue
// show response that this is a response from whichever device has this endPoint
break
}
}
[1]: https://developer.apple.com/forums/thread/129644
I'm using NWConnection to send my discoveryTokens. When an outgoing connection is found, I have the following code:
let session = NISession()
guard let token = session.discoveryToken else {
print("discoveryToken is nil")
return
}
guard let data = try? NSKeyedArchiver.archivedData(withRootObject: token, requiringSecureCoding: true) else {
return
}
// send data ...
Why is it that the discoveryToken sometimes returns nil? Is there a remedy for this?
This has happened on multiple occasions and because of this nothing is sent. There doesn't seem to be a workaround nor any documentation on why this occurs
I'm using Bonjour, NWConnection, NWBrowser, and NWListener along with the NearInteraction framework. The NearInteraction framework works best when the devices are within 9 meters - https://developer.apple.com/documentation/nearbyinteraction/initiating_and_maintaining_a_session of each other.
NI works best when the peer devices are: Within nine meters of each other. Is there anyway that I can set Bonjour to not discover devices beyond or only within 9 meters?
I'm using Bonjour, NWListener, NWBrowser, NWConnection, and peer-to-peer to connect devices. Even though I'm not using Mutlipeer, I know that it is built on top of Bonjour and from what I've read the Multipeer range is 20 - 50 meters depending on what's in between the devices etc. For this question I'm assuming that is the same range for these four APIs.
If deviceA is within 50 meters of deviceB, then there will be an automatic discovery/connection. If deviceA moves outside the range to 51 meters, the connection will be lost.
If deviceA moves back within 50 meters, will there be automatic discovery/connection between them again?
The reason I ask is because another developer who has networking experience told me that I have to add a timer to the view controller and fire if off every few seconds so that discovery can keep occurring. I haven't found anything on the forums or stackoverflow that gives any examples of having to use a timer with Bonjour for discovery work. I don't know if I need to add a timer or not to rediscover devices that were once connected, got disconnected, and want to connect again.
I'm far from a Network expert and that is why I ask this question.