You need to set flag isCameraAssistanceEnabled = true on the NINearbyAccessoryConfiguration (or NINearbyPeerConfiguration if you are connecting to another iPhone or Apple watch).
This configuration you will use to start NISession like this:
niSession.run(configuration)
Note that the flag isCameraAssistanceEnabled is available only from iOS 16 as you mentioned but you do not need it to get the direction from the NISession. It is used only to improve spatial awareness of the NearbyInteractions framework, and everything will work without it.
Take a look at the sample projects in the documentation:
https://developer.apple.com/documentation/nearbyinteraction/implementing_interactions_between_users_in_close_proximity
https://developer.apple.com/documentation/nearbyinteraction/implementing_spatial_interactions_with_third-party_accessories
Post
Replies
Boosts
Views
Activity
No, the NISession is only used to get you data regarding the position of the other device. NearbyInteraction framework works only in one direction (sending spatial information from the device to your phone).
If you want to send the data to the other device (iPhone, Apple Watch, or the UWB Accessory) you need to use standard communication frameworks: CoreBluetooth, Multipeer Connectivity, etc.