Hello,
I am attempting to perform a Diffie Hellman Keyexchange with a server running on .Net.
However, the secretKey I am creating on the client side does not match with the secretKey on the server side, which I have for testing purposes.
I can import the server secret key as a SymetricKey, and if I use it to seal and open a box, it works. However, if I seal the box with my client key, I can not open it with the server shared key.
I create the SymetricKey like this:
let sharedHash = SHA256.self
let sharedInfo = serverPublicKey.rawRepresentation
let sharedLength = 32
let symetricKey = sharedSecret.x963DerivedSymmetricKey(
using: sharedHash,
sharedInfo: Data(),
outputByteCount: sharedLength)
The server key is created using .Net like this:
bob.KeyDerivationFunction = ECDiffieHellmanKeyDerivationFunction.Hash;
bob.HashAlgorithm = CngAlgorithm.Sha256;
bobPublicKey = bob.PublicKey.ToByteArray();
bobKey = bob.DeriveKeyMaterial(CngKey.Import(Alice.alicePublicKey, CngKeyBlobFormat.EccPublicBlob));
My assumption is the keys should be the same. Is that correct?
How can I find out what format the server key is in? The .Net documentation is not particularly precise on that
You can find a Playground of my code, and when you google for ECDiffieHellmanCng Class, you will find an example on what .Net does.
Any help is appreciated
Post
Replies
Boosts
Views
Activity
After I realized that the download site now tries to open a popup, I was able to get the download for Xcode 15 Beta and the selected Simulator SDKs
Now, no matter where I unpack and start Xcode 15, it wants to re-download these files.
What is the way this is supposed to work? Where should I put my downloads and uncompressed files?
The environment
We have a rather large project, where the main app is ReactNative/Expo, and there are some native extensions and components.
We are building two flavors, development and production, and these are deployed as separate applications, with separate bundle ids. Let's call them com.example.dev and com.example.prod. These flavors are build via Schemes. For each flavor, four variants exist ("debug", "release", "appstore", and "automation"
The two flavors thus have different Shared Group Idenfiers (group.com.example.dev and group.com.example.prod, and we use a shared container to share data between this apps.
The Main Objective
We want to share the group identifier between the main app and its various extensions. If you have a better idea than what we are describing, please absolutely say so. I'd rather solve the problem, than care about details.
The Solution
We are putting the setting into a react native .env file, which is read by the react native parts and is basically structured like an .xcconfig file. For example, the debug.env file looks like this:
BUILD_CONFIG=debug
SHARED_GROUP_NAME_IOS=group.com.example.dev
Then, there is a debug.xcconfig file, which includes the pods.xcconfig, and the debug.env file:
#include "Pods/Target Support Files/Pods-App/Pods-App.debug.xcconfig"
#include "../../react-native-config/debug.env"
Side node: People who know about ReactNative files may interject that these environment variables should be available because there is a build setting INFOPLIST_PREFIX_HEADER = ${CONFIGURATION_BUILD_DIR}/../GeneratedInfoPlistDotEnv.h
This, however does not seem to work for me, not in this nor in a a sample project.
Finally, the Info.plist contains:
<key>SharedGroupName</key>
<string>$(SHARED_GROUP_NAME_IOS)</string>
The idea then is that the app and the extensions can simply retrieve this value using Bundle.main.object(forInfoDictionaryKey: "SharedGroupName)
The Issue
Because the SharedGroupName key remained empty in the generated plist. I took a look at the Preprocessed-Info.plist in the build output, and found something extremely curious:
<key>SharedGroupName</key>
<string>$(group.com.example.dev)</string>
So the variable was substituted with another variable! And thus, it ultimately get's resolved to an empty string
Am I doing it wrong? At what point, in the build settings, the variable seems to resolve correctly
How can I possibly debug this better?
Is this a bug?
Thanks for any hints or insights you might have
Primary Objective
My Widget displays pictures from a list. If I load in all the pictures when setting up the timeline, I run into a memory issue (EXC_RESOURCE RESOURCE_TYPE_MEMORY), because Widgets have a strict limit on how much memory they can use.
My Solution
Instead of storing a UIImage in my timeline entry, I store the URL to the picture, and load it with a .taskattached to the view.
This works, when the view is displayed in the app, and is also shown in the Preview window.
The Problem
The image is apparently loaded, but not updated for the Widget. Not if running in the simulator or on device, or in the Preview canvas.
The Question
Obviously: Am I doing it wrong? Do I need to update my timeline more conservatively? Can I load the pictures on demand? Is there a different way of doing this?
The Sample
Sample code illustrating this can be found here: https://github.com/below/WidgetSample
Hello,
apparently, HMAccessorySetupManager.performMatterEcosystemAccessorySetup has been removed from HMAccessorySetupManager.h.
Is there any more information on this? Has it moved to a different Framework, or has this kind of functionality ("setting up Matter accessories with a Matter Partner Ecosystem App") been removed from iOS?
In the Xcode 14 Release candidate, I get a No such module 'Matter' error when attempting to import Matter.
Has Matter been removed from the iOS 16 RC?
What are my options other than waiting for an iOS 16.1 Beta?
Sorry, I did not have a catchier title:
Prelude
Avid followers of my Twitter account know: I was looking for an easy, preferably human editable container for documents, and — being the old Mac developer I am — I went with bundles, stored in the Documents folder.
The Problem
At some point in my app, the user can choose which file to use. And then, of course, on next launch, that file should be used. Selecting the file works all nice and well, including reading from the created Bundle.
The problem occurs when the App is relaunched, and reads the absoluteString of the file from the UserDefaults. This works, and I can create a URL from it. But creating a bundle fails, and I don't get any error. This happens on iOS 15 and 16, on device and in the simulator.
Is this some permissions things? Is the url not formatted correctly? What is going on?
To Reproduce
I have created a sample App here: https://github.com/below/BundleSample
Launch the App, in the Simulator or on device
Select "Copy Bundle" (You can "Test Resource Bundle", but it does not matter)
Finally, select "Test Documents Bundle". You should see a picture
Close the app, either by stopping it in Xcode or force quit
Restart the App
Expected Result
You see the picture
Acutal Result
You see the "Gear" placeholder
As I currently live in ReactNative Hell, I like to flesh out all my native iOS demos and samples to the max. Including things like accessibility. Recently, I wrote a very simple demo containing a map, and I stumbled upon some issues I was unable to resolve. I think they present very general usecases, and so I would be happy if anyone of you had any idea.
The condensed source code for the issues can be found on GitHub
Issue 1: The Phantom Overlay
To reproduce Run the app on a device, and be sure that VoiceOver is on. Swipe right to get to the Annotations.
Expected Result The title of the annotation is read.
Actual Result The title of the annotation is read twice.
What I know For every annotation on the map view, there is also an overlay, an MKCircle, generated by an MKCircleRenderer. When this overlay is not present, the title is — correctly — only read once.
What I have tried In ViewController.swift, lines 54 and 92, I have set both the overlay's and the renderer's isAccessibilityElement property to false. This does not fix the issue (probably because neither of them are the actual views).
The overlay should never be an accessible element. Any information should be encoded in the annotation (e.g. "There is a 10m region around this marker")
Issue 2: The Unknown Trait
While it is correct that the title of the annotation should be read, there is no indication that the annotation can be clicked or interacted with. I have set annotationView.accessibilityTraits = [.button], but this does not change anything. My expectation would be "Cologne Cathedral, Button" or a similar hint that the item is clickable.
Issue 3: The Unreachable Callout
With VoiceOver active, click on an annotation. I have taken some hints from Stackoverflow, and tried to disable the accessibility on the annotation, and enable it on the callout. This leads to the callout being reachable somehow, but it is absolutely not obvious if you can not see the screen.
How can I indicate to the VoiceOver user that now a callout is being shown?
A Working Extra: The Annotation Rotor
The app also contains a custom rotor, to go through the annotations one by one, without also reading the default Points-Of-Interest on the map. Interestingly (or maybe rather as expected), the title of the annotation is correctly only read once.
I whould be extremely happy to get some feedback on these issues, it sounds like most of them could be rather common.
Thanks!
Alex
Sorry, I did not have a better title for this, I hope it gets clearer with code.
The idea is to build a simple facade for Persistent Storage of some objects:
class PersistantStorage<T, Codable, Identifiable> {
func store(_ object: T) throws { }
func objectFor(_ key: T.ID) -> T? {
return nil
}
}
As apparent, there is the generic type T, which is constrained to Codable and Identifiable. Now I want to use the later constraint to define my objectFor method, but the compiler complains:
'ID' is not a member type of type 'T'
How would I do this? Or is this completely the wrong approach?
Thanks
Alex
Today I was reminded that apps on the Mac do not have to be Mach-O Executables: They can be scripts!
For example, a python script.
But what I found out that if the app is started directly from the terminal, the app starts as ARM64. But if starting from the Finder (i.e. launchd), it is launched as Intel.
Is there an Info.plist key that fixes this?
We have our own backend, running on our own server.
When iCloud Private Relay is enabled, the requests to the server time out.
How can we find out what precisely is failing? TLS 1.3 is enabled on the backend
finished with error [-1001] Error Domain=NSURLErrorDomain Code=-1001 "The request timed out." UserInfo={_kCFStreamErrorCodeKey=-2102, NSUnderlyingError=0x281b0a3d0 {Error Domain=kCFErrorDomainCFNetwork Code=-1001 "(null)" UserInfo={_kCFStreamErrorCodeKey=-2102, _kCFStreamErrorDomainKey=4}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <CBAE9DA5-9988-4E55-A732-B759842E411F>.<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask <CBAE9DA5-9988-4E55-A732-B759842E411F>.<1>"
), NSLocalizedDescription=The request timed out.,
Hello,
on Mac OS, is it possible to see a virtual Camera, such as OBS (https://obsproject.com) as a CaptureDevice?
I see that, for example, Google Chrome can use this camera, but using AVCaptureDevice.DiscoverySession I am unable to see it.
Am I doing wrong?
var deviceTypes: [AVCaptureDevice.DeviceType] = [.builtInMicrophone, .builtInWideAngleCamera]
#if os(OSX)
deviceTypes.append(.externalUnknown)
#else
deviceTypes.append(contentsOf: [.builtInDualCamera, .builtInDualWideCamera, .builtInTelephotoCamera, .builtInTripleCamera, .builtInTrueDepthCamera, .builtInUltraWideCamera])
#endif
let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: deviceTypes,
mediaType: nil, position: .unspecified)
result = discoverySession.devices.map { device in
device.localizedName
}
Building my app with Xcode 13, a dependency threw an error related to the changes in CoreBluetooth: Some properties are now returned as optionals in iOS 15.
Wanting to fix that, I would like to make sure that the library will run on both iOS 15 and SDKs prior to that, but the current availability markers will not help me:
func peripheralManager(_ peripheral: CBPeripheralManager, didAdd service: CBService, error: Error?) {
let peripheral: CBPeripheral
// So what I would like is something like this
// But this does not work :(
if #available(iOS 15, *) {
guard let nonOptional = service.peripheral else {
return
}
peripheral = nonOptional
} else {
peripheral = service.peripheral
}
let result: CBPeripheral = peripheral
print ("\(result)")
}```
This code will cause the compiler to throw an error both on Xcode 13 and Xcode 12.
Is there any good way to solve this?
I would like to declare an @EnvironmentObject as a protocol, namely as
protocol AccessTokenProvider: Authenticator {
func accessToken() - AccessTokenPublisher
}
where Authenticator is
public class Authenticator: NSObject, ObservableObject
However, when I add this to the environment as
var authenticator: AccessTokenProvider = MyAuthenticator()
[…]
ContentView().environmentObject(authenticator)
the compiler throws an error at:
struct ContentView: View {
// error: property type 'AccessTokenProvider' does not match that of the 'wrappedValue' property of its wrapper type 'EnvironmentObject'
@EnvironmentObject var authenticator: AccessTokenProvider
^
Is this even a good idea? If so, what am I doing wrong?
^
Context: Authorized Network requests.
I have a process (a publisher, to be precise), which will give me a fresh access token for a request.
To do that, I may need user interaction, i.e. show a View, Alert, or something like that.
How can I do that in SwiftUI? Can I just (modally) display something, without explicitly being in a View body? Should I have to pass in a "superview" to do this, or do you have other ideas?
Thanks
Alex