Apple Silicon

RSS for tag

Build apps, libraries, frameworks, plug-ins, and other executable code that run natively on Apple silicon.

Posts under Apple Silicon tag

67 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Cannot install Python from source on Apple Silicon M1
I am trying to install Python from source according to the readme using: ./configure make <-- Error happens here make test sudo make altinstall However, I cannot complete the make command since it fails with: Undefined symbols for architecture arm64: "_libintl_bindtextdomain", referenced from: __locale_bindtextdomain in _localemodule.o "_libintl_dcgettext", referenced from: __locale_dcgettext in _localemodule.o "_libintl_dgettext", referenced from: __locale_dgettext in _localemodule.o "_libintl_gettext", referenced from: __locale_gettext in _localemodule.o "_libintl_setlocale", referenced from: __locale_setlocale in _localemodule.o __locale_localeconv in _localemodule.o "_libintl_textdomain", referenced from: __locale_textdomain in _localemodule.o ld: symbol(s) not found for architecture arm64 clang: error: linker command failed with exit code 1 (use -v to see invocation) make: *** [Programs/_freeze_module] Error 1 Looks like make is somehow using the wrong architecture. I just don't know why. Does anyone have an idea?
2
0
709
Sep ’23
Unable to use bfloat on M1 Ultra
I have the higher end M1 Mac Studio, and I have had a lot of success with Metal pipelines. However, I tried to compile a compute pipeline that uses the bfloat type and it seems to have no idea what that is. Error: program_source:10:55: error: unknown type name 'bfloat'; did you mean 'float'? Is there an OS update that is necessary for this support?
1
0
452
Aug ’23
Error with GPU JIT function with GPU tensor UNIMPLEMENTED: DefaultDeviceAssignment not supported for Metal Client.
Hi everyone, I'm trying to test some functionality of jax-metal and got this error. Any help please? import jax import jax.numpy as jnp import numpy as np def f(x): y1=x+x*x+3 y2=x*x+x*x.T return y1*y2 x = np.random.randn(3000,3000).astype('float32') jax_x_gpu = jax.device_put(jnp.array(x), jax.devices('METAL')[0]) jax_x_cpu = jax.device_put(jnp.array(x), jax.devices('cpu')[0]) jax_f_gpu = jax.jit(f, backend='METAL') jax_f_gpu(jax_x_gpu) --------------------------------------------------------------------------- XlaRuntimeError Traceback (most recent call last) Cell In[1], line 17 13 jax_x_cpu = jax.device_put(jnp.array(x), jax.devices('cpu')[0]) 15 jax_f_gpu = jax.jit(f, backend='METAL') ---> 17 jax_f_gpu(jax_x_gpu) [... skipping hidden 5 frame] File ~/.virtualenvs/jax-metal/lib/python3.11/site-packages/jax/_src/pjit.py:817, in _create_sharding_with_device_backend(device, backend) 814 elif backend is not None: 815 assert device is None 816 out = SingleDeviceSharding( --> 817 xb.get_backend(backend).get_default_device_assignment(1)[0]) 818 return out XlaRuntimeError: UNIMPLEMENTED: DefaultDeviceAssignment not supported for Metal Client.
0
0
541
Aug ’23
OpenGL stutter on Apple Silicon
We use an in-house OpenGL app to provide the out-the-window visuals for our flight simulators. The app is cross platform, but until now the Mac version was only used by desktop researchers, not in our primary sim labs. Now we are attempting to replace some Windows boxes with Apple Studios. We can easily maintain high framerate, and visual quality is excellent, but we are finding the graphics have a bit of stutter during high yaw rates (which quickly forces new assets into view). I've eliminating unnecessary processes, tried raising my priority via pthread_set_qos_class_self_np() or thread_policy_set(), and reducing textures quality, all of which helped, but it didn't eliminate the problem. For background, we are using framebuffers, we have a very large texture database (90 GB), and the render code runs in the main thread (not a secondary thread). What might I be missing?
0
0
615
Aug ’23
Maximize memory read bandwidth on M1 Ultra/M2 Ultra
I am in the process of developing a matrix-vector multiplication kernel. While conducting performance evaluations, I've noticed that on M1/M1 Pro/M1 Max, the kernel demonstrates an impressive memory bandwidth utilization of around 90%. However, when executed on the M1 Ultra/M2 Ultra, this figure drops to approximately 65%. My suspicion is that this discrepancy is attributed to the dual-die architecture of the M1 Ultra/M2 Ultra. It's plausible that the necessary data might be stored within the L2 cache of the alternate die. Could you kindly provide any insights or recommendations for mitigating the occurrence of on-die L2 cache misses on the Ultra chips? Additionally, I would greatly appreciate any general advice aimed at enhancing memory load speeds on these particular chips.
0
0
640
Aug ’23
DeefaceLab: MacOs Gui port
I've been trying to get the bash/script version of DeepFaceLab to work with Apple Silicon Macs, but this was original a Windows project that even now has non-existent support for MacOs/Apple Silicon. I am thinking of converting everything into a native macOS app using Swift, specifically optimized for Apple Silicon GPUs. Here's what I got from ChatGPT. Any help/advice on how to do this would be greatly appreciated. I don't have any Swift programming experience, but I have experience with some coding and can generally figure things out. I know that this is probably not feasible for a single individual with little programming experience, but I wanted to throw this out there to see what others think. Thank you Here's a high-level overview of the steps involved in porting DeepFaceLab to Swift with a graphical UI: Understand DeepFaceLab: Thoroughly study the DeepFaceLab project, its Python scripts, and the overall architecture to grasp its functionalities and dependencies. Choose a Swift Framework: Decide on the UI framework you want to use for the macOS app. SwiftUI is Apple's latest UI framework that works across all Apple platforms, including macOS. Alternatively, you can use AppKit for a more traditional approach. Rewrite Python to Swift: Convert the Python code from DeepFaceLab into Swift. You'll need to rewrite all the image processing, deep learning, and video manipulation code in Swift, potentially using third-party Swift libraries or native macOS frameworks. Deep Learning Integration: Replace the Python-based deep learning library used in DeepFaceLab with an appropriate Swift-compatible deep learning framework. TensorFlow and PyTorch both offer Swift APIs, but you may need to adapt the specific model implementation to Swift. Image Processing: Find equivalent Swift libraries or frameworks for image processing tasks used in DeepFaceLab. UI Development: Design and implement the graphical user interface using SwiftUI or AppKit. You'll need to create views, controls, and navigation elements to interact with the underlying Swift code. Integration: Connect the Swift code with the UI components, ensuring that actions in the GUI trigger the appropriate Swift functions and display results back to the user. Testing and Debugging: Rigorously test the Swift application and debug any issues that arise during the porting process. Optimization: Ensure that the Swift app performs efficiently and effectively on macOS devices.
0
0
468
Jul ’23
Multi Thread Audio Unit plugin on apple silicon
I am developing a multi thread instrument plugin for audio unit V2. This topic is about a software synthesizer that has been proven to work on intel macs, and has been converted to apple silicon native. I have a problem when I use logic pro on apple silicon macs. Plug the created software synthesizer to the instrument track. Make the track not exist other than the track you created. Put it in recording mode. When the above steps are followed, the performance meter on the logic pro will show that the load is concentrated on one specific core and far exceeds the total load when the load is divided. This load occurs continuously and is resolved when another track is created and the track is selected. It is understandable as a specification that the load is concentrated on a particular core. However, the magnitude of the load is abnormal. In fact, when the peak exceeds 100%, it leads to the generation of acoustic noise. Also, in this case, the activity monitor included with macOS does not show any increase in the usage of a specific CPU core. Also, the time profiler included with XCode did not identify any location that took a large amount of time. We have examined various experimental programs and found that there is a positive correlation between the frequency of thread switches in multi threaded areas and the peak of this CPU spike. We even found a positive correlation between the frequency of thread switches in the multithreaded area and the peak of this CPU spike. Mutex is used for thread switch. In summary In summary, we speculate that performance seems to be worse when multi thread processing is done on a single core. Is there any solution to this problem at the developer level or at the customer level of logic pro? Symptom environment MacBookePro 16inch 2021 CPU: apple m1 max OS: macOS 12.6.3 Memory: 32GB Logic pro 10.7.9 Built-in speaker autido buffer size: 32 sample Performance meter before symptoms occurred A view of the performance meter with symptoms after the recording condition
0
0
609
Jul ’23
How can I test my native macOS app on an Intel-based hardware when I developed it on an Apple Silicon (M2 chip)?
I've written a native app for macOS on my MacBook Air (with the Apple M2 chip.) Now I need to test it for an Intel-based CPU. When I build my app in Xcode, it is supposed to cover both ARM64 and x86-64 architectures in a single Mach-O binary, but when I send it to my customer he tells me that the app works on the Apple silicon but it crashes on his Intel-based Mac. So I'm looking for ways to test-run my app on an Intel-based platform and see what is wrong there. (But I obviously don't want to buy a separate Mac just for that.) I know that one can use Azure to spin up a Windows, or a Linux VM and open it via a web browser. But it doesn't seem to support macOS. How can I run an Intel-based macOS in a virtual environment? Or, do you have any other suggestions?
3
0
1.2k
Jul ’23
AVSpeechSynthesisVoice.speechVoices() - different behavior on Mac (Designed for iPhone) and iOS and MANY errors checking .audioFileSettings properties.
We recently started working on getting an iOS app to work on Macs with Apple Silicon as a "Designed for iPhone" app and are having issues with speech synthesis. Specifically, voices retuned by AVSpeechSynthesisVoice.speechVoices() do not all work on the Mac. When we build an utterance and attempt to speak, the synthesizer falls back on a default voice and says some very odd text about voice parameters (that is not in the utterance speech text) before it does say the intended speech. Here is some sample code to setup the utterance and speak: func speak(_ text: String, _ settings: AppSettings) { let utterance = AVSpeechUtterance(string: text) if let voice = AVSpeechSynthesisVoice(identifier: settings.selectedVoiceIdentifier) { utterance.voice = voice print("speak: voice assigned \(voice.audioFileSettings)") } else { print("speak: voice error") } utterance.rate = settings.speechRate utterance.pitchMultiplier = settings.speechPitch do { let audioSession = AVAudioSession.sharedInstance() try audioSession.setCategory(.playback, mode: .default, options: .duckOthers) try audioSession.setActive(true, options: .notifyOthersOnDeactivation) self.synthesizer.speak(utterance) return } catch let error { print("speak: Error setting up AVAudioSession: \(error.localizedDescription)") } } When running the app on the Mac, this is the kind of error we get with "com.apple.eloquence.en-US.Rocko" as the selectedVoiceIdentifier: speak: voice assgined [:] 2023-05-29 18:00:14.245513-0700 A.I.[9244:240554] [aqme] AQMEIO_HAL.cpp:742 kAudioDevicePropertyMute returned err 2003332927 2023-05-29 18:00:14.410477-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.412837-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.413774-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.414661-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.415544-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.416384-0700 A.I.[9244:240554] Could not retrieve voice [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null) 2023-05-29 18:00:14.416804-0700 A.I.[9244:240554] [AXTTSCommon] Audio Unit failed to start after 5 attempts. 2023-05-29 18:00:14.416974-0700 A.I.[9244:240554] [AXTTSCommon] VoiceProvider: Could not start synthesis for request SSML Length: 140, Voice: [AVSpeechSynthesisProviderVoice 0x6000033794f0] Name: Rocko, Identifier: com.apple.eloquence.en-US.Rocko, Supported Languages ( "en-US" ), Age: 0, Gender: 0, Size: 0, Version: (null), converted from tts request [TTSSpeechRequest 0x600002c29590] <speak><voice name="com.apple.eloquence.en-US.Rocko">How much wood would a woodchuck chuck if a wood chuck could chuck wood?</voice></speak> language: en-US footprint: premium rate: 0.500000 pitch: 1.000000 volume: 1.000000 2023-05-29 18:00:14.428421-0700 A.I.[9244:240360] [VOTSpeech] Failed to speak request with error: Error Domain=TTSErrorDomain Code=-4010 "(null)". Attempting to speak again with fallback identifier: com.apple.voice.compact.en-US.Samantha When we run AVSpeechSynthesisVoice.speechVoices(), the "com.apple.eloquence.en-US.Rocko" is absolutely in the list but fails to speak properly. Notice that the line: print("speak: voice assigned \(voice.audioFileSettings)") Shows: speak: voice assigned [:] The .audioFileSettings being empty seems to be a common factor for the voices that do not work properly on the Mac. For voices that do work, we see this kind of output and values in the .audioFileSettings: speak: voice assigned ["AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 16, "AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 0, "AVSampleRateKey": 22050, "AVLinearPCMIsNonInterleaved": 0, "AVNumberOfChannelsKey": 1] So we added a function to check the .audioFileSettings for each voice returned by AVSpeechSynthesisVoice.speechVoices(): //The voices are set in init(): var voices = AVSpeechSynthesisVoice.speechVoices() ... func checkVoices() { DispatchQueue.global().async { [weak self] in guard let self = self else { return } let checkedVoices = self.voices.map { ($0.0, $0.0.audioFileSettings.count) } DispatchQueue.main.async { self.voices = checkedVoices } } } That looks simple enough, and does work to identify which voices have no data in their .audioFileSettings. But we have to run it asynchronously because on a real iPhone device, it takes more than 9 seconds and produces a tremendous amount of error spew to the console. 2023-06-02 10:56:59.805910-0700 A.I.[17186:910118] [catalog] Query for com.apple.MobileAsset.VoiceServices.VoiceResources failed: 2 2023-06-02 10:56:59.971435-0700 A.I.[17186:910118] [catalog] Query for com.apple.MobileAsset.VoiceServices.VoiceResources failed: 2 2023-06-02 10:57:00.122976-0700 A.I.[17186:910118] [catalog] Query for com.apple.MobileAsset.VoiceServices.VoiceResources failed: 2 2023-06-02 10:57:00.144430-0700 A.I.[17186:910116] [AXTTSCommon] MauiVocalizer: 11006 (Can't compile rule): regularExpression=\Oviedo(?=, (\x1b\\pause=\d+\\)?Florida)\b, message=unrecognized character follows \, characterPosition=1 2023-06-02 10:57:00.147993-0700 A.I.[17186:910116] [AXTTSCommon] MauiVocalizer: 16038 (Resource load failed): component=ttt/re, uri=, contentType=application/x-vocalizer-rettt+text, lhError=88602000 2023-06-02 10:57:00.148036-0700 A.I.[17186:910116] [AXTTSCommon] Error loading rules: 2147483648 ... This goes on and on and on ... There must be a better way?
5
1
1.9k
Sep ’23
SwiftUI Stepper Crashes (EXC_BAD_ACCESS) on My Mac (Designed for iPhone) but works fine on iOS device/simulator?
I've been working on an iOS project for the iPhone and would like to support running it on macOS computers with Apple Silicon. In the Targets / Supported Destinations we added "Mac (Designed for iPhone)" but experienced Thread 1: EXC_BAD_ACCESS crashes immediately when we tried to run it. We've isolated it down to Stepper UI elements in our view. Starting a new project and just trying to present a single Stepper in the ContentView, we get the same crash. Here is code that presents the issue: // ContentView.swift import SwiftUI struct ContentView: View { @State var someValue = 5 var body: some View { VStack { Stepper("Stepper", value: $someValue, in: 0...10) } } } When run from Xcode on an iOS device or the simulator, it runs fine. Trying to run it on the Mac, it crashes here: // Stepper_01App.swift import SwiftUI @main // <-- Thread 1: EXC_BAD_ACCESS (code=2, address=0x16a643f70) struct Stepper_01App: App { var body: some Scene { WindowGroup { ContentView() } } } Xcode 14.3 (14E222b), MacOS Ventura 13.3.1 (a), Mac mini M2. Target: Mac (Designed for iPhone) We have verified that the same code crashes on all the Apple Silicon Macs we have access to. Searching the Internet and Apple Developer forums I dont find other reports, so I kind of feel there must be some level of either user error or system/project misconfiguration going on? If any iOS app that used Steppers was just crashing when trying to run on a Mac, it seems like this would be a big deal. If anyone has input or can point out what we need to do differently, it would be appreciated!
12
1
1.2k
Feb ’24
iOS Simulators not listed as Run Destinations (Xcode 14.3)
For all my iOS projects only simulators running iOS 16.4 are listed as Run Destinations ... although I've installed the iOS 13 simulator and corresponding entries are listed under "Devices & Simulators". I've toggled "Show run destination" from "Automatic" to "Always" with no avail. Deployment target is e.g. iOS 13, and I'm running Xcode Version 14.3 (14E222b) on a 14" MBP with Apple Silicon. As a current bypass I'm booting up the simulator manually and install apps by "xcrun simctl install booted APP.app" to allow some basic testing, but that's no sustainable solution. Any help is much appreciated! Mattes
6
4
10k
Jul ’23
Unable to change Photos permission of iPad app on Mac
Users can run our apps on Macs with Apple Silicon via the "iPad Apps on Mac" feature. The apps use PHPhotoLibrary.requestAuthorization(for: .addOnly, handler: callback) to request write-only access to the user's Photo Library during image export. This works as intended on macOS, but a huge problem arises when the user denies access (by accident or intentionally) and later decides that they want us to add their image to Photos: There is no way to grant this permission again. In System Preferences → Privacy &amp;amp; Security → Photos, the app is just not listed – in fact, none of the "iPad Apps on Mac" apps appear here. Not even tccutil reset all my.bundle.id works. It just reports tccutil: Failed to reset all approval status for my.bundle.id. Uninstalling, restarting the Mac, and reinstalling the app also doesn't work. The system seems to remember the initial decision. Is this an oversight in the integration of those apps with macOS, or are we missing something fundamental here? Is there maybe a way to prompt the user again?
3
3
1.1k
Sep ’23
Virtualize framework Connection Invalid when starting VM
Hi. Sorry if this question has been answered in another post, if it has I can't find it. My device is MacBook Pro 16-inch, M1, 2021. So I tried to create a VM using this guide from Apple I followed the guide and used an image of debian. Everything worked fine until the machine appeared stuck at some point of the installation. I chose my languages then I had some other prompt asking me to install something but I can't remember precisely the step at which I thought it was freezed (I think it was the GNOME install) So because the machine was not responding for several minutes (I might have been too hurried) I quitted the process by simply clicking on the Quit button in the VM window. The problem is that from that point onward, I can't load any VM anymore. The build is successful in Xcode, the machine starts but immediately quits with this response from Xcode logs : Virtual machine successfully started. Guest did stop virtual machine. 2023-02-02 22:22:45.413600+0100 GUILinux[22984:380971] [client] No error handler for XPC error: Connection invalid I just can't understand why, I tried to delete and download the guide again but it doesn't work. I will add that it's my first time using Xcode and I might have missed something obivous.
3
1
1.3k
Aug ’23
Menu works fine in iPad and Mac Catalyst but crashed on Apple Silicon
Hi, I have an iPad app that has menus, like:  CommandGroup(replacing: .help) {                 Button("Help") { showHelp = true }                     .keyboardShortcut("/")  } They works fine in iPad and also if compiled to Mac Catalyst, but will crash on Apple Silicon Mac when selected the menu items with errors like: [General] -[_UIEditMenuInteractionMenuController propertyList]: unrecognized selector sent to instance 0x600000190540 I did not use storyboard and only use SwiftUI. Any suggestions? Note: of course the best solution is to compile to Mac Catalyst, but the app has some other issues when run in Mac Catalyst. So I can only release it as iPad app.
5
0
1.3k
Oct ’23
SecureTransport Generates SSL Continuation Message Instead of TLS Client Hello on M1
I maintain a cross-platform client side network library for persistent TCP connections targeting Win32, Darwin and FreeBSD platforms. I recently upgraded to a Mac Studio w/ M1 Max (Ventura 13.1) from a late 2015 Intel Macbook Pro (Monterey 12.6.2) and I've encountered a discrepancy between the two. For secure TCP connections my lib uses WolfSSL across all platforms but also supports use of system provided Security libraries. On Darwin platforms this is SecureTransport. Yes I am aware SecureTransport is deprecated in favor of Network. I intend to attempt to integrate with Network later but for now my architecture dictates that I use similar C-style callbacks akin to WolfSSL, OpenSSL, MBedTLS etc. On the first call to SSLHandshake the SecureTransport write callback generates 151 bytes for my TLS 1.2 connection to example.com:443 on both platforms. However, while on Intel MBP I am able to continue with the full handshake I immediately receive 0 bytes with EOF. In Wireshark on the Intel MBP the 151 bytes are observed as a TLS 1.2 client hello while on M1 it is observed as an SSL continuation message and that is the last message observed.
11
0
1.6k
Nov ’23