I plan to create a simple motion graphics software for macOS that animates text, basic shapes, and handles audio. I'll use SwiftUI for the UI.
What are the commonly used technologies for rendering animated graphics? Core Animation is suitable for UI animations but not for exporting and controlling UI animations.
Basic requirements:
Timeline user interface
Animation of text and basic shapes
Viewer in SwiftUI GUI with transport control (play, pause, scrub, …)
Export to video file
Is Metal or Core Graphics typically used directly? I want to keep it as simple as possible.
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
I have an oval UIBezierPath with a moving SKSpriteNode,
I stop its motion and record the stopped position. I then restart this motion and want it to restart where it initially stopped.
Works great if motion is not stopped. Movement is great around entire oval Path.
Also works great as long as this stop-restart sequence occurs along the top half of the oval UIBezierPath. However, I have problems along the bottom half of this Path -- it stops okay, but the restart position is not where it previously stopped.
My method to create this oval UIBezierePath is as follows:
func createTrainPath() {
trainRect = CGRect(x: tracksPosX - tracksWidth/2,
y: tracksPosY - tracksHeight/2,
width: tracksWidth,
height: tracksHeight)
// these methods come from @DonMag
trainPoints = generatePoints(inRect: trainRect,
withNumberOfPoints: nbrPathPoints)
trainPath = generatePathFromPoints(trainPoints!,
startingAtIDX: savedTrainIndex)
} // createTrainPath
My method to stop this motion is as follows:
func stopFollowTrainPath() {
guard (myTrain != nil) else { return }
myTrain.isPaused = true
savedTrainPosition = myTrain.position
// also from @DonMag
savedTrainIndex = closestIndexInPath(
trainPath,
toPoint: savedTrainPosition) ?? 0
} // stopFollowTrainPath
Finally, I call this to re-start this motion:
func startFollowTrainPath() {
var trainAction = SKAction.follow(trainPath.cgPath,
asOffset: false,
orientToPath: true,
speed: thisSpeed)
trainAction = SKAction.repeatForever(trainAction)
myTrain.run(trainAction, withKey: runTrainKey)
myTrain.isPaused = false
} // startFollowTrainPath
Again, great if motion is not stopped. Movement is great around entire oval Path.
Again, no problem for stopping and then restarting along top half of oval .. the ohoh occurs along bottom half.
Is there something I need to do within GameScene's update method that I am missing? For example, do I need to reconstruct my UIBezierPath? every time my node moves between the top half and the bottom half and therein account for the fact that the node is traveling in the opposite direction from the top half?
I decided to use a club to kick a ball and let it roll on the turf in RealityKit, but now I can only let it slide but can not roll.
I add collision on the turf(static), club (kinematic) and the ball(dynamic), and set some parameters: radius, mass.
Using these parameters calculate linear damping, inertia, besides, use time between frames and the club position to calculate speed. Code like these:
let radius: Float = 0.025
let mass: Float = 0.04593 // 质量,单位:kg
var inertia = 2/5 * mass * pow(radius, 2)
let currentPosition = entity.position(relativeTo: nil)
let distance = distance(currentPosition, rgfc.lastPosition)
let deltaTime = Float(context.deltaTime)
let speed = distance / deltaTime
let C_d: Float = 0.47 //阻力系数
let linearDamping = 0.5 * 1.2 * pow(speed, 2) * .pi * pow(radius, 2) * C_d //线性阻尼(1.2表示空气密度)
entity.components[PhysicsBodyComponent.self]?.massProperties.inertia = SIMD3<Float>(inertia, inertia, inertia)
entity.components[PhysicsBodyComponent.self]?.linearDamping = linearDamping
// force
let acceleration = speed / deltaTime
let forceDirection = normalize(currentPosition - rgfc.lastPosition)
let forceMultiplier: Float = 1.0
let appliedForce = forceDirection * mass * acceleration * forceMultiplier
entityCollidedWith.addForce(appliedForce, at: rgfc.hitPosition, relativeTo: nil)
Also I try to applyImpulse but not addForce, like:
let linearImpulse = forceDirection * speed * forceMultiplier * mass
No matter how I adjust the friction(static, dynamic) and restitution, using addForce or applyImpulse, the ball can only slide. How can I solve this problem?
It’s great that we’ll be able to use Metal custom renderers in passthrough mode on visionOS.
https://developer.apple.com/wwdc24/10092
This is a lot of complicated set-up, however. It’s also unclear how occlusion and custom algorithms / raytracing will work in tandem with scene understanding. May we have a project template and/or sample? Preferably with the C api and not just swift. This would be much-appreciated and helpful to everyone who wants this set-up. I’d like to see the whole process.
Thank you for introducing this feature!
I would like to take YCbCr CVPixelBuffers from AVCaptureVideoDataOutput, apply some processing in RGB space, render to an MTKView, and pass to AVAssetWriter for recording.
Right now, I'm doing this all manually – deswing the incoming data if necessary, choose the right matrix to convert to RGB, apply processing, etc. I also have to convert back to YCbCr before feeding the frames to AVAssetWriter because encoding performs much better if I do. Is there any efficient, built-in way to achieve the same?
I can't use AVCaptureVideoPreviewLayer, since I need to do some further processing before display. I can't use AVCaptureVideoDataOutput's videoSettings to get automatic BGRA conversion because that would lose bit depth for 10 bit video formats (and isn't available on all formats anyway).
I see these Accelerate functions, but they seemingly don't use the GPU, nor do they support all the formats and bit depths I'd need.
I found reference to some undocumented MTLPixelFormats that seem to do exactly what I want, but I don't want to rely on something like this unless it's explicitly endorsed. This would also incur an RGB/YCbCr conversion on every texture read and write, right?
Is there anything I'm missing here?
Dear Apple Support Team,
I recently purchased an iPad Pro 2022 and updated it to iOS 18.2. However, I am experiencing an issue while using Call of Duty Mobile. The Game Mode activates randomly and sometimes does not activate at all. Additionally, when the Game Mode is on, the game crashes unexpectedly, causing an unstable experience. I kindly request that you address this issue in upcoming iOS updates.
Thank you for your attention and support.
Best regards,
[samadBg]
SceneKit SCNScene MacOS 15.1 Xcode 16.0
SceneView(scene: , options:[autoenablesDefaultLighting, allowsCameraContol]
there is:
.rootNode.cameraNode
.rootNode.camera
.rootNode.nestedChildNodes each with its own animation
when the object is animated and dragged by mouse to change the view point, I can't return the view to the previous view.
I have reinstated a clone of the original cameraNode, positions of all childNodes, removed and re-activated all animations... in vain.
I have also cloned, removed and replaced .rootNode.camera, in vain.
The documentation states the camera is "attached" to an SCNNode but does not say how. I make no declaration to associate .rootNode.cameraNode to .rootNode.camera yet if either is absent there is no scene to view.
What am I missing?
Thanks
I want to implement the ability to apply Lightroom Preset (.xmp file) to an image in my app, but am running into difficulties. How can I configure things like color grading, curve, etc. in Swift?
In macOS project with RealityKit and SwiftUI, adding OrthographicCameraComponent causes crashes in both Xcode Preview and at runtime.
import SwiftUI
import RealityKit
struct ContentView: View {
var body: some View {
RealityView { content in
var camera = Entity()
var component = OrthographicCameraComponent()
component.scale = 5
camera.position = [0, 0, 5]
camera.components.set(component)
content.add(camera)
content.add(ModelEntity(mesh: .generateSphere(radius: 1)))
}
}
}
#Preview {
ContentView()
}
Has anyone faced this issue or knows a fix?
I am writing an app to create 3D objects with curved surfaces such as a metal cabinet knob using SceneKit and Model I/O. I want the surfaces to be smooth so that edges between adjacent polygon faces are not visible. According to the documentation for MDLMesh.addNormals(withAttributeNamed: creaseThreshold:), a positive creaseThreshold value lower than 1.0 will interpolate sharper angles between faces into smooth surfaces. I have not been able to get this to work, and I need help with it.
The lines of code where the problem occurs are shown here.
let mesh = MDLMesh(scnGeometry: surfaceGeometry)
// mesh.addNormals(withAttributeNamed: "MDLVertexAttributeNormal", creaseThreshold: 0.9)
surfaceGeometry = SCNGeometry(mdlMesh: mesh)
When the code is executed with middle line commented out, the knob object is rendered as shown in the screenshot. When that line is not commented out, mesh is altered and the SCNNode for the knob is created with no errors, but the node is not rendered.
The questions I have are: (1) What changes do I need the make to the code so that the node will be rendered with a smooth surface?, and (2) what is the recommended way of smoothing a curved surface so that edges between faces are not visible?
The full code for the function and a screenshot of the faceted knob object are attached.
![]("https://developer.apple.com/forums/content/attachment/a17feca7-ed6f-440c-add6-760a1cbf8778" "title=Screenshot cabinet knob with faceted surface.png;width=790;height=568")
code-block
func cabinetKnob() -> SCNNode {
let controlPoints: [(x: Float, y: Float)] = [
(0.728,-0.237), (0.176,-0.06), (0.202,0.475), (0.989,0.842),
(-0.066,1.093), (-0.726,0.787) ]
let pairs = bsplinePath(controlPoints)
var knobProfile = [SCNVector3]()
for (x,y) in pairs {
knobProfile += [ SCNVector3(x: CGFloat(x), y: CGFloat(y), z: 0)]
}
let nProfiles = 64
// create knob by rotating knobProfile about y-axis
let aIncrement: CGFloat = 2 * CGFloat.pi / CGFloat(nProfiles) // ~6 degrees
var angle: CGFloat = 0
var knobVertices = knobProfile.map( { $0 } )
angle = 0
for _ in 1...nProfiles {
angle += aIncrement
// rotate knobProfile about y-axis
knobVertices += knobProfile.map( { $0.rotate(about: .y, by: angle) } )
}
let source = SCNGeometrySource(vertices: knobVertices)
var indices = [[UInt16]]()
var i: UInt16 = 0
var j: UInt16 = UInt16(knobProfile.count) // 1st vertex of next profile
for k in 0...nProfiles {
var stripIndices = [UInt16]()
if k == nProfiles { j = 0 }
for _ in 0...knobProfile.count-1 {
stripIndices += [i, j]
i += 1; j += 1
}
indices += [stripIndices]
}
let elements: [SCNGeometryElement] = indices.map( {
SCNGeometryElement(indices: $0, primitiveType: .triangleStrip) } )
var surfaceGeometry = SCNGeometry(sources: [source], elements: elements)
let mesh = MDLMesh(scnGeometry: surfaceGeometry)
// mesh.addNormals(withAttributeNamed: "MDLVertexAttributeNormal", creaseThreshold: 0.9)
surfaceGeometry = SCNGeometry(mdlMesh: mesh)
let aluminum = SCNMaterial()
aluminum.lightingModel = SCNMaterial.LightingModel.physicallyBased
aluminum.diffuse.contents = NSColor(srgbRed: 0.95, green: 0.95, blue: 0.95, alpha: 1.0)
aluminum.roughness.contents = 0.2
aluminum.metalness.contents = 0.9
aluminum.isDoubleSided = true
surfaceGeometry.materials = [ aluminum ]
let node = SCNNode(geometry: surfaceGeometry)
return node
}
Ever since the release of iOS18, we've been seeing a new crash related to calling jpegData(compressionQuality:). From reports, this isn't crashing during foreground usage of the app, but instead will prompt the user about a background app crash upon foregrounding. The stacks from crash reports show this crash is happening from a variety of callers, but all go through jpegData(compressionQuality:), down through [HDRImageConverter_Metal init] and end up in apthread_mutex_local call when it crashes.
Attached is a sample crash report from 18.2(22C5125e), but we've been seeing this since the first iOS18 release.
Did something change with around these calls in iOS18 that prohibits their use in the background?
crash.txt
Hi,
I tried to save the game progress using the official Apple plugin for Unity but the crash happened when I active the "iCloud Documents" inside capabilities and when deactivated it this error message appeared:
Code=27 Domain=GKErrorDomain Description=The requested operation could not be completed because you are not signed in to iCloud or have not enabled iCloud Drive. (UbiquityContainerUnavailable)
Authentication with Game Center works fine using the Core plugin, but nothing works correctly when I use the GameKit plugin.
Note:
I already active iCloud for app Identifier.
Tichnecal informations:
Unity version: 2022.3.47f1 LTS
XCode 16
Swift 6
GameKit-3.0.0 (Apply unity plugin)
Core-3.1.5 (Apply unity plugin)
I am working on adding synchronized physical properties to EntityEquipment in TableTopKit, allowing seamless coordination during GroupActivities sessions between players.
Treating EntityEquipment's state to DieState is not a way, because it doesn't support custom collision shapes.
I have also tried adding PhysicsBodyComponent and CollisionComponent to EntityEquipment's Entity. However, the main issue is that the position of EntityEquipment itself does not synchronize with the Entity's physics body, resulting in two separate instances of one object.
struct PlayerPawn: EntityEquipment {
let id: ID
let entity: Entity
var initialState: BaseEquipmentState
init(id: ID, entity: Entity) {
self.id = id
let massProperties = PhysicsMassProperties(mass: 1.0)
let material = PhysicsMaterialResource.generate(friction: 0.5, restitution: 0.5)
let shape = ShapeResource.generateBox(size: [0.4, 0.2, 0.2])
let physicsBody = PhysicsBodyComponent(massProperties: massProperties, material: material, mode: .dynamic)
let collisionComponent = CollisionComponent(shapes: [shape])
entity.components.set(physicsBody)
entity.components.set(collisionComponent)
self.entity = entity
initialState = .init(parentID: .tableID, pose: .init(position: .init(), rotation: .zero), entity: self.entity)
}
}
I’d appreciate any guidance on the recommended approach to adding synchronized physical properties to EntityEquipment.
I am trying to install the game-porting-toolkit using
brew -v install apple/apple/game-porting-toolkit
but this fails each time because of a dependency on a deprecated openssl version:
Fetching dependencies for apple/apple/game-porting-toolkit: cmake, ninja, apple/apple/game-porting-toolkit-compiler, openssl1.1
...
...
Error: openssl@1.1 has been disabled because it is not supported upstream! It was disabled on 2024-10-24.
Is there a way to override this dependency or use a newer version of openssl for the check?
Every now and then my SceneKit game app crashes and I have no idea why. The SCNView has a overlaySKScene, so it might also be SpriteKit's fault.
The stack trace is
#0 0x0000000241c1470c in jet_context::set_fragment_texture(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, jet_texture*) ()
#27 0x000000010572fd40 in _pthread_wqthread ()
Does anyone have an idea where I could start debugging this, without being able to consistently reproduce it?
Even when the action is run on the main thread, the following code causes a crash on iOS, but not on macOS. The game launches with a simple yellow rectangle, and when it finishes fading out and should be removed from the overlay scene, the app crashes.
The code can be pasted into the file GameController.swift of Xcode's default project for Multiplatform macOS and iOS game.
import SceneKit
import SpriteKit
@MainActor
class GameController: NSObject {
let scene: SCNScene
let sceneRenderer: SCNSceneRenderer
init(sceneRenderer renderer: SCNSceneRenderer) {
sceneRenderer = renderer
scene = SCNScene(named: "Art.scnassets/ship.scn")!
super.init()
sceneRenderer.scene = scene
renderer.overlaySKScene = SKScene(size: CGSize(width: 500, height: 500))
DispatchQueue.main.async {
let node = SKShapeNode(rect: CGRect(x: 100, y: 100, width: 100, height: 100))
node.fillColor = .yellow
node.run(.sequence([
.fadeOut(withDuration: 1),
.removeFromParent()
]))
renderer.overlaySKScene!.addChild(node)
}
}
}
The Xcode console shows this stacktrace:
*** Assertion failure in -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:], UIApplication.m:3246
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Call must be made on main thread'
*** First throw call stack:
(
0 CoreFoundation 0x00000001804ae0f8 __exceptionPreprocess + 172
1 libobjc.A.dylib 0x0000000180087db4 objc_exception_throw + 56
2 Foundation 0x0000000180d17058 _userInfoForFileAndLine + 0
3 UIKitCore 0x00000001853cf678 -[UIApplication _performAfterCATransactionCommitsWithLegacyRunloopObserverBasedTiming:block:] + 376
4 UIKitCore 0x000000018553f7a0 -[_UIFocusUpdateThrottle scheduleProgrammaticFocusUpdate] + 300
5 UIKitCore 0x0000000184e2e22c -[UIFocusSystem _requestFocusUpdate:] + 548
6 UIKitCore 0x0000000184e2dfa4 -[UIFocusSystem requestFocusUpdateToEnvironment:] + 76
7 UIKitCore 0x0000000184e2e864 -[UIFocusSystem _focusEnvironmentWillDisappear:] + 408
8 SpriteKit 0x00000001a3d472f4 _ZL12_removeChildP6SKNodeS0_P7SKScene + 240
9 SpriteKit 0x00000001a3d473b0 -[SKNode removeChild:] + 80
10 SpriteKit 0x00000001a3d466b8 -[SKNode removeFromParent] + 128
11 SpriteKit 0x00000001a3d1678c -[SKRemove updateWithTarget:forTime:] + 64
12 SpriteKit 0x00000001a3d1b740 _ZN11SKCSequence27cpp_updateWithTargetForTimeEP7SKCNoded + 84
13 SpriteKit 0x00000001a3d20e3c _ZN7SKCNode6updateEdf + 156
14 SpriteKit 0x00000001a3d20f20 _ZN7SKCNode6updateEdf + 384
15 SpriteKit 0x00000001a3d26fb8 -[SKScene _update:] + 464
16 SpriteKit 0x00000001a3cf3168 -[SKSCNRenderer _update:] + 80
17 SceneKit 0x000000019c932bf0 -[SCNMTLRenderContext renderSKSceneWithRenderer:overlay:atTime:] + 60
18 SceneKit 0x000000019c9ebd98 -[SCNRenderer _drawOverlaySceneAtTime:] + 204
19 SceneKit 0x000000019cb1a1c0 _ZN3C3D11OverlayPass7executeERKNS_10RenderArgsE + 60
20 SceneKit 0x000000019c8e05ec _ZN3C3D13__renderSliceEPNS_11RenderGraphEPNS_10RenderPassERtRKNS0_9GraphNodeERPNS0_5StageENS_10RenderArgsEbRPU27objcproto16MTLCommandBuffer11objc_object + 2660
21 SceneKit 0x000000019c8e18ac _ZN3C3D11RenderGraph7executeEv + 3808
22 SceneKit 0x000000019c9ed26c -[SCNRenderer _renderSceneWithEngineContext:sceneTime:] + 756
23 SceneKit 0x000000019c9ed544 -[SCNRenderer _drawSceneWithNewRenderer:] + 208
24 SceneKit 0x000000019c9ed9fc -[SCNRenderer _drawScene:] + 40
25 SceneKit 0x000000019c9edce4 -[SCNRenderer _drawAtTime:] + 500
26 SceneKit 0x000000019ca87950 -[SCNView _drawAtTime:] + 368
27 SceneKit 0x000000019c943b74 __83-[NSObject(SCN_DisplayLinkExtensions) SCN_setupDisplayLinkWithQueue:screen:policy:]_block_invoke + 44
28 SceneKit 0x000000019ca50600 -[SCNDisplayLink _displayLinkCallbackReturningImmediately] + 132
29 libdispatch.dylib 0x000000010239173c _dispatch_client_callout + 16
30 libdispatch.dylib 0x0000000102394c14 _dispatch_continuation_pop + 756
31 libdispatch.dylib 0x00000001023aa4e0 _dispatch_source_invoke + 1736
32 libdispatch.dylib 0x00000001023997f0 _dispatch_lane_serial_drain + 340
33 libdispatch.dylib 0x000000010239a774 _dispatch_lane_invoke + 420
34 libdispatch.dylib 0x00000001023a71a8 _dispatch_root_queue_drain_deferred_wlh + 324
35 libdispatch.dylib 0x00000001023a6604 _dispatch_workloop_worker_thread + 488
36 libsystem_pthread.dylib 0x000000010242bb74 _pthread_wqthread + 284
37 libsystem_pthread.dylib 0x000000010242a934 start_wqthread + 8
)
libc++abi: terminating due to uncaught exception of type NSException
Am I doing something wrong?
Hi guys! Is there any way to get a frame at certain time? I'm writing plug-in and want to use 2 frames before and 2 frames after current frame in order to render final image.
I've been running my SceneKit game for many weeks in Xcode without performance issues. The game itself is finished, so I thought I could go on with publishing it on the App Store, but when archiving it in Xcode and running the archived app, I noticed that it seriously hangs.
The hangs only seem to happen when I run the game in fullscreen mode. I tried disabling game mode, but the hangs still happen. Only when I run in windowed mode the game runs smoothly.
Instruments confirms that there are many serious hangs, but it also reports that CPU usage is quite low during those hangs, on average about 15%. From what I know, hangs happen when the main thread is busy, but how can that be when CPU usage is so low, and why does it only happen in fullscreen mode for release builds?
I'm displaying a GKGameCenterViewController after successfully authenticating and on iOS 18.0 and 18.1, I get a black screen. As a sanity check GKLocalPlayer.local.isAuthenticated is also returning true. The same code works just fine on iOS 17. Is there something that needs to be done on iOS 18 and above?
I am building a MacOS desktop app (https://anukari.com) that is using Metal compute to do real-time audio/DSP processing, as I have a problem that is highly parallelizable and too computationally expensive for the CPU.
However it seems that the way in which I am using the GPU, even when my app is fully compute-limited, the OS never increases the power/performance state. Because this is a real-time audio synthesis application, it's a huge problem to not be able to take advantage of the full clock speeds that the GPU is capable of, because the app can't keep up with real-time.
I discovered this issue while profiling the app using Instrument's Metal tracing (and Game tracing) modes. In the profiling configuration under "Metal Application" there is a drop-down to select the "Performance State." If I run the application under Instruments with Performance State set to Maximum, it runs amazingly well, and all my problems go away.
For comparison, when I run the app on its own, outside of Instruments, the expensive GPU computation it's doing takes around 2x as long to complete, meaning that the app performs half as well.
I've done a ton of work to micro-optimize my Metal compute code, based on every scrap of information from the WWDC videos, etc. A problem I'm running into is that I think that the more efficient I make my code, the less it signals to the OS that I want high GPU clock speeds!
I think part of why the OS is confused is that in most use cases, my computation can be done using only a small number of Metal threadgroups. I'm guessing that the OS heuristics see that only a small fraction of the GPU is saturated and fail to scale up the power/clock state.
I'm not sure what to do here; I'm in a bit of a bind. One possibility is that I intentionally schedule busy work -- spin threadgroups just to waste energy and signal to the OS that I need higher clock speeds. This is obviously a really bad idea, but it might work.
Is there any other (better) way for my app to signal to the OS that it is doing real-time latency-sensitive computation on the GPU and needs the clock speeds to be scaled up?
Note that game mode is not really an option, as my app also runs as an AU plugin inside hosts like Garageband, so it can't be made fullscreen, etc.