I have done plenty of research, and I am aware of this post on the same exact question, and I tried that exact solution. However, it will not work as it throws the fatal error, "Couldn't create font from data". It seems like Swift cannot find my font.
I am currently using Xcode to create a SwiftUI project in an app playground (.swiftpm) file. This file can then be moved over to the MacOS Playgrounds app or iPadOS Playgrounds app etc. I found a custom .otf font that I want to use for my project, but I am having trouble using it.
If you have any experience with these, you would know that .swiftpm Playgrounds lack Info.plist files, so you cannot simply do what you would do in an Xcode project to use custom fonts (which involve a .plist).
How can I do this? I would really like to have a solution in SwiftUI, and I really wished that the linked solution would work as it seems like a great solution. My only thought now is that I am putting the fonts in the wrong directory, as .swiftpm files do not come with a Resources folder, unlike an older macOS .playground file, for example. I will attach a picture of my file structure here if that is helpful. I definitely know that the official PostScript name of my files match the names of the font variations in the screenshot (I double checked in Mac's FontBook). I have been trying to fix this for long enough to know that I definitely am not spelling anything wrong and I know that the file is definitely not corrupt because I can use it in other softwares and I got it from a reliable source.
If anyone knows how to do this, I would really appreciate your help. Feel free to look at the previous solution I was looking at linked above. Thanks! :)
As an additional note, the Font.swift file is completely empty, so you aren't missing anything from not seeing that. And yes, I have tried moving where the fonts are to many, many places, including the Assets.xcassets folder. Nothing has worked so far.
Post
Replies
Boosts
Views
Activity
Hi! I am trying to animate a button:
struct ButtonView: View {
@State private var isPressed = false
var label: String
let action: () -> Void
var body: some View {
Text(label)
/* There are lots of customizations here; no need to put them here */
.onTapGesture {
isPressed = true
action()
}
.scaleEffect(isPressed ? 1.25 : 1.0)
.animation(.easeInOut.repeatCount(2, autoreverses: true), value: isPressed)
}
}
This is going to create an animation that will scale out the button and then scale it back to its original size after one simple tap. In addition, it performs a custom action called action(), but that isn't really useful, I just want to be clear.
The problem is when I test this and tap the button, it runs very close to what you would expect, but with an additional and unwanted part. When I tap the button, it becomes large and then its original size and then large again, but then it jumps back with no animation to the large size again.
I believe that SwiftUI is trying to keep all of the state updated (isPressed), as it starts out false, becomes true when button is tapped, evaluates the ternary in .scaleEffect to be 1.25, which makes it larger. It then reverses this animation to make it small again... but then it must make the size larger again (1.25) because... why? I feel like I am getting confused here.
How can I make the button simply get larger and then go back to its original size? What am I misunderstanding here? Please feel free to let me know if there are any clarifications needed. I appreciate the help! :)
Hint: Realize that this isn't a SwiftUI Button type, but rather just a custom shape that I made.
Hello!
I have a few questions about using RealityKit on iPadOS and Mac Catalyst.
I currently have a model in Cinema 4D that has no materials added to it. I need to add a couple of simple materials to everything.
My goal is to move my model from Cinema 4D to my RealityKit app. The model can load fine in the app. However, I'm having issues with the materials.
First, when I export from Cinema 4D as a USD, the materials don't seem to come with it. I thought materials came with the model. I just get a pink striped model (no materials), which is not ideal. So, that crossed out making materials in Cinema 4D for me.
Here's a test model with materials added:
Here's what it looks like when exported as a USDA in Reality Composer Pro. It looks the same when exported as a USDZ:
I checked materials when exporting from Cinema 4D, so I don't know what I am doing wrong. So, I thought of another idea instead of making materials in Cinema 4D.
What if I used Apple's new Reality Composer Pro app to add materials to my model? You can add physically based materials or even custom shader materials with nodes.
I thought that would work, and it does. When I export the model as a USDZ with physically based materials, they appear fine and work in my app.
However, what about custom shader materials?
Whenever I play with a custom shader material and apply it to my model, I am left with problems.
Look at this image. This is my model in Reality Composer Pro with two types of materials added from the app. The water and sand on the beach are created with physically based materials in the app. No nodes. The gold/metal ball is created with a custom shader material with nodes. Looks fine, right?
When I drag an exported USDZ of this into my Xcode project, it even looks good in the Xcode preview of the file under Resources. (Note I am not adding the USDZ as an . rkassets as Apple suggests as .rkassets folders are only available for visionOS. This is an iPadOS + Catalyst app.)
However, when I run the actual app, only the physically based materials actually display correctly:
Besides the lighting in the scene, the physically based materials look good. However, the metal ball that used a custom shader material? It looks gray.
Why? How can I fix this?
I guarantee this is not a problem with my app or its lighting setup etc. I tried loading a custom shader material in a visionOS simulator and it worked! But, not for this.
I know Reality Composer Pro seems to be very focused on visionOS right now, but this is still just a USDZ file. Why aren't these custom shaders working?
I've been working on this problem for at least 24+ hours, so I've tried as much as I could imagine.
I thought Reality Composer Pro would be good to do textures in as it would be less error prone when moving the model over to Xcode, compared to moving materials from Cinema 4D to Xcode, and I kind of proved that with my second photo.
For RealityKit on iPadOS + Catalyst, how should I be applying materials? What am I doing wrong?
P.S. Yes, this is a nonAR project with a virtual camera, which is possible for RealityKit.
Thanks for the help! :)
I have the following code:
import SwiftUI
@main
struct MyApp: App {
var body: some Scene {
WindowGroup {
GeometryReader { geometry in
LocationPickerView()
.frame(maxWidth: geometry.size.width, maxHeight: geometry.size.height)
.ignoresSafeArea()
}
}
}
}
My goal is to have my LocationPickerView() take up the entire screen regardless of the orientation of the device. A LocationPickerView() is a custom view I built:
struct LocationView: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView()
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
However, I have a problem. The safe area is not being ignored. At first glance, I thought this was because the frame was already set by a "constant" value from geometry.
So, I thought I would flip the order of when I ignore the safe area and when I set the frame. This only resulted in a much weirder configuration. At first, the view does take up the entire screen. However, when I change the device to another orientation, the view stays the original height and width it was without updating, despite using GeometryReader.
What is happening here? I am certain this is valid behavior, but I don't understand why it is and how to fix it. I am trying to make things as simple as possible and thought I would add the frame and safe area options in my main struct, in an attempt to keep that code out of every view.
This feels like an easy fix, but I can't figure it out.
Hello!
I'm having a very odd problem. I was trying to open a USD file in Xcode so I could then open it in Reality Composer Pro. I've been able to do that without a problem for a number of weeks. However, I can't do that now. Every time I try to open a USD, Xcode briefly opens and then crashes. Then, every time I try to open Reality Composer Pro from the Developer Tools menu in Xcode, the app bounces up and down, opens for one second (little dot on the dock) and then just doesn't open.
I have no idea what I did. I've been using Xcode 15.2 and all of the sudden it just doesn't work anymore. The only thing I could think of is that I used an online converter from GLB to USD and then tried opening up that USD, but the website was working for me before. Plus, when I try to open up other files like USDA, it still doesn't work. So, I don't think it's one type of file.
I tried updating to macOS Sonoma 14.3.1 but that didn't fix it. Xcode is downloaded from the Mac App Store. I am not using any beta software. I tried doing the usual restart, clean build folder etc. but nothing works.
I am really confused... all of the sudden it just stopped working. Any fixes? I am on a very tight deadline, and this app is crucial to my work.
Thanks! :)
Is it possible to render an SF Symbol to a plane in RealityKit? As an example, render it as a 2D image without any depth, like a plane?
I thought of MeshResource and generateText but you cannot interpolate SF Symbols (their literal text interpretation displays instead).
Perhaps rendering it as a texture on a plane? Any thoughts?
Hello!
I’m trying to make a material in RealityKit that has a basic gradient. I am making an iPadOS app.
A few thoughts:
I cannot use Reality Composer Pro to do this because the Shader Graph tool only works for visionOS.
I cannot use a Metal file to create a shader because I am using a .swiftpm (app playgrounds) file targeted for Swift Playgrounds. Metal files don’t seem to work on Swift Playgrounds (it’s a Swift playground, after all).
I would prefer to not use image textures for a simple thing like this. That would take up storage. I wish it was as easy as applying a .basecolor with a UIColor, but UIColor does not support gradients.
What are my options? I know my requirements are likely not typical, but I really need to try to not break those.
I looked into CustomMaterial from RealityKit but once again, those take Metal shaders. Amazing tool, but I sadly cannot use them because I’m using a Swift Playground that doesn’t seem to with Metal files, at least it seems.
I’ve briefly done research on MetalKit? Could that help me out?
Let’s say I have a simple box in RealityKit. How would I apply a simple gradient to it given my constraints?
I really appreciate the help.
P.S. This is a SwiftUI project for reference.
EDIT:
Could I create a texture without images, perhaps by making a view into a texture and applying it? How would I do this? What are the pros and cons of this?
Another thought was could I just use MetalKit to create the gradient and apply it using CustomMaterial?
I will say that I'm kind of at a last resort — I am trying to create fairly straightforward materials. The two most important ones would be a gradient type of material with two colors and a transparent water material with a bit of refraction and perhaps maybe a little bit of reflection.
But, these are not meant to be photorealistic at all, not at all. They're meant to be fairly "2D"/simple which makes me wonder if I could just load in a texture. The only issue is that I don't know how I would do the water.
I have a transparent video that I created as a test. It's an mov file that was exported as HEVC 265, which means it has transparency as a background.
I am using it for VideoMaterial in RealityKit, which now has support for transparent videos.
I just loaded it like you would think:
private static var playerLooper: AVPlayerLooper? = nil // property of class
let video = AVURLAsset(url: Bundle.main.url(forResource: "transparentVideo", withExtension: "mov")!)
let playerItem = AVPlayerItem(asset: video)
let queuePlayer = AVQueuePlayer()
playerLooper = AVPlayerLooper(player: queuePlayer, templateItem: playerItem)
let material = VideoMaterial(avPlayer: queuePlayer)
let myMesh = MeshResource.generatePlane(width: 300, depth: 90)
let myModel = ModelEntity(mesh: myMesh, materials: [material])
myModel.position = .init(x: 0, y: -22.48, z: 75)
queuePlayer.play()
There is one issue.
When I load the material in, it has a gray outline around it:
Everything that is the edge of transparency has a gray background, which is really odd. I've tried multiple sources and ways of exporting/making videos and I'm pretty sure it is not my video. I think it is something with RealityKit.
How can I get rid of this?
(the yellow circle is the only content in the video, it was just a test)