For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/
Firebase looks like the most promising at this point??
Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
Anyone else get these warnings when using UI Tab Bar in visionOS? Are these detrimental to pushing my visionOS app to the App Review Team?
import SwiftUI
import UIKit
struct HomeScreenWrapper: UIViewControllerRepresentable {
func makeUIViewController(context: Context) -> UITabBarController {
let tabBarController = UITabBarController()
// Home View Controller
let homeVC = UIHostingController(rootView: HomeScreen())
homeVC.tabBarItem = UITabBarItem(title: "Home", image: UIImage(systemName: "house"), tag: 0)
// Brands View Controller
let brandsVC = UIHostingController(rootView: BrandSelection())
brandsVC.tabBarItem = UITabBarItem(title: "Brands", image: UIImage(systemName: "bag"), tag: 1)
tabBarController.viewControllers = [homeVC, brandsVC]
return tabBarController
}
func updateUIViewController(_ uiViewController: UITabBarController, context: Context) {
// Update the UI if needed
}
}
struct HomeScreenWrapper_Previews: PreviewProvider {
static var previews: some View {
HomeScreenWrapper()
}
}
Anyone download this yet and have your system data **** up in size? Lesson learned to never download beta of anything. Can barely work in Xcode 16 now, another beta I should not have downloaded :/
Anyway to reduce this size of system data properly?
Thanks!
Anyone else getting the below issue when using Shopify's SDK for iOS? Seems odd they just released a new version and it already has an issue embedded in the package.
Within Xcode when initializing the client via code below, do I insert this code in my app declaration Swift file or can it be just another Swift file included in my app? Thanks!
let client = Graph.Client(
shopDomain: "your-shop-name.myshopify.com",
apiKey: "your-storefront-access-token",
locale: Locale(identifier: "en_US")
)
Integrating Apple Pay for a Shopify Store via headless ecomm. Shopify on the backend/iOS + visionOS app on the front end. Shopify won't admit that they're at the wrong here BUT they are. The available CSR file from Shopify that you download is not using the encryption method Apple accepts, therefore you have to work some OpenSSL magic on your Mac terminal to create the correct encryption Apple Developer requires. Open Chat GPT 4.0 or later and type in the below, your issue is solved on that front.
"I'm trying to upload a certificate signing request file to create a merchant identity certificate for Apple Pay. I'm getting the following error message from Apple, "CSR algorithm/size incorrect. Expected: RSA(2048)". What does this mean and how do I fix the CSR file? I've uploaded the CSR file for reference."
My issue now is that Shopify gives me an error when attempting to upload the Merchant ID certificate from Apple Developer. Simply stating, "An error occurred while trying to save the certificate."
Anyone else get this far and run into this issue? Thanks
I need to offer Apple Pay within my visionOS app for ecomm transactions, does this functionality exist yet for Apple Vision Pro apps? Below is the current documentation and I do not see visionOS :/ Just want to confirm.
Apple Pay Developer Doc
Thanks!
Anyone out there incorporate UE5 assets into their app yet? Epic will be offering support for the full immersion style, which is so rad.
Building in visionOS and one of my Swift UI views keeps causing Xcode to crash. The root issue is within the preview code, stuck on updating the preview code to prevent Xcode from crashing. When I run the simulator the app works perfectly, no bugs or issues. Any advice on how to update the preview code would be very helpful :)
import RealityKit
import RealityKitContent
struct BrandImage: View {
@State private var currentIndex: Int = 0
@Environment(\.openWindow) private var openWindow
@EnvironmentObject var sharedAppState: SharedAppState
var brand: [BrandEcommData]
var initialBrand: BrandEcommData
init(brand: [BrandEcommData], initialBrand: BrandEcommData) {
self.brand = brand
self.initialBrand = initialBrand
if let initialIndex = brand.firstIndex(where: { $0.id == initialBrand.id}) {
_currentIndex = State(initialValue: initialIndex)
}
}
var body: some View {
HStack(spacing: 0) {
ZStack {
ForEach(0..<brand.count, id: \.self) { index in
if index == currentIndex {
VStack {
Text(brand[index].brand)
.padding(.top, 5)
brand[index].image
.resizable()
.scaledToFit()
}
.transition(.scale)
}
}
HStack {
Button(action: {
withAnimation {
self.currentIndex = (self.currentIndex - 1 + brand.count) % brand.count
sharedAppState.currentModelId = brand[currentIndex].id
}
}) {
Image(systemName: "arrow.left.circle.fill")
.font(.largeTitle)
.foregroundStyle(.linearGradient(
colors: [.black, .gray],
startPoint: .top,
endPoint: .bottom))
}
.padding(.leading, 20)
Spacer()
Button(action: {
withAnimation {
self.currentIndex = (self.currentIndex + 1) % brand.count
sharedAppState.currentModelId = brand[currentIndex].id
}
}) {
Image(systemName: "arrow.right.circle.fill")
.font(.largeTitle)
.foregroundStyle(.linearGradient(
colors: [.black, .gray],
startPoint: .top,
endPoint: .bottom))
}
.padding(.trailing, 20)
}
VStack {
HStack {
Spacer()
Button(action: {
openWindow(id: "volumetric")
})
{
Image(systemName: "cube.transparent")
.font(.title)
.padding()
.foregroundStyle(.linearGradient(
colors: [.black, .gray],
startPoint: .top,
endPoint: .bottom))
}
}
Spacer()
}
}
.frame(maxWidth: .infinity)
Rectangle()
.frame(width: 2)
.foregroundStyle(.linearGradient(
colors: [.black, .gray],
startPoint: .top,
endPoint: .bottom))
VStack {
Text(brand[currentIndex].name)
.font(.title2)
Text(brand[currentIndex].itemDetail)
.font(.subheadline)
Text(brand[currentIndex].itemDescription)
.padding()
Text(brand[currentIndex].price)
}
}
.onAppear {
sharedAppState.currentModelId = initialBrand.id
}
}
}
#Preview {
if let initialBrand = ecommdata?.first {
BrandImage(brand: ecommdata!, initialBrand: initialBrand)
} else {
Text("Unable to load 3D Asset")
}
}
Anyone else having issues where you can see your 3D models in Xcode preview but when running simulator all you see is the progress view spinner? Below is my Model3D load that I can see in preview but in simulator it only shows the spinner. Any suggestions?
import RealityKit
struct Three_D_Ad: View {
var body: some View {
VStack {
Image("Fender")
Text("Fender Stratocaster")
.font(.title)
Text("Legendary Sound")
Model3D(named: "Fender") { model in
model
.resizable()
.scaledToFit()
.scaleEffect(0.75)
} placeholder: {
ProgressView()
}
}
}
}
#Preview {
Three_D_Ad()
}
Anyone else having issues in Xcode preview where when you tap on a button to open a volume window nothing appears in the Xcode preview window?
For example, if you download the Hello World Xcode project and try to open the globe volume after tapping the toggle button, the globe volume window never opens.
Running the latest version of Xcode 15.3 beta.
I'm trying to use Model3D in a similar fashion to how I've used Image in the below data structure for my visionOS app and I keep getting the following error message.
Reference to generic type 'Model3D' requires arguments in <...>
Here is my data structure code.
import Foundation
import SwiftUI
import RealityKit
struct BrandEcommData: Hashable, Codable, Identifiable {
var id: Int
var brand: String
var name: String
var category: String
var itemDetail: String
var price: String
var itemDescription: String
var imageName: String
var ThreeDitem: String
var image: Image {
Image(imageName)
}
var volume: Model3D {
Model3D(ThreeDitem)
}
}
I've setup a NavigationLink so users can tap on the image and they're taken to another view, in this case "FenderExperience". I can't for the life of me figure out how to make this oval bar behind the image disappear. Is this an Xcode preview issue? I'm running Xcode 15.2
struct BrandImage: View {
var body: some View {
VStack(alignment: .leading) {
NavigationStack {
Text("Ralph Lauren")
.font(.title)
.padding(.bottom, -5)
Image("RL")
.resizable()
.scaledToFit()
.frame(width: 300, height: 200)
.overlay(
RoundedRectangle(cornerRadius: 0)
.stroke(Color.white, lineWidth: 2))
Text("Fender")
.font(.title)
.padding(.bottom, -5)
NavigationLink(destination: FenderXperience()) {
Image("Fender II")
.resizable()
.scaledToFit()
.frame(width: 300, height: 95)
.overlay(
RoundedRectangle(cornerRadius: 0)
.stroke(Color.white, lineWidth: 2))
}
Text("Burton")
.font(.title)
.padding(.bottom, -5)
Image("Burton")
.resizable()
.scaledToFit()
.frame(width: 300, height: 200)
.overlay(
RoundedRectangle(cornerRadius: 0)
.stroke(Color.white, lineWidth: 2))
Text("Ray Ban")
.font(.title)
.padding(.bottom, -5)
Image("Ray Ban")
.resizable()
.scaledToFit()
.frame(width: 300, height: 200)
.overlay(
RoundedRectangle(cornerRadius: 0)
.stroke(Color.white, lineWidth: 2))
Text("Levi's")
.font(.title)
.padding(.bottom, -5)
Image("Levis II")
.resizable()
.scaledToFit()
.frame(width: 300, height: 200)
.overlay(
RoundedRectangle(cornerRadius: 0)
.stroke(Color.white, lineWidth: 2))
}
}
.padding(.leading, 50)
}
}
#Preview {
BrandImage()
}
What type of media is being used in the full immersion scenes Apple is showing as examples? Are they 360 photos?
Running on iOS17 Beta 6 and getting the below issue.
Conformance of 'ObjectCaptureSession.CaptureState' to protocol 'Equatable' was already stated in the type's module '_RealityKit_SwiftUI'
Operator function '==' will not be used to satisfy the conformance to 'Equatable'
'ObjectCaptureSession.CaptureState' declares conformance to protocol 'Equatable' here
Please help!