We are working on to enable enterprise api account from our developer account. But it is not showing me that option. We are referring below link from apple :
https://developer.apple.com/help/account/get-started/apple-developer-enterprise-program-api/
We don't find "Apple Developer Enterprise Program API configuration" in our developer account Inside Integration tab
I am attach our developer account Screenshot. Please guide us!!
Post
Replies
Boosts
Views
Activity
I don't get cameraFrame from cameraFrameUpdates in vision pro app, why it's no getting , where I am doing wrong in code please guide me.
for await cameraFrame in cameraFrameUpdates { print("cameraFrame:: (cameraFrame)") }
var body: some View {
VStack {
image
.resizable()
.scaledToFit()
if(self.finalImage != nil){
self.finalImage!
.resizable()
.scaledToFit()
}else{
image
.resizable()
.scaledToFit()
}
}
.task {
if #available(visionOS 2.0, *) {
guard CameraFrameProvider.isSupported else {
print("CameraFrameProvider not supported.")
return
}
let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions: [CameraFrameProvider.CameraPosition.left])
let cameraFrameProvider = CameraFrameProvider()
do {
try await arkitSession.run([cameraFrameProvider])
} catch {
guard let sessionError = error as? ARKitSession.Error else {
preconditionFailure("ARKitSession.run() returned a non-session error: \(error)")
print("ARKitSession.run() returned a non-session error: \(error)")
}
}
guard let cameraFrameUpdates = cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else {
preconditionFailure("Failed to get an async sequence for the first format.")
print("Failed to get an async sequence for the first format.")
}
print("cameraFrameUpdates:: \(cameraFrameUpdates)")
for await cameraFrame in cameraFrameUpdates {
print("cameraFrame:: \(cameraFrame)")
print("Camera Frame ::: LEFT :: \(cameraFrame.sample(for: .left))")
guard let leftSample = cameraFrame.sample(for: .left) else {
print("CameraFrameProviderSample - Nil camera frame left sample")
print("CameraFrameProviderSample - Nil camera frame left sample")
continue
}
self.pixelBuffer = leftSample.pixelBuffer
print(" ======== PIXEL BUFFER ::: \(self.pixelBuffer) ========")
self.finalImage = self.setImage()
}
} else {
// Fallback on earlier versions
}
}
}
I want to see the vision pro camera view in my application window. I had write some code from apple, I stuck on CVPixelBuffer , How to convert pixelbuffer to video frame?
Button("Camera Feed") {
Task{
if #available(visionOS 2.0, *) {
let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions:[.left])
let cameraFrameProvider = CameraFrameProvider()
var arKitSession = ARKitSession()
var pixelBuffer: CVPixelBuffer?
await arKitSession.queryAuthorization(for: [.cameraAccess])
do {
try await arKitSession.run([cameraFrameProvider])
} catch {
return
}
guard let cameraFrameUpdates =
cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else {
return
}
for await cameraFrame in cameraFrameUpdates {
guard let mainCameraSample = cameraFrame.sample(for: .left) else {
continue
}
//====
print("=========================")
print(mainCameraSample.pixelBuffer)
print("=========================")
// self.pixelBuffer = mainCameraSample.pixelBuffer
}
} else {
// Fallback on earlier versions
}
}
}
I want to convert "mainCameraSample.pixelBuffer" in to video. Could you please guide me!!
I want to open the popup view controller or sheet to invite user in share play. But I got an error to achieve.
struct ContentView: View {
@State private var showDialog = false
private let activityManager = GroupActivityManager()
var body: some View {
VStack {
Button(action: {
Task {
let outcome = await activityManager.startSharing()
if outcome == .needsDialog {
showDialog = true
}
}
}, label: {
Label(title: {
Text("Start SharePlay")
}, icon: {
Image(systemName: "shareplay")
})
})
.buttonStyle(.borderedProminent)
}
> .sheet(isPresented: $showDialog, content: {
GroupActivityShareSheet {
DemoAppActivity()
}
})
.padding()
}
}
Using below code to open the sheet
struct GroupActivityShareSheet<Activity: GroupActivity>: UIViewControllerRepresentable {
let preparationHandler: () async throws -> Activity
func makeUIViewController(context: Context) -> UIViewController {
GroupActivitySharingController(preparationHandler: preparationHandler)
}
func updateUIViewController(_ uiViewController: UIViewControllerType, context: Context) {}
}
Got the below error while open the popup
> `Cannot run query `<_EXQuery: 0x6000021a2620>`
Failed to lookup mobile extension with query <_EXQuery: 0x6000021a2620> on <_GroupActivities_UIKit.PeoplePickerController: 0x600002c62480>
Failed to fetch config for hostViewController <_GroupActivities_UIKit.PeoplePickerController: 0x600002c62480>
Failed to build remote hostViewController for <_GroupActivities_UIKit.GroupActivitySharingController: 0x105f6f530>
Failed to fetch extensionViewController
Calling -viewDidAppear: directly on a view controller is not supported, and may result in out-of-order callbacks and other inconsistent behavior. Use the -beginAppearanceTransition:animated: and -endAppearanceTransition APIs on UIViewController to manually drive appearance callbacks instead. Make a symbolic breakpoint at UIViewControllerAlertForAppearanceCallbackMisuse to catch this in the debugger. View controller: <_GroupActivities_UIKit.GroupActivitySharingController: 0x105f6f53`
Hello,
I want to capture video from Vision Pro in the Vision OS app. I am referring to the (https://developer.apple.com/videos/play/wwdc2024/10139/) Apple video and their code. step like below
import ARKit
com.apple.developer.arkit.main-camera-access.allow = true in info.plist
Do below code
func loadCameraFeed() async {
// Main Camera Feed Access Example
let formats = CameraVideoFormat.supportedVideoFormats(for: .main, cameraPositions:[.left])
let cameraFrameProvider = CameraFrameProvider()
var arKitSession = ARKitSession()
var pixelBuffer: CVPixelBuffer?
await arKitSession.queryAuthorization(for: [.cameraAccess])
do {
try await arKitSession.run([cameraFrameProvider])
} catch {
return
}
guard let cameraFrameUpdates =
cameraFrameProvider.cameraFrameUpdates(for: formats[0]) else {
return
}
print(cameraFrameUpdates)
for await cameraFrame in cameraFrameUpdates {
print(cameraFrame)
guard let mainCameraSample = cameraFrame.sample(for: .left) else {
continue
}
pixelBuffer = mainCameraSample.pixelBuffer
}
}
I want to convert "pixelBuffer" into video streaming and show it in a frame like iOS.
Please guide me on how to achieve my next step. I am blank after this code.
I am trying to create demo for spatial meeting using persona also refer apple videos, But not getting clear idea about it.
Any one could you please guide me step by step process or any code are appreciated for learning.
I am doing below code for getting thumbnail from usdz model using the QuickLookThumbnailing, But don't get the proper out.
guard let url = Bundle.main.url(forResource: resource, withExtension: withExtension) else{
print("Unable to create url for resource.")
return
}
let request = QLThumbnailGenerator.Request(fileAt: url, size: size, scale: 10.0, representationTypes: .all)
let generator = QLThumbnailGenerator.shared
generator.generateRepresentations(for: request) { thumbnail, type, error in
DispatchQueue.main.async {
if thumbnail == nil || error != nil {
print(error)
}else{
let tempImage = Image(uiImage: thumbnail!.uiImage)
print(tempImage)
self.thumbnailImage = Image(uiImage: thumbnail!.uiImage)
print("=============")
}
}
}
}
Below Screen Shot for selected model :
Below is the thumbnail image, which not come with guitar but get only usdz icon.
I want to get thumbnail image from USDZ model from vision os, But it will get image without material apply. Here is my code
import Foundation
import SceneKit
import SceneKit.ModelIO
class ARQLThumbnailGenerator {
private let device = MTLCreateSystemDefaultDevice()!
/// Create a thumbnail image of the asset with the specified URL at the specified
/// animation time. Supports loading of .scn, .usd, .usdz, .obj, and .abc files,
/// and other formats supported by ModelIO.
/// - Parameters:
/// - url: The file URL of the asset.
/// - size: The size (in points) at which to render the asset.
/// - time: The animation time to which the asset should be advanced before snapshotting.
func thumbnail(for url: URL, size: CGSize, time: TimeInterval = 0) -> UIImage? {
let renderer = SCNRenderer(device: device, options: [:])
renderer.autoenablesDefaultLighting = true
if (url.pathExtension == "scn") {
let scene = try? SCNScene(url: url, options: nil)
renderer.scene = scene
} else {
let asset = MDLAsset(url: url)
let scene = SCNScene(mdlAsset: asset)
renderer.scene = scene
}
let image = renderer.snapshot(atTime: time, with: size, antialiasingMode: .multisampling4X)
self.saveImageFileInDocumentDirectory(imageData: image.pngData()!)
return image
}
func saveImageFileInDocumentDirectory(imageData : Data){
var uniqueID = UUID().uuidString
let tempPath = NSSearchPathForDirectoriesInDomains(FileManager.SearchPathDirectory.documentDirectory, FileManager.SearchPathDomainMask.userDomainMask, true)
let tempDocumentsDirectory: AnyObject = tempPath[0] as AnyObject
let uniqueVideoID = uniqueID + "image.png"
let tempDataPath = tempDocumentsDirectory.appendingPathComponent(uniqueVideoID) as String
try? imageData.write(to: URL(fileURLWithPath: tempDataPath), options: [])
}
}
Hello, I am doing to load model from bundle and it is loaded successfully. Now I am scaling model using GestureExtension from apple demo code. (https://developer.apple.com/documentation/realitykit/transforming-realitykit-entities-with-gestures?changes=_8)
@State private var selectedEntityName : String = ""
@State private var modelEntity: ModelEntity?
var body: some View {
contentView
.task {
do {
modelEntity = try await ModelEntity.loadArcadeMachine()
} catch {
fatalError(error.localizedDescription)
}
}
}
@ViewBuilder
private var contentView: some View {
if let modelEntity {
RealityView { content, attachments in
modelEntity.position = SIMD3<Float>(x: 0, y: -0.3, z: -5)
print(modelEntity.transform.scale)
modelEntity.transform.scale = [0.006, 0.006, 0.006]
content.add(modelEntity)
if let percentTextAttachment = attachments.entity(for: "percentage") {
percentTextAttachment.position = [0, 50, 0]
modelEntity.addChild(percentTextAttachment)
}
} update: { content, attachments in
// I want here to get updated scaling value and it is showing in RealityView attachmnt text.
} attachments: {
Attachment(id: "percentage") {
Text("\(modelEntity.name) \(modelEntity.scale * 100) %")
.font(.system(size: 5000))
.background(.red)
}
}
// This method am using for gesture support
.installGestures()
} else {
ProgressView()
}
}
}
Below code from GestureExtension
let state = EntityGestureState.shared
guard canScale, !state.isDragging else { return }
let entity = value.entity
if !state.isScaling {
state.isScaling = true
state.startScale = entity.scale
}
let magnification = Float(value.magnification)
entity.scale = state.startScale * magnification
state.magnifyValue = magnification
magnifyScale = Double(magnification)
print("Entity Name ::::::: \(entity.name)")
print("Scale ::::::: \(entity.scale)")
print("Magnification ::::::: \(magnification)")
print("StartScale ::::::: \(state.startScale)")
> This "magnification" value I need to use in RealityView class. How can i Do it? Could you please guide it.
}
I want to set collection in curve view with fix paging in vision pro, How can i do?
I want to get only spatial video while open the Photo library in my app. How can I achieve?
One more thing, If I am selecting any video using photo library then how to identify selected video is Spatial Video or not?
self.presentPicker(filter: .videos)
/// - Tag: PresentPicker
private func presentPicker(filter: PHPickerFilter?) {
var configuration = PHPickerConfiguration(photoLibrary: .shared())
// Set the filter type according to the user’s selection.
configuration.filter = filter
// Set the mode to avoid transcoding, if possible, if your app supports arbitrary image/video encodings.
configuration.preferredAssetRepresentationMode = .current
// Set the selection behavior to respect the user’s selection order.
configuration.selection = .ordered
// Set the selection limit to enable multiselection.
configuration.selectionLimit = 1
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
}
`func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true) {
// do something on dismiss
}
guard let provider = results.first?.itemProvider else {return}
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in
guard error == nil else{
print(error)
return
}
// receiving the video-local-URL / filepath
guard let url = url else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)"
// create new URL
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
print(newUrl)
print("===========")
// copy item to APP Storage
//try? FileManager.default.copyItem(at: url, to: newUrl)
// self.parent.videoURL = newUrl.absoluteString
}
}`