WWDC23 Support

RSS for tag

Ask for help with session viewing, lab appointments, and more.

Posts under WWDC23 Support tag

11 Posts
Sort by:
Post not yet marked as solved
0 Replies
285 Views
The "Create a great spatial playback experience" video from WWDC23 shows examples of a 3D stream of a Prehistoric Planet clip within the Destination Video sample app. In the documentation for Destination Video, it shows a screenshot with that clip in it and references playback of 3D video as a function of the sample app. However, the Destination Video app does not include the 3D video clip and the clip of it referenced here, https://developer.apple.com/streaming/examples/, does not work. I am hoping to use that 3D sample clip to test playback of 3D content. Is that something that can be fixed soon and included in the Destination Video app? Thanks!
Posted
by mparrott.
Last updated
.
Post not yet marked as solved
2 Replies
486 Views
Hey, I am the winner of WWDC20 Swift Student Challenge, I was recently checking out my account hoping to find Events tab to find my winner badge there, but the whole section is gone. I was wondering if anyone can help me in finding out the updated or similar version of the Events section in this website :)
Posted Last updated
.
Post not yet marked as solved
8 Replies
921 Views
I'm hoping someone might point me in the right direction to fix an issue I'm having with a Map view. I'm using the latest XCode 15, targetting iOS 17. I have a map view where the user can tap on the map to create a new annotation marker. However after adding the onTapGesture handler my MapUserLocationButton isn't triggered, instead it drops a pin behind where the button is. The compass button still works which is odd. Here's the code for the view (with a few bits removed to keep it concise) struct LocationsMapView: View { @Environment(\.modelContext) private var context @Query private var locations: [SavedLocation] @Binding var path: NavigationPath @State private var position: MapCameraPosition = .automatic @State private var selectedLocation: SavedLocation? @State var placedCreateLocationPin = false @State var createLocationCoord :CLLocationCoordinate2D? = nil var body: some View { GeometryReader { proxy in MapReader{ reader in Map(position: $position, selection: $selectedLocation){ if let pl = createLocationCoord { if placedCreateLocationPin{ Annotation("New Location", coordinate: pl){ Image("PinIcon") .resizable() .aspectRatio(contentMode: .fit) .frame(width: 44, height: 44) } } } } .mapStyle(.hybrid) .safeAreaInset(edge: .bottom){ if placedCreateLocationPin { Button { if let pl = createLocationCoord{ let vm = CreateLocationViewModel(location: pl) path.append(vm) placedCreateLocationPin = false } } label: { Text("Create Location") } } } .mapControls{ MapCompass() MapUserLocationButton() } .onTapGesture(perform: { screenCoord in placedCreateLocationPin.toggle() let pl = reader.convert(screenCoord, from: .local) print("tapped location: \(pl?.latitude ?? 0.0) \(pl?.longitude ?? 0.0)") createLocationCoord = pl }) } } } }
Posted Last updated
.
Post marked as solved
2 Replies
549 Views
Hi, I'm trying to load some sample 3D models into my app, and I cannot figure out how to make lightning work. The model looks lighted up in Reality Composer Pro and preview but looks extremely dark when loaded. Any ideas on how to fix it? The model is the Earth one you can easily get in composer pro when clicking add and searching for Earth. No other modifications have been made, thanks var body: some View { Model3D(named: "EarthTest", bundle: realityKitContentBundle){ model in model.resizable() .scaledToFit() .rotation3DEffect(Rotation3D(angle: .degrees(isRotating), axis: .y)) .onAppear{ withAnimation(.linear(duration: 30).repeatForever(autoreverses: false)) { isRotating += 359 } } } placeholder: { ProgressView() } }
Posted
by JuanSE0.
Last updated
.
Post marked as Apple Recommended
631 Views
I am having trouble getting the new mirroring session API to send data to the companion device. when starting a workout on my watch I call startMirroringToCompanionDevice and then go onto my iOS workout manager to handle it via the workoutSessionMirroringStartHandler, I set the delegate here and can confirm that it is indeed not nil within that function but when I go to send some data across I get an error saying the remote session delegate was never set up. I noticed this same behaviour in the WWDC demo and have been unable to find a solution that will allow me to send data across the mirrored session even though I am able to control the workout session state(pause, resume, end) on both Phone and Watch. Has anyone else encountered this issue? Anyone have a solution?
Posted
by sebrob.
Last updated
.
Post not yet marked as solved
0 Replies
800 Views
@Apple: Would it be possible to get the source of the Sample app shown in WWDC 2023 session “Explore pie charts and interactivity in Swift Charts” https://developer.apple.com/wwdc23/10037 In particular I’m interested in learning how to get the dates of currently be viewed bar marks in a scroll view. You can see that in the video at 9:12 on the iPhone screen showing the demo app.
Posted
by Besti.
Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
Xcode Server is say goodbye at Xcode15 Beta. Hello Apple: As iOS developers and users of Xcode Server, we hope that Apple can retain the functionality of Xcode Server or provide an API for Xcode Server, open source the code, so that Xcode Server can continue to operate and provide better services for iOS developers. Additionally, Apple's recommendation of Xcode Cloud has raised some concerns among us. We are very grateful to Apple for creating this excellent tool that helps us automate building, testing, and deploying iOS applications, improving our development efficiency and quality. However, we are disappointed with the removal of Xcode Server from the latest Xcode 15 version. At the same time, we also have some doubts about Apple's proposed Xcode Cloud. Firstly, we hope that Apple can recognize that Xcode Server is an essential tool for iOS developers. Many development teams rely on Xcode Server for continuous integration and delivery, enabling them to quickly build, test, and deploy new versions of the application. If Xcode Server is removed, developers will need to spend more time and effort on these tasks, greatly reducing development efficiency and quality. Secondly, although Apple has proposed Xcode Cloud, we believe that it has some problems. Xcode Cloud is cloud-deployed, which presents limitations on private networks and data security, and lacks the ResetAPI of Xcode Server, making it difficult to integrate with other systems. Also, Xcode Cloud requires payment, which may be a considerable expense for individual developers and small teams. Therefore, we hope that Apple can reconsider and retain the functionality of Xcode Server or provide an API for Xcode Server, open source the code, so that Xcode Server can continue to operate and provide better services for iOS developers. At the same time, we also hope that Apple can continue to improve Xcode Cloud and address its problems to provide better cloud services for iOS developers. Thanks
Posted
by rbbtsn0w.
Last updated
.
Post not yet marked as solved
0 Replies
1.3k Views
While using Vision Pro, there could be a situation where the battery pack is yanked from the device by a door handle or just a wave of the arm, what happens in those situations, does the Vision Pro just turn off instantly or is there some sort of back up. I am curious, because when building productivity apps like Word, whether data would be lost. If there is a backup, is there a notification that developers can use to react and save current state and other data?
Posted
by TeddyWon.
Last updated
.