I'm preparing my submission for the Swift Student Challenge, and I have a couple of questions regarding the development environment.
Is it allowed to use Xcode to program my scene, or do I have to use Swift Playgrounds?
Can I use iPadOS 18 for development? I noticed that Swift Playgrounds currently only supports up to iPadOS 17.5, but I would like to use RealityView, which is only available starting from iPadOS 18.
I appreciate any clarification on this. Thanks in advance!
Swift Playground
RSS for tagLearn and explore coding in Swift through interactive learning experiences on the Swift Playground app for iPadOS and macOS.
Posts under Swift Playground tag
96 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi, I encountered an issue after the latest update on Swift Playground.
I’m using the iPad Pro 3rd Gen, first time reporting a bug hopefully I’m on the right platform.
When I create a new Swift file or folder, it’ll auto revert the file/folder name to the default naming upon creation.
Tested this one on a few existing projects, it’s the same for all of them.
Initially when I created a new project to verify, those files/folders can be renamed without the issue of reverting.
But after testing it again, seems like the same issue is happening to it as well.
I‘ve tried restarting my iPad but the problem persists.
So I thought to report it and from my search seems like this is the platform for it? Thanks.
My M2 Pro Mac mini with OSX 14.7 just updated to Swift Playgrounds 4.6.1. Now the :Learn to Code & Build Apps screen is limited in size and can't be zoomed. There is no way to scroll horizontally and I can't view the whole screen.
I am submitting a challenge to the Swift Student Challenge. I have created a RealityContent folder using Reality Composer Pro. How can I import this folder into the Swift Package Manager (.swiftpm) project hosted on Playground to ensure that it becomes a usable package?
I am interested in participating in the Swift Student Challenge. My application contains a significant amount of augmented reality (AR) content, necessitating access to the camera. It is evident that if the reviewer utilizes a simulator or operates on a Mac, they will not be able to experience the AR function. Therefore, the AR function in the camera experience application must be utilized to access a real iPad.
However, it is mentioned in https://developer.apple.com/forums/thread/773530 that the plan is to evaluate Xcode app playgrounds within the simulator. Additionally, I observed the statement “Note: Xcode app playgrounds are executed in Simulator” on the submission page. Consequently, it is clear that the reviewers are limited to using a simulator or running my application on a Mac.
In light of this, I am seeking guidance on how to enable the reviewer to utilize a real iPad to access the AR function in the camera experience application. Alternatively, I may need to reconsider my strategy and discontinue utilizing AR.
It was mentioned in the Swift Student Challenge that outstanding winners will have the opportunity to visit Apple Park in the United States. However, as a challenger from China who is not currently in the U.S., this means that if I receive the outstanding award, I will need to apply for a visa to travel to Apple Park. Since I am under 18, my guardian would also need to apply for a visa. Therefore, I would like to know if Apple provides visa assistance for outstanding winners and their guardians from China, or if we are responsible for applying for the visas on our own.
I was developing my app on Xcode and I saw in requirements it says “your submission must be an app playground (.swiftpm)
I reckon I can develop in Xcode and then copy those files in playgrounds app, make some changes, for it to work.
also I made my project in landscape mode in Xcode, in playgrounds can I lock display orientation through package.swift file or I should continue making me app in landscape mode and ask players to change their viewing orientation via a popup?
I use Swift Playgrounds on iPad to run an intro to programming class with high school students. I've created some custom playground books based on Apple's guides to provide some simple sandboxes to learn basic coding.
Students simply click a button on my website, which downloads my playground books so students can easily create new sandboxes.
With Swift Playgrounds 4.6.1 on both Mac and iPad, clicking the link only opens Swift Playgrounds and does not download my playground books! There used to be an "Add a Subscription URL" button, but it is no longer present on the Learn to Code page or anywhere else in the app that I can find:
Is it intentional that this functionality is removed? Playground Books still appear to work, and documentation for subscription feeds is still available.
I start another session with my students next Monday, so I need to know as soon as possible whether I need to plan to work around this myself.
Thanks,
Mark Schmidt
Just downloaded the latest Swift Playground for Mac (4.6). The Welcome window does not let you scroll horizontally via mouse, trackpad, or other means, so you can only see the first two Learn to Code resources - similar for the App Gallery and Extend Your App sections.
In an App Playground Xcode project there is no Targets menu in the UI, When I try use the model, it says the model is not in scope. When I did it in a regular project it automatically generated a Swift Class and had no erorrs because it had a target but I see no place to add a target on an App playground.
I am trying to run TinyLlama directly using Swift Playgrounds for iOS. I have tried multiple solutions, like libraries (LLM.swift, swift-transformers, ...) which never worked due to import issues, and also tried importing an exported mlmodel.
For the later, I followed the article about Llama 3.1 on CoreML. It was hard to understand how to do the inference with it, but I was able to export a mlpackage, that I then placed in a xcode project to generate the mlmodelc (compiled model) and the model class. I had to go with the first version described in the article, without optimizations, as I got errors during model loading with the flexible input shapes. I was able to run the model for one token generation.
But my biggest problem is that, though the mlmodelc is only 550 MiB, th model loads 24+GiB of memory, largely exceeding what I can have on an iOS device.
Is there a way to use do LLM inferences on Swift Playgrounds at a reasonable speed (even 1 token / s would be sufficient)?
I intend to participate in the Swift Student Challenge 25. I see Rules, It is mentioned that Playgrounds works should be a work that can be experienced in three minutes. However, my work does not meet this requirement.
Create an interactive scene in an app playground that can be experienced within three minutes.
Initially, my work was not intended for the Challenge but for the App Store. However, I decided to submit it to the Challenge, and my work and I met the requirements of the Challenge. Therefore, my work is a complete application, which makes it impossible for the judges to experience it within three minutes. It may take more time. Does this have any impact?
I am having issues with exported playgrounds from Xcode, when I try to open my exported swift fill I get the following message: "Couldn't load settings from contents.xcplayground"
Xcode Version: Version 16.2 (16C5032a)
Steps to reproduce
Create new playground in Xcode.
File->Export
Open exported file.
The issue still press persist after reinstalling Xcode.
Hello, I have an issue with importing some .mp3 files into a swift playground project (in Xcode, not in the Playground app). They worked fine in the Xcode project, but for some reason playgrounds isn't able to find them. I imported them the exact same way as I did in the Xcode project.
Hey there-
I'm having a quite interesting bug on Swift Playgrounds.
I am trying to run my app with this following code snippet which does not compile on Swift Playgrounds, yet compiles on XCode (note: this is a Swift Playground app)
if #available(iOS 18.0, *) {
//simple function to get the indices of other items that have the same date as the "date" variable
let indices = data!.indices(where: { item in
let sameMonth = Calendar.current.component(.month, from: item.time) == Calendar.current.component(.month, from: date)
let sameYear = Calendar.current.component(.year, from: item.time) == Calendar.current.component(.year, from: date)
let sameDay = Calendar.current.component(.day, from: item.time) == Calendar.current.component(.year, from: date)
return sameDay && sameMonth && sameYear
})
However, the indices(where:) codeblock seems to stop the app from compiling (ONLY on Swift Playgrounds - it works perfectly fine on XCode).
I am getting the following error:
Cannot call value of non-function type 'Range<Array<Int>.Index>' (aka 'Range<Int>')
Please let me know if you have any insight regarding this issue.
-ColoredOwl
I just wonder if it’s possible to add push notifications to an app made it Swift Playgrounds or if it always has to be exported to XCode first
I keep trying to use the app but every time I try to click the module "get started with code" it crashes and produces an error log, attached.
Crash log
Trying to add a Reality Composer Pro project into my swift playground application. Can't figure out what name to call for the package.
Following code crashes (sigsegv in lldb-rpc-server) when run as swift 6, but runs correctly when run as swift 5 (from "Metal by tutorials"):
import PlaygroundSupport
import MetalKit
print("start")
guard let device = MTLCreateSystemDefaultDevice() else {
fatalError("GPU is not supported")
}
let frame = CGRect(x: 0, y: 0, width: 600, height: 600)
let view = MTKView(frame: frame, device: device)
view.clearColor = MTLClearColor(red: 1, green: 1, blue: 0.8, alpha: 1)
let allocator = MTKMeshBufferAllocator(device: device)
let mdlMesh = MDLMesh(sphereWithExtent: [0.75,0.75,0.75], segments: [100, 100], inwardNormals: false, geometryType: .triangles, allocator: allocator)
let mesh = try MTKMesh(mesh: mdlMesh, device: device)
guard let commandQueue = device.makeCommandQueue() else {
fatalError("Could not create a command queue")
}
let shader = """
#include <metal_stdlib>
using namespace metal;
struct VertexIn {
float4 position [[attribute(0)]];
};
vertex float4 vertex_main(const VertexIn vertex_in [[stage_in]])
{
return vertex_in.position;
}
fragment float4 fragment_main() {
return float4(1, 0, 0, 1);
}
"""
print("A")
let library = try device.makeLibrary(source: shader, options: nil)
let vertexFunction = library.makeFunction(name: "vertex_main")
let fragmentFunction = library.makeFunction(name: "fragment_main")
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineDescriptor.vertexFunction = vertexFunction
pipelineDescriptor.fragmentFunction = fragmentFunction
print("X")
pipelineDescriptor.vertexDescriptor = MTKMetalVertexDescriptorFromModelIO(mesh.vertexDescriptor)
let pipelineState = try device.makeRenderPipelineState(descriptor: pipelineDescriptor)
guard let commandBuffer = commandQueue.makeCommandBuffer(),
let renderPassDescriptor = view.currentRenderPassDescriptor,
let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor)
else {
fatalError()
}
renderEncoder.setRenderPipelineState(pipelineState)
renderEncoder.setVertexBuffer(mesh.vertexBuffers[0].buffer, offset: 0, index: 0)
guard let submesh = mesh.submeshes.first else {
fatalError()
}
renderEncoder.drawIndexedPrimitives(type: .triangle, indexCount: submesh.indexCount, indexType: submesh.indexType, indexBuffer: submesh.indexBuffer.buffer, indexBufferOffset: 0)
renderEncoder.endEncoding()
guard let drawable = view.currentDrawable else {
fatalError()
}
commandBuffer.present(drawable)
commandBuffer.commit()
print("test")
PlaygroundPage.current.liveView = view
Crash report: https://gist.githubusercontent.com/tumdum/8aa53bc806619c0d21c93a55fae07937/raw/370b00c07b08fff8856f9fc678de9888faa8d06e/crash.log
I'm on macOS 15.1.1 (24B2091) + Xcode 16.2 (16C5032a)
I am building an app playground for SSC'25 where I want to use Multipeer Connectivity framework that would allow me to send and receive data to and from stranger devices. I also want to use some other open-source packages for some of the features. I just wanted to know if we are allowed to use or not?