I'm trying to create an app that uses artificial intelligence technology.
One of the models provided on this website(https://developer.apple.com/machine-learning/models/) will be used.
Are there any copyright or legal issues if I create an app using the model provided by this website and distribute it to the App Store?
Post
Replies
Boosts
Views
Activity
I'm trying to create an app that uses artificial intelligence technology.
One of the models provided on this website(https://developer.apple.com/machine-learning/models/) will be used.
Are there any copyright or legal issues if I create an app using the model provided by this website and distribute it to the App Store?
I want to use emoji like this in my iOS app and release it on the App Store.
struct ContentView: View {
@State private var move = false
var body: some View {
VStack {
Text("👻")
.font(.largeTitle)
.position(x: move ? 50 : 400)
Button("Move") {
withAnimation(.linear(duration: 2)) {
move.toggle()
}
}
}
}
}
Are there any other problems with this?(like legal issues, etc.)
I prefer the coding style below. (Of course, the code below is not long, but when the number of lines is long)
struct ContentView: View {
var body: some View {
VStack {
text
hello
}
}
var text: some View {
Text("text")
.padding()
}
var hello: some View {
Button("hello") {
print("hello")
}
}
}
But people seem to prefer the style below.(Not only when the number of lines of code is small, but also when it is long)
struct ContentView: View {
var body: some View {
VStack {
Text("text")
.padding()
Button("hello") {
print("hello")
}
}
}
}
Which coding style is more popular among the two?
This code works fine.
struct Cardify: AnimatableModifier {
init(isFaceUp: Bool) {
rotation = isFaceUp ? 0 : 180
}
var animatableData: Double {
get { rotation }
set { rotation = newValue }
}
var rotation: Double
func body(content: Content) -> some View {
ZStack {
let shape = RoundedRectangle(cornerRadius: DrawingConstants.cornerRadius)
if rotation < 90 {
shape.fill().foregroundColor(.white)
shape.strokeBorder(lineWidth: DrawingConstants.lineWidth)
} else {
shape.fill()
}
content
.opacity(rotation < 90 ? 1 : 0)
}
.rotation3DEffect(Angle.degrees(rotation), axis: (0, 1, 0))
}
private struct DrawingConstants {
static let cornerRadius: CGFloat = 10
static let lineWidth: CGFloat = 3
}
}
But the code below where I changed AnimatableModifier to Animatable, ViewModifier doesn't work.
struct Cardify: Animatable, ViewModifier {
init(isFaceUp: Bool) {
rotation = isFaceUp ? 0 : 180
}
var animatableData: Double {
get { rotation }
set { rotation = newValue }
}
var rotation: Double
func body(content: Content) -> some View {
ZStack {
let shape = RoundedRectangle(cornerRadius: DrawingConstants.cornerRadius)
if rotation < 90 {
shape.fill().foregroundColor(.white)
shape.strokeBorder(lineWidth: DrawingConstants.lineWidth)
} else {
shape.fill()
}
content
.opacity(rotation < 90 ? 1 : 0)
}
.rotation3DEffect(Angle.degrees(rotation), axis: (0, 1, 0))
}
private struct DrawingConstants {
static let cornerRadius: CGFloat = 10
static let lineWidth: CGFloat = 3
}
}
how do i fix it?
I won the challenge.
Will supervisors be contacted further after I win? (Requirements such as student identification)
If so, I would like to consult with the supervisor.
As far as I know, the winning result confirmation webpage disappears after a few years.
I want to use the fact that I won when I get a job in a few years, but how can I prove it?
I am a student preparing for the wwdc swift student challenge.
I am trying to build an AR app, but rcproject is not loading in swiftpm. What should I do?
My code is below.
private var game = try! game.loadGame()
help me..
The process of the app I'm trying to make is simply like this.
It calls the weather API every morning at 7:00 and gives a notification if it rains.
Do I need to create a new backend server or is it possible to schedule on iOS?
I watched Stanford 193p, and I had a question.
When the professor writes the code, he writes like this
struct GameView: View {
var body: some View {
gameBody
}
var gameBody: some View {
Button("gameBody") {
gameBodyFunc()
}
}
private func gameBodyFunc() {
}
}
Why is the function declared private but not the gameBody variable?
I am impressed with his coding style and want to learn it.
that is my python code (file name pythonTest.py)
while True:
a = input()
print(a)
I want to run this python code in a swift script and get the result.
For example, I would like to enter something command like this.
python3 pythonTest.py\n1\n2\n3\n and ctrl + C
i found Process() but this is only entered once and I don't know how to handle input(), ctrl + C
How can I do that?
If it's not possible in Swift, is it possible in Python?
my code
func placeObjectAtImageTracking(object: ModelEntity, imageAnchor: ARImageAnchor) -> AnchorEntity {
let imageAnchorEntity = AnchorEntity(anchor: imageAnchor)
return imageAnchorEntity
}
in this code, AnchorEntity(anchor: imageAnchor) is occuring error when i changed xcode setting device.
It's fine when I set it to my real phone, but when I change it to the simulator, the following error appears.
No exact matches in call to initializer
how to solve it?
i use this code in my app
// MARK: - Protocol
protocol VideoManagerProtocol: AnyObject {
func didReceive(sampleBuffer: CMSampleBuffer)
}
final class VideoManager: NSObject {
// MARK: -- Properties
/// RequestPermissionCompletionHandler
typealias RequestPermissionCompletionHandler = ((_ accessGranted: Bool) -> Void)
/// delegate of VideoManager
weak var delegate: VideoManagerProtocol?
/// An object that manages capture activity.
let captureSession = AVCaptureSession()
/// A device that provides input (such as audio or video) for capture sessions.
// let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) // choose deviceType, position is better but don't know ipad camera so choose default saftely
let videoDevice = AVCaptureDevice.default(for: .video)
/// A Core Animation layer that displays the video as it’s captured.
lazy var videoLayer: AVCaptureVideoPreviewLayer = {
return AVCaptureVideoPreviewLayer(session: captureSession)
}()
/// A capture output that records video and provides access to video frames for processing.
lazy var videoOutput: AVCaptureVideoDataOutput = {
let output = AVCaptureVideoDataOutput()
let queue = DispatchQueue(label: "VideoOutput", attributes: DispatchQueue.Attributes.concurrent, autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency.inherit)
output.setSampleBufferDelegate(self, queue: queue)
output.alwaysDiscardsLateVideoFrames = true
return output
}()
// MARK: -- Methods
override init() {
guard let videoDevice = videoDevice, let videoInput = try? AVCaptureDeviceInput(device: videoDevice) else {
fatalError("No `Video Device` detected!")
}
super.init()
captureSession.addInput(videoInput)
captureSession.addOutput(videoOutput)
}
func startVideoCapturing() {
self.captureSession.startRunning() // do not operate in dispatch global background
// DispatchQueue.global(qos: .background).async {
// self.captureSession.startRunning()
// }
}
func stopVideoCapturing() {
captureSession.stopRunning()
}
func requestPermission(completion: @escaping RequestPermissionCompletionHandler) {
AVCaptureDevice.requestAccess(for: .video) { (accessGranted) in
completion(accessGranted)
}
}
}
extension VideoManager {
}
// MARK: - AVCaptureVideoDataOutputSampleBufferDelegate
extension VideoManager: AVCaptureVideoDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
delegate?.didReceive(sampleBuffer: sampleBuffer)
}
}
i tried like this AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)
The camera uses only the front camera. how to use back camera?
I want to make object capture app in iOS.
I've been looking for object capture examples, but there are only macOS.
how to make object capture app in iOS?
I am a student studying AR with ARkit.
I want a variety of assets that can be used commercially. (there are limits to what I can do myself...)
Is there any place where I can get assets(like 3Dmodels, usdz file etc.) like Unity's asset store?
The resources on the Apple developer homepage are good but few, and don't know if they are commercially available.