Hi everyone,
I have a question about subscription upgrades and refunds for an app that offers two options:
Premium Yearly - $99.99/year (level 1)
Premium Monthly - $9.99/month (level 2)
If a user has been on the Premium Monthly plan for 11 months (totaling $109.89) and then upgrades to the Premium Yearly plan ($99.99/year), how would the refund/credit work in this situation?
Would the user receive:
A $9.99 credit (difference between $109.89 paid and $99.99 for yearly)?
No credit at all?
Post
Replies
Boosts
Views
Activity
I’m using AVFoundation in my iPhone application to encode a video in MP4 format with H.264, which can then be shared or exported.
Do I need to pay a license for using the H.264 format to MPEG LA? Or are these fees already covered by Apple?
I’ve read articles suggesting that Apple covers these fees when encoding is done through its native APIs (or via its dedicated encoding hardware components), but I haven’t found any explicit confirmation of this point in the various documentation or contracts... Did I miss something?
I observe significant performance differences when encoding a video in mp4 format (H264). The code I use is standard (using AVAssetWriter, AVAssetWriterInput...).
Here is what I notice when I run the same code on different platforms:
On an iPhone, the video is encoded in 3 seconds (iPhone 13, 14, 15, 16, Pro...).
On a Mac equipped with an M2 Pro, the video is encoded in 50 seconds.
On a Mac equipped with an Intel processor (2,3 GHz Intel Xeon W 18 cœurs), the video is encoded in 2 minutes.
The encoding on an iPhone is very fast due to hardware acceleration. However, I don’t understand why I don’t get similar performance with a Mac M2 Pro, which is equipped with a dedicated component for hardware acceleration (H264 media engine)?
Is hardware acceleration disabled on a Mac?
I have a problem with the following code, I am not being notified of changes to the progress property of my Job object, which is @Observable... This is a command-line Mac application (the same code works fine in a SwiftUI application).
I must have missed something?
do {
let job = AsyncJob()
withObservationTracking {
let progress = job.progress
} onChange: {
print("Current progress: \(job.progress)")
}
let _ = try await job.run()
print("Done...")
} catch {
print(error)
}
I Try this without any success:
@main
struct MyApp {
static func main() async throws {
// my code here
}
}
I notice price conversion errors when I set the price of my subscription at €4.99.
Base Price France (€4.99)
€4.99 -> $3.99 U.S.A
Base Price U.S.A ($3.99)
$3.99 -> €3.99 France
The prices are inconsistent and incorrect: €4.99 is $5.38 (1 EUR = 1.07860 USD today) -> the lower price tier of $5.38 should be $4.99 not $3.99.
$0.99 -> €0.99 France
$1.99 -> €1.99 France
$2.99 -> €2.99 France
$3.99 -> €3.99 France
$4.99 -> €5.99 France -> ????
€0.99 -> $0.99 U.S.A
€1.99 -> $1.99 U.S.A
€2.99 -> $2.99 U.S.A
€3.99 -> $3.99 U.S.A
€4.99 -> $3.99 U.S.A -> ????
Why do €3.99 and €4.99 give the same price in USD -> $3.99, whereas this is not the case with the previous price tiers?
I need to put my app into production, and I don’t know what to do (should I manually change the prices???).
I am experiencing an issue with the display of my localization file (String Catalog) in Xcode. Randomly, Xcode displays the file in XML format instead of the usual view. How can I restore the normal display?
Is it possible to play a stereoscopic video in MV-HEVC format using a player embedded in an HTML page?
When I open a volumetric window, it appears slightly to the right of its parent (SwiftUI window) is this normal (it's not centered on the horizontal axis)?
I was expecting the center (x axis) of the volumetric window to be aligned with the center of the parent (x axis of the SwiftUI window).
Is this a bug in the simulator, or is it an intentional choice by Apple? (Some of Apple's documentations show this offset, which leads me to think that it is a deliberate choice... However, I don't understand the reason of this design decision...).
Are there any technical recommendations regarding the display of a 3D scene in VisionOS?
I am particularly looking to know:
the maximum recommended number of polygons (e.g., display within a volume, throughout the entire space...)
the maximum recommended resolution of an HDR/EXR image used as a lighting source (e.g., 2K?)
the maximum recommended resolution of a png or jpeg/equirectangular texture used as a 360 background
Has anyone equipped with a headset conducted tests?
Thank you!
What is the correct approach to save the image (eg. CVPixelBuffer to png) obtained after calling the captureHighResolutionFrame method?
"ARKit captures pixel buffers in a full-range planar YCbCr format (also known as YUV) format according to the ITU R. 601-4 standard"
Should I change the color space of the image (ycbcr to rgb using Metal)?
I do not encounter any problems during the upload which seems to proceed correctly (success message at the end)...
However, I cannot view the builds from the interface (App Store Connect).
I've been testing different options (re-build, test on another computer, cache cleanup...) for several hours without success ;-(
@Apple: I need urgent help please