I was wondering what do other developers think about one app clips enforced feature -
What do you think of the banner inviting to install the full app that appears at the first launch of an app clip?
In my case, my app is a platform that lets users experience visual AR content made by '3rd party' creators. I love app clips and I think they are a great way for creators to distribute their content because they can advertise a more focused user experience with their content solely and not sitting next to other creators. Also some app clip assets and UI parameters can be customized by the creator to make it more unique.
Creators feel in a way that they have their own app which is great!
I think this banner is too bold and is breaking the branding efforts of creators. Of course the full app has a more complete set of features and act as a catalog but I wish I had the control on how to present the invitation to install the full app within the app clips.
wdyt
Post
Replies
Boosts
Views
Activity
Hi,
when I use a local .mp4 video file encoded in HEVC + Alpha channel with an AVPlayer as the material of a SCNNode, the transparency is restituted correctly similarly as if I used a .png image with transparency.
The issue is:
when I encode this same .mp4 file into a HLS stream using mediafilesegmenter and try to play it in the same manner as a SCNNode material with AVPlayer the transparency is not restituted and instead the transparent zones are filled with opaque black. (the hls stream has correct transparency as verified by opening its url with Safari)
Sample Test:
import UIKit
import ARKit
class ViewController: UIViewController {
private var arView: ARSCNView!
lazy var sphere: SCNNode = {
let node = SCNSphere(radius: 5)
node.isGeodesic = false
node.segmentCount = 64
node.firstMaterial?.lightingModel = .constant
node.firstMaterial?.diffuse.contents = colorLiteral(red: 0, green: 0, blue: 0, alpha: 0)
node.firstMaterial?.cullMode = .front
return SCNNode(geometry: node)
}()
private var avPlayer: AVPlayer!
override func viewDidLoad() {
super.viewDidLoad()
setupArView()
setupArSession()
setupButton()
}
private func setupButton() {
let button = UIButton()
button.setTitle("START", for: .normal)
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
button.centerYAnchor.constraint(equalTo: view.centerYAnchor)
])
button.addTarget(self, action: #selector(createSphere), for: .touchUpInside)
}
@IBAction func createSphere() {
guard avPlayer == nil else { return }
addSphere()
}
}
extension ViewController {
private func setupArView() {
arView = ARSCNView()
arView.backgroundColor = .black
arView.translatesAutoresizingMaskIntoConstraints = false
view.insertSubview(arView, at: 0)
NSLayoutConstraint.activate([
arView.leadingAnchor.constraint(equalTo: view.leadingAnchor),
arView.topAnchor.constraint(equalTo: view.topAnchor),
arView.trailingAnchor.constraint(equalTo: view.trailingAnchor),
arView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
arView.preferredFramesPerSecond = 60
}
private func setupArSession() {
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
configuration.environmentTexturing = .none
if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
configuration.frameSemantics.insert(.personSegmentationWithDepth)
}
if ARWorldTrackingConfiguration.supportsUserFaceTracking {
configuration.userFaceTrackingEnabled = true
}
arView.session.run(configuration)
}
private func addSphere() {
// let asset = AVURLAsset(url: URL(string: "https://SOMECLOUDSTORAGE.com/hls-bug/prog_index.m3u8")!)
let asset = AVURLAsset(url: Bundle.main.url(forResource: "puppets", withExtension: "mp4")!)
let playerItem = AVPlayerItem(asset: asset)
playerItem.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
playerItem.isAudioSpatializationAllowed = true
playerItem.allowedAudioSpatializationFormats = .monoStereoAndMultichannel
avPlayer = AVPlayer()
sphere.position = SCNVector3(0, 0, 0)
arView.scene.rootNode.addChildNode(sphere)
avPlayer.replaceCurrentItem(with: playerItem)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == #keyPath(AVPlayerItem.status) {
let status: AVPlayerItem.Status
if let statusNumber = change?[.newKey] as? NSNumber {
status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!
} else {
status = .unknown
}
switch status {
case .readyToPlay:
DispatchQueue.main.async {
self.avPlayer.playImmediately(atRate: 1)
self.sphere.geometry?.firstMaterial?.diffuse.contents = self.avPlayer
}
case .failed, .unknown:
break
@unknown default:
break
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
}
The video file used is the puppets_with_alpha_hevc.mov file from the Apple's HEVC alpha demo that I re-muxed into .mp4 container using ffmpeg.
To reproduce both scenarios replace the AVURLAsset with either a local .mp4 file or the HLS stream url.
Issue reproduced on iPhone 11 Pro iOS 15.
This issue has been unresolved for a bit of time now though I tried everything to get attention. Unsuccessful TSI ticket, silent Feedback Assistant Bug report, I even discussed about this bug during WWDC 2021 with Shiva Sundar who is in charge of HEVC dev and who said it would be checked.
Hopes
Hi,I know that AVAudioSession can indicate the session output such as bluetooth, headphones, speakers, etc... I was wondering if other audio session states such as AirPods Pro Transparency (or Noise Cancelling) status were accessible from the code.