If I put the phone flat on a table, the left and right swipe gestures is not working but up and down gestures works.
Only when I put the iPhone to some vertical degree, the left and right swipe works.
Tested on 2 iPhone 7 Plus and 2 iPhone 13.
Anyone has similar experience? If yes, why?
Post
Replies
Boosts
Views
Activity
The app needs to play remote videos. Sometimes it takes very long time (~10 seconds) to load the media and play with AVPlayer.
So I use a timer to check and try to play next video if it is over 5 seconds:
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];// line 1
NSArray *keys = @[@"playable"];
mediaLoaded = NO;
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^() { // line 2
mediaLoaded = YES; // line 4
dispatch_async(dispatch_get_main_queue(), ^{
[self.player replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:asset]];
[self.player playImmediatelyAtRate:playSpeed];
});
}];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
if (!mediaLoaded) {
[self playNextVideo]; // line 3
}
});
So the flow is: line 1 (of video 1)- line 2 (of video 1)- line 3 (if over 5 seconds and video 1 is not playing)- line 1 (of video 2)-...
Now the problem is that seems line 2 is blocking line 1: only line 4 (for video 1 after ~10 seconds) or the completionHandler is executed will line 2 (for video 2) will be executed.
Anybody can give any insight?
Thx!
I saw equalizer in apps like Musi and Spotify. I think (but not sure) they use HLS streaming. If so, how to implement such an equalizer for HLS?
I searched and tried several approaches but so far none works, like:
AVAudioEngine seems only support local file;
Download .ts and merge into .mp3 to make it local can not guarantee real time effect;
MTAudioProcessingTap needs the audio track. For remote .mp3 I can extract the audio track but not for HLS.
Any suggestion?