Posts

Post not yet marked as solved
1 Replies
761 Views
I'm having an interesting issue with unit testing a memory leak. I don't understand why the test fails when the line "context!.getView()" is uncommented, but we're suspecting it's a language intern issue. We can make the test pass as expected using an autorelease pool, but I don't understand the root cause of the problem. The test: class MemoryLeakTests: XCTestCase { &#9;&#9;func testExample() throws { &#9;&#9;&#9;&#9;var view:MockView?&#9;= MockView() &#9;&#9;&#9;&#9;weak var weakRefToView = view &#9;&#9;&#9;&#9;var context: SomeContext? = SomeContext() &#9;&#9;&#9;&#9;context!.setView(view!) &#9;&#9;&#9;&#9;/* &#9;&#9;&#9;&#9;Uncomment this and it fails 🤷‍♂️🤷‍♂️🤷‍♂️🤷‍♂️ &#9;&#9;&#9;&#9;context!.getView() &#9;&#9;&#9;&#9;*/ &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;view = nil &#9;&#9;&#9;&#9;XCTAssertNil(weakRefToView) &#9;&#9;} } @interface SomeContext : NSObject (MockView *) getView; (void) setView: (MockView*) view; @end #import "SomeContext.h" @implementation SomeContext { &#9;&#9;__weak MockView* _view; } (MockView*) getView { &#9;&#9;return _view; } (void) setView: (MockView*) view { &#9;&#9;_view = view; } @end #import <Foundation/Foundation.h> NS_ASSUME_NONNULL_BEGIN @interface MockView : NSObject @end NS_ASSUME_NONNULL_END #import "MockView.h" @implementation MockView @end
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.4k Views
I'm trying to get some clarity on what dictates the format of an CVPixelBuffer. I obtain the buffer for a stream using AVPlayerItemVideoOutput.copyPixelBuffer(), and have seen that on a simulator the format is BGRA, and on a real device it is 420v. What decides on the format of the PixelBuffer?On a simulator the pixel buffer has the following properties: Height = 1080; IOSurfaceCoreAnimationCompatibility = 1; PixelFormatType = 1111970369; Width = 1920; } propagatedAttachments={ AlphaChannelIsOpaque = 1; CVFieldCount = 1; CVImageBufferColorPrimaries = "ITU_R_709_2"; CVImageBufferTransferFunction = "ITU_R_709_2"; CVImageBufferYCbCrMatrix = "ITU_R_709_2"; CVPixelAspectRatio = { HorizontalSpacing = 1; VerticalSpacing = 1; }; QTMovieTime = { TimeScale = 90000; TimeValue = 210000; }; } nonPropagatedAttachments={ }&gt;And on the real device:&lt;attributes={ ExtendedPixelsBottom = 24; ExtendedPixelsLeft = 0; ExtendedPixelsRight = 16; ExtendedPixelsTop = 0; PixelFormatDescription = { BitsPerComponent = 8; ComponentRange = VideoRange; ContainsAlpha = 0; ContainsGrayscale = 0; ContainsRGB = 0; ContainsYCbCr = 1; FillExtendedPixelsCallback = {length = 24, bytes = 0x0000000000000000d886649d010000000000000000000000}; IOSurfaceCoreAnimationCompatibility = 1; IOSurfaceCoreAnimationCompatibilityHTPCOK = 1; IOSurfaceOpenGLESFBOCompatibility = 1; IOSurfaceOpenGLESTextureCompatibility = 1; OpenGLESCompatibility = 1; PixelFormat = 875704438; Planes = ( { BitsPerBlock = 8; BlackBlock = {length = 1, bytes = 0x10}; }, { BitsPerBlock = 16; BlackBlock = {length = 2, bytes = 0x8080}; HorizontalSubsampling = 2; VerticalSubsampling = 2; } ); }; } propagatedAttachments={ CVFieldCount = 1; CVImageBufferChromaLocationBottomField = Left; CVImageBufferChromaLocationTopField = Left; CVImageBufferColorPrimaries = "ITU_R_709_2"; CVImageBufferTransferFunction = "ITU_R_709_2"; CVImageBufferYCbCrMatrix = "ITU_R_709_2"; CVPixelAspectRatio = { HorizontalSpacing = 1; VerticalSpacing = 1; }; QTMovieTime = { TimeScale = 90000; TimeValue = 1059000; }; } nonPropagatedAttachments={ }&gt;
Posted Last updated
.