Posts

Post not yet marked as solved
3 Replies
2.8k Views
I noticed that some amount of effort has been put into Xcode to support yacc/Bison source files (.y, .ypp, .ymm) and lex/Flex source files (.l, .lpp, .lmm) out of the box conveniently with Xcode projects, that is you just add those source files to your Xcode project and Xcode will automatically generate the respective parser and scanner for you.However there is one issue I encountered: for a pair of Flex/Bison scanner/parser one usually wants Bison to additionally generate the parser's definitions as separate header file, which would be on the command line either:bison -d foo.y mv y.tab.h foo.horbison --defines=foo.h foo.yThat auto generated C/C++ header file (in this case foo.h) would then be included i.e. on scanner side (i.e. to avoid declaring terminal symbols twice on both ends).Is there a convenient way to handle generation of such parser header files with Xcode as well?As far as I can see it at the moment (with Xcode 7), when I look at the auto generated output files, Xcode actually calls Bison to let it generate the definition header file and a y.tab.h file is created accordingly, so if you just have exactly 1 Bison parser then you can do a hack and use a#include "y.tab.h"statement in your respective source files. But since the auto generated header file will always be named "y.tab.h", what would you do if you got multiple Bison parsers in your project? Shouldn't Xcode automatically domv y.tab.h foo.hfor each parser it generates?
Posted
by Crudebyte.
Last updated
.
Post not yet marked as solved
3 Replies
1.4k Views
When using Metal on iOS devices, I see a fairly high base CPU load on three of Metal's API calls. Even with noop (entirely empty) shader functions and no buffers/textures allocated/assigned at all, even then the Metal API calls cause a constant CPU load of 20% (main thread) on an iPad Pro 4. Is this normal? I only see that issue on real iOS devices. If I run the same Metal code on macOS (or with iOS simulator on macOS) there is no measurable CPU load at all. The heaviest Metal API calls are, in this order: renderCommandEncoderWithDescriptor: (40%) [commandBuffer commit]; (22%) [self nextDrawable]; (17%) The render function of my very simple test code looks like this: (void)renderFrame:(CADisplayLink*)dlink { &#9;&#9;// semaphore previously initialized with 3 &#9;&#9;dispatch_semaphore_wait(semaphoreRenderFrame, DISPATCH_TIME_FOREVER); &#9;&#9;@autoreleasepool { &#9;&#9;&#9;&#9;id<MTLCommandBuffer> commandBuffer = commandQueue.commandBuffer; // 8%         id<CAMetalDrawable> drawable = [self nextDrawable]; // 17%         if (!drawable) return; &#9;&#9;&#9;&#9;renderPass.colorAttachments[0].texture = drawable.texture;         id<MTLRenderCommandEncoder> commander =             [commandBuffer renderCommandEncoderWithDescriptor:renderPass]; // 40!         [commander setRenderPipelineState:renderPipeline];         [commander drawPrimitives:MTLPrimitiveTypeTriangle                     vertexStart:0 vertexCount:nVertices instanceCount:1];         [commander endEncoding];         [commandBuffer presentDrawable:drawable]; &#9;&#9;&#9;&#9;__block dispatch_semaphore_t semaphore = semaphoreRenderFrame;         [commandBuffer addCompletedHandler:^(id<MTLCommandBuffer> buffer) {             dispatch_semaphore_signal(semaphore);         }]; &#9;&#9;&#9;&#9;[commandBuffer commit]; // 22% &#9;&#9;} } So there is no data transferred between CPU and GPU, no work on the shaders. Simply nothing. Configuration: Frame capturing disabled. Metal API validation disabled. Metal fast math enabled. Compiled in release mode. All sanitizers disabled. The widget class derives directly from CAMetalLayer, so this is not using MTKView. iPad Pro 4 (iOS 14.2). Xcode 12.2. Any ideas appreciated!
Posted
by Crudebyte.
Last updated
.
Post not yet marked as solved
0 Replies
997 Views
Wondering is Metal debugging not supported with iOS simulator? When running a Metal app with Xcode on a real iOS device, by clicking on the camera icon on the debugger toolbar, the Metal debugger (a.k.a. Frame Capture Debugger) appears on screen for further investigation of issues with Metal shaders. That works with real iOS devices: https://developer.apple.com/documentation/metal/shader_authoring/developing_and_debugging_metal_shaders However when running the same Metal app with iOS simulator, the camera icon is grayed out, hence Metal debugger is unavailable for some reason. So is this simply not supported by the simulator or has there something else to be set up?
Posted
by Crudebyte.
Last updated
.
Post not yet marked as solved
0 Replies
363 Views
Could somebody share with me where Xcode's user specified fonts preferences are stored at? I.e. the settings configurable under Xcode -> Preferences -> Fonts & Colors. Every time a new major Xcode version is released (this time Xcode 12), the user's fonts preferences are gone. Obviously I would like to avoid setting them up manually over and over again once per year. Especially since I have quite a bunch of adjustments and also several different themes for different monitors and purposes.
Posted
by Crudebyte.
Last updated
.
Post not yet marked as solved
8 Replies
13k Views
Is there still no (in whatever creative) way to set Xcode's own environment variables from a "Run Script Phase" script?As you might know, the problem with "Run Script" build phase scripts always was that those scripts are forked as a separate process by Xcode. So a script inherits downwards Xcode's environment variables, but any changes made to those environment variables by scripts will not propagate back to Xcode due to that.Is there maybe in the meantime a way to force scripts being run directly within Xcode's build process instead of being forked? Or is there some RPC that would allow to modify Xcode's environment variables by script?
Posted
by Crudebyte.
Last updated
.