Posts

Post not yet marked as solved
1 Replies
1.3k Views
Hello I want to programmatically generate a random patterned image, think wallpaper. The method of generation is yet to be established and is unimportant for this question. The output will either be a UIImage or a CIImage object. Then I want to take a random area of that pattern, a crop effectively, and make one of any number of alterations to it. A non-exhaustive list might include: rotation, swapping colours, other colour effects, shift or scroll. The area affected might also be one of any number of shapes. A non-exhaustive list might include: square, rectangle, triangle, circle, oval. Essentially, I want to take a uniform pattern and break or disrupt it in random ways. So I've been Googling how I might do this and everything seems to point to the Core Image framework, particularly the filters. Although it seems less clear to me how to do transformations, crops and composite images. My question simply is, am I on the right path pursuing Core Image or are there other ways to achieve these effects? Thanks Jim
Posted
by dcjams.
Last updated
.
Post marked as solved
1 Replies
1.3k Views
Hello I want to commission a designer to create some bitmap assets for an app. The designer has no experience specific to iOS. I am new to iOS development, too. I am from a web background. I understand what pixel density is, and that for a given point size, there may be more than a single pixel per point. This is the origin of the @2x and @3x suffixes. Nonetheless, if the designer is working with bitmaps in say Photoshop, and the densest screen I expect my app to be seen on has 3x pixel density, then surely he needs to work with an artboard or canvas measured in actual pixels? For example, the iPhone 13 has a logical width and height of 390 x 844. An iPhone 6's is 375 x 667. If I understand correctly, these are points. Not a huge difference. But the actual pixel counts are very different. 1170 x 2532 versus 750 x 1334. So if I want my assets to look good on both, he should be working with a 1170 x 2532 pixel canvas? The largest the app is likely to use, in other words. Before I dipped my toe into iOS development, I had assumed most iOS assets were vector based to account for these issues, but I was surprised to learn SVG support is a fairly recent thing and that, of the various guidelines I've read, everything seems to point towards iOS apps using bitmap assets. Regardless, the graphic assets for this app wouldn't suit vector graphics. In short, how to instruct my designer with regards to asset sizes?! Thanks
Posted
by dcjams.
Last updated
.
Post not yet marked as solved
5 Replies
2.9k Views
My app has been rejected with no explanation other than 'binary rejected' and 'unresolved ios issue'. There are no further details and no indication I can respond or reply. I'm new to this process. I've Googled extensively and not found any answers for this specific scenario. What am I supposed to do next?
Posted
by dcjams.
Last updated
.
Post not yet marked as solved
0 Replies
954 Views
Hello Question: What are the best practices for developing with Bluetooth and the camera connection kit accessory without always sideloading to a physical device given the limitations of iPhone simulator in this regard? MIDI is the specific application. Background: I'm a front-end dev with 15 years experience moving into app development, which is really exciting. The Apple dev environment is so cosy compared to the chaos of the JavaScript ecosystem! Apple seem to want you to succeed. I'm developing an app that uses MIDI. It's a MIDI application that isn't time critical. Not a sequencer in other words. The app will be able to connect to external MIDI devices over both Bluetooth (there are various Bluetooth enabled MIDI interfaces now on the market) and the more traditional way of getting MIDI into an iOS device via the camera connection kit accessory (through MIDI over USB). Thanks Jim
Posted
by dcjams.
Last updated
.