Posts

Post not yet marked as solved
1 Replies
855 Views
I have my iOS app with a WidgetKit extension. The app is used for storing lists of any data that the user wishes to, and the widget shows a few of the items, such as upcoming birthdays, etc. The app and widget extension use CoreData+CloudKit, sharing the same database which is about 350mb currently. When the widget tries to load, it would seem that the size of the database is causing my widget to use more than 30mb of memory (about 45mb when run in the simulator). On a physical device (iPhone 13 Pro Max) using more than 30mb makes the widget crash out while updating its content. This is very frustrating as the iOS app uses about 50mb of memory and it's perfectly fine with it, but I am unable to use the same data to power my widget. I have considered maybe using some kind of temporary in memory data for my widgets, updated by the app, but the data would change regularly and you'd expect it to change without opening the app. e.g. A widget showing a list of upcoming birthdays should update each day, rather than rely on the user to open the app first. Is it possible to raise or remove the widget memory limit of 30mb? Or, is there some way to make CoreData use a lot less memory? It appears to be when I make a call to loadPersistentStores that the memory usage suddenly shoots up. I'm guessing that it's loading some kind of indexing data. I'm not even trying to read any data before it maxes out. My widget only sets up CoreData and then does loadPersistentStores and then boom! I've tried lots of googling, searching these forums, and ChatGPT, and none have given me any answers that actually work. ChatGPT gives me lots of answers that don't work lol. I'm hoping that somebody has answers that they've actually tried and know do work. Many thanks for any help, as this is super frustrating!
Posted
by moontiger.
Last updated
.
Post marked as solved
3 Replies
883 Views
Hi, I have created my own custom SiriKit Intent, and in the handler I am returning the response code of .continueInApp which opens my app in response. I've worked out that I want to open my app in the background, and .handleInApp seems to be the correct response code to do this. However, .handleInApp is not an option in my intent's enum generated by Xcode. If I look at the auto-generated code, I see .continueInApp, and .success, etc, but no .handleInApp. My Deployment Target is set to iOS 15.5 everywhere that I can find, so I really can't figure out why .handleInApp is not included in the auto-generated code. I've even tried creating a brand new workspace and project totally separate from my main one, and still creating a SiriKit Intent Definition does not generate code that includes .handleInApp. Is there something I need to enable to make .handleInApp appear as an enum option?
Posted
by moontiger.
Last updated
.
Post not yet marked as solved
0 Replies
924 Views
Hi. I'd like to be able to do a flood fill on images, either UIImage or CGImage, and was wondering if there was a built in way to do this provided by Apple's standard frameworks? i.e. Take a bitmap image and specify a point and color and then make it fill the area with that color, no matter what shape it is. I've seen a few examples of algorithm code to do this, but they're quite large and complicated so am trying to avoid them.
Posted
by moontiger.
Last updated
.
Post not yet marked as solved
5 Replies
2.8k Views
Since upgrading my iPad mini to iOS 12.2 and xcode to 10.2, I've had the following problem...SKTexture.cgImage() returns an image the size of the visible image rather than the original image that the SKTexture was created with. I need to get a CGImage the size of the texture, and not just the visible pixels.A very simple example... I have a 32x32 png image that the bottom 16 pixels are clear (alpha of 0) and all of the top 16 rows are a solid color (it's a rectangle filling the top 16 rows). I have imported it into an .xcassets file in xcode. I have a TileSet (.sks file) where I've included the png as a texture of a tile definition. If I look at the properties of the png in xcode, it says it is 32x32px. If I look at the properties of the tile definition it says it is 32x32px. Everything good so far.I need to examine the pixels of the CGImage of the resulting texture during runtime. I do this by loading the SKTexture from the SKTileSet made in xcode designers, and doing SKTexture.cgImage(). At the time this happens, the SKTexture object reports a size of 32x32 as expected, but the CGImage that comes back is 32x16. It would appear that something has decided that only the top 16 pixels are actually being used (i.e. visible) and therefore the CGImage should be 32x16, and not 32x32 like the SKTexture, or even the original png used to create the texture.This is a problem for two reasons...1. I want the image that starts off as 32x32 to always be 32x32, for consistency. I'm showing these textures in an SKTileMapNode and the ones which it thinks are 32x16 get stretched undesirably in the tile map node cells. e.g. my 32x16 rectangle gets stretched double across the height to fill the 32x32 squares in the tile map node.2. I can't even find out where amongst the 32x32 texture the 32x16 area should be showing. It would be nice if something could at least tell me "yeah it's coming back as 32x16 but that's because we're offsetting it by 0,0".So the question is either...A. How do I get the CGImage from the SKTexture without it automatically cropping it? i.e. a 32x32 SKTexture always returning a 32x32 CGImage.ORB. How do I stop it automatically cropping images when applied to SKTextures?Many thanks for any pointers or help. This is driving me crazy as it has broken my entire game in many ways!
Posted
by moontiger.
Last updated
.
Post not yet marked as solved
1 Replies
2.6k Views
Hi.I'm working on my first SceneKit (3D) game and I can't figure out an easy way to paint 3D textures for my game, to be used as diffuse and normal images etc.I have Cheetah 3D on Mac, which allows me to paint textures straight onto the 3D models, or I can use Cheetah 3D to create a texture image based on the 3D model unwrapped. But painting using my MacBook Trackpad isn't very natural (and therefore not easy), and the textures exported by Cheetah 3D aren't always unwrapped in a way that makes it easy to then paint. For example, painting along seams is a problem because you have to match both sides of the seam.What I'm hoping for is an iPad app that I can use with my Apple Pencil to draw directly onto a 3D model (presumbly in dae format) so that it affects the texture image mapped to that 3D object. I'd then like to save that texture image to drag into XCode and use in SceneKit.Can anybody please point me in the direction of any such iPad apps? Searching has failed me. Also, what do other people use to paint their textures? Do they just paint the 2D unwrapped versions, or maybe they use apps like Cheetah 3D, Blender, etc?If there's not an iPad/Pencil app for doing such a thing, it seems like a huge gap in the market to provide indie developers with such an app for a reasonable price.
Posted
by moontiger.
Last updated
.