From reading the description of SKTexture.texture(from:crop:) you would think this method could return a sub-texture of a node's full texture. For example, if the node is the entire scene, then the texture(from:crop:) would let you specify a rectangle anywhere within the scene that could be cropped and returned as a texture. The benefit would be that instead of having a huge texture for the whole scene, you could get a much smaller texture that represented the rectangle of interest within your scene. This has memory implications. A much smaller snapshot obviously takes less resources. If these snapshots are taking frequently then a large, full scene snapshot will quickly result in app termination due to memory usage. After lots of experimentation (and searches), I can't make sense of the API parameters. Maybe this method works and getting a sub-rectangle SKTexture snapshot of an SKScene is possible, but the documentation vs. behavior is confusing. Specifically, the CGRect crop parameter is a mystery. How is the origin and the width and height supposed to be specified to get a sub-rectangle of the node (e.g., when the node is the scene)? Are the dimensions of these CGRect parameters supposed to be view, scene, pixels, points, unit (between 0 and 1), or some other flavor? The obvious interpretation is that the CGRect's origin would be (as stated) in coordinates relative to the specified node (in my case the scene). And the width and height of the CGRect would also be in scene-based coordinates. Or possibly the width and height might need to be based on the scene's fully rendered SKTexture size (which seems to be double the scene's view size (e.g. a scene with a view of width 375 and height 667 results in an SKTexture of width 750 and height 1334)? This texture(from:crop:) has been a big issue https://forums.developer.apple.com/message/52852#52852
I just discovered that if you add a root node to the scene (rather than using just the scene), and then add child nodes to this root node, then the cropping with this method works. BTW, you need to use a CGRect whose origin is based on doubling the scene's view size (which is the view's frame size). So if the root node is placed at the center of the screen and has default anchorPoint (0.5, 0.5), then the lower left most point on the screen will be at (-view.bounds.width, -view.bounds.height):
let x = -1.0 * (self.view?.bounds.width)!
let y = -1.0 * (self.view?.bounds.height)!
let rect = CGRect(x: x, y: y, width: -2 * x, height: -y * 0.2)
// captures a subrect along bottom portion of scene and 20 % height of scene
let tex = self.view?.texture(from: self.rootNode, crop: rect)
Note, I wonder if SKScene can still be used with this texture(from:crop:) method so long as its CGRect origin parameters are somehow adjusted.
BTW I filed an Apple Bug #
SKTexture texture(from:crop:) broken and/or needs better documentation
On the points vs pixels comments in your original question, it's points for positions and sizes, but textures are returned as pixels, which (on most devices) means a doubling of the points.
I'm not sure if this is important to you and what you're doing, but SKViews are what's needed to make texture: from really sing. If you try using an SKView as the host of your texture making experiments this might well get you closer to the results you want. You may be doing this already. I'm commentating on the fact there's no mention of use of SKView as the parent node for your node that you're cropping, etc.
Did you ever get a resolution for this issue? It is 2019 and this bug is still present. If you use the rootNode (SKScene) as the node to crop and generate a texture from, the texture is always the same regardless of changing the origin CGPoint of the CGRect defined for the crop. The workaround of creating a node as a root node still works.
This is definitely still a bug.