Pixels or points ?

I have a collectionView, with 16 cells of size 50 * 50, min space 10.

Collection scrollView is

scrollView frame (82.0, 118.0, 240.0, 193.0)


I set a tag from 0 to 15 to cells

I'm trying to compute the position of each cell when view has scrolled.


    func scrollViewDidScroll(_ scrollView: UIScrollView) {
       
        for cell in collectionView.visibleCells {
            let point = cell.convert(cell.frame.origin, to: self.view)
            print("cell tag", cell.tag, "--> cell.frame.origin y", cell.frame.origin.y, "point y", point.y)
        }


I get the following results (point y is the position relative to the viewController's view


cell tag 2 --> cell.frame.origin y 0.0 point y 118.0

cell tag 4 --> cell.frame.origin y 60.0 point y 238.0

cell tag 6 --> cell.frame.origin y 60.0 point y 238.0

cell tag 8 --> cell.frame.origin y 120.0 point y 358.0

cell tag 10 --> cell.frame.origin y 120.0 point y 358.0

cell tag 12 --> cell.frame.origin y 180.0 point y 478.0

cell tag 14 --> cell.frame.origin y 180.0 point y 478.0

cell tag 1 --> cell.frame.origin y 0.0 point y 118.0

cell tag 3 --> cell.frame.origin y 0.0 point y 118.0

cell tag 5 --> cell.frame.origin y 60.0 point y 238.0

cell tag 7 --> cell.frame.origin y 60.0 point y 238.0

cell tag 9 --> cell.frame.origin y 120.0 point y 358.0

cell tag 11 --> cell.frame.origin y 120.0 point y 358.0

cell tag 13 --> cell.frame.origin y 180.0 point y 478.0

cell tag 15 --> cell.frame.origin y 180.0 point y 478.0

cell tag 0 --> cell.frame.origin y 0.0 point y 118.0


cells 0, 1, 2, 3 seem OK : y 0.0 point y 118.0 : 118.0 is the top of scroll view, hence the top of first row at this stage


The second row should be y 60.0 point y 178.0 : 178.0 = 118.0 + 50.0 (first row height) + 10.0 (vertical spacing)

But, for cells 4, 5, 6, 7 I get y 60.0 point y 238.0 : 60.0 is correct, but 50 + 10 has been multipled by 2.


Similarly, for row 8, 9, 10, 11, I should get y 120.0 point y 238.0 : 238.0 = 118.0 + 2 * 50.0 (first 2 rows height) + 2 * 10.0 (twice vertical spacing)

But I get cell.frame.origin y 120.0 point y 358.0 : 120.0 is correct, but 120 (2 * 50 + 2 * 10) has been multipled by 2.


Is it due to Retina display, with a count of pixels instead of points in cell.convert ?

Accepted Reply

This thread has been deleted

I think you're off in the weeds here. The idea that some API might be "accidentally" returning pixels instead of points isn't plausible.


In fact, it looks like you have a pretty glaring bug in your code:


            let point = cell.convert(cell.frame.origin, to: self.view)


Assuming 'cell' is a subclass of UIView, its frame is in the bounds coordinate system of its superview, but "cell.convert" expects the bounds coordinate system of 'cell'. You likely should be using "cell.bounds.origin" instead.


The other thing to be careful of is that a UIView's "frame" property is meaningless if the view has a transform applied to it, and you may not be entirely in control of whether there is a transform or not. It's safest to avoid using the "frame" property at all, and instead use a combination of the "center" and "bounds" properties — keeping in mind that "center" is in frame coordinates (aka the superview's bounds coordinates), while "bounds" is in bounds coordinates of the view itself.

Replies


Remember, pixels are counted, while points are measured.


>Is it due to Retina display


As a retina display is simply a display that has twice the pixel density (pixels or points of light per inch) of a non retina screen, I think yes, due to.