I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
To clarify: the "LiDAR capabilities" requirement refers to both the camera and the SoC (primarily the GPU).
Put another way, only iOS devices with a GPU powerful enough are capable of running a contemporary PhotogrammetrySession. Those iOS devices also happen to have a LiDAR camera.
We agree that LiDAR scanning isn't strictly required for photogrammetry in general but have found that it significantly improves results and have thereby made it a requirement for contemporary RealityKit-based object capture.