I have a need to tell if 2 images are different by unimportant details.
For example, one image is only slightly different than another because:
*) It is a little stretched *) or it has 99% portion of another (only missing some pixels on borders)
I wonder if there is an efficient algorithm to achieve my goal. Any suggestions are welcome.
are different by unimportant details.
Your criteria are a bit fuzzy,
- if only border missing, nearly all pixels are the same
In this case, such algorithm could work: https://stackoverflow.com/questions/6488732/how-does-one-compare-one-image-to-another-to-see-if-they-are-similar-by-a-certai
- but if stretched, all pixels will be different
This could do it: https://stackoverflow.com/questions/71615277/image-similarity-in-swift
I would also try to compute a similarity distance:
- split the images in small squares (20*20 for instance), which would make a few hundreds square areas
- compute the average color of each area
- Set thresholds: similarity is the % of squares which average color differs less than 10% different; if similarity > 90% then images are "similar": you set 10% and 90% to suit best your need
- a more sophisticated way would be to compare pixels histograms of the squares: https://developer.apple.com/forums/thread/725550?answerId=746000022#746000022
Hope that helps