Find tie points

This tool locates matching points between two overlapping images using the Speeded Up Robust Features (SURF) algorithm (Bay et al., 2008). The SURF algorithm identifies key pixels in an image that are likely to remain identifiable after various radiometric and geometric distortions. The matched points that are identified using this tool can later be used as inputs to an image-to-image registration operation.

The SURF parameters include a threshold value, the number of octaves, and a matching threshold. A lower SURF threshold value will result in a greater number of SURF points located in each of the two images and vice versa. The number of identified points is relatively insensitive to the number of octaves and the default value of 4 is a reasonable value for most image pairs. The matching threshold is used during the SURF point matching process (i.e. the process of finding potential pairs of corresponding points between the left and right images). It is used to determine whether two SURF points are similar enough in character to be considered corresponding points. The matching process operates as follows. Each SURF point in the left image is selected. The Euclidean distance in attribute-space is calculated between the left-image point and each right-image SURF point. When the Euclidean distance of the best fitting pair (i.e. smallest attribute distance between left-image/right-image points) divided by the distance the second best fitting pair is less than the matching threshold, the pair are considered to be a left/right-correspondence. Thus, a lower matching threshold applies a more rigorous matching criteria, resulting in fewer corresponding points located, and vice versa.

While the above process locates potential corresponding points between the left and right images based on their SURF attributes, there is still potential that this matched point set will contain outliers that are not true corresponding points. To help remove these outlier points, the algorithm calculates a two-dimensional polynomial rectification transformation model, which provides the spatial mapping of the one image's SURF points onto the second image's points. Point matches that are not true correspondences will be apparent in this transformation model as outliers. The user can select the polynomial order of the transformation model and the maximum allowable error value (in pixels), which determines which point matches are removed. Notice that, 1) a polynomial order of more than 2 is not generally advisable, and 2) this method for removing outlier matching points does not work well where there is extensive topographic relief or off-terrain objects (e.g. buildings) in the scene.

See Also:

Scripting:

The following is an example of a Python script that uses this tool:

wd = pluginHost.getWorkingDirectory()
leftInputFile = wd + "leftImage.dep"
rightinputFile = wd + "rightImage.dep"
leftOutputFile = wd + "leftPoints.shp"
rightOutputFile = wd + "rightPoints.shp"
threshold = "4.0"
octaves = "4"
matchingValue = "0.6"
removalThreshold = "2.0"
polyOrder = "1"
args = [leftInputFile, rightinputFile, leftOutputFile, rightOutputFile, threshold, octaves, matchingValue, removalThreshold, polyOrder]
pluginHost.runPlugin("FindTiePoints", args, False)

This is a Groovy script also using this tool:

def wd = pluginHost.getWorkingDirectory()
def leftInputFile = wd + "leftImage.dep"
def rightinputFile = wd + "rightImage.dep"
def leftOutputFile = wd + "leftPoints.shp"
def rightOutputFile = wd + "rightPoints.shp"
def threshold = "5.0"
def octaves = "4"
def matchingValue = "0.6"
def removalThreshold = "2.0"
def polyOrder = "1"
args = [leftInputFile, rightinputFile, leftOutputFile, rightOutputFile, threshold, octaves, matchingValue, removalThreshold, polyOrder]
pluginHost.runPlugin("FindTiePoints", args, false)

Credits: