Interferometry is a method of determining the angle of arrival of a scattered signal to a radar receiver. Normally we would only be able to determine the range of a target, which for a monostatic radar implies concentric circles of possible locations for the target with increasing range. (For a bistatic radar, the locus of possible locations of a target at one specific range forms an ellipse.) Furthermore, the "range" is actually the slant range measuring from the radar on the ground to the target (generally in the sky). So, there is further uncertainty about the elevation of the target. We would like to be able to pinpoint targets in all three dimensions; this is where the interferometry technique comes in.

By receiving the scattered signal at two different antennas with known physical separation, one can perform certain signal processing techniques to determine the difference in phase between the two received signals. (Phase can be thought of as something like a time or distance delay. Knowledge of complex numbers in mathematics is required to understand phase very well.)

Once the phase difference between the two receiving antennas has been determined, geographical information may be "folded in" via equations relating antenna phase difference and antenna vertical and horizontal physical separation to a signal's angle of arrival.

Depending on the orientation of the two antennas (particularly, are they north-south or east-west?) and the antennas' beam patterns, this resulting angle of arrival may represent either azimuth (picture a clock on the ground: its hands form an azimuth angle) or elevation angle of arrival. With both a vertical and horizontal interferometer, a target's range, azimuth angle, and elevation angle may be determined, thus pinpointing its location in space.