We propose an efficient multiscale image disparity estimation algorithm that estimates the local translations needed to align different regions in two images. The algorithm is based on the dual-tree quaternion wavelet transform (QWT). Each QWT coefficient features a magnitude and three phase angles; we exploit the fact that two of the phase angles are covariant with image shifts in the horizontal and vertical directions. By fusing phase information across multiple scales, we combat the phase unwrapping problem that has traditionally plagued phase-based disparity estimation techniques. The result is a linear-time algorithm that provides reliable estimates of both small and large image shifts. The algorithm is completely image-based; that is, it involves no extraction of feature points or other landmarks. It can be used as a front-end for image processing and computer vision tasks such as image registration, motion estimation, and stereo matching. We present results with a real-world image sequence.