Kernel matrices that encode the distance (or similarity) between data points are widely used throughout the computational sciences for classification, clustering, and dimensionality reduction. For large datasets, the cost of computing and factorizing such matrices becomes intractable. Thus instead of operating on the entire matrix, approximate methods such as the Nystrom method and the incomplete Cholesky decomposition (ICD) generate a low rank matrix factorization using only a subset of the matrix columns (or rows). Here, we present an adaptive column sampling strategy for the Nystrom method that we dub Accelerated Sequential Incoherence Selection (oASIS). This sampling strategy reveals a missing link between Nystrom methods and ICD: we demonstrate that ICD is actually a special case of the Nystrom method with the oASIS adaptive sampling rule. Numerical experiments and theoretical results suggest that oASIS achieves performance comparable to state-of-the-art greedy Nystrom methods but with shorter runtimes and less memory consumption.