The conjugate gradient method has been used extensively to solve a given set of (symmetric) positive definite linear equations instead of Gaussian's elimination based direct methods, especially for very large systems when parallel solution environment is preferred. The method itself can take many iterations to converge. In order to reduce the number of iterations, i.e., in order to accelerate the rate of convergence of the method, the set of linear equations at hand is preconditioned. We present the implementation results of a previously published approximate inverse preconditioner for the conjugate gradient method. The preconditioner is based on an approximate inverse of the coefficient matrix obtained from a linear combination of matrix-valued Chebyshev polynomials. We implement and test the proposed method on a Sun SMP machine. Because the preconditoner itself contains mainly matrix-matrix products and conjugate gradient method contains mostly matrix-vector products, convincing results are obtained in terms of both speedup and scalability. (c) 2006 Elsevier B.V. All rights reserved.