Interpreting images to compute properties of the 3D world is a significant matter of computer vision. Therefore computer vision applications can help people requiring assistance. This paper presents a novel stereo-vision-based perception and navigation approach to assist visually impaired people. Frontal view images of stores in a shopping mall are first searched for logo recognition. Distances to the found logos (store signboards) are estimated by stereo matching. Both logo recognition and stereo matching are based on local image features (keypoint descriptors) calculated via Speeded Up Robust Features (SURF) algorithm. Final refined distances are calculated via statistical filtering and averaging of the individual keypoint distances found by matched keypoint pairs. Experimental results on our self generated stereo dataset of 28 storefront images from various distances and viewpoints demonstrate the performance of the proposed approach.