With a rapid growth in smartphone technology, there is a need to provide secured access to critical data using personal authentication. Existing access mechanisms such as pin and password suffer due to lack of security from shoulder-surfing attacks. Use of biometric modalities such as fingerprint are currently explored in existing smartphones as a more secure authentication mechanism. Typically, fingerprint capturing requires an extra sensor, adding to the cost of the device as well as denying backend services to existing smartphone devices. Using a rear camera captured fingerphoto image provides a cheap alternate option, without the need for a dedicated sensor for capturing images. However, unlike fingerprints fingerphoto images are captured in a more uncontrolled environment including any outdoor conditions. Hence, fingerphoto matching is prone to many challenges such as varying environmental illumination and surrounding background. We propose a novel end-to-end fingerphoto matching pipeline by studying the effect of different environmental conditions in fingerphoto matching. The pipeline consists of the following major contributions: (i) a segmentation technique to extract the fingerphoto region of interest from varying background, (ii) an enhancement module to neutralize the illumination imbalance and increase the ridge-valley contrast, (iii) a scattering network based fingerphoto representation technique to deal with the pose variations, whose resultant features are invariant to geometric transformations, and (iv) a learning based matching technique to accommodate maximum variations occurring in fingerphoto images. To experimentally study the challenging conditions such as background and illumination, we create a publicly available fingerphoto dataset, IIITD SmartPhone Fingerphoto Database v1, along with the corresponding live-scan prints. The experiments performed on the dataset shows that the proposed matching pipeline provides an improved performance when compared with some of the existing approaches. © 2017 Elsevier Ltd. All rights reserved.