One of the well-established research domains among computer vision scientists is object tracking. However, not much work has been done in underwater scenarios. This article addresses the problem of visual tracking in the underwater environment with the stationary and nonstationary camera setups. In order to deal with the underwater optical dynamics, a dominant color component-based scene representation is employed in the YCbCr color space. An adaptive approach is devised to select the Walsh-Hadamard (WH) kernels for the efficient extraction of color, edge, and texture strengths, whereas a new feature called range strength is proposed to extract the variation of intensity from underwater sequences in the local neighborhood using the WH kernel. The likelihood of these feature strengths is integrated in a particle filter framework to track the object of interest in underwater sequences. The reference feature strengths used in assigning weights to the particles are updated based on the S$\phi$rensen distance. The coefficients of feature strengths are calculated in such a way that if one feature fails, then its coefficient become insignificant, whereas the more suitable features get higher feature coefficients. The effectiveness of the proposed scheme is evaluated using the underwater video datasets: reefVid, fish4knowledge (F4K), underwaterchangedetection (UWCD), and National Oceanic and Atmospheric Administration (NOAA). The performance evaluation is performed by comparing the scheme with five recent state-of-the-art tracking schemes. The quantitative analysis of the proposed scheme is carried out using three evaluation measures: overall intersection over union, centroid location error, and average tracking error. The performance of the proposed scheme is quite encouraging in the case of sequences with hazy and degraded, partially occluded, and camouflaged challenges. © 2005-2012 IEEE.