Promoting visual diversity in image search results has been of immense interest in the MIR (multimedia information retrieval) community. However, most of the work done in this space and the test collections used therein have ignored the use of social tags (user generated metadata) for image search result diversification on social multimedia platforms such as Flickr, Picasa etc. Unlike traditional multimedia content, content generated on such social media platforms are usually annotated with a rich set of explicit and implicit human generated metadata (referred here as social tags) such as keywords, textual description, category information, author's profile, user-To-user and user-To-content interaction etc which can be useful for the image search-result diversity task. In this paper we demonstrate how existing image search result diversification method can be extended to incorporate social tag information. Experiments on a real-world dataset shows that incorporating social tag features in some of the popular diversification algorithms results in improvement over baseline numbers. © 2016 IEEE.