Multi-view stereo, novel view synthesis and high dynamic range (HDR) imaging are three pertinent areas of concern for high quality 3D view generation. This paper presents a novel parameterized variety based model that integrates these different domains into one common framework with an envisioned goal, to accommodate multi-view stereo for multiple exposure input views and to render photo-realistic HDR images from arbitrary virtual viewpoints for high quality 3D reconstruction. We extend the parameterized variety approach for rendering presented earlier by Genc and Ponce  to handle full perspective cameras. An efficient algebraic framework is proposed to construct an explicit parameterization of the space of all multi-view multi-exposed images. This characterization of differently exposed views allow to simultaneously recover artifacts free HDR images, and reliable depth maps from arbitrary camera viewpoints. High quality, HDR textured 3D model of the scene is obtained using these images and recovered geometry information. © 2012 IEEE.