This paper presents a novel image variety-Based approach that elegantly models the space of a broad class of perspective and non-perspective stereo varieties within a single, unified framework. The basic concept of parameterized variety presented earlier by Genc and Ponce [1] is extended to represent the nonlinear space of images. An efficient algebraic framework is constructed to parameterize the variety associated with full perspective cameras. The algorithm seeks the manifolds that constrain this space of six-dimensional variety to generate compelling multi-perspective 3D effects from arbitrary virtual viewpoints. Combining geometric space of multiple uncalibrated perspective views with appearance space in a globally optimized way leads to numerous potential applications, especially in content creation for multi-perspective 3DTV. The proposed approach works for uncalibrated static/dynamic scenes, containing parallax and unstructured object motion. It even seamlessly deals with images or video sequences that do not share a common origin, thus provides an effective tool for montaging, indexing and virtual navigation. © 2013 IEEE.