The real-world is a medley of multisensory information and so are our experiences, memories, and responses. As embodied beings, we respond to the information endogenously and in ways derived from self-defining factors. Thus inspired, we attempt formalization of data-structures to facilitate generation of system-bespoke comprehension-granules of the real-world. The conceptualized structures encapsulate multisensory inputs (sourced from the real-world or memories), intrinsic and deliberate emotions, messages (bearing intermittent process-results, queries, multimodal data, etc.) across system modules and memory units, and sensorimotor responses to the inputs. The structural-schematics are anthropomorphic. These variable-length constructs are theoretically platform-independent, support genericity across data-modality and information-inclusion, and provide for representation of novel sensory-data. An epigenome-styled header node for the afferent data-units provides for the activation of intuitive 'fight-flight' behavior. The documentation includes a flow-graph, depicting the translation of information across the data-structures and the different ways of thinking while interpreting a real-world scene or a mind-generated event. Applicability of the structures has been analyzed in the context of comprehension in an embodied mind-machine framework and other similar architectures. Studies herein target contribution to the design of generally intelligent man-machine symbiotic systems. © 2016 IEEE.