In the recent past, due to the boost in digitization a huge volume of data (BigData) related to building and architectures has been accumulated. The sources of such data are interviews, blogs, websites, etc. These descriptions has details about the interior of a building. For humans it is easy to interpret and imagine its structure and arrangements of furniture. Automatic synthesis of real-world images from text descriptions has been explored in the computer vision community. However, there is no such attempt in the area of document images, like floor plans. Even though, Floor plan synthesis from sketches, as well as data-driven models, were proposed earlier, this is the first attempt to automatically render building floor plan images from textual description. Here, the input is a natural language description of the internal structure and furniture arrangements within a house, and the output is the 2D floor plan image of the same. We have experimented on publicly available benchmark floor plan datasets. We were able to render realistic synthesized floor plan images from the description written in English. © 2019 IEEE.