Exponential growth in the field of Internet of Things (IoT) in recent times has resulted in generation of huge amount of sensory data on daily basis. In the context of smart building monitoring applications, the inherent heterogeneity and dynamics of the system along with a wide range of quality of service requirement for different running applications make it challenging for efficient data management. An offline analysis on huge amount of data suffering from noise and redundancy for learning inference would not be a good idea considering the huge overhead. In this work, we try to motivate the requirement of context adaptive sensing and in-network fusion of sensory data for smart building monitoring applications towards faster inference and thus efficient decision-making for automation. To the best of our knowledge, this work is the first to exploit in-network fusion in smart building monitoring systems. We justify our proposed scheme by deploying a campus-scale sensor network and performing an in-network data fusion using Kalman Filter and making the system context-adaptive. © 2019 IEEE.