The ability of neural networks to perform pattern recognition, classification and associative memory, is essential to applications such as image and speech recognition, natural language understanding, decision making etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed train of spikes, which allows learning through the spike-timing dependent plasticity (STDP) property. SNNs can potentially achieve very large scale implementation and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. In this work, we develop an efficient, scalable and flexible SNN simulator, which supports learning through STDP. The simulator is ideal for biologically inspired neuron models for computation but not for biologically realistic models. Bayesian neuron model for SNNs that is capable of online and fully-distributed STDP learning is introduced. The function of the simulator is validated using two networks representing two different applications from unsupervised feature extraction to inference based sentence construction.