Spike-Timing Dependent Plasticity (STDP), the canonical learning rule for spiking neural networks (SNN), is gaining tremendous interest because of its simplicity, efficiency and biological plausibility. However, to date, multilayer feed-forward networks of spiking neurons are either only partially trained using STDP or pre-trained using traditional deep neural networks which are converted to deep spiking neural networks or a two-layer network where STDP learnt features are manually labelled. In this work, we present a low-cost, simplified, yet stable STDP rule for layer-wise unsupervised and supervised training of a multilayer feed-forward SNN. We propose to approximate Bayesian neuron using Stochastic Integrate and Fire (SIF) neuron model and introduce a supervised learning approach using teacher neurons to train the classification layer with one neuron per class. A SNN is trained for classification of handwritten digits with multiple layers of spiking neurons, including both the feature extraction and classification layer, using the proposed STDP rule. Our method achieves comparable to better accuracy on MNIST dataset than manually labelled two layer networks for the same sized hidden layer. We also analyze the parameter space to provide rationales for parameter fine-tuning and provide additional methods to improve noise resilience and input intensity variations. We further propose a Quantized 2-Power Shift (Q2PS) STDP rule, which reduces the implementation cost of digital hardware while achieves comparable performance.