Again, I’m not a machine learner, but I feel that there is a misunderstanding about the role of STDP. It is not a learning rule in the sense that back propagation is, i.e. just presenting some stimuli and using STDP for the connections to a target neuron will not make these connections learn the stimulus class. If you present several different stimuli and simply add more output neurons and connections, you will end up with several output neurons that do the same thing, since they get exactly the same input and there’s no reason they should do anything else. To do some kind of learning in the classification sense, you have to add additional stuff like the competition between the output neurons (which would still be unsupervised learning, i.e. you have to select neurons corresponding to your classes later), or a teaching signal, i.e. making the output neurons spike when you want them to.
To emphasize my point: There are many simple machine learning data set that can easily be learned by an MLP + backprogagation (or similar approaches), such as MNIST or iris. To learn these kind of tasks with a spiking neural network is far from trivial. Simply converting the inputs into Poisson spike trains and using them with spiking neurons and STDP will not work. Doing machine learning with SNNs is an area of active research, and STDP can certainly be useful in certain circumstances (in particular with tasks that actually use the timing of spikes) but is not a general method for learning. The general approaches to doing machine learning tasks with an SNN instead of a classical ANN rather use a variant of the back propagation algorithm that works with spikes.