I have been asked for many times that why can't you use a single resistor for a number of LEDs in parallel instead of one each?
The main reason is because you can't safely connect diodes in parallel.
So when we use one resistor, we have a current limit for the whole diode section. After that it's up to each diode to control the current that goes through it.
The problem is that real world diodes don't have same characteristics and therefore there's a danger that one diode will start conducting while others won't.
By the way,you can find kinds of resistors from http://www.kynix.com...ct/Cate/84.html
So you basically want this
And you in reality get this
As you can see, in the first example, all diodes are conducting equal amounts of current and in the second example one diode is conducting most of the current while other diodes are barely conducting anything at all. The example itself is a bit exaggerated so that the differences will be a bit more obvious, but nicely demonstrate what happens in real world.
The above is written with assumption that you will chose the resistor in such way that is sets the current so that the current is n times the current you want in each diode where n is the number of diodes and that the current is actually larger than the current which a single diode can safely conduct. What then happens is that the diode with lowest forward voltage will conduct most of the current and it will wear out the fastest. After it dies (if it dies as open circuit) the diode with next lowest forward voltage will conduct most of the current and will die even faster than first diode and so on until you run out of diodes.
One case that I can think of where you can use a resistor powering several diodes would be if the maximum current going through the resistor is small enough that a single diode can work with full current. This way the diode won't die, but I myself haven't experimented with that so I can't comment on how good idea it is.