You have to use Ohm's law (V=IR) To figure out the correct resistor value for any LED, you have to know the mA (milli Amp) rating of the LED.
So, for example, lets say that you have a LED thats rated for 20mA. (as almost all of them are) And its rated for 3 to 12 volts. Let's say you're going to run with 12V, because thats what most non-adjustable transformers will produce.
You use the equation "Voltage = Current * Resistance" to figure out the needed resistance to generate the correct current. Current is always measured in Amps, Resistance in Ohms. So, for this example, we have:
12V = 0.020A * Resistance
Solve for resistance, to get
Resistance = 12 / (0.020) = 600 Ohms.
If you were running at 5V, you would have:
5V / (0.020A) = 250 Ohms.
Don't worry too much about exact Ohm values. You'll be hard pressed to find a 600 Ohm resistor. Just use a 470 Ohm and you'll run it "a little hot" (~25 mA) or to take no risks, use a bigger resistor and run a little dimmer. Or, you can chain together multiple resistors to get the correct value.
You can commonly run LEDs at Amperages that greatly exceed their rated capacity (up to 5x or 10x their rating) They will burn brighter, and have significantly shortened life if run for extended periods, as I expect we all are. To test, just hook up the LED with no resistor. See how long it lasts. (note: I'm joking)
Other things to remember: LEDs should produce *no* heat while running. You should feel *absolutely nothing* from them. If you can feel some heat, you're running too much current through there and you need a bigger resistor.
Steve