I believe I read elsewhere that failing to filter the current might result in a rapid, flickering response of the LED. Anyone know what I could do about that? I suppose I should be asking about that as well as the question of whether or not I'll need a resistor to prevent shorts/killed LEDs.
You're assuming that you'll see flickering of an LED at 60 times a second. I doubt that.
LED's aren't rated for voltage and current like an incandescent lamp. They are semiconductors, not resistive devices. The voltage 'rating' of an LED is the voltage that it takes to overcome the depletion layer of the P-N junction (otherwise known as the 'forward voltage', or 'Vf'). Don't sweat the deep meaning of that last bit of physics, just understand that once that Vf voltage is reached, the LED becomes a short in the circuit, and as such needs a current-limiting resistor.
Works like this: Say you have an LED which is 'rated' at 3V (2 to 3V is typical of modern high-brightness LED's). This means that when the LED is forward-biased, there will be 3V across it. First you decide how much current you want to run through the LED...most are rated for 20mA max, so let's say we want 15mA through it. If you have 12V for a source voltage, you have to calculate the current-limiting resistor. Since the LED itself is going to drop about 3V, subtract this from the supply, and use the remainder to calculate the resistor.
R=V/I
= (12-3)/.015
=9/.015
=600 ohms
So with this '3V' LED, a 600 ohm resistor will limit the current to about 15mA with a 12V source.
(Should also check that a standard 1/4W resistor will not be overtaxed. Power (P) = I² x R, = (.015)² x 600, = 135mW, or about 1/8 of a watt, so a 1/4W resistor is fine. Alternate calculation - P = V²/R, = 9²/600 [there will be 3V across the diode, and the rest of the supply voltage (9V) MUST be dropped across the resistor according to Kirchhoff's Voltage Law], = 81/600, =135mW. How about that? Ohm's Law works.
)
If you are unsure about the Vf of your LED, you can measure it yourself. A 9V battery, a 1K resistor, and a multimeter will get the job done. Wire it up with some test leads and measure the voltage across the diode when conducting. Vf rises with current, but this method will get you well into the ballpark with a figure that is perfectly useful.
With an AC voltage source, things can get tricky. Diodes are supposed to allow current to flow in one direction, and oppose that flow when current is reversed (rectification). LED's are diodes, but they aren't designed for rectification duties. The standard max reverse voltage rating on LED's has been 5V for many years, and manufacturers don't condone their use as rectifiers, hence in most applications a rectifier is recommended to do the blocking of reverse current so the LED doesn't have to. But...
another AK'er who sells LED assemblies has done some experimenting with some wide-dispersion LED's he has sourced, and found that they can take rather high peak reverse voltages despite of the manufacturers ultra-conservative rating. I do not know if all LED's are so conservatively rated and in my own applications it is usually easy enough to add a diode (but the additional voltage drop of the diode must be factored into the current calculation above).
All this is a moot point if the guy you're buying these from has done all the work for you by building an assembly with a limiting resistor and perhaps a protection diode. If you're simply buying a bare LED, you get to do this yourself. So...it depends on what you've bought, and it kinda sounds like you don't know.