That effect drives me nuts too; but it seems to bother a minority of people. I've always assumed/guessed it's in the order of 50–100Hz; the 500–1000Hz mentioned may be fast enough to appear continuous to sensitive humans and maybe fish...
The most usual source of that flicker these days is driving LEDs (usually in a string of Christmas lights) with half-wave rectified AC, which can be considered PWM with a 50% duty cycle over the same period as the input signal - 60Hz in the US and I think Japan, 50Hz most other places. That's slow enough to see easily; it would surprise me to learn that PWM at ten times the rate is slow enough to see at all.
Looking at the code I wrote a while back to drive a four-digit common-cathode LED display (you know, a TV bomb timer!) as a clock, I find that after all the tuning I did to get rid of flicker, I ended up activating each digit for 1ms at a time. Granted, running under Raspbian this is going to flicker anyway from time to time because multitasking, but that aside, I found no PWM flicker perceptible, even at what is effectively 20% duty cycle over a 5ms period. (The decimals are implemented as a fifth "digit".) So I'm going to guess that .5KHz and up are fine for most humans. Fish I don't know about; (1) cites a flicker fusion rate between 2 and 40Hz for swordfish, but for goldfish I've no idea, other than to guess that .5KHz and up is probably fine for them too.
In my own testing I could see flicker at 10kHz, during fast eye movement only. This has nothing to do with flicker fusion threshold. Instead it's a failure of saccadic masking. At 10kHz it's subtle enough that it's not at all annoying, but I do find 1kHz slightly annoying. The commonly used 200Hz is definitely annoying.