The open question for me is why did Edison even bother with DC at all? A DC generator requires a commutator. If there is anything that screams dodgy, that would be it.
Sliding friction. high current contacts. 60 times a second? Seriously?
There used to exist these "buttons" (for lack of a better term) that you could place at the bottom of a lamp socket to make your bulbs last longer. They were actually a diode shaped like a disc (more or less), that ultimately halved the amount of voltage going to the lamp (while also acting as a half-wave rectifier to form a noisy DC current). So in such an application - yeah, running a lamp at half its rated voltage would lengthen the life of the lamp (while reducing its brightness of course).
There was once the argument that DC would cause the lamp filament to vibrate like AC would (in the presence of the Earth's magnetic field?) - and thus if you ran your lamp on DC it would last longer due to not being mechanically stressed (I think this was one of the pitches behind those buttons, too).
I think it was later found that the argument had little validity, and was more a marketing pitch. That said, a lamp does experience a moment where, when the current (AC or DC) is switched on, the filament does "flex" - partially from magnetism, partially from thermal loading as it heats up. This flex, over time, does produce a mechanical stress on the filament. It's a major reason why incandescent lamps typically burn out when you turn them on.
Sliding friction. high current contacts. 60 times a second? Seriously?