It depreciates to a point. An i7 2600 depreciated about half its value when the next generation came out, but 5 years later it still sells used for 40% of its MSRP. The era of older chips being downright worthless passed circa anything since ~2006.
The start of that era for GPUs was much more recent because they matured much more into recent times, but I would easily say anything from the AMD 4000 series or the Nvidia 400 series and up has held value long term.
There is also some "classic" rebound for parts once they drop off the retail market. If a specific component in a system from 2012 dies, be it the motherboard; processor; or ram, the supply of replacement parts will gradually dwindle over time as old hardware breaks down. It makes what you have all the more valuable.
I still have a near-complete Nehalem desktop with the original i7 920, which has a used retail price of around $30 (it depreciated a lot more than the aforementioned i7 2600k because there were still substantial IPC improvements between their generations) but my x58 deluxe motherboard can still be sold used for about $150-200. It has had a depreciation in 9 years of only about 30% from what I paid for it. Meanwhile the ram is almost worthless, mostly because DDR3 is still available in retail.
This is also why its usually recommended to either buy new or buy the immediate last generation hardware on the cheap. Anything older becomes a scarce resource for repair purposes and can maintain value as such.
Also, said depreciation is intensified by cryptocurrency mining. It's basically GPU torture and buyers in the used market are (rightly) distrustful of GPUs that appear to have been used for mining.