This is a fundamental Microeconomics supply-demand pricing question: Do you sell an item based upon the cost to produce it, or the price customers are willing to pay for it?
Employment is no different. There is a struggle between buyers of your employment, and you (the seller) of your time/expertise. If those prices overlap, then there is a spectrum of possible transaction prices that can occur.
As you can imagine, if there are many buyers, and few sellers, then the price is driven to an auction level where buyers must compete with one another for the employee. If there are more sellers than buyers, then the price is driven down towards the cost of that employee (and associated education).
Typically we exist in the middle where negotiation tactics dominate where within that range the price settles. That's why there are policies such as "don't share your salary" - to create information asymmetry favoring the employer. ...and why people with poor negotiating skills typically earn significantly less than those who demand more.
Generally, when demand for employees is high, companies try to make the market less efficient by creating hurdles to employees leaving (like the agreements between tech companies not to "poach" employees from one another), and bonus programs that vest over multi-year periods.
This is relevant today because allowing employees to work from home greatly increases the supply of potential employees (both domestically as well as internationally), thereby forcing employees to accept closer and closer to their cost-basis. By microeconomic principles alone, it would be surprising if this didn't significantly drive down tech wages.
This is exactly right. I'm surprised that so many people, who are otherwise highly educated and intelligent, struggle with this concept.
There's no universal rule of "fairness" saying how much a company should pay you. It's not dictated by COL calculations and logical questions of value.
The company looks at you and your competitors (alternative candidates), and puts forward a number they think is high enough for you to accept, but not so high that they could snag an "equivalent" candidate for a lower price.
You win that battle not by whinging about COL adjustments, but by eliminating your alternatives (metaphorically speaking, of course). If the company truly needs what you do, and you're the only one who can do it, then you set the terms.
Employment is no different. There is a struggle between buyers of your employment, and you (the seller) of your time/expertise. If those prices overlap, then there is a spectrum of possible transaction prices that can occur.
As you can imagine, if there are many buyers, and few sellers, then the price is driven to an auction level where buyers must compete with one another for the employee. If there are more sellers than buyers, then the price is driven down towards the cost of that employee (and associated education).
Typically we exist in the middle where negotiation tactics dominate where within that range the price settles. That's why there are policies such as "don't share your salary" - to create information asymmetry favoring the employer. ...and why people with poor negotiating skills typically earn significantly less than those who demand more.
Generally, when demand for employees is high, companies try to make the market less efficient by creating hurdles to employees leaving (like the agreements between tech companies not to "poach" employees from one another), and bonus programs that vest over multi-year periods.
This is relevant today because allowing employees to work from home greatly increases the supply of potential employees (both domestically as well as internationally), thereby forcing employees to accept closer and closer to their cost-basis. By microeconomic principles alone, it would be surprising if this didn't significantly drive down tech wages.