Is anyone surprised that the 25 year war on corporate paternalism and 2 way loyalty in favor of a quite openly stated reductionist policy of "extract maximum value from employees for the bare minimum that they will accept" has left a lot of unhappy workers?
There is a little bit of misunderstanding sometimes when people discuss "inflation". To some, if not most people, inflation means increasing prices. To others, it means increase in the money supply, which eventually may cause rising prices after some latency, especially if the growth rate of the economy doesn't keep up with the increase in money supply.
I have no idea whether "rising price inflation" has been underreported, especially by design, by the government. I would concede that there are political and bottom line reasons (e.g, minimize increases in salaries/benefits tied to COLA), but again, if this is happening, IMO it is probably more just institutional slouch than a top-secret directive from the Federal Reserve bunker.
Overall, I don't have an informed opinion what the facts of the matter are wrt to underreported inflation - I only have anecdotal evidence.
OTOH, it is pretty clear that "money supply inflation" has increased dramatically in the past few years due to the various policies associated with the bailout.
Whether this will cause "rising price inflation" remains to be seen - there is always latency between money supply increase and rising prices. In the bailout policies case, the latency is pretty large, as the bulk of the money went to securing "toxic" assets and so forth rather than directly into the consumer economy.
The overhang of this increase in money supply/government debt naturally constitutes rising price pressure, but again, how much is anyone's guess, as there are deflationary pressures as well (falling asset prices, and such).
While I respect the pragmatic nature of your argument, I think it's important to understand not just the limit of the system but the underlying mechanisms.
If increasing the money supply is "free", then why not just increase it a whole bunch this year and end poverty?
I think the view you express implicitly puts too much trust in policymakers and assumes the system under study to be more stable than we really have evidence to believe it is. Incidentally it is precisely those two characteristics that led to the crash a few years ago.
You're sort of correct, in that inflation and the money supply can often be related, but generally one would not use the term "inflation" to refer to an increase in the money supply.
In simple terms though, as the money supply increases, the value of the currency decreases, causing prices to rise (inflation) - but there are too many other factors to count that affect prices and its not always as simple as supply goes up -> prices go up.
Where I think there may some disagreement is around how prices are recorded and tracked - the methodology behind the calculation of whichever price index is being referenced. The number used to determine real inflation excludes the prices of fuel (oil) and food since both numbers depend heavily on speculation and a number of other factors outside of "inflation".
The Fed has tools like interest on reserves, reverse repos and term deposit facilities to make sure that all of the excess reserves on bank balance sheets don't turn into actual price increases.
Austrian economists conflate increases in the money supply and price inflation, but it's not necessarily true. In fact, mainstream New Keynesians (that includes conservatives like Greg Mankiw) have predicted for years that increasing the Fed's balance sheet wouldn't increase inflation. They have been right.
But this is due to the larger forces of supply and demand for US treasury bills. So it's the US's role as an empire that makes its sovereign debt demanded by nations with less solid foundations.
The US enjoys this position largely b/c of its size but also b/c of the frequent policymaking folly that occurs in other nations.
The only thing that can impose true discipline on this process is the existence of competition.... it's the only thing that could decrease demand for US Treasury bills.
It's obvious that we're not currently experiencing hyper-inflation, but the configuration of the world that will allow the trend to continue becomes less certain the further things get extended.
Imagine someone issues you a credit card with a $1 per month minimum payment. As long as you can keep getting the limit increased you'll surely be able to make the payment each month, no matter what other spending you do. The US is able to keep borrowing and borrowing, and its creditors are very lenient b/c they lack a better option.
The debt situation might get out of hand, but that doesn't necessarily mean we're headed for inflation. The US has gone for hundreds of years without a default[1] and I'm sort of proud of that as a US citizen, but if worst came to worst we could always just default - unlike, say, 1920s Germany.
[1] There was a few payment in the 1970s that was slightly late due to a clerical error, but that hardly counts.
I agree that it doesn't necessarily mean we're headed for inflation, but I think the counterfactual is that the US is headed for even more dramatic economic hegemony.
> "I have no idea whether "rising price inflation" has been underreported, especially by design, by the government."
That shouldn't even be an open question anymore. Concern about the accuracy of government reporting of prices is what gave rise to the Billion Price Project.[1] So unless we posit that MIT is now 'in' on a government conspiracy to suppress price increases, we can put that one to bed.
Do you know how the bpp selected their basket of goods? From what I can tell (reverse engineering from the result), they regressed their collected prices against the CPI -- which therefore makes it a not-independent estimator of price.
(If I'm right, this offers evidence neither for nor against accuracy of CPI reporting -- it just makes BPP mostly irrelevant for this discussion until 10 years or so have passed)
I thought that the current mainstream economic view was that inflation, rather than lagging the quantity of money, actually reacted to people's expectation of how much the money supply would increase in the future. That was certainly the case a little while ago when the Swiss central bank announced they wouldn't allow any more appreciation in the Frank, and that they would print more Franks until they had reached the peg they wanted. The exchange rate hit their target within 15 minutes of the announcement.
EDIT: Wikipedia has a very good article on inflation, and I think that it would do most people a lot of good to read it and understand the various reasons people care about inflation, and where different schools of economics differ on it. http://en.wikipedia.org/wiki/Inflation
I agree with the SCOTUS sentiment, but in other related areas, strict adherence to the law with the underlying intent to remain legal but avoid the spirit of the law has been cracked down upon.
In the case of cash transactions, with the 10k reporting limits, people are now routinely prosecuted for "structuring" their >10k transactions into a series of <10k transactions, even when it is clearly demonstrated that there is no underlying criminal activity. There have been cases where it is just cantankerous believers in privacy being prosecuted.
For my part, I'm on the side of the tax avoiders & believers in privacy, but it wouldn't surprise me to see the government's views/regulations on tax avoidance come around to their current policies on currency controls.
I think the early, mid 90's were a lot more interesting time to be a programmer than now, just before the Internet explosion. Back then, a bigger percentage of the focus was on using computers to do something, solve a problem directly (as opposed to implicitly) and the computers were actually getting powerful enough to do things on a cheap basis.
People's cat pictures were a vanishingly small portion of the landscape.
As an example, cheap computing power is one of the reasons that while at the beginning of the 90's, nuclear power was considered a money losing proposition by electric utilities and everyone was trying to get out from under them but by the end of the decade, all the utilities were hanging on to their nukes for dear life and trying to figure out how to extend their licenses/service life.
Now, a much bigger percentage of the focus is simply getting information out of one spot, tranporting it to another, dolling it up w/some marketing glitz.
That's not to say that the cool stuff isn't still going on, but that it is a smaller percentage and the profession has been dumbed down significantly (hence brogrammers and all).
Things that weren't in the mid 90's: stack overflow, google, gmail, google docs, os x, github, git, torrents, and basically every tool I use to make things with my computer (besides bash).
Yeah, back then you actually had to read documentation (yes, it existed) and more or less know what you are doing. Copy/paste coding and questions from colleagues like "How to connect to datbase, pls help urgently!" were unheard of.
You are certainly welcome to your opinion, but it's unfortunate that you seem so heavily defined by your tools (especially when they aren't even one's you've written).
The thing that I dislike the most about the technology business is people who want to use developers as crash test dummies for their random ideas and think that "good ideas" are the topmost measure of goodness and once having a good idea, all the rest will naturally unfold.
Pivots are good and oftentimes necessary, but I learned years ago that good ideas are a dime a dozen. The ability of an organization to competently execute on a (seemingly) good idea is the much rarer skill, 1% inspiration/99% perspiration and all that.
The OP seems to think this is something that can be managed. I do believe that, but I'm not sure it is something that can be as easily quantified & then learned as a generic management skill as the OP seems to imply.
I spend 1/2 time in Japan because my wife is Japanese. Our 8 year old is bilingual, my wife is fluent in english but learned as an adult, and my japanese is rudimentary.
Some of the comments disagreeing with the research, I can sympathize with from my perspective in Japan - certainly I feel at a disadvantage due to my language limitations.
However, I think the presumption in the research is that the subjects understand the language, that this disadvantage is removed. The thesis is that if one is fluent in a second language (or not disadvantaged by non-understanding of linguistic constructs), one is freed of cultural constraints that attend to native speakers of the language.
Going from personal experience observing my wife debate in english, and again, she is fluent, the language is much more of a coldly analytic tool for her - the words all have their nominal meaning. However, over the years, as she has grown more culturally fluent, I've observed this effect declining.
The optimistic viewpoint going into the unknown, i.e, the confidence that the answer is out there somewhere and you will find it, is one of the, if not the, most important success factors in any sort of greenfield venture.
I'm an older guy and even years ago when looking back, things that I just assumed would work out, even though an objective observer would perhaps disagree, always seemed to have more or less unfolded according to plan. It is almost as if life is much like some evil jungle vine that wraps and immobilizes one the more you struggle against it.
Here is one personal example - I was born into a family that valued education, it was always assumed that the kids were going to college - this was a little before the now current assumption that everyone goes to college.
Well, I hated school from junior high onward with an abiding passion and finally quit at the first available opportunity at the beginning of 10th grade. But it never occurred to me that I wouldn't go to college and I did, pretty much on schedule w/my peers (and I paid for it..) getting a BS in physics, MS in Computational Fluids/Mech Engrg.
Again, it just never really occurred to me that I wouldn't go to college as I was bailing from high school, kind of weird in retrospect. But I never even slightly doubted the outcomes - not in a defiant way, either, just a low key assumption.
And again, looking back, I can find many other examples my life or other peoples lives that follow this pattern.
I don't know if there is a term for this mindset, but I see allusions to it both in the OP and in many of the other comments here.
Top 10% - gets the coding job done, timely, high quality code. Knows his craft well, but is still essentially a craftsman.
Top 1% - Designs systems that will get the job done for the business/larger org goals, or even designing systems that generate business props. Also, the systems designed are capable of being implemented/maintained by lesser skilled developers. Of course, knows his craft, but is something of the artist, can see the Buddha in the gearbox.
The 1% guy should easily, as in no conceptual difficulties, be able to write a compiler/interpreter, that clearly demonstrates whether one is simpatico with the machine, capable of thinking like one. Actually should love doing this sort of thing, probably, and might have to resist the the tendency to incorporate custom command processing components/metadata/languages in every project.
The 1% guy should easily, as in no conceptual difficulties, be able to write a compiler/interpreter, that clearly demonstrates whether one is simpatico with the machine, capable of thinking like one.
You do realize that compilers and programming languages are an actual realm of expertise, right? I can, and do, write a compiler, but I couldn't construct you a web framework from scratch.
Acknowledge others' expertise rather than trying to construct a mountain to stand on.
Sorry, I think I poorly worded my comment. The point I was trying to make was that (imo) the better devs have an extremely deterministic viewpoint towards their code and are always paying attention to all the contextual layers in which they are coding and how their code may affect that context.
The example of compilers/languages sort of follows from this, as this (constant) awareness leads to pattern recognition which then leads to surmising about leveraging languages, code generation, etc, to take advantage of these patterns & reduce effort/errors.
There are other examples, just that I picked this one.