Eclipse is way more than just Java IDE. In fact, I've been using Eclipse for many years for a number of purposes, and 80% of that time - not as Java IDE.
It's actually quite easy to understand, given a sufficient understanding of human nature.
You're trying to dominate a social situation, because you think you are right (and by extensions, that there are others who will support you and strengthen your position) and because there's a lack of physical threat.
In the real world, you would not want to cause a ruckus, because there could be consequences, and your are not the dominant member of the group. For example; someone could punch you.
If you could, you would want to dominate any situation that you can. We are wired that way.
eh, this is like an obese man complaining that thin women don't want to date him, then when people point out the obvious, he complains why is the onus being pushed on him instead of on the thin women who don't find him attractive.
What I find more surprising is the length they went through to make git work for them.
It would probably have been better if they had assigned a team of highly skilled engineers to design a distributed source control from scratch specifically to solve the problems faced by the windows team.
After all, `git` itself was developed for Linux because Linus was not satisfied with any of the existing solutions.
Moving Windows development to git sounds like a completely irrational decision from a technical standpoint that was driven mostly by marketing concerns.
I worked at Microsoft before and after my team (not Windows) did the Source Depot -> git transition, and I can say that it was mostly for the better. Source Depot was pretty good as non-distributed VCSs go, but it doesn't hold a candle to git. Some specifics:
- Branching in SD sucked, the only way to work on multiple features at once was to either have multiple copies of the repo or constantly fiddle with "package files" that contained your changes, and the whole thing shit the bed if the server went down.
- All the new hires both from college and the industry all know git and there were an increasing number of them who just found it bizarre that Microsoft was still on a proprietary, non-distributed system.
- Higher-ups (correctly, IMO) decided that VSTS/TFS had to have git in order to remain competitive, and that we should be eating our own dogfood. That's partly a marketing concern but also a legitimate technical decision.
Marketing concerns by whom? The nonexisting company behind git? Github when MS doesn't use github? Or you think MS is going to be able to drum up some business just because they use git? Please elaborate!
While I don't agree with OP... you could make a case that switching to Git is a way for Microsoft to make itself more fashionable to the engineering community at large.
The marketing here is: it's cool to work at Microsoft again!
> you could make a case that switching to Git is a way for Microsoft to make itself more fashionable to the engineering community at large.
The marketing here is: it's cool to work at Microsoft again!
But that makes it a technical decision! If it's easier to attract talented engineers because you're using git, that improves the product. I was trying to make this point in my other comment too: OP is drawing a sharp line between "marketing decisions" and "engineering decisions" and sneering at the former, but actually the boundary can be pretty fuzzy.
Because git is the latest fashion rage and by itself incapable of managing the size of the Windows code base. This is the ideal combination of reasons for corporate PHB to decide to use it.
Exhibit A: GVFS was solely invented as a hack to make git usable by the Windows team according to the Windows team blog.
(Yes, there is some sour grapes and sarcasm in this post)
That’s a lot of fearmongering with no evidence. Do you have profiler data showing that the microseconds needed to pass a message between a process is a significant limiting factor in a program which is rate-limited by human text entry?
I mean, taking your argument seriously would mean everything should be hand-tuned assembly. Obviously we figured out that other factors like time to write, security, portability, flexibility, etc. matter as well and engineering is all about finding acceptable balances between them. Microsoft has been writing developer tools since the 1970s and in the absence of actual evidence I’m going to assume they made a well-reasoned decision.
It seems like the person was suggesting if all processes on your PC used a client/server model with message passing/RPC instead of the existing API model, the idle cores you speak about, would not be idle.
While you're right that productivity versus performance is a trade-off, and an editor is not necessarily a high performance application, its not clear to me whether future optimizations would reduce the gap, as much as optimizing compilers did vis-a-vis C and assembly.
In any case, that aside, the core guarantee of software stability with LSP remains to be seen.
> Do you have profiler data showing that the microseconds needed to pass a message between a process is a significant limiting factor
What? I think you didn't get my point. Let me try again.
You can look at a single operation and say "oh, that's nothing, it's so cheap, it only takes a millisecond". Even though there's a way to do the same thing that takes much less time.
So this kind of measurement gives you a rational to do things the "wrong" way or shall we saw the "slow" way because you deem it insignificant.
Now imagine that everything the computer is built that way.
Layers upon layers of abstractions.
Each layer made thousands of decisions with the same mindset.
The mindset of sacrificing performance because "well it's easier for me this way".
And it's exactly because of this mindset.
Now you have a super computer that's doing busy work all the time. You think every program on your machine would start instantly because the hardware is so advanced, but nothing acts this way. Everything is still slow.
This is not really fear mongering, this is basically the state of software today. _Most_ software runs very slow, without actually doing that much.
I don't think this is true. Also any solution that involves humans just "Trying harder" is doomed to failure. History has demonstrated that over and over again.
The technologies that win are the ones that account for that.