It is the MOST trivial thing to measure performance in a Go game.
It is the LEAST trivial thing to measure the "code quality"/"unit of time" metric in a programmer. For example, you might bang out an implementation that looks fine, but 1 guy will say "that will become unmaintainable in 1 year" or "that will be a problem if we ever switch databases" or whatever and sure enough, a year later, the team has to do that... and that 1 guy was fucking RIGHT.
Who's to say how you measure such a thing?
The longer you spend in a career that has ANY creative component, the more you will come to loathe the importance placed on "performance evaluations." In fact, I'd argue that the more advanced you are in those creative jobs, the SLOWER and BETTER you work... the problem is that the newbies won't be able to recognize the "better" portion.
"that will become unmaintainable in 1 year" or "that will be a problem if we ever switch databases" or whatever and sure enough, a year later, the team has to do that... and that 1 guy was fucking RIGHT.
And sometimes YAGNI, we'll cross that bridge when we come to it, nice problem to have, good enough is good enough, do the simplest thing that could possibly work, premature optimization, perfect is the enemy of good, grass is always greener somewhere else, better a bird in the hand than two in the bush, he's right but it's still not worth the cost right now, etc.
That is a perfect mindset to have to the problem of the type "that will be a problem if we ever switch databases". However, "that will become unmaintainable in 1 year" is a "bridge" you want to cross sooner rather than later. If you have a deadline to meet, fine, but come back and fix it asap if the software is something that needs to be maintained. I really cannot explain it well, it's just something most developers realize after a while when they've had to maintain an old codebase.
They key point however is, that one guys insight shouldn't be ignored. You can take the words of wisdom and proceed to not do anything about it, but at least you know the cost down the road. All experienced managers I know personally, know the cost of umaintainable software and are more than willing to do something about it if ressources allow it. They need to, otherwise it's their ass that is in the line of fire when it cost a factor 10 to implement a new feature and bugs creep up at the customer time after time, even though they spend a factor 10 more on QA.
>It is the MOST trivial thing to measure performance in a Go game.
Actually in terms of very old games Go is one of the harder ones to measure performance of during the game. In terms of measuring performance in any 1v1 game however it is easy.
It is the LEAST trivial thing to measure the "code quality"/"unit of time" metric in a programmer. For example, you might bang out an implementation that looks fine, but 1 guy will say "that will become unmaintainable in 1 year" or "that will be a problem if we ever switch databases" or whatever and sure enough, a year later, the team has to do that... and that 1 guy was fucking RIGHT.
Who's to say how you measure such a thing?
The longer you spend in a career that has ANY creative component, the more you will come to loathe the importance placed on "performance evaluations." In fact, I'd argue that the more advanced you are in those creative jobs, the SLOWER and BETTER you work... the problem is that the newbies won't be able to recognize the "better" portion.