A looong time ago I went through Army basic training. As a BS degree holder I was given the job of "book man". This meant I carried around the platoon training book and carried placards to put in signs at various training locations that told what battalion, company, and platoon we were so if anyone driving by a gun range, motor pool, or classroom could look and see oh that's 1st platoon, C company, 1st training battalion.
I've come to believe that that main driver for Agile adoption has come to be something similar. Making visible to outsiders what software development teams are doing, and making progress (or it's lack) visible to management. I thinks that's a completely reasonable expectation. Businesses are paying exorbitant salaries and providing ping pong tables, why shouldn't they have visibility into what's being done?
Where it becomes toxic is in areas this posts parent indicates; posturing, one upsmanship, pressure to perform. Effective teams need "safe spaces" to learn, discover, try and fail. Agile isn't that anymore. Hasn't been for a long time.
> Businesses are paying exorbitant salaries and providing ping pong tables
Businesses are paying market rate salaries to employees who generate them revenue. If anything engineers should be the ones questioning the exorbitant compensation and perks at the top of the org structure.
> why shouldn't they have visibility into what's being done?
Because they shouldn't be micromanaging. They should be setting high level goals and expectations and giving teams autonomy and trust. Intrusive surveillance and low level metrics create perverse incentives that detract from those high level goals.
You use this argument to defend engineers but then question c—level executive salaries. The exact same principles apply but at a multiplicative level
That’s simply not true, workers and managers are qualitatively different. There are no shortage of examples of managers of companies losing money and marketshare who nevertheless are paid bonuses every year then leave with a golden parachute.
Trust, but verify. I work on the services side and we frequently are in the position begging for more surveillance. If you are really so confident in the speed and quality of your delivery (my org borders on cocky) then the biggest risk we face is being misaligned to client expectations due to poor communication. We always ask for their explicit sign off on story criteria and feedback on features as soon as they are finished to make sure we're aligned and let them see exactly what pace we're setting.
I see what you're getting at, but the Russian proverb that Reagan made famous - "Trust, but verify" https://en.wikipedia.org/wiki/Trust,_but_verify - raises a wry smile each time I read it because it's such an obvious oxymoron. You either trust, or you verify.
There's an earlier version attributed to Mohammed - "Trust in God. But tie your camel first." - so the sentiment has been around for a while.
It feels like it's just a way of reducing cognitive dissonance, which is useful I suppose, but I wish people wouldn't use it because it allows a feeling of resolution without a real resolution of the tension between trusting and verifying.
I don’t think neither of these sayings are in themselves contradictory. The first could be translated as “I trust your work ethics, but I also know that your human nature will lead you to perform better under supervision”. The second basically says: “God has my back on a number of things, but She also expects me to do my part”.
I work in services, so for us it's like "Trust, but write explicit assumptions and disclaimers into the contract". It's like when you commit to paying $2M for a custom piece of software, you expect to pay the fee and get the product. Not spy on your vendor then sue for your money back. But you have to prepared to protect your investment.
But how can they set goals without any visibility into what a team can typically accomplish in a given time period? How can they identify better performers and worse performers?
I'm not saying agile is the right solution, (it probably isn't), but expecting higher-ups to fund a black-box team is kind of naive.
Reporting on random internal stuff instead of the actual problem at hand is the #1 problem I see with corporate reporting, everywhere. I see various combinations of people wasting time measuring:
1) The thing that is easy to measure, typically money or time.
2) The things they "understand", typically people for HR, compliance for legal, money for finance, etc...
3) The things their manager wants to know, no matter how irrelevant that is to executing their own job well.
Meanwhile what they should be measuring is the qualities of the end-product or the overall external customer end result.
It doesn't matter one iota if Bob the Developer Guy missed an internal 3-day deadline that John the Manager made up on the spot if the end product is a winner in the market and makes the users ecstatically happy to part with their money.
This happens everywhere, with everybody. For an IT-centric example, the common one I see is:
Helpdesk: "The users are complaining that the app is slow"
Admins: "The load is only 10%, but fine, we'll add more capacity!"
Helpdesk: "The app is still slow!"
Admins: "The load is only 5%! They should have no reason to complain!"
Do you see the issue? No, seriously: do you? Because practically nobody does, in my experience. Take a minute.
What happened here is that the admins measured the thing that is easy for them to measure: the load. There's a cute little bar graph in VMware, or a chart in their network appliance, or whatever. What they should have been measuring is latency from the end-user perspective, but that's hard to measure and practically no product tells you this number out of the box. So their entire process, their reporting, their troubleshooting, their forms, requests, everything becomes focused on the thing that they can see and control. Even if it's pointless, ineffective, and basically a waste of everyone's time and effort.
This happens with developers in exactly the same manner. Software quality is stupid hard to measure. Long term supportability is borderline impossible to measure without a time machine. Technical debt is hard to even explain to a manager, let alone keep tabs on in terms of numbers. So what's easy to measure? Time! Deadlines, sprints, release dates, etc... That's super easy.
That's why inevitably the unimportant internal time metrics become critical to everybody, but the actually important metrics aren't even measured and become invisible to management until it's far too late.
As a manager, I want to measure leading indicators of success and failure. Absolutely, feedback control based on the real output is important. I must measure that! But I’m always looking for ways to predict that, so I can steer more gently. What’s a leading indicator of a crisis? A late team struggling to make a deadline. What’s a leading indicator of that? Mismatch between estimates and performance. I need to know about bad point estimates because if I don’t fix it—I mean fix the PM’s misalignment—they’re going to push the team into something dangerous.
The problem is that these "leading indicators of success and failure" aren't. A late team struggling to meet a deadline might be a sign of imminent failure, or it might the team working hard to do something that is genuinely difficult to do.
The core problem with Agile (most forms of software management) is that it massively overweights "first mover" advantage. I keep hearing, as a justification of agile, that software needs to be delivered quickly so that the company can go to market first, and gain marketshare while its competitors are still floundering. But, in practice, that's hardly ever true. I can't name a single product that succeeded solely because it was on the market first. I can name many products that were first to market and failed because they were clunky and difficult to use.
Heck, Apple's entire business model consists of being second to market with a product that is more polished and easier to use than its competition.
Yes, if a developer or team is well and truly stuck (as in spinning their wheels on the same problem, week after week), that's a problem. But you don't need Agile to tell you that. A simple weekly status meeting with incremental demos is sufficient. The only thing Agile does is create a bunch of graphs that allow management to comfort themselves with "story points" and "velocity" so that they don't have to confront the hard reality that they have no idea what it is they want to build.
What about maintaining an environment where the team felt safe to communicate to the pm that there was something wrong?
And for the pm to feel safe enough to communicate to you something is wrong.
This feels like a Taylorism. Knowledge work isn’t factory work.
I totally understand keeping tabs on delivery speed to enable the team to benchmark themselves but the act of identifying a problem from that (if there is one) should be the teams responsibility imo.
As a manager, my job is to enable other people to do their job the best they can.
People manage things using their judgement and qualitative observations, all the time. We can accuse them of bias, but that has to be weighed against the fidelity of the metrics.
This is a good breakdown of this really common dynamic.
It is recognizable to many people, some of whom use it for their benefit, which can be very effective. When I learned that last fact a lot of things made more sense.
What happened here is that the admins measured the thing that is easy for them to measure: the load.
Nah, you have it completely backwards. If the users said “this specific job took 5 minutes today but was only 1 minute yesterday”, that’s actionable, you can e.g look at what changes were deployed overnight.
But users always say “the system is slow”, even if they have only the vaguest idea of what “the system” is, and even if it’s actually faster than yesterday. It’s not really clear what any sysadmin can do other than spending hours every day painfully extracting the details from the user only to find nothing is wrong. Every day, forever.
That's not true. It's just that most sysadmins don't bother to upskill to find out what they can and should be doing.
> painfully extracting the details from the user
Asking users for any information is a recipe for disaster. Much like witnesses to a murder that can't agree on the most basic details, users inevitably conflate totally unrelated things. E.g.:
"Citrix is slow?"
"Okay, how so... are button presses slow to respond to a click?"
"I couldn't log on. Something to do with my password. It's slow."
"ಠ_ಠ"
So don't ask. Don't rely on your users at all. Build synthetic transaction tests that act like users. Measure end-to-end latency. Sit down with them and watch them work. Don't rely on their verbal feedback, use your own eyes. Use your tools. Measure. Then measure some more.
Conversely, capacity metrics are largely irrelevant in the era of 10 Gbps networks and 64-core server CPUs. Focus on latency. Look for delays. Timeouts. Deadlocks. Firewall packet drops. That kind of thing.
> only to find nothing is wrong. Every day, forever.
Of course something is wrong! Something is practically always wrong, that's why the users are complaining!
Here's a fun rule of thumb for you: For every 1 user that complained, there are between 100 and 1,000 that had the same issue but shrugged it off and didn't call support.
I got that from a scientific paper. I couldn't believe it, so I measured it in a large 10K user system. The error-to-call ratio was about 500-800 in ours. It blew my mind, and it blew the minds of a lot of people in IT management.
We started gathering every error, tracking every possible latency measurement we could, and it was a horror show. 30K app crashes per day. I shit you not. That's about 3 per user per day! Data loss. Hangs. Login failure rate of nearly 50%.
It tooks months to triage the issues, push patches, and apply workarounds. We had to rewrite several components. We eventually got the errors down to less than a hundred per day. Believe me, that was a real achievement.
Users were so happy they were begging to be migrated to the new system instead of pushing back and refusing to upgrade.
If the users are complaining, something is probably very wrong and you just don't know it. Go look.
By assigning problems to specific responsible individuals, and noticing the existence / quality of the solutions they produce? If anything, Agile obscures individual performance, in that it treats everyone as fungible and every part of the system a commons.
Software engineering salaries are not exorbitant. In fact, engineers may be consistently underpaid for the value delivered. A small team can create millions for executives and shareholders. Ping pong tables etc are a sad tool for management to placate the people who drive the company value up, to keep them from unionizing and realizing the power they hold.
Ping pong tables etc are a sad tool for management to placate the people who drive the company value up
Nowhere is this more apparent than the so-called Hackathon, where you do days of overtime in exchange for a few slices of pizza and worse, are expected to be grateful to management for providing the opportunity to do so! Nothing infantilises the profession more.
Most software projects fail and do not deliver value. Code is not delivered, code is not taken into use, code is not used much. Software is capital intensive and risky. Code can be more valuable than gold but its value is determined by many factors other than skill and effort of the developer.
True, but you can’t figure out what product will fly and which not unless you actually build it. So you are stuck ;-)
Granted, one can filter out 99% of product ideas on a whiteboard but the remaining 1% is still enormously huge. And that 1% is actually the stuff that nowadays gets started.
> Software engineering salaries are not exorbitant. In fact, engineers may be consistently underpaid for the value delivered.
What about the value of those who cooked your lunch? Without them, software engineers would starve and die, so that makes them have far more value, no? Do you think that food service workers are also underpaid?
Consider a food service worker in Facebook's cafeteria. In an hour, perhaps they make 30 lunches, which in the best case delight 60 people for an hour: 60 person-hours of delight for an hour of work.
Now contrast a Facebook programmer, who in one hour might be able to fix a bug that has been annoying 0.1% of Facebook's 2.5 billion users, causing them to be frustrated rather than delighted for, say, two minutes a day, for the next three years before the feature gets rewritten. Maybe that sounds trivial, but if so, shut up and multiply: 2.5 million hours of delight per month for 36 months gives you 90 million hours of delight, for the same hour of work.
So, at a rough estimate, then, the gourmet hacker is 1.5 million times as productive as the gourmet chef. Maybe if I've been overoptimistic it's only a factor of 100,000 or 10,000, but it's huge. And that's how Facebook can be profitable at all despite all the shitty and stupid things they do: capturing even a tiny fraction of the value they produce makes them wildly profitable, as long as they can successfully externalize the harm they do. Some software companies don't even need to externalize their damages.
And that's why software is eating the world.
That isn't the only reason hackers get paid more than foodservice workers. No business is going to pay more for its inputs than it is forced to; that reduces its profits. Hackers also enjoy a dramatically better bargaining position than most foodservice workers, because a hacker with US$3000 has a better BATNA than a cook with $3000: the cook is going to be trying to sell $3 burritos out of an Igloo ice chest outside of concerts while the hacker can buy a laptop, bring up a couple of VPSes on AWS, and spend a few weeks putting together a useful web service, maybe get a few dozen to a few thousand users, but at any rate can easily scale to hundreds of thousands of users. The cook is dependent on someone investing a few hundred thousand dollars (in the US) to have top-quality tools and a good location. Difficult to do yourself unless your family is rich or you graduated from the Cordon Bleu.
This is purely factual reasoning, so it cannot answer normative questions like whether it would be ethically better to pay hackers more or less. It only purports to explain the chains of cause and effect in the world that give rise to that situation, and illuminate the other possibilities inherent in the current state of affairs, and how they might change.
The cook trying to sell $3 burritos "out of an igloo chest" at a concert is a lot less favorable an example of the successful, entrepreneurial, optimal software engineer that he's compared against. The cook could also start a successful youtube channel and turn into the next cooking superstar -- and you could argue that, in ideal circumstances and optimal execution, their actions have just as much or more impact as the software engineer. But few know and can execute their optimnal path and thus their (our) reality is much more mundane.
(Also I don't know where you're from that burritos might be sold out of an igloo chest, that just sounds gross)
There's a difference between working in foodservice and making a TV show about it. I was talking about foodservice workers. You can probably build a successful cooking channel without much capital investment, but not a successful restaurant. Starting a successful restaurant requires talent°, equipment, and land; starting a successful website only requires talent.
That's what FAANG are competing against when they hire hackers. And that's still true even though most hackers didn't apply to YC last fall, because they can sign on as employee #2 or employee #20 with someone who did. Because the other factors of production are not scarce enough to matter, jobs for hackers are abundant in a way that jobs for chefs are not. Patents, noncompetes, H1Bs, the anti-poaching conspiracy, and API keys are efforts to change that, but mostly they haven't been very effective.
(What makes you think the cook is a "he"? Both of the people I was basing that sketch on were women.)
° By "talent" I mean "strenuous and persistent effort by highly skilled people", not some kind of inborn genius. You can't start a business by sitting around thinking deep thoughts; you have to work hard. But for a restaurant, working hard isn't nearly enough, and that puts foodservice talent at more of a negotiating disadvantage with respect to investors.
I'm calling bullshit on this because without the cafeteria worker the whole place would be a cockroach infested dump with moldy lunches in the fridge
It's a fallacy to view anyone's work as less valuable when contributing to the whole -- everyone's work is essential in the ecosystem even if all they're doing is unclogging the toilets of all these knowledge workers toxic turds
> Without them, software engineers would starve and die
Software engineers are perfectly well capable of making a sandwich themselves, bringing a packed lunch, or surviving on an empty stomach for a few hours until they get home. Not to mention buying lunch from somewhere else or ordering delivery food if one cafe is gone.
Unless you're positing a world where all food production, farming, fishing, etc. has vanished, then software engineers are in no worse a position than chefs, cooks, baristas, grocery store employees, or anyone else.
> Software engineers are perfectly well capable of making a sandwich themselves
I have an idea! Let's call it DevFood, and make it a requirement for everyone.
I mean, we already expect the software engineers to analyze the requirements, so we don't have to hire analysts; we expect them to test their products, so we don't have to hire testers; and we expect them to do the operations, so we do not have to hire an administrator. So tell me, why do we have to waste money on an extra guy who makes the lunch? DevFood is the future of software development!
People are generally capable of cooking to nourish themselves. Learning enough cooking to cook for yourself can be achieved in one afternoon. The foodworker employment market is completely based on convenience culture. The software engineering employment market is very different to that.
When asking about the value that a certain occupation provides, I think it's fair to compare them to the rest of society and ask about the balance. I'm not bringing up the plight of other employees randomly to dismiss one argument, I'm asking for a defense of the stark difference in economic status.
Sure, but don't just compare to people who are also underpaid. Also compare to management. In many companies, managers get paid a lot more than software developers. That's not necessarily fair.
Right now a lot more meals than usual are being prepared at home, and a lot of people who had jobs doing that are out of work. On the other hand, essentially the entire accounting system for the state hardest hit by coronavirus is run by "nonessential" employees, because that is what we are calling people who can work from home.
Right now, there are definitely lots of people who can't work from home that are enormously essential in the ordinary sense. For instance, say you have a plumbing emergency. But, you know, your unemployment check is pretty essential too.
As someone who had a plumbing emergency I can attest it was essential also. I can also say I am sure we would have been able to get the repair done regardless, as humans want to help (and make money).
It is correct that those who are working remotely are no longer supporting the cooks, child care workers, etc ... while also having hardship due to the lack of these services. These hardships however are temporary. Our society can't really afford to live with these disparities. I spent a first career in the restaurant industry before learning to work with computers. I can say for certain in my case the first career was much harder and had much less rewards. However, I don't see eating out as essential, but a privilege. I am not sure how we walk back this system, but it seems the time has come that more people raise their own children and cook their own food regardless of their jobs.
I can't figure out what you are trying to say, particularly about plumbers. You seem to be disagreeing with me about something, but your link supports them being essential, which is what I wrote.
There is a division between people whose jobs are secure right now and those who are (or soon will be) out of work.
There is also a division between people who can work remotely and people who have to be at a job site in person.
My point, or one of them, is that these are independent of each other, so all together there are four types of people/jobs.
My apologies, I misinterpreted your remarks as saying plumbers were essential but not able to work. I was therefore defending plumbers necessity and showing their status based on it. Thank you for the clarification on your point for me.
I do agree with you. Those that are able to work now are certainly lucky (as long as the work allow proper social distancing) whether remote or essential.
My last point was on remote workers, especially those in tech which I am most familiar with, is that one should realize how pampered that role has become at the cost of others. Many of the new stresses those situations involve now with cooking and child care that was done by the service workers who are not essential and can not work remotely are the same stresses those service workers encounter all the time, not just in a pandemic situation. If one is able to work remotely now you should take this time to count your blessings and re-evaluate your own internal definition of struggle and what you are entitled to as part of society.
Also, as you may have alluded too, remote does not necessarily mean job security, rather it could be quite the opposite in the long term. It is times like these I wish I was a plumber.
My point was many lower wage workers can not afford the luxury of prepared meals or child care. I was in no way comparing cooking professionally to cooking domesticaly. As someone who worked at every job except baker in the food service industry for a decade and changed careers because of the stress I can not agree with you more. Restaurant work is hard and in my opinion underpaid.
this is not how prices are set. If you don't like how they are set then you don't like capitalism, where capitalists pay labour, and labour have to work because they need to live.
That's all fine, but let's be clear on how the price of wages are set. It is not "value created - some 'fair value'" for capitalists.
If people could refuse to work because they didn't have to pay rent and had access to the commons to get food or work for themselves, we might see capitalists having to share the spoils more.
Right, CEO salaries skyrocket, not necessarily because of the value they create, but because they can get the board and stockholders to back them more than the people actually creating value in a company.
They can do this in the form of stock buybacks, dividends, etc.
The standard software engineer does not have the power to "bribe" the stockholders in that way.
I quit FAANG and started doing my own thing. I found I hated a ll the paperwork. Dealing with taxes, accounts receivable, accounts payable, business registration, finding clients, meeting with clients, networking for new work, unmovable deadlines, pressure, fears of disappointing a client or the client being late on payment or me being late on filing some form my government requires but I have no idea I need, or a million other things.
I realized that some level of not getting 100% of the value i generate is worth it to have someone else deal with all that stuff.
I'm not saying whether or not engineers are underpaid for the value they create but it's worth something > 0
May I ask which of that you found to be the most difficult?
I've been thinking of going the same route, and I'm confident I can handle the paperwork and deliver, but it's the finding the clients part that worries me the most. Not sure if the problems I see are widespread enough to warrant founding a business and going through all that...
CEOs' decision making has far more impact on the revenue of a company than a single engineer, or many engineers. That's why the market rate is very high for them.
If there were only a handful of qualified engineers available on the market, then they'd get paid CEO-like salaries due to the value they provide and being in high demand. But because there is a whole market of qualified engineers they get paid far less.
It's pretty basic economics and doesn't have much to do with stock buybacks/dividends/etc... Regardless of the value of an individual position, wages will be lower or higher depending on the supply of an occupation. In the big picture engineers are very replaceable at market price compared to executives so their wages are reflective of that.
It may be more demand than supply. Plenty of people would love to be CEOs, and have some ability to run a business and delegate, but you can only have one per company. On the other hand, you can have many engineers. If you could only have one engineer per company, their salaries would likely be much higher.
This would be counter-intuitive, because low demand results in high salary, regardless of supply.
Not related to your main point, but would you reconsider calling them 'exorbitant salaries'?
Engineering is one of the only professions that is paid commensurate to the value they provide
Calling our salaries 'exorbitant' leaves you without superlatives left to describe executives that get paid thousands of times our wage, and at least they do work compared to the owners of capital who are 'paid' incredible money without doing any work at all.
To call an engineer salary 'exorbitant' you must be comparing it to the salary of people who function as modern serfs, and are not paid commensurate with the value they provide, or even enough to live a safe and satisfied life. Being paid enough for modest freedom and security is not 'exorbitant' though- no matter what you're comparing it to.
There are certainly good reasons to make a team's progress visible. PMs/POs ( I'm never sure of the difference ) will want to know if a project can meet the release date. They will want to know that the need-to-have features are being worked on before the nice-to-have. I don't want to sat a team should work in total isolation.
I can only say that IME a team is going to be a lot more productive if they are able to own their development process. Good infrastructure, knowledgeable engineers, healthy relationships instead of competitive ones - that's what creates a highly productive team and highly productive engineers.
> will want to know if a project can meet the release date
The number one main headline takeway axiom of Agile is that this question is completely banned.
Agile is about delivering working software continuously, delivering whatever increment you get working at each timepoint, adapting to observed reality as it comes, and not having deadlines for specific features that might turn out to be impossible or harder than estimated.
That question is banned until a board meeting comes up and your owners asks you "What are your plans for the products the next quarter, misters CEO and CTO"?
If you babble around saying plans and deadlines are for suckers and that things will happen when they will happen, I am not sure they will keep paying you and your team's salaries for long. This new contrarian cargo cult of waving away any kind of medium-term planning or estimation is hurting businesses as much, if not more, as when all projects were waterfalled and timeboxed to death.
Yes, estimating is hard to get right. No, it does not mean we should simply get rid of it. Because engineers do not live in a little bubble of code: there is usually a whole company around them who need insight into what is getting built and when they can expect it. And anybody who believes that asking this question is too much or intrusive has never worked in a non-engineering position, or think of themself too much.
The mistake here is making your measurements too granular. What happens is the board meets, long and medium term plans are made, etc etc. That's all good. Then what happens is every level of management calls in the banners and forces them to pitch how they align their teams and projects to whatever was pitched one level up. Somewhere along the process, you hit the point where planning stops being a positive and starts being a liability.
This process never stops, until you have some toothless PM slave driving devs because they're 8 levels removed from the strategic planning and they've pitched a 40-point/week development rate to their boss because they don't have enough information to do anything else. Anybody that tries to break the pattern is going to find themselves on the receiving end of an unfavourable performance review in any big business I've ever worked for.
There's a happy middle ground somewhere, where devs are given plenty of space to work but their output is still tied to the business's long term objectives. How you reliably reach that middle ground is probably a 8/9/10 figure question depending on the business.
Anyone relying on a medium-term estimate about a software project has clearly never been near one.
A software system is an objective reality. It doesn't care what we want from it; it simply is. We can have intuitions about it, and use those intuitions to generate hypotheses about what will work, and ask the compiler / test suite to check them. But with a system of any real complexity, we're going to encounter surprising answers sometimes. We can't know how surprising, or how often, how long it will take to make sense of them, or what the ramifications will be for the rest of the project, until we actually get there.
It's not like we were producing useful estimates and then decided against it. They were always lies. Refusing to provide a medium-range estimate is finally telling the business the truth, that we have no idea. Instead we can tell you what we do know: the functionality we delivered last sprint, and the functionality we're going to work on next sprint. That is what you pay us for. The software actually delivered. Not bullshit promises.
But if you really do want a good faith, best effort forecast, then we need to plan the project in as much detail as possible upfront. Lock down the requirements, design the implementation, break down the work, and schedule it. Map out exactly what we're going to touch, so we can at least have a gut feel about how surprising it's going to be. That is waterfall. Own that, and do it right. Don't skip important steps and dress it up in agile clothing.
"Banned" may not be the right word, but I do agree it's the wrong question. Story points and velocity measurements are about giving a reasonable guess about what's "likely" to be completed at any given point in time in the future, with bigger error bars the further out you look. Agile projects don't "complete" so much as a decision is made to release at a given point, and everyone is happier with agile in an environment where there's an expectation of multiple, continuous releases.
Except they are used wrong and not required for agility. How many tasks per time is an observation, a hypothesis about usefulness of projection. Story points? Waste.
> making progress (or it's lack) visible to management.
The challenge is that management is only looking through a small window and everything from promotions to raises to favours are dished out based on that window.
Anything not in view of that window immediately obviously becomes useless to one's advancement in a company, so the window better cover everything which matters and it usually doesn't.
It is reasonable but it is not necessarily the smartest way of going about things.
The question of whom things are visible to is an important one. Your comment says "management", and yeah that's one common way of thinking about it. But managers are just individuals with their own motivations. It is really "the business" that wants visibility in order to best accomplish its goals. There is an assumption that "management" is better aligned with "the business" than the employees they are paying "exorbitant" salaries to. But that may or may not be true, managers are also just individuals with their own motivations. So to the extent that it is true that managers are better aligned with the business, why is that the case? I see a few things that trend in that direction: compensation that is more tied to the performance of the business (through bonuses or what have you), more visibility into the goals and reasoning for decisions being made by leadership, and a better grounding in the dynamics of the business (what is the strategy, how does its market work, how does it make money). The theory is that because of those things (and probably others I'm not thinking of), the managers are better suited to be representing the interests of the business, whereas the engineers are just tools those managers can use to further those interests.
That is one way of doing it, probably the most common way, and it definitely seems like management visibility into work is extremely important in such a setup. But another way might be to align the engineers with the business in the same way managers are: pay bonuses based on the performance of the business, give them visibility into goals and decisions, teach them about the strategy of the business and the dynamics of how it makes money. That is, give the engineers ownership in the same way management has ownership, and trust them to make decisions that are in the interest of the business.
To put this another way: if you trust managers to make decisions about trade-offs that are in the best interest of the business, why can't you trust engineers as well? Are the engineers dumber or shiftier than the managers such that they can't be trusted with this?
The general line of argument here is that the managers have spent time and effort as part of their professional development in order to understand the business, whereas the engineers have dedicated their time towards a more technical focus. This is reflected in how they spend their days - the manager spends time in meetings, whereas the engineer is spending time coding (very broad generalization)
Assuming that neither side is doing significant amounts of extracurricular work to bridge the gap, it therefore follows that management understands the business better, because that is explicitly the purpose of their job.
In a best case scenario, it makes sense to provide engineers etc. with the context to understand the decisions. On the flip side, business decisions are often made upon a huge heap of context that is generally invisible to the engineer (unless they spend an equal amount of time in building up their business knowledge, spending time in meetings, etc. - but at that point, they're basically a manager?)
It's not to say that engineers can't develop the background, etc. to make the decisions, but that it's not exactly part of the expected job functions, and not something that's explicitly looked for as part of the recruitment process. In a sense, this is a bit of a tautological issue.
Yep, I do think this specialization argument has a lot of merit, and re-reading my comment I think I downplayed it too much.
What I'm saying is: what if we rethink specialization on this front? What if we expect it to be part of everyone's job to have deep understanding of and context on the business? Recognizing that there are significant efficiencies in specialization that we lose with this approach, maybe it still comes out ahead. Maybe the improvement in decision making outweighs the inefficiencies introduced by requiring everyone to be a businessperson.
Personally, I think this is a really great way to run a business, if you can get broad buy in for it. The best places I've worked have been those that most closely approached this structure.
But I will admit that this is biased by having a personality where I neither want to be "just" an engineer, nor give up engineering entirely in order to be a businessperson. I think lots of engineers don't want the business stuff to be part of their job, and I guess I can understand that, even if I lament it.
One interesting question is: if everyone is acting with the business ownership mindset more commonly expected of managers, what are managers supposed to be doing? My answer is that front-line supervisor managers are there to support their reports, help them get and understand the information needed to make the right decisions. At the higher level, managers are there to make decisions on tough cross-team and cross-organizational trade-offs. At the very top level, they are there to set and communicate the overarching strategy of the business, which everyone else should be using to inform their own day to day decisions. I think there is plenty of that kind of work to be done, without expecting managers to be making the day to day decisions.
That's a really interesting idea that I never thought about. Seems to make a lot of sense.
As far as business deserves to know what they are paying for, that is absolutely correct as well.
The problem is that business is not paying for story points. They are paying for actual product and delivery. So when they measure output by story points, you end up with a situation where teams are trying to maximize story points delivery, etc. as opposed to actual product delivery.
> business is not paying for story points. They are paying for actual product and delivery.
but that's the whole problem, if you constrain time, cost and result there's no any space left for agility.
the point of agile is to maximize the value of what can be produced with a given team within a given project, you picked time and cost and you apply agile to emerge stories that are actually important to you and push back on the gold plating.
if you have all three fixed, waterfall works just fine, actually even better.
If outsiders want to see whats going on in an Agile team, all they need to do is run the "working software". They don't need to see all the ugly bits under the covers.
Having working software is line two of the Manifesto. They can run it each week and see whats new.
Even better, they can provide feedback on the working software and the team can then quickly respond to that feedback. In a week or two, when the outsider next looks at the working software they will see the software improved.
The quality of the software is the best metric of whether the process is working but management has no incentive to blame the process when it breaks down and every incentive to blame the people instead
Making visible to outsiders what software development teams are doing, and making progress (or it's lack) visible to management. I thinks that's a completely reasonable expectation.
The devil is in the details and in this case literally because the granularity of progress is too fine-grained to be exposed away from its context. The project manager or whatever you call the nearest person with power over your time needs some leeway to organize, ask for some extra, compensate, forgive, and reward you, without higher powers micromanaging. Freedom is the premise of responsability.
Absolute evil happens when customer demands access to hourly activity log for each developer in the (not on premises) project.
> I thinks that's a completely reasonable expectation.
Most often it is not. The problem is that management does not understand shit about software. And they usually don't care to learn about it. Hence this is about as effective as me asking for transparency regarding a COVID-19 vaccine. Yes, the practitioners could tell me they are in a phase-II trial. But to process that information fully and put it into context, I'd effectively need to study medicine.
I've come to believe that that main driver for Agile adoption has come to be something similar. Making visible to outsiders what software development teams are doing, and making progress (or it's lack) visible to management. I thinks that's a completely reasonable expectation. Businesses are paying exorbitant salaries and providing ping pong tables, why shouldn't they have visibility into what's being done?
Where it becomes toxic is in areas this posts parent indicates; posturing, one upsmanship, pressure to perform. Effective teams need "safe spaces" to learn, discover, try and fail. Agile isn't that anymore. Hasn't been for a long time.