Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Software Development, the Pareto Principle, and the 80% Solution (projectricochet.com)
92 points by twobitshifter on May 2, 2022 | hide | past | favorite | 40 comments


I hate this pareto principle.

It makes sense when applied correctly but in our company it's viewed as a 'golden rule' and it means whatever they want it to mean. They call it the "80/20 rule" and to our project/program managers it means they only have to do 80% of their job.

Basically the "20%" is always the part they don't want to do or they consider too hard. When applied this way it leads to half-assed solutions that leave out significant usecases, meaning the old solution will still need to be kept around. In the end this leads to a messy environment and significant technical debt.

The principle as I see it, is not meant as a cop-out at all. Rather it's a warning for project managers to budget and plan carefully because some features may be much heavier on resources than others. It's not an "oops we stumbled onto a hard problem, oh well, 20% so let's drop it". As the article states:

> 80% of your application (if you could get away with only doing this 80%) could be developed in 20% of the time and budget.

This check "if you could get away with..." is always automatically dropped in our company.

It makes sense as a principle but as a rule it leads to mediocrity and incompetence. I often wonder whether project managers are so consistently bad in other companies too :)

PS their latest excuse by the way is 'agile'. "We can worry about this problem later because we're agile". Not to mention that we don't actually use agile methodology at all. They just have us log our hours in Jira. Besides that our projects are simple waterfall projects. A tool does not a methodology make.


I often use the 80/20 rule to push back against requirements from product managers.

Identify the features that are hard to implement and provide little business value then cut them out. For the things that are important, I see if there are any adjustments to the requirements to make it simpler.

It usually an easy sell because it results in getting out the product faster. We can always do the nice to have stuff as a follow up, but more than half the time, they realize they never needed it.

If done strategically, it can prevent adding tech debt in the system.


Here's a productivity hack for your managers:

1. Start with a task.

2. Recognize you can complete 80% in 20% of the total time.

3. Define this 80% as a new task.

4. Goto 1.

A significant amount of the project will be done in no time!


That's effectively where the pareto principle comes from: that many distributions tend to follow the power law [1] [2] [3]

[1]: https://en.wikipedia.org/wiki/Power_law

[2]: https://en.wikipedia.org/wiki/Pareto_distribution

[3]: https://en.wikipedia.org/wiki/Zipf%27s_law


that's agile! Except step 3 must be done by the stakeholder.


Sounds like 20% of management is wasting 80% of everyone's time.


>80% of your application could be developed in 20% of the time and budget.

Does it work like this? My understanding is that the Pareto Principle is that it is a natural distribution that is observed not something that is declared and then implemented.

If anything, this would be "80% of the time/cost comes from 20% of the features", which means if you wanted to cut costs down to 20%, you would severely cut down on features, unless you have a foolproof way to identify in advance which features will contribute the most time/cost. But this is often like having a crystal ball.


If it is a natural distribution always present in software projects it follows that if you got rid of 20% of the project that was the most expensive in time and money to develop, the remaining 80% would now be 100% of the completed project and of that 100% of the completed project 20% of it would have taken up 80% of the time and money spent developing it.

If this is the case it is not an especially interesting or powerful concept for managerial purposes.


In my experience, skimping on error handling around edge cases will cause 1 to 3 orders of magnitude more time on support requests. In the worst case, I've seen an entire dev team spend 20% of the team on manually editing the database to patch over incomplete code. I saw this go on for 2 years before I upgraded to a better job. Management would never spare the time to fix the underlying issues because it wasn't adding new features.

At another job, I spent an extra month (33% of the dev time) finessing the error handling to new application. In the first few months, a couple of issues popped up. The next six months after needed less than a day's worth of support request. I've never had an application run so smoothly in my career.


The numbers vary, but getting a working minimum viable product is a tiny fraction of a business, and transmuting that into a going concern is far more work than getting it kind of working in the first place.

The path is tricky and there's existential danger on both sides: a chasm of endless rewrites on one and a mire of endless technical debt on the other. Both can torpedo a project ever reaching a steady state.


> To be sure, it’s not always exactly 80/20. Sometimes it’s 90/10. Sometimes it’s 70/30.

Note that the numbers do not need to add up to 100. Sometimes it's 50/50, 60/50, 95/90, π/e, ...

Sometimes you have no choice to go 100%, for example if you want to reach the summit of a mountain. It then makes no practical difference when you reached 80% of the height in 20% of the time (or whatever).

To make use of the Pareto Principle, and I do it a lot, means to permanently re-evaluate one's to-do list and switch task when someting else seems more promissing. This still leaves many questions open, such as: "Promissing in the short or the long run?", etc. Applying the principle is an art not a science.


The hard problem is not declaring that the Pareto distribution applies; the hard problem is identifying the critical 80%.

If anyone could do that accurately and at low cost, all of our problems would be solved.

(Often what happens, in my experience, is that the team spends 50% of the cost debating which 80% to implement.)


Nice article with many points I agree with from a practical management perspective.

There are hidden dangers to Pareto logic though. Not fully understanding the value of the remaining 80%, and how it in turn is distributed, can lead to suboptimal or catastrophic decisions.

One that strikes me from software engineering, IIRC Sommerville's analysis, was that 80% of the cost of projects lies in the "maintenance phase". However, with software each work unit breaks down, fractally, into a "building" stage and a "sustenance" (debugging, refactoring, integrating...) that looks like an overhead. Modern SE kicks that can down the road so that we end up with software that is only 20% finished (but what a wonderful 20% that is), and 80% of problems made into deferred externalities of downloading updates, security fixes and so on.

Another problem happens when we apply Pareto selection iteratively. 20% of 20% of 20% is less than 1%. It's easy to get carried away with brutal rounds of optimisation.

Not understanding how the low value margins support the seemingly most productive minority is a classic judgement error best told by the Aesop fable of the stomach;

" One day it occurred to the Members of the Body that they were doing all the work while the Belly had all the food. So they held a meeting and decided to strike till the Belly consented to its proper share of the work. For a day or two, the Hands refused to take the food, the Mouth refused to receive it, and the Teeth had no work to do. After a day or two the Members began to find that they themselves were in poor condition: the Hands could hardly move, and the Mouth was parched and dry, while the Legs were unable to support the rest. Thus even the Belly was doing necessary work for the Body, and all must work together or the Body will go to pieces."


It becomes really apparent when you develop an app via the easiest possible path on every feature and then get trounced by another company that did it the hard way and in ways customers hadn't even imagined beforehand but now love. You'll be too far behind and too under-featured to ever catch up. Paradigm shifts are how markets are created and destroyed, after all, so it is quite possible to be under-prepared.


> By triangulating these [...] numbers [...], I can tell if a software engineer is truly fast, or just simply slow.

It seems that Mr. Casey Cobb is more interested in micromanagement of his engineers and monitoring of their performance than value the experience of them.

Yes, he talked about having a good understanding of the topic is crucial, but engineers that have 95% of the time a correct estimate rises a red flag. I guess the engineers have a quite narrow scope to implement things in so it's quite "easy" to estimate the cost of the feature in question.

Let me tell you from my day-to-day work: I'm honest and transparent with my workload but because of the wide scope I have to operate in my estimates are more or less accurate. But that's ok! Because of the planning we're doing, the communication we're having and the transparency we live we can always explain why things go so fast or slow. We're adjusting the goals as required and that's it. No big management, no micromanagement, no one that sits on your shoulder just to ask if you're truly fast, or just simply slow.

We have only engineers in our team that want to build good software so we can skip the fingerpointing and poisoning (not that project ricochet has that problem.).

> ... if you're truly fast, or just simply slow

oh man...


20% of this article holds 80% of the valuable content


There is a financial incentive to over-complicate your project or job because it's job security to make a mess. When I call for parsimony (KISS, YAGNI, DRY, etc.), I am often given the stink-eye by IT staff. It's somehow "supposed to be" complicated.

Warren Buffett often says a key secret to his success is saying "no" to all the temptations, fads, and BS that come his way or come the financial industry's way. In software, lemmings are following the bloat and fads because it's usually the customer that pays the price when software projects careen off the edge of a cliff, not the lemmings. Developers often use production applications as training grounds for the latest fads so they can get a bigger job somewhere else.

Another problem is that web-standards are a poor fit for most everyday CRUD, requiring bloated buggy JS ui libraries to emulate GUI idioms that have been around for 3+ decades. A good state-ful GUI markup standards (and GUI browser/pluggin) would be quite helpful. And before you say Flash & Java Applets already tried that, they had two problems: first, they tried to be an entire virtual OS instead of just a display engine, overcomplicating themselves. Second, they resisted open standards.

I've seen Oracle Forms do small and internal CRUD cheap and quick with small quantities of code. It may be esthetically ugly, but there was something keen about it as far as developer productivity. If you tell fashion to go to Hell, you can simplify much of dev.

It was or acted like a "GUI browser" in that you install one client and then run gajillion apps off it, which is something desktop-based IDE's like Visual Basic, Delphi, and Clarion couldn't do. Oracle Forms had warts, but none that couldn't be fixed if somebody cared to fix them. (Oracle messed it up when they converted the client(s) from C to Java, as client-side Java made lots of mistakes.)

And for all the talk of making apps mobile-friendly, too few use it in office work in my observation. Maybe a roaming sales force, but few others. YAGNI was shot bloody dead by bloating our UI libraries to cater to mobile. Bootstrap's cat-like personality can kiss a pack of hungry dogs.


> And before you say Flash & Java Applets already tried that, they had two problems: [...] Second, they resisted open standards.

Flash was partially opened by Adobe: https://en.wikipedia.org/w/index.php?title=Adobe_Flash&oldid...

The SWF file format was documented by Adobe (and the documentation was freely available), many components of Flash were open-sourced, and the Flex framework was released as open source.


For some reason no other respected company made their own Flash or Flash-like player with it. HTML took off in part because multiple vendors made browsers such that no one vendor could hold the industry hostage if they turned evil(er) one day.

Being "open" on paper and open in practice are often 2 different things.


There were such projects, e.g. Shumway and Gnash:

> https://en.wikipedia.org/wiki/Shumway_(software)

> https://en.wikipedia.org/wiki/Gnash_(software)

It is not Adobe's fault that these open-source projects did not succeed and their development ceased.


It is partly Adobe's fault; they played legal games to scare away competitors. Plus, Flash was already falling out of favor by then. Competitors should have started in the late 90's.

Also, cloning Flash also meant cloning its security flaws. To simplify the client and have less holes, shift more of the processing burden to the server side: it's easier to patch servers than user clients.


> For example, a proof-of-concept might literally be rewritten entirely once it is validated.

The experienced engineer does this because they have learned that when someone tells them the code is throwaway, they are probably lying.


If I had a dollar for every proof of concept that I've made that's still in production today serving users I would probably have like 15 dollars.


Prolific!


Only 80% of my db inserts make it to disk, the remaining 20% randomly dissapear into the void. But this is OK because Pareto Principle. It's just savvy business.


  20% of team members can communicate 80% of problem definition, solutions
  80% of team members can only communicate effectively 20% of the time

  20% of the volume of communications (meetings, emails, docs) communicates properly 80% content.   80% of the meetings are redundant.
  80% of the money spent on a project has 20% ROI.
  
after a while this stops making sense.


80% of comments on HN are only read by 20% of its users.

80% of comments read by HN users are only interesting to 20% of the users who read them.

20% of beer drinkers consume 80% of beer drunk

20% of people snoozes 80% of alarms

80% of Pareto Principle jokes are amusing only 20% of the time

The thing with Pareto is that it always sounds right, no matter if it's true.


> 80% of comments on HN are only read by 20% of its users.

Some interesting comments about the 1% rule and HN: https://news.ycombinator.com/item?id=9219581


I always fall into this infinite depth problem when I start thinking of the Pareto principle too. I resolve it by identifying the principle as a soft guideline for prioritizing work, not as an objective truth of everything


You can please 80% of the people 20% of the time, or you can please 20% of the people 80% of the time, but you can't please 80% of the people 80% of the time.


Are you saying the statements you made aren't true?


Those ratios probably vary widely on the real world. Many are probably near 80%/20%, while a few may be around 99.9%/0.1% or 50%/50%.

As you get more data points, your generalization fails more and more often, and some failures are severe, adding more planning risk than everything else added.

Or, in other words, the rule works most of the time. And you can rely on it at all.


there is a common conceit that if you repeat a word often enough it stops making sense, but in this case the statements are, as often is the case with percentages regarding tech, meant to sound meaningful but are probably not as clear as they could be.


“Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”


80% of observed behaviours only reproduce 20% of the time.


The Pareto principle to me feels like hand-wavy psuedo-scientific garbage used to summarize some distribution and simultaneously justify some action by it. It’s like saying things are non-uniform. No shit?


The drip email collector form for the Whitepaper Download is 404!


Setting that up was not part of the 20% effort that delivers 80% of the value ;).


Don’t expect to ever get to the bottom 80 % of the priority list, because new items will be added to the top 20% in the priority list before you finish the initial top 20%.


I worked for Project Ricochet for 3 years, funny to see this here.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: