This could be an interesting companion to Helium. Helium-css can be used on live sites to find unused CSS. Combine with this project to remove duplicates too. https://github.com/geuis/helium-css
I'll echo the sibling commenter. We're in the process of rolling out a new site layout at $work and I suspect there is a LOT of leftover CSS rules which I can get rid of. Thanks very much for the link!
Personally I used Chrome dev tools - and it has a feature called 'CSS Selector Profiles' under the 'profile' tab.
You can 'record' yourself using a website - and Chrome records how many 'hits' each css rule gets.
This is better than Dust-me, and all the other CSS rule detectors - because it allows you to use your site in a 'dynamic' way - and test for all the edge CSS cases (like resizing your browser to small, or enabling an error message etc).
After using my SaaS for an hour - I found hundreds of rules that just never got used.
Also - it tells you how many times a rule got used. So I found a few number of rules that literally only got used once, and I could often re-write them into 1-2 bigger rules - further reducing my CSS overhead.
That sounded really cool. So I opened up chrome, navigated to YouTube, opened the dev-tools, turned on CSS Selector Profiles, and chrome immediately crashed.
Yeah, and they lie in their first example. The "example code" is nothing like their real demo code. If it was, you'd be able to use their select boxes if you had JS disabled...
Just a side note, it took me a few tries to hit "OSX." My first two tries were "OS X" and "Mac OS X", before I noticed something was coming up and disappearing when I typed the "OS X"
Also, Chosen only works with the <option> tags, which can be quite annoying. And to me, the Select2 community seems a lot more active and able to participate in the project.
If you're a ruby developer I strongly encourage you to check out the source code and parsing in general. This is heavily using parslet [1] to build a css parser [2]. I'm sure there are edge cases I have missed, but still the LOC for this codebase is relatively small and fairly readable.
Stay away from the RedundancyAnalyzer though. There be dragons.
CSSO [1] is a tool that removes duplicate declarations during minification instead of just warning about them. It also performs more advanced structural optimizations.
I think the purpose of this package is more about making existing source code easier to maintain (applying DRY principals to CSS) than about browser-oriented optimization (thus purpose of CSSO.)
Here is something I came across at PyCon. This technique uses genetic algorithm to minimize the CSS. The author claims 10% improvement over the standard CSS minimizer.
It seems like when run on SCSS mode, it expands mixins before running the redundancy check. For instance, two of my selectors both include three of the same mixins, which end up expanding into 23 rules, and CSSCSS reports 25 shared rules between them. (Further inspection confirms that exactly 2 normal rules are shared.)
I would think that using mixins should also count as having eliminated redundancy. A simple solution would be to ignore them, or more ideally the redundancy check could treat mixins just as a normal rule, so that it could detect using the same set of mixins in multiple places as redundancy.
This has come up a couple times. I wouldn't mind this topic moving it to github so the conversation doesn't get lost. And others can contribute to it long after this gets buried on HN.
My initial opinion is that even though your SCSS code is consolidated, the resulting CSS code is still duplicated all over the place. To me, that is a code smell. Particularly when I need to debug from the web developer tools.