Hacker News new | past | comments | ask | show | jobs | submit login

I think its a better deterrent to just release software updates often. Of course you need to give the user a compelling reason to want to update.



In the old days (80's and early/mid 90's) where people would distribute small and simple patches to disable protections (i.e. "crack" executable files), a fast release cycle for the software thwarted the simple cracks. This situation did not last long. The crackers started using more sophisticated patching techniques, like search-string-patching and key generators.

The task of maintaining on-going disassembly across multiple release versions of some software is actually straight forward. The "dumb" (but useful) way to do it is by finger-printing all of the subroutines in the old disassembly, and then using the fingerprints to identify the similar routines in the new disassembly (IDB2PAT). The "smart" way to do it is the graph theoretic approach of Halvar Flake.

Anyone in the Anti-Virus or compatibility industries can confirm both the capacity and the need to maintain disassemblies across multiple versions of software.

Pumping out a relentless stream of new versions of your software is no longer a deterrent, and hasn't been for over a decade.


In the 80s/90s where software companies able to develop, test and distribute updates as efficiently as the warez community?


I think it's an unfair question. The creators are always at a disadvantage since the replicators always leverage and reuse the efforts of the creators.


I'm sorry I did not mean to be unfair. I was curious if the companies were able to distribute updates pre-broadband. I can remember downloading the twenty something floppies for os2 over a dialup.


At one point in time, software companies sent updates on magnetic tape through the postal mail. One of the most clever hacks I've read about was when a group doing penetration testing mailed a fake (back doored) update tape to the target.

When it comes to the efficiency of distribution, it's best to think of it in terms of the constraints and requirements.

Without a way to duplicate and distribute their products to customers, software companies could not exist, so the capacity to duplicate and the ability to distribute are both requirements.

Those very same duplication and distribution methods used by the company can also be used by others to further (re)distribute additional copies.

The difference is, the software companies are operating under the constraint of needing to make a living by selling copies of their products, so there's really no way to make a fair comparison on the efficiency of the methods used by the companies versus those people making additional copies. You're essentially comparing farmers to chefs; one produces food, while the other prepares the food.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: