Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’d love for others with more knowledgeable to chime in, since this feels close to the logical end state for non-user-facing distribution. At a protocol level, content basically becomes a combination of a hash/digest and one or more canonical sources/hubs. This allow any intermediaries to cache or serve the content to reduce bandwidth/increase locality, and could have many different implementations for different environments to take advantage of local networks as well as public networks in a similar fashion as recursive DNS resolvers. In this fashion you could transparently cache at a host level as well as eg your local cloud provider to reduce latency/bw.


Sounds a lot like BitTorrent.


I’m not super well versed, but I thought BitTorrent’s main contribution was essentially the chunking and distributed hash table. There is perhaps a hood analog of the different layers of a docker image.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: