Hacker News new | past | comments | ask | show | jobs | submit login
Why Scaling Bitcoin with Sharding Is Very Hard (2015) (petertodd.org)
72 points by elmar on Dec 10, 2017 | hide | past | favorite | 23 comments



this is pretty good. Key line for me:

> How do you prove that a coin has validly been spent? First, prove that it hasn’t already been spent! How do you do that if you don’t have the blockchain data? You can’t, and no amount of fancy math can change that.

and also, something Bitcoin devs realise that Bitcoin fans often don't seem to:

> On the other hand, decentralization isn’t cheap: using PayPal is one or two orders of magnitude simpler than the Bitcoin protocol.


> How do you prove that a coin has validly been spent? First, prove that it hasn’t already been spent! How do you do that if you don’t have the blockchain data? You can't

There is actually a trivial way to do it. We could prune ~98% of the blockchain data by committing to a UTXO set. We call this UTXO set commitment. Here is a very brief overview of how it works (this is not classic "pruning", and this is something I wrote before being aware of UTXO set commitment, but still it's a good introduction to the general concept):

https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017...

I'm actually surprised Peter doesn't touch the subject since he's been researching this idea himself. IMHO, UTXO set commitment is one of the most promising, if not most promising way to scale the blockchain. Sadly it seems to be under-researched compared to other alternatives.


It was the great hope early on, but it turned out to be impractical to implement.


What are the practical issues?

Serialization of the UTXO set? Dealing with large rollbacks? Time spent computing the hash?

None of those seem impractical.


There's some detail here:

TXO commitments aren’t a new idea - the author proposed them years ago2 in response to UTXO commitments. However it’s critical for small miners’ orphan rates that block validation be fast, and so far it has proven difficult to create (U)TXO implementations with acceptable performance; updating and recalculating cryptographicly hashed merkelized datasets is inherently more work than not doing so. Fortunately if we maintain a UTXO set for recent outputs, TXO commitments are only needed when spending old, archived, outputs. We can take advantage of this by delaying the commitment, allowing it to be calculated well in advance of it actually being used, thus changing a latency-critical task into a much easier average throughput problem.

https://petertodd.org/2016/delayed-txo-commitments


Thanks! That is pretty much what I was looking for.


Why couldn't you use a sufficiently large bloom filter to determine if a coin has been spent?


How do you verify the authenticity of the bloom filter?


That’s not really a problem – anyone could verify it by downloading the full blockchain. The important thing is that you don’t need the full blockchain to get a negative answer from a bloom filter, which helps.

The problem happens when you get a positive answer from the bloom filter; i.e. a coin might have been previously spent. As far as I know you then still need the full blockchain to verify.


Some coins do sharding by not using a blockchain at all. e.g. RaiBlocks [1] uses a "block-lattice" approach. It is a Proof-of-stake-secured coin.

[1] https://raiblocks.net/media/RaiBlocks_Whitepaper.pdf


There's also the Iota "Tangle": https://iota.org/IOTA_Whitepaper.pdf


Sharding isn't any easier with the Tangle.


Proof of stake is not trustless.


Indeed, that's the case. It's an important distinction, but one which may not matter as much as it used to.


It's hard but not necessarily impossible. Here's Ethereum's sharding FAQ:

https://github.com/ethereum/wiki/wiki/Sharding-FAQ


Hi Dennis, a friend recommended you to me for Ethereum smart contract audit, are you open/available for a project?


Willing to talk at least, how do I contact you?


The fact that it isn't working yet, if it ever works at all, is a good indication of the problems faced.


Yes it's not implemented yet. I think Vitalik said the first phase of sharding is planned to be implemented later next year. So we'll see.



That is not a sharded blockchain and shares the same fundamental restriction: all nodes must verify all blocks.

(though it does offer some nice improvements, we're taking factors of 2-5, not 100-1000)


Bitcoin is over.


... $15,000




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: