Dynamic blocksize question

As it stands, the blocksize penalty is calculated using the last 100 blocksize average (3h20m). This prevents spamming the network and cramming as many transactions as possible during peak time, but on the long term is does little more than smoothing out the rate of blocksize increase with no real limit.

Let's pose that in a near future, our hopes materialise and Monero sees widespread adoption across the world in everyday transactions: what prevents the blocksize from reaching insane levels, preventing anyone other than data centers from running nodes? Wouldn't that pose a massive centralisation problem?

As an example to illustrate my concern: in order to manage the number of transactions Visa currently processes, the blocksize would have to be about 625MB if my calculations are correct.

submitted by /u/LuckyJumper
[link] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *