Answering a Journalist’s Bitcoin Scaling Questions

John Carvalho
9 min readFeb 14, 2019

Below is an email exchange I had with someone apparently tasked with aggregating text about Twitter fights about Bitcoin stuff. Enjoy!

INTERNET TEXT-AGGREGATING PERSON:

“I’m a writer for a tech website … specifically for its cryptocurrency and blockchain section ... Hope you’re well and all that.

I’m talking to a few Bitcoin community members about Luke Dashjr’s proposed soft fork to decrease the block size that’s gaining traction. Would I be able to get you to let me in on what the hell is going on here?

Is this serious, and if so, what’s the point?

How realistic that a soft fork would be initiated to do something like this? I’d just like to try to clear up what the deal is here for readers, sometime this week would be cool, but I understand if you’re busy.

Thanks”

POPULAR BITCOIN PERSONALITY:

“Thanks for reaching out. I’d be happy to help provide any insight about Luke’s soft fork ideas to reduce block size and my motivations for giving it more attention. To answer your questions so far…

Would I be able to get you to let me in on what the hell is going on here?

Luke has suggested for years now that a smaller block size is necessary to keep Bitcoin secure and decentralized to his standards. I’ve been watching and participating in those conversations, on and off, over time and decided recently that it is something that should be explored further. Although we “won” the Segwit2x/BCash battle, that win came with an effective increase in Bitcoin’s blockspace capacity. This has resulted in increased overhead to running a Bitcoin full node, and I’m just wondering whether we overshot the capacity needs with that change, and whether we actually need the extra space if 2nd layer tech (Lightning Network) and planned optimizations continue to prove themselves.

Is this serious, and if so, what’s the point?

Well, I think Luke is very serious about the concept, but more research is needed to design a proposal with parameters that might have a chance of reaching consensus. The point would be to make running a Bitcoin full node easier in the future for users with more constraints on their ability to dedicate hardware and internet bandwidth to running the full version of Bitcoin, which is the only way to use Bitcoin in a trustless way. For example, if Bitcoin is too difficult to run as a full node, the network will trend to trusting custodians and using wallet software that doesn’t fully validate without trust. This can snowball over time, causing various systemic risks related to centralization, both for new personal users, and for new small business entrepreneurs.

How realistic that a soft fork would be initiated to do something like this?

Depends on who you ask! Any change or addition to Bitcoin’s protocol rules is extremely difficult to reach consensus on. In the end, the only way to know for sure is to do sufficient research, make a good proposal, communicate all of the facets and considerations clearly, and supply the new software as an option for people to support and run. Right now, there isn’t a lot of support for blocksize adjustment ideas because many see it as a controversial and sore topic. We all still feel the bruises from the Segwit2x/No2x/BCash debates.

However, the biggest gain from that battle was getting Segwit and Lightning Network technology live. This enabled new efficiencies and scaling capabilities that no other cryptocurrency can achieve when paired with Bitcoin’s unmatched security and decentralization. The Lightning Network has opened a new valve on Bitcoin for instant and extremely low-fee transactions. As I work more and more with the team at Bitrefill.com, I see how great Lightning really is, and I can already see it’s going to be a juggernaut for Bitcoin and onboarding commercial businesses and customers into the Bitcoin ecosystem. That is why I feel a little safer entertaining proposals like Luke’s and sparking public debate on reducing the block size. I just want to make sure we are always keeping the first layer as decentralized and resistant to attack as possible.

In the end, no one person gets to choose the future of Bitcoin, we all have to agree on change for it to happen.

If you have any follow-up questions or would like to talk over voice sometime, let me know!”

INTERNET TEXT-AGGREGATING PERSON:

Thanks so much for the detailed responses. I’ve got some followup questions to dive a bit deeper into the specifics:

1. Do you think that 300KB blocks really help decentralisation? Is keeping the chain smaller really important to help get everyone to own their own nodes to ensure they are trusting the least amount of people when they are using Bitcoin as it is meant to be used? Is 300KB small enough to make a difference? This is one tweet that was shared with me.

https://twitter.com/hodlonaut/status/1095442902535176192

2. I’m working with an understanding that more nodes would also reduce the cost of on-chain transactions (smaller block size means less transaction capacity), which makes higher transaction fees.

To me, it would seem that miners would be keen to support this proposition, as more nodes means more adoption, which increases Bitcoin’s value proposition (increasing their rewards), and they get more transaction fees in the process. Is that a correct line of logic to you?

3. Is there an argument to it being more of a technical barrier that is keeping people from running their own nodes, rather than a higher cost of mining equipment?

4. How does a 300KB block size limit fit in with the Lightning Network in the efforts to scale the network, as well as the Lightning Network’s precursor, SegWit?

5. Some other soft fork proposals that have been highlighted to me are SIGHASH_NOINPUT, taproot, graph root and Schnor, are you interested by any of those as well? If you do issue a formal proposal for a soft fork, do you think the 300KB block size limit is higher or lower in terms of priority for the network?

Thanks for your time, I do appreciate it”

POPULAR BITCOIN PERSONALITY:

1. Do you think that 300KB blocks really help decentralisation? Is keeping the chain smaller really important to help get everyone to own their own nodes to ensure they are trusting the least amount of people when they are using Bitcoin as it is meant to be used? Is 300KB small enough to make a difference? This is one tweet that was shared with me.

“As a general rule, all reduction in Bitcoin’s footprint improves decentralization. I suppose there are diminishing returns at some point, but we aren’t there yet. Much of the work Core developers do is to find ways to optimize the network and minimize the size of transactions and improve scaling. The whole battle of segwit2x was basically over this concept. We didn’t want to hard fork and scale the blocksize limit directly because it contributes to centralization. Instead we chose to facilitate a layer 2 and more efficient transacting with Segwit.

Bitcoin’s footprint (chain size, bandwidth, processing load, etc) has direct effects on the quantity of nodes, particularly at entry levels. For example, my current desktop computer is always close to full because of Bitcoin, putting me literally at the threshold of having to spend another $50+ on drive space and some hours moving my install, or uninstalling it and only keeping my archived copy. This extends to affecting small businesses even more, if you run Bitcoin as part of a hosted server setup with customer accounts, Bitcoin will use up a LOT of resources to access the database and perform tasks to actively work with your web wallet or other Bitcoin features. If you try to do too much, you are left having to buy an additional server just for hosting Bitcoin for your business, which, for some small startups equates to nearly a doubling of expenses.

These may sound like edge cases to people with more resources, but that is the very nature of centralization. Centralization brings efficiency, but it brings censorability and trust requirements with it.

2. I’m working with an understanding that more nodes would also reduce the cost of on-chain transactions (smaller block size means less transaction capacity), which makes higher transaction fees.

The quantity of nodes doesn’t have any notable effect on transaction fees. The quantity of nodes improves network propagation, stability, and, assuming the nodes are “real” users, it improves enforcement of the protocol and overall decentralization. A smaller block size could affect transaction fees, but it’s important to understand that blocks will always fill because demand for blockspace is essentially infinite. They may appear less-than-full often, but that is only because no one is bothering to fill them with “spam” transactions at the moment. *There’s technically no such thing as true spam in Bitcoin, what I’m referring to is transactions that are a negligible fee to include* This is why we needed layer 2, providing the Lighting Network. LN opens up exponential scaling capability by moving transactions off of the permanently stored chain, but still being slaved to it in a way that does not require trust or custody, and still leverages the security and censorship resistance of the base layer (plus it allows transactions to be instant and super-cheap!)

To me, it would seem that miners would be keen to support this proposition, as more nodes means more adoption, which increases Bitcoin’s value proposition (increasing their rewards), and they get more transaction fees in the process. Is that a correct line of logic to you?

In theory that is true and I agree. I suspect that Bitcoiners that are scared of this proposal would be surprised that it aligns with miner incentives and might get more support than they are predicting. If Bitcoin is already scaling well, AND allowing instant cheap txns on Layer 2, that means we can finally onboard merchants and consumers and use it as p2p cash — something that was just an inaccurate meme before, but is now true with Lightning. This vastly improves the value of the chain and means we need miners and mining difficulty to scale with that value. If we start having 100x more value transacting on Bitcoin layers, we will need to protect it, and we will need fees high enough to draw in the miners to do it. Or… we at least need the Bitcoin price to go up as a factor to offset whatever is lacking in fees. Luckily Bitcoin has lots of pillars to lean on in its incentive structure, allowing things to load-balance as necessary.

3. Is there an argument to it being more of a technical barrier that is keeping people from running their own nodes, rather than a higher cost of mining equipment?

Is there an argument about UX/UI that refutes this block reduction proposal, no, not in my opinion. However, I do think more attention should be paid to the UI of Bitcoin Core and that a more slick interface, with more support for features within the UI, would improve stickiness and reduce some mental friction. But the actual steps to loading a node for the first time couldn’t get much more basic.

4. How does a 300KB block size limit fit in with the Lightning Network in the efforts to scale the network, as well as the Lightning Network’s precursor, SegWit?

Well, the actual optimally small limit is unknown and I think more research is needed. There’s a complexity with Segwit because it converted the way blockspace is measured to a “weight” instead of a simple size. So there is a .25 factor applied. In the end, what I want to see is this discussion spur the community to establishing good theory and research for determining the minimum blocksize limit within the current segwit/layer 2 paradigm. People need to get used to using and supporting LN, and saving L1 blockspace for more settlement-layer style transacting.

5. Some other soft fork proposals that have been highlighted to me are SIGHASH_NOINPUT, taproot, graph root and Schnor, are you interested by any of those as well? If you do issue a formal proposal for a soft fork, do you think the 300KB block size limit is higher or lower in terms of priority for the network?

I personally have no plans to submit or contribute to any proposals in any direct or official way. I am focused on my work at Bitrefill and helping it fulfill a role in demonstrating Bitcoin and LN usage and utility through commercial usage.

The other technology schemes you just mentioned are mostly unrelated to the concept of reducing the blocksize. Some improve privacy, some enable more complex utilities and reduce footprint in other ways. They are compatible to or complimentary to Luke’s priorities.

I think Luke will have trouble getting people all the way down to 300kb. The proper approach is to show research and for him to get more input from experts about what an optimal reduction would be. From a social standpoint, Luke would probably have an easier time packaging this as a blocksize “reversion” and getting the effective limit back to down 1MB).”

INTERNET TEXT-AGGREGATING PERSON:

(completely disregards second round of questions & answers and the deadline of “sometime this week” and posts half an article with errors and misattributions and doesn’t bother getting/finding any quotes from Luke)

POPULAR BITCOIN PERSONALITY:

(posts this blog, which is hopefully more useful and interesting than the article)

--

--

John Carvalho

This is a blog about how Bitcoin dynamics and how people interact with it. I am currently CEO at Synonym.