Discussion:
[bitcoin-dev] Dynamic limit to the block size - BIP draft discussion
Washington Sanchez via bitcoin-dev
2015-09-08 07:45:16 UTC
Permalink
Hi everyone,

I know many of us feel that the last thing the Bitcoin community needs is
another BIP related to the block size, but after a lot of reading and
commenting, I'd like to throw this idea out there.

I've already written it up as a BIP and would like some constructive
feedback/suggestions/alternatives related to some of the variables in my
specification:


Dynamic limit to the block size
=======================

The goal is to dynamically increase the maximum block size conservatively,
but allow meaningful relief to transaction volume pressure in response to
true market demand. The specification follows:

- Every 4032 blocks (~4 weeks), the maximum block size will be increased by
10% *IF* a minimum of 2000 blocks has a size >= 60% of the maximum block
size at that time
+ This calculates to theoretically 13 increases per year
- The maximum block size can only ever be increased, not decreased

For example, if this rule were to be instituted January 1st 2016, with a
present maximum block size 1 MB, the limit would be increased to 1.1 MB on
January 29th 2016. The theoretical maximum block size at the end of 2016
would be ~3.45 MB, assuming all 13 increases are triggered.

As the maximum block size rises, so the cost of artificially triggering an
increase in the maximum block size.


Regards,
Wash


-------------------------------------------
*Dr Washington Y. Sanchez <http://onename.com/drwasho>*
Co-founder, OB1 <http://ob1.io>
Core developer of OpenBazaar <https://openbazaar.org>
@drwasho <https://twitter.com/drwasho>
Btc Drak via bitcoin-dev
2015-09-08 08:49:31 UTC
Permalink
but allow meaningful relief to transaction volume pressure in response to true market demand
If blocksize can only increase then it's like a market that only goes
up which is unrealistic. Transaction will volume ebb and flow
significantly. Some people have been looking at transaction volume
charts over time and all they can see is an exponential curve which
they think will go on forever, yet nothing goes up forever and it will
go through significant trend cycles (like everything does). If you
dont want to hurt the fee market, the blocksize has to be elastic and
allow contraction as well as expansion.

On Tue, Sep 8, 2015 at 8:45 AM, Washington Sanchez via bitcoin-dev
Hi everyone,
I know many of us feel that the last thing the Bitcoin community needs is
another BIP related to the block size, but after a lot of reading and
commenting, I'd like to throw this idea out there.
I've already written it up as a BIP and would like some constructive
feedback/suggestions/alternatives related to some of the variables in my
Dynamic limit to the block size
=======================
The goal is to dynamically increase the maximum block size conservatively,
but allow meaningful relief to transaction volume pressure in response to
- Every 4032 blocks (~4 weeks), the maximum block size will be increased by
10% *IF* a minimum of 2000 blocks has a size >= 60% of the maximum block
size at that time
+ This calculates to theoretically 13 increases per year
- The maximum block size can only ever be increased, not decreased
For example, if this rule were to be instituted January 1st 2016, with a
present maximum block size 1 MB, the limit would be increased to 1.1 MB on
January 29th 2016. The theoretical maximum block size at the end of 2016
would be ~3.45 MB, assuming all 13 increases are triggered.
As the maximum block size rises, so the cost of artificially triggering an
increase in the maximum block size.
Regards,
Wash
-------------------------------------------
Dr Washington Y. Sanchez
Co-founder, OB1
Core developer of OpenBazaar
@drwasho
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Ivan Brightly via bitcoin-dev
2015-09-08 12:28:56 UTC
Permalink
This is true, but miners already control block size through soft caps.
Miners are fully capable of producing smaller blocks regardless of the max
block limit, with or without collusion. Arguably, there is no need to ever
reduce the max block size unless technology advances for some reason
reverse course - aka, WW3 takes a toll on the internet and the average
bandwidth available halves. The likelihood of significant technology
contraction in the near future seems rather unlikely and is more broadly
problematic for society than bitcoin specifically.

The only reason for reducing the max block limit other than technology
availability is if you think that this is what will produce the fee market,
which is back to an economic discussion - not a technology scaling
discussion.

On Tue, Sep 8, 2015 at 4:49 AM, Btc Drak via bitcoin-dev <
Post by Washington Sanchez via bitcoin-dev
but allow meaningful relief to transaction volume pressure in response
to true market demand
If blocksize can only increase then it's like a market that only goes
up which is unrealistic. Transaction will volume ebb and flow
significantly. Some people have been looking at transaction volume
charts over time and all they can see is an exponential curve which
they think will go on forever, yet nothing goes up forever and it will
go through significant trend cycles (like everything does). If you
dont want to hurt the fee market, the blocksize has to be elastic and
allow contraction as well as expansion.
Adam Back via bitcoin-dev
2015-09-08 13:13:16 UTC
Permalink
The maximum block-size is one that can be filled at zero-cost by
miners, and so allows some kinds of amplification of selfish-mining
related attacks.

Adam


On 8 September 2015 at 13:28, Ivan Brightly via bitcoin-dev
Post by Ivan Brightly via bitcoin-dev
This is true, but miners already control block size through soft caps.
Miners are fully capable of producing smaller blocks regardless of the max
block limit, with or without collusion. Arguably, there is no need to ever
reduce the max block size unless technology advances for some reason reverse
course - aka, WW3 takes a toll on the internet and the average bandwidth
available halves. The likelihood of significant technology contraction in
the near future seems rather unlikely and is more broadly problematic for
society than bitcoin specifically.
The only reason for reducing the max block limit other than technology
availability is if you think that this is what will produce the fee market,
which is back to an economic discussion - not a technology scaling
discussion.
On Tue, Sep 8, 2015 at 4:49 AM, Btc Drak via bitcoin-dev
Post by Btc Drak via bitcoin-dev
Post by Washington Sanchez via bitcoin-dev
but allow meaningful relief to transaction volume pressure in response
to true market demand
If blocksize can only increase then it's like a market that only goes
up which is unrealistic. Transaction will volume ebb and flow
significantly. Some people have been looking at transaction volume
charts over time and all they can see is an exponential curve which
they think will go on forever, yet nothing goes up forever and it will
go through significant trend cycles (like everything does). If you
dont want to hurt the fee market, the blocksize has to be elastic and
allow contraction as well as expansion.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Ivan Brightly via bitcoin-dev
2015-09-08 13:52:00 UTC
Permalink
Agreed. For this reason, the scaling BIPs which don't allow for easy gaming
such as BIP101, your proposal or Pieter's are preferable for their
predictability and simplicity. Changing the fundamental rules for Bitcoin
is supposed to be hard - why give this power up to a subsection of the
ecosystem in order to make it easier to change or game?
Post by Adam Back via bitcoin-dev
The maximum block-size is one that can be filled at zero-cost by
miners, and so allows some kinds of amplification of selfish-mining
related attacks.
Adam
On 8 September 2015 at 13:28, Ivan Brightly via bitcoin-dev
Post by Ivan Brightly via bitcoin-dev
This is true, but miners already control block size through soft caps.
Miners are fully capable of producing smaller blocks regardless of the
max
Post by Ivan Brightly via bitcoin-dev
block limit, with or without collusion. Arguably, there is no need to
ever
Post by Ivan Brightly via bitcoin-dev
reduce the max block size unless technology advances for some reason
reverse
Post by Ivan Brightly via bitcoin-dev
course - aka, WW3 takes a toll on the internet and the average bandwidth
available halves. The likelihood of significant technology contraction in
the near future seems rather unlikely and is more broadly problematic for
society than bitcoin specifically.
The only reason for reducing the max block limit other than technology
availability is if you think that this is what will produce the fee
market,
Post by Ivan Brightly via bitcoin-dev
which is back to an economic discussion - not a technology scaling
discussion.
On Tue, Sep 8, 2015 at 4:49 AM, Btc Drak via bitcoin-dev
Post by Btc Drak via bitcoin-dev
Post by Washington Sanchez via bitcoin-dev
but allow meaningful relief to transaction volume pressure in response
to true market demand
If blocksize can only increase then it's like a market that only goes
up which is unrealistic. Transaction will volume ebb and flow
significantly. Some people have been looking at transaction volume
charts over time and all they can see is an exponential curve which
they think will go on forever, yet nothing goes up forever and it will
go through significant trend cycles (like everything does). If you
dont want to hurt the fee market, the blocksize has to be elastic and
allow contraction as well as expansion.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Washington Sanchez via bitcoin-dev
2015-09-08 14:02:51 UTC
Permalink
Post by Adam Back via bitcoin-dev
The maximum block-size is one that can be filled at zero-cost by
miners, and so allows some kinds of amplification of selfish-mining
related attacks
A selfish mining attack would have to be performed for at least 2000 blocks
over a period of 4 weeks in order to achieve a meager 10% increase in the
block size.

If there goal is to simply drive up fees to gain acceptance into the block,
we're in exactly the same position we are in today (as in nothing stops a
miner from doing this).
If the goal is to increase the block size to push out smaller miners,
they'll have to perform this attack over the course of years and destroy
any economic incentives they have for mining in the first place.

why give this power up to a subsection of the ecosystem in order to make
Post by Adam Back via bitcoin-dev
it easier to change or game
Well this same could be said for developers trying to predict what the
appropriate block size should be over the next 20 years... it's a hallmark
to a group of bankers trying to predict the appropriate interest rate for
the entire economy. Just as it is impossible to predict the appropriate
hash rate to secure the network, so it goes for the block size. Both need
to adjust dynamically to the scale/adoption of the network.
Post by Adam Back via bitcoin-dev
The maximum block-size is one that can be filled at zero-cost by
miners, and so allows some kinds of amplification of selfish-mining
related attacks.
Adam
On 8 September 2015 at 13:28, Ivan Brightly via bitcoin-dev
Post by Ivan Brightly via bitcoin-dev
This is true, but miners already control block size through soft caps.
Miners are fully capable of producing smaller blocks regardless of the
max
Post by Ivan Brightly via bitcoin-dev
block limit, with or without collusion. Arguably, there is no need to
ever
Post by Ivan Brightly via bitcoin-dev
reduce the max block size unless technology advances for some reason
reverse
Post by Ivan Brightly via bitcoin-dev
course - aka, WW3 takes a toll on the internet and the average bandwidth
available halves. The likelihood of significant technology contraction in
the near future seems rather unlikely and is more broadly problematic for
society than bitcoin specifically.
The only reason for reducing the max block limit other than technology
availability is if you think that this is what will produce the fee
market,
Post by Ivan Brightly via bitcoin-dev
which is back to an economic discussion - not a technology scaling
discussion.
On Tue, Sep 8, 2015 at 4:49 AM, Btc Drak via bitcoin-dev
Post by Btc Drak via bitcoin-dev
Post by Washington Sanchez via bitcoin-dev
but allow meaningful relief to transaction volume pressure in response
to true market demand
If blocksize can only increase then it's like a market that only goes
up which is unrealistic. Transaction will volume ebb and flow
significantly. Some people have been looking at transaction volume
charts over time and all they can see is an exponential curve which
they think will go on forever, yet nothing goes up forever and it will
go through significant trend cycles (like everything does). If you
dont want to hurt the fee market, the blocksize has to be elastic and
allow contraction as well as expansion.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
-------------------------------------------
*Dr Washington Y. Sanchez <http://onename.com/drwasho>*
Co-founder, OB1 <http://ob1.io>
Core developer of OpenBazaar <https://openbazaar.org>
@drwasho <https://twitter.com/drwasho>
Adam Back via bitcoin-dev
2015-09-08 14:18:03 UTC
Permalink
A selfish mining attack would have to be performed for at least 2000 blocks over a period of 4 weeks in order to achieve a meager 10% increase in the block size.
You seem to be analysing a different attack - I mean that if someone
has enough hashrate to do a selfish mining attack, then setting up a
system that has no means to reduce block-size risks that at a point
where there is excess block-size they can use that free transaction
space to amplify selfish mining instead of collecting transaction
fees.

Adam
Washington Sanchez via bitcoin-dev
2015-09-08 15:10:54 UTC
Permalink
1) It's not really clear to me how that would work, but assuming it does
then it will go into a basket of attacks that are possible but unlikely due
to the economic disincentives to do so.

2) That said, is the Achilles heal of this proposal the lack of a mechanism
to lower the block size?

3) Let me put it another way, I've read that both Gavin and yourself are
favorable to a dynamic limit on the block size. In your view, what is
missing from this proposal, or what variables should be adjusted, to get
the rules to a place where you and other Core developers would seriously
consider it?
Post by Washington Sanchez via bitcoin-dev
A selfish mining attack would have to be performed for at least 2000
blocks over a period of 4 weeks in order to achieve a meager 10% increase
in the block size.
You seem to be analysing a different attack - I mean that if someone
has enough hashrate to do a selfish mining attack, then setting up a
system that has no means to reduce block-size risks that at a point
where there is excess block-size they can use that free transaction
space to amplify selfish mining instead of collecting transaction
fees.
Adam
--
-------------------------------------------
*Dr Washington Y. Sanchez <http://onename.com/drwasho>*
Co-founder, OB1 <http://ob1.io>
Core developer of OpenBazaar <https://openbazaar.org>
@drwasho <https://twitter.com/drwasho>
Andrew Johnson via bitcoin-dev
2015-09-08 16:46:33 UTC
Permalink
I rather like this idea, I like that we're taking block scaling back to a
technical method rather than political. BIP100 is frightening to me as it
gives a disproportionate amount of power to the miners, who can already
control their own blocksize with a soft cap. It also seems silly to worry
about a selfish mining attack if you're going to institute a miner vote
that an entity with that much hashrate can noticeably influence anyway.

101 is better but is still attempting to make a guess as to technological
progression quite far into the future. And then when we do finally hit 8GB
we will need yet another hard fork if we need to go bigger(or we may need
to do it earlier if the increase schedule isn't aggressive enough). And
who knows how large the ecosystem may be at that time, a hard fork may be
an undertaking of truly epic proportions due to the sheer number of devices
and embedded firmware that operates on the block chain.

I've done no math on this(posting from mobile) but something similar to
this would be reasonable, I think. Unbounded growth, as Adam points out,
is also undesirable.

Every 4032 blocks (~4 weeks), the maximum block size will be decreased by
10% *IF* a minimum of 2500 blocks has a size <= 40% of the maximum block
size at that time.

This requires a larger threshold to be crossed to move downwards, that way
we hopefully aren't oscillating back and forth constantly. I'll try to do
some blockchain research sometime this week and either back my plucked from
the air numbers or change them.

Andrew Johnson
On Sep 8, 2015 10:11 AM, "Washington Sanchez via bitcoin-dev" <
Post by Washington Sanchez via bitcoin-dev
1) It's not really clear to me how that would work, but assuming it does
then it will go into a basket of attacks that are possible but unlikely due
to the economic disincentives to do so.
2) That said, is the Achilles heal of this proposal the lack of a
mechanism to lower the block size?
3) Let me put it another way, I've read that both Gavin and yourself are
favorable to a dynamic limit on the block size. In your view, what is
missing from this proposal, or what variables should be adjusted, to get
the rules to a place where you and other Core developers would seriously
consider it?
Post by Washington Sanchez via bitcoin-dev
A selfish mining attack would have to be performed for at least 2000
blocks over a period of 4 weeks in order to achieve a meager 10% increase
in the block size.
You seem to be analysing a different attack - I mean that if someone
has enough hashrate to do a selfish mining attack, then setting up a
system that has no means to reduce block-size risks that at a point
where there is excess block-size they can use that free transaction
space to amplify selfish mining instead of collecting transaction
fees.
Adam
--
-------------------------------------------
*Dr Washington Y. Sanchez <http://onename.com/drwasho>*
Co-founder, OB1 <http://ob1.io>
Core developer of OpenBazaar <https://openbazaar.org>
@drwasho <https://twitter.com/drwasho>
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Gavin Andresen via bitcoin-dev
2015-09-08 17:04:16 UTC
Permalink
Post by Washington Sanchez via bitcoin-dev
3) Let me put it another way, I've read that both Gavin and yourself are
favorable to a dynamic limit on the block size. In your view, what is
missing from this proposal, or what variables should be adjusted, to get
the rules to a place where you and other Core developers would seriously
consider it?
I'm not clear on what problem(s) you're trying to solve.

If you want blocks to be at least 60% full, then just specify a simple rule
like "maximum block size is 1.0/0.6 = 1.666 times the average block size
over the last N blocks (applied at every block or every 2016 blocks or
whatever, details don't really matter)".

If you want an upper limit on growth, then just implement a simple rule
like "Absolute maximum block size is 1 megabyte in 2016, 3.45 megabytes in
2017, and increases by a maximum of 3.45 times every year."

If you want me to take your proposal seriously, you need to justify why 60%
full is a good answer (and why we need a centralized decision on how full
blocks "should" be), and why 3.45 times-per-year is a good answer for
maximum growth (and, again, why we need a centralized decision on that).
--
--
Gavin Andresen
Washington Sanchez via bitcoin-dev
2015-09-08 23:11:49 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
If you want me to take your proposal seriously, you need to justify why
60% full is a good answer
Sure thing Gavin.

If you want blocks to be at least 60% full...


First off, I do not want blocks to be at least 60% full, so let me try and
explain where I got this number from

- The idea of this parameter is set a *triggering level* for an increase
in the block size
- The triggering level is the point where a reasonable medium-term trend
can be observed. That trend is an increase in the transaction volume that,
left unchecked, would fill up blocks.
- Determining the appropriate triggering level is difficult, and it
consists of 3 parameters:
1. Evaluation period
- *Period of time where you check to see if the conditions to
trigger a raise the block size are true *
- Ideally you want an increase to occur in response to a real
increase of transaction volume from the market, and not some
short term
spam attack.
- Too short, spam attacks can be used to trigger multiple
increases (at least early on). Too long, the block size
doesn't increase
fast enough to transaction demand.
- I selected a period of *4032 blocks*
2. Capacity
- *The capacity level that a majority of blocks would demonstrate
in order to trigger a block size increase*
- The capacity level, in tandem with the evaluation period and
threshold level, needs to reflect an underlying trend towards filling
blocks.
- If the capacity level is too low, block size increases can be
triggered prematurely. If the capacity level is too high, the
network could
be unnecessarily jammed with the transactions before an
increase can kick
in.
- I selected a capacity level of *60%*.
3. Threshold
- *The number of blocks during the evaluation period that are
above the capacity level in order to trigger a block size increase.*
- If blocks are getting larger than 60% over a 4032 block period,
how many reflect a market-driven increase transaction volume?
- If the threshold is too low, increases could be triggered
artificially or prematurely. If the threshold is too high,
the easier it
gets for 1-2 mining pools to prevent any increases in the
block size or the
block size doesn't respond fast enough to a real increase in
transaction
volume.
- I selected a threshold of *2000 blocks or ~50%*.
- So in my proposal, if 2000+ nodes have a block size >= 60%, this is
an indication that real transaction volume has increased and we're
approaching a time where block could be filled to capacity without an
increase. The block size increase, 10%, is triggered.

A centralized decision, presumably by Satoshi, was made on the parameters
that alter the target difficulty, rather than attempt to forecast hash
rates based on his CPU power. He allowed the system to scale to a level
where real market demand would take it. I believe the same approach should
be replicated for the block size. The trick of course is settling on the
right variables. I hope this proposal is a good way to do that.

*Some additional calculations*

Block sizes for each year are *theoretical maximums* if ALL trigger points
are activated in my proposal (unlikely, but anyway).
These calculations assume zero transactions are taken off-chain by third
party processors or the LN, and no efficiency improvements.

- 2015
- 1 MB/block
- 2 tps (conservative factor, also carried on below)
- 0.17 million tx/day
- 2016
- 3.45 MB/block
- 7 tps
- 0.6 million tx/day
- 2017
- 12 MB/block
- 24 tps
- 2 million tx/day
- 2018
- 41 MB/block
- 82 tps
- 7 million tx/day
- 2019
- 142 MB/block
- 284 tps
- 25 million tx/day
- 2020
- 490 MB/block
- 980 tps
- 85 million tx/day

By way of comparison, Alipay (payment processor for the Alibaba Group's
ecosystem) processes 30 million escrow transactions per day. This gives us
at least 4-5 years to reach the present day transaction processing capacity
of 1 corporation... in reality it will take a little longer as I doubt all
block size triggers will be activated. This also gives us at least 4-5
years to develop efficiency improvements within the protocol, develop the
LN to take many of these transactions off-chain, and network infrastructure
to be significantly improved (and anything else this ecosystem can come up
with).

(let me know if any of these calculations are off)
Washington Sanchez via bitcoin-dev
2015-09-09 13:10:43 UTC
Permalink
- So in my proposal, if 2000+ *blocks *have a size >= 60% *of the
current limit*, this is an indication that real transaction volume has
increased and we're approaching a time where block could be filled to
capacity without an increase. The block size increase, 10%, is triggered.
On Wed, Sep 9, 2015 at 9:11 AM, Washington Sanchez <
If you want me to take your proposal seriously, you need to justify why
Post by Gavin Andresen via bitcoin-dev
60% full is a good answer
Sure thing Gavin.
If you want blocks to be at least 60% full...
First off, I do not want blocks to be at least 60% full, so let me try and
explain where I got this number from
- The idea of this parameter is set a *triggering level* for an
increase in the block size
- The triggering level is the point where a reasonable medium-term
trend can be observed. That trend is an increase in the transaction volume
that, left unchecked, would fill up blocks.
- Determining the appropriate triggering level is difficult, and it
1. Evaluation period
- *Period of time where you check to see if the conditions to
trigger a raise the block size are true *
- Ideally you want an increase to occur in response to a real
increase of transaction volume from the market, and not some short term
spam attack.
- Too short, spam attacks can be used to trigger multiple
increases (at least early on). Too long, the block size doesn't increase
fast enough to transaction demand.
- I selected a period of *4032 blocks*
2. Capacity
- *The capacity level that a majority of blocks
would demonstrate in order to trigger a block size increase*
- The capacity level, in tandem with the evaluation period and
threshold level, needs to reflect an underlying trend towards filling
blocks.
- If the capacity level is too low, block size increases can be
triggered prematurely. If the capacity level is too high, the network could
be unnecessarily jammed with the transactions before an increase can kick
in.
- I selected a capacity level of *60%*.
3. Threshold
- *The number of blocks during the evaluation period that are
above the capacity level in order to trigger a block size increase.*
- If blocks are getting larger than 60% over a 4032 block
period, how many reflect a market-driven increase transaction volume?
- If the threshold is too low, increases could be triggered
artificially or prematurely. If the threshold is too high, the easier it
gets for 1-2 mining pools to prevent any increases in the block size or the
block size doesn't respond fast enough to a real increase in transaction
volume.
- I selected a threshold of *2000 blocks or ~50%*.
- So in my proposal, if 2000+ nodes have a block size >= 60%, this
is an indication that real transaction volume has increased and we're
approaching a time where block could be filled to capacity without an
increase. The block size increase, 10%, is triggered.
A centralized decision, presumably by Satoshi, was made on the parameters
that alter the target difficulty, rather than attempt to forecast hash
rates based on his CPU power. He allowed the system to scale to a level
where real market demand would take it. I believe the same approach should
be replicated for the block size. The trick of course is settling on the
right variables. I hope this proposal is a good way to do that.
*Some additional calculations*
Block sizes for each year are *theoretical maximums* if ALL trigger
points are activated in my proposal (unlikely, but anyway).
These calculations assume zero transactions are taken off-chain by third
party processors or the LN, and no efficiency improvements.
- 2015
- 1 MB/block
- 2 tps (conservative factor, also carried on below)
- 0.17 million tx/day
- 2016
- 3.45 MB/block
- 7 tps
- 0.6 million tx/day
- 2017
- 12 MB/block
- 24 tps
- 2 million tx/day
- 2018
- 41 MB/block
- 82 tps
- 7 million tx/day
- 2019
- 142 MB/block
- 284 tps
- 25 million tx/day
- 2020
- 490 MB/block
- 980 tps
- 85 million tx/day
By way of comparison, Alipay (payment processor for the Alibaba Group's
ecosystem) processes 30 million escrow transactions per day. This gives us
at least 4-5 years to reach the present day transaction processing capacity
of 1 corporation... in reality it will take a little longer as I doubt all
block size triggers will be activated. This also gives us at least 4-5
years to develop efficiency improvements within the protocol, develop the
LN to take many of these transactions off-chain, and network infrastructure
to be significantly improved (and anything else this ecosystem can come up
with).
(let me know if any of these calculations are off)
--
-------------------------------------------
*Dr Washington Y. Sanchez <http://onename.com/drwasho>*
Co-founder, OB1 <http://ob1.io>
Core developer of OpenBazaar <https://openbazaar.org>
@drwasho <https://twitter.com/drwasho>
Continue reading on narkive:
Loading...