Discussion:
Fees and the block-finding process
(too old to reply)
Gavin Andresen via bitcoin-dev
2015-08-07 14:57:23 UTC
Permalink
1) If "not now", when will it be a good time to let the "market
minimum fee for miners to mine a transaction" rise above zero?
1. If you are willing to wait an infinite amount of time, I think the
minimum fee will always be zero or very close to zero, so I think it's a
silly question.
Which Jorge misinterpreted to mean that I think there will always be at
least one miner willing to mine a transaction for free.

That's not what I'm thinking. It is just an observation based on the fact
that blocks are found at random intervals.

Every once in a while the network will get lucky and we'll find six blocks
in ten minutes. If you are deciding what transaction fee to put on your
transaction, and you're willing to wait until that
six-blocks-in-ten-minutes once-a-week event, submit your transaction with a
low fee.

All the higher-fee transactions waiting to be confirmed will get confirmed
in the first five blocks and, if miners don't have any floor on the fee
they'll accept (they will, but lets pretend they won't) then your
very-low-fee transaction will get confirmed.

In the limit, that logic becomes "wait an infinite amount of time, pay zero
fee."

So... I have no idea what the 'market minimum fee' will be, because I have
no idea how long people will be willing to wait, how many times they'll be
willing to retransmit a low-fee transaction that gets evicted from
memory-limited memory pools, or how much memory miners will be willing to
dedicate to storing transactions that won't confirm for a long time because
they're waiting for a flurry of blocks to be found.
--
--
Gavin Andresen
Pieter Wuille via bitcoin-dev
2015-08-07 15:16:34 UTC
Permalink
On Fri, Aug 7, 2015 at 4:57 PM, Gavin Andresen via bitcoin-dev <
Post by Gavin Andresen via bitcoin-dev
Every once in a while the network will get lucky and we'll find six blocks
in ten minutes. If you are deciding what transaction fee to put on your
transaction, and you're willing to wait until that
six-blocks-in-ten-minutes once-a-week event, submit your transaction with a
low fee.
All the higher-fee transactions waiting to be confirmed will get confirmed
in the first five blocks and, if miners don't have any floor on the fee
they'll accept (they will, but lets pretend they won't) then your
very-low-fee transaction will get confirmed.
In the limit, that logic becomes "wait an infinite amount of time, pay
zero fee."
That's only the case when the actual rate of transactions with a non-zero
fee is below what fits in blocks. If the total production rate is higher,
even without configured floor by miners, a free transaction won't ever be
mined, as there will always be some backlog of non-free transaction. Not
saying that this is a likely outcome - it would inevitably mean that people
are creating transactions without any guarantee that they'll be mined,
which may not be what anyone is interested in. But perhaps there is some
"use" for ultra-low-priority unreliable transactions (... despite DoS
attacks).
Post by Gavin Andresen via bitcoin-dev
So... I have no idea what the 'market minimum fee' will be, because I have
no idea how long people will be willing to wait, how many times they'll be
willing to retransmit a low-fee transaction that gets evicted from
memory-limited memory pools, or how much memory miners will be willing to
dedicate to storing transactions that won't confirm for a long time because
they're waiting for a flurry of blocks to be found.
Fair enough, I don't think anyone knows.

I guess my question (and perhaps that's what Jorge is after): do you feel
that blocks should be increased in response to (or for fear of) such a
scenario. And if so, if that is a reason for increase now, won't it be a
reason for an increase later as well? It is my impression that your answer
is yes, that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
--
Pieter
Gavin Andresen via bitcoin-dev
2015-08-07 15:55:09 UTC
Permalink
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you feel
that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.

I take the opinion of smart engineers who actually do resource planning and
have seen what happens when networks run out of capacity very seriously.


And if so, if that is a reason for increase now, won't it be a reason for
Post by Pieter Wuille via bitcoin-dev
an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
--
--
Gavin Andresen
Pieter Wuille via bitcoin-dev
2015-08-07 16:28:47 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you feel
that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
Post by Gavin Andresen via bitcoin-dev
And if so, if that is a reason for increase now, won't it be a reason for
Post by Pieter Wuille via bitcoin-dev
an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to make
technical decisions based on a fear of change of economics...
--
Pieter
Ryan Butler via bitcoin-dev
2015-08-07 17:47:22 UTC
Permalink
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.

Nobody ever said you wouldn't run out of capacity at any size. It's quite
the fallacy to draw the conclusion from that statement that block size
should remain far below a capacity it can easily maintain which would bring
more users/velocity/value to the system. The outcomes of both of those
scenarios are asymmetric. A higher block size can support more users and
volume.

Raising the blocksize isn't out of fear. It's the realization that we are
at a point where we can raise it and support more users and transactions
while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
Post by Pieter Wuille via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you
feel that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
Post by Gavin Andresen via bitcoin-dev
And if so, if that is a reason for increase now, won't it be a reason for
Post by Pieter Wuille via bitcoin-dev
an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to make
technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Mark Friedenbach via bitcoin-dev
2015-08-07 18:25:39 UTC
Permalink
Please don't put words into Pieter's mouth. I guarantee you everyone
working on Bitcoin in their heart of hearts would prefer everyone in the
world being able to use the Bitcoin ledger for whatever purpose, if there
were no cost.

But like any real world engineering issue, this is a matter of tradeoffs.
At the extreme it is simply impossible to scale Bitcoin to the terrabyte
sized blocks that would be necessary to service the entire world's
financial transactions. Not without sacrificing entirely the protection of
policy neutrality achieved through decentralization. And as that is
Bitcoin's only advantage over traditional consensus systems, you would have
to wonder what the point of such an endeavor would be.

So *somewhere* you have to draw the line, and transactions below that level
are simply pushed into higher level or off-chain protocols.

The issue, as Pieter and Jorge have been pointing out, is that technical
discussion over where that line should be has been missing from this debate.

On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <
Post by Ryan Butler via bitcoin-dev
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.
Nobody ever said you wouldn't run out of capacity at any size. It's quite
the fallacy to draw the conclusion from that statement that block size
should remain far below a capacity it can easily maintain which would bring
more users/velocity/value to the system. The outcomes of both of those
scenarios are asymmetric. A higher block size can support more users and
volume.
Raising the blocksize isn't out of fear. It's the realization that we are
at a point where we can raise it and support more users and transactions
while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
Post by Pieter Wuille via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you
feel that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
Post by Gavin Andresen via bitcoin-dev
And if so, if that is a reason for increase now, won't it be a reason
Post by Pieter Wuille via bitcoin-dev
for an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to make
technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Ryan Butler via bitcoin-dev
2015-08-07 18:57:43 UTC
Permalink
Who said anything about scaling bitcoin to visa levels now? We're talking
about an increase now that scales into the future at a rate that is
consistent with technological progress.

Peter himself said "So, I think the block size should follow technological
evolution...".

The blocksize increase proposals have been modeled around this very thing.
It's reasonable to increase the blocksize to a point that a reasonable
person, with reasonable equipment and internet access can run a node or
even a miner with acceptable orphan rates. Most miners are spv mining
anyways. The 8 or even 20 MB limits are within those parameters.

These are not mutually exclusive. We can design an increase to blocksize
that addresses both demand exceeding the available space AND follow
technological evolution. Peter's latest proposal is way too conservative
on that front.
Post by Mark Friedenbach via bitcoin-dev
Please don't put words into Pieter's mouth. I guarantee you everyone
working on Bitcoin in their heart of hearts would prefer everyone in the
world being able to use the Bitcoin ledger for whatever purpose, if there
were no cost.
But like any real world engineering issue, this is a matter of tradeoffs.
At the extreme it is simply impossible to scale Bitcoin to the terrabyte
sized blocks that would be necessary to service the entire world's
financial transactions. Not without sacrificing entirely the protection of
policy neutrality achieved through decentralization. And as that is
Bitcoin's only advantage over traditional consensus systems, you would have
to wonder what the point of such an endeavor would be.
So *somewhere* you have to draw the line, and transactions below that
level are simply pushed into higher level or off-chain protocols.
The issue, as Pieter and Jorge have been pointing out, is that technical
discussion over where that line should be has been missing from this debate.
On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <
Post by Ryan Butler via bitcoin-dev
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.
Nobody ever said you wouldn't run out of capacity at any size. It's
quite the fallacy to draw the conclusion from that statement that block
size should remain far below a capacity it can easily maintain which would
bring more users/velocity/value to the system. The outcomes of both of
those scenarios are asymmetric. A higher block size can support more users
and volume.
Raising the blocksize isn't out of fear. It's the realization that we
are at a point where we can raise it and support more users and
transactions while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
Post by Pieter Wuille via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you
feel that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
Post by Gavin Andresen via bitcoin-dev
And if so, if that is a reason for increase now, won't it be a reason
Post by Pieter Wuille via bitcoin-dev
for an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to
make technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Ryan Butler via bitcoin-dev
2015-08-07 19:07:28 UTC
Permalink
Clarification...

These are not mutually exclusive. We can design an increase to blocksize
that increases available space on chain AND follow technological
evolution. Peter's latest proposal is way too conservative on that front.

And given Peter's assertion that demand is infinite there will still be a
an ocean of off chain transactions for the likes of blockstream to address.
Post by Ryan Butler via bitcoin-dev
Who said anything about scaling bitcoin to visa levels now? We're talking
about an increase now that scales into the future at a rate that is
consistent with technological progress.
Peter himself said "So, I think the block size should follow technological
evolution...".
The blocksize increase proposals have been modeled around this very
thing. It's reasonable to increase the blocksize to a point that a
reasonable person, with reasonable equipment and internet access can run a
node or even a miner with acceptable orphan rates. Most miners are spv
mining anyways. The 8 or even 20 MB limits are within those parameters.
These are not mutually exclusive. We can design an increase to blocksize
that addresses both demand exceeding the available space AND follow
technological evolution. Peter's latest proposal is way too conservative
on that front.
Post by Mark Friedenbach via bitcoin-dev
Please don't put words into Pieter's mouth. I guarantee you everyone
working on Bitcoin in their heart of hearts would prefer everyone in the
world being able to use the Bitcoin ledger for whatever purpose, if there
were no cost.
But like any real world engineering issue, this is a matter of tradeoffs.
At the extreme it is simply impossible to scale Bitcoin to the terrabyte
sized blocks that would be necessary to service the entire world's
financial transactions. Not without sacrificing entirely the protection of
policy neutrality achieved through decentralization. And as that is
Bitcoin's only advantage over traditional consensus systems, you would have
to wonder what the point of such an endeavor would be.
So *somewhere* you have to draw the line, and transactions below that
level are simply pushed into higher level or off-chain protocols.
The issue, as Pieter and Jorge have been pointing out, is that technical
discussion over where that line should be has been missing from this debate.
On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <
Post by Ryan Butler via bitcoin-dev
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.
Nobody ever said you wouldn't run out of capacity at any size. It's
quite the fallacy to draw the conclusion from that statement that block
size should remain far below a capacity it can easily maintain which would
bring more users/velocity/value to the system. The outcomes of both of
those scenarios are asymmetric. A higher block size can support more users
and volume.
Raising the blocksize isn't out of fear. It's the realization that we
are at a point where we can raise it and support more users and
transactions while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
Post by Pieter Wuille via bitcoin-dev
On Fri, Aug 7, 2015 at 11:16 AM, Pieter Wuille <
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you
feel that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size,
and yes, fear of Bad Things Happening as we run up against the 1MB limit is
one of the reasons.
I take the opinion of smart engineers who actually do resource
planning and have seen what happens when networks run out of capacity very
seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
And if so, if that is a reason for increase now, won't it be a reason
Post by Pieter Wuille via bitcoin-dev
for an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message to
in-the-future Bitcoin engineers: you should consider raising the maximum
block size if needed and you think the benefits of doing so (like increased
adoption or lower transaction fees or increased reliability) outweigh the
costs (like higher operating costs for full-nodes or the disruption caused
by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to
make technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Mark Friedenbach via bitcoin-dev
2015-08-07 19:15:34 UTC
Permalink
Surely you have some sort of empirical measurement demonstrating the
validity of that statement? That is to say you've established some
technical criteria by which to determine how much centralization pressure
is too much, and shown that Pieter's proposal undercuts expected progress
in that area?
Post by Ryan Butler via bitcoin-dev
Clarification...
These are not mutually exclusive. We can design an increase to blocksize
that increases available space on chain AND follow technological
evolution. Peter's latest proposal is way too conservative on that front.
And given Peter's assertion that demand is infinite there will still be a
an ocean of off chain transactions for the likes of blockstream to address.
Post by Ryan Butler via bitcoin-dev
Who said anything about scaling bitcoin to visa levels now? We're
talking about an increase now that scales into the future at a rate that is
consistent with technological progress.
Peter himself said "So, I think the block size should follow
technological evolution...".
The blocksize increase proposals have been modeled around this very
thing. It's reasonable to increase the blocksize to a point that a
reasonable person, with reasonable equipment and internet access can run a
node or even a miner with acceptable orphan rates. Most miners are spv
mining anyways. The 8 or even 20 MB limits are within those parameters.
These are not mutually exclusive. We can design an increase to blocksize
that addresses both demand exceeding the available space AND follow
technological evolution. Peter's latest proposal is way too conservative
on that front.
Post by Mark Friedenbach via bitcoin-dev
Please don't put words into Pieter's mouth. I guarantee you everyone
working on Bitcoin in their heart of hearts would prefer everyone in the
world being able to use the Bitcoin ledger for whatever purpose, if there
were no cost.
But like any real world engineering issue, this is a matter of
tradeoffs. At the extreme it is simply impossible to scale Bitcoin to the
terrabyte sized blocks that would be necessary to service the entire
world's financial transactions. Not without sacrificing entirely the
protection of policy neutrality achieved through decentralization. And as
that is Bitcoin's only advantage over traditional consensus systems, you
would have to wonder what the point of such an endeavor would be.
So *somewhere* you have to draw the line, and transactions below that
level are simply pushed into higher level or off-chain protocols.
The issue, as Pieter and Jorge have been pointing out, is that technical
discussion over where that line should be has been missing from this debate.
On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <
Post by Ryan Butler via bitcoin-dev
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.
Nobody ever said you wouldn't run out of capacity at any size. It's
quite the fallacy to draw the conclusion from that statement that block
size should remain far below a capacity it can easily maintain which would
bring more users/velocity/value to the system. The outcomes of both of
those scenarios are asymmetric. A higher block size can support more users
and volume.
Raising the blocksize isn't out of fear. It's the realization that we
are at a point where we can raise it and support more users and
transactions while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
On Fri, Aug 7, 2015 at 5:55 PM, Gavin Andresen <
On Fri, Aug 7, 2015 at 11:16 AM, Pieter Wuille <
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do you
feel that blocks should be increased in response to (or for fear of) such a
scenario.
I think there are multiple reasons to raise the maximum block size,
and yes, fear of Bad Things Happening as we run up against the 1MB limit is
one of the reasons.
I take the opinion of smart engineers who actually do resource
planning and have seen what happens when networks run out of capacity very
seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
And if so, if that is a reason for increase now, won't it be a reason
Post by Pieter Wuille via bitcoin-dev
for an increase later as well? It is my impression that your answer is yes,
that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message
to in-the-future Bitcoin engineers: you should consider raising the
maximum block size if needed and you think the benefits of doing so (like
increased adoption or lower transaction fees or increased reliability)
outweigh the costs (like higher operating costs for full-nodes or the
disruption caused by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to
make technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Ryan Butler via bitcoin-dev
2015-08-07 20:17:29 UTC
Permalink
Peter's proposal undercuts matching blocksize growth to technological
progress not limiting centralization pressure. They are somewhat related,
but I want to be clear on what I originally stated. I would also point out
that Peter's proposal lacks this technical criteria as well.

That being said, I think designing any growth rates on theoretical
centralization pressure is not sensible and Peter's proposal rightly
excludes it and attempts instead for a very gradual increase over time
attempting to match blocksize growth that is consistent with technological
bandwidth growth. The problem is that it ignores the last 6 years (we are
already "behind") and underestimates bandwidth and storage growth. (See
Nielsen's law which states 50% and has held for 30 years).

Peter seems to be of the belief that since bitcoin will never be able to
handle all the world's transactions we should instead "...decrease the need
for trust required in off-chain systems...". This is akin to basing all
the world's transactions on a small settlement layer, much like balancing a
pyramid upside down it will topple.

I'm of the belief that the "reasonable node" test is a simple enough test
to maintain decentralization.

A raspberry pie 2 node on reasonable Internet connection with a reasonable
hard drive can run a node with 8 or 20mb blocks easily.

As Peter's proposal indicates, "If over time, this growth factor is beyond
what the actual technology offers, the intention should be to soft fork a
tighter limit." I wholeheartedly agree, which is why we should plan to be
ahead of the curve...not behind it.
Post by Mark Friedenbach via bitcoin-dev
Surely you have some sort of empirical measurement demonstrating the
validity of that statement? That is to say you've established some
technical criteria by which to determine how much centralization pressure
is too much, and shown that Pieter's proposal undercuts expected progress
in that area?
Post by Ryan Butler via bitcoin-dev
Clarification...
These are not mutually exclusive. We can design an increase to blocksize
that increases available space on chain AND follow technological
evolution. Peter's latest proposal is way too conservative on that front.
And given Peter's assertion that demand is infinite there will still be a
an ocean of off chain transactions for the likes of blockstream to address.
Post by Ryan Butler via bitcoin-dev
Who said anything about scaling bitcoin to visa levels now? We're
talking about an increase now that scales into the future at a rate that is
consistent with technological progress.
Peter himself said "So, I think the block size should follow
technological evolution...".
The blocksize increase proposals have been modeled around this very
thing. It's reasonable to increase the blocksize to a point that a
reasonable person, with reasonable equipment and internet access can run a
node or even a miner with acceptable orphan rates. Most miners are spv
mining anyways. The 8 or even 20 MB limits are within those parameters.
These are not mutually exclusive. We can design an increase to
blocksize that addresses both demand exceeding the available space AND
follow technological evolution. Peter's latest proposal is way too
conservative on that front.
Post by Mark Friedenbach via bitcoin-dev
Please don't put words into Pieter's mouth. I guarantee you everyone
working on Bitcoin in their heart of hearts would prefer everyone in the
world being able to use the Bitcoin ledger for whatever purpose, if there
were no cost.
But like any real world engineering issue, this is a matter of
tradeoffs. At the extreme it is simply impossible to scale Bitcoin to the
terrabyte sized blocks that would be necessary to service the entire
world's financial transactions. Not without sacrificing entirely the
protection of policy neutrality achieved through decentralization. And as
that is Bitcoin's only advantage over traditional consensus systems, you
would have to wonder what the point of such an endeavor would be.
So *somewhere* you have to draw the line, and transactions below that
level are simply pushed into higher level or off-chain protocols.
The issue, as Pieter and Jorge have been pointing out, is that
technical discussion over where that line should be has been missing from
this debate.
On Fri, Aug 7, 2015 at 10:47 AM, Ryan Butler via bitcoin-dev <
Post by Ryan Butler via bitcoin-dev
Interesting position there Peter...you fear more people actually using
bitcoin. The less on chain transactions the lower the velocity and the
lower the value of the network. I would be careful what you ask for
because you end up having nothing left to even root the security of these
off chain transactions with and then neither will exist.
Nobody ever said you wouldn't run out of capacity at any size. It's
quite the fallacy to draw the conclusion from that statement that block
size should remain far below a capacity it can easily maintain which would
bring more users/velocity/value to the system. The outcomes of both of
those scenarios are asymmetric. A higher block size can support more users
and volume.
Raising the blocksize isn't out of fear. It's the realization that we
are at a point where we can raise it and support more users and
transactions while keeping the downsides to a minimum (centralization etc).
On Aug 7, 2015 11:28 AM, "Pieter Wuille via bitcoin-dev" <
On Fri, Aug 7, 2015 at 5:55 PM, Gavin Andresen <
On Fri, Aug 7, 2015 at 11:16 AM, Pieter Wuille <
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do
you feel that blocks should be increased in response to (or for fear of)
such a scenario.
I think there are multiple reasons to raise the maximum block size,
and yes, fear of Bad Things Happening as we run up against the 1MB limit is
one of the reasons.
I take the opinion of smart engineers who actually do resource
planning and have seen what happens when networks run out of capacity very
seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system. Whatever size blocks are actually produced, I believe
the result will either be something people consider too small to be
competitive ("you mean Bitcoin can only do 24 transactions per second?"
sounds almost the same as "you mean Bitcoin can only do 3 transactions per
second?"), or something that is very centralized in practice, and likely
both.
And if so, if that is a reason for increase now, won't it be a
Post by Pieter Wuille via bitcoin-dev
reason for an increase later as well? It is my impression that your answer
is yes, that this is why you want to increase the block size quickly and
significantly, but correct me if I'm wrong.
Sure, it might be a reason for an increase later. Here's my message
to in-the-future Bitcoin engineers: you should consider raising the
maximum block size if needed and you think the benefits of doing so (like
increased adoption or lower transaction fees or increased reliability)
outweigh the costs (like higher operating costs for full-nodes or the
disruption caused by ANY consensus rule change).
In general that sounds reasonable, but it's a dangerous precedent to
make technical decisions based on a fear of change of economics...
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Dave Hudson via bitcoin-dev
2015-08-07 20:33:16 UTC
Permalink
A raspberry pie 2 node on reasonable Internet connection with a reasonable hard drive can run a node with 8 or 20mb blocks easily.
I'm curious as I've not seen any data on this subject. How fast can a RP2 do the necessary cryptographic calculations to validate blocks of various sizes?

While everyone tends to talk in terms of 10 minutes per block that is, of course, only a typical time and doesn't account for situations in which 2 or more blocks are found in quick succession (which, of course, happens on a daily basis). At what point does, say, an RP2 node fail to be able to validate a second or third block because it's still not finished processing the first?

If someone were to be playing games with the system and mining transactions without first broadcasting them to the network then how long would that take? This would in essence define the ability to DoS lower-performance nodes (ignoring all of the other usual considerations such as bandwidth, etc).
jl2012 via bitcoin-dev
2015-08-07 18:17:32 UTC
Permalink
On Fri, Aug 7, 2015 at 5:55 PM, Gavin Andresen
On Fri, Aug 7, 2015 at 11:16 AM, Pieter Wuille
Post by Pieter Wuille via bitcoin-dev
I guess my question (and perhaps that's what Jorge is after): do
you feel that blocks should be increased in response to (or for
fear of) such a scenario.
I think there are multiple reasons to raise the maximum block size,
and yes, fear of Bad Things Happening as we run up against the 1MB
limit is one of the reasons.
I take the opinion of smart engineers who actually do resource
planning and have seen what happens when networks run out of
capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should),
and it just takes time for the market to find a way to fill whatever
is available - the rest goes into off-chain systems anyway. You will
run out of capacity at any size, and acting out of fear of that
reality does not improve the system. Whatever size blocks are actually
produced, I believe the result will either be something people
consider too small to be competitive ("you mean Bitcoin can only do 24
transactions per second?" sounds almost the same as "you mean Bitcoin
can only do 3 transactions per second?"), or something that is very
centralized in practice, and likely both.
What if we reduce the block size to 0.125MB? That will allow 0.375tx/s.
If 3->24 sounds "almost the same", 3->0.375 also sounds almost the same.
We will have 50000 full nodes, instead of 5000, since it is so
affordable to run a full node.

If 0.125MB sounds too extreme, what about 0.5/0.7/0.9MB? Are we going to
have more full nodes?

No, I'm not trolling. I really want someone to tell me why we
should/shouldn't reduce the block size. Are we going to have more or
less full nodes if we reduce the block size?
Bryan Bishop via bitcoin-dev
2015-08-07 18:35:29 UTC
Permalink
On Fri, Aug 7, 2015 at 1:17 PM, jl2012 via bitcoin-dev <
Post by jl2012 via bitcoin-dev
No, I'm not trolling. I really want someone to tell me why we
should/shouldn't reduce the block size. Are we going to have more or less
full nodes if we reduce the block size?
Some arguments have floated around that even in the absence of "causing an
increase in the number of full nodes", that a reduction of the max block
size might be beneficial for other reasons, such as bandwidth saturation
benefits. Also less time spent validating transactions because of the fewer
transactions.

- Bryan
http://heybryan.org/
1 512 203 0507
Simon Liu via bitcoin-dev
2015-08-07 18:36:01 UTC
Permalink
That's a good question.

An argument has been put forward that a larger block size would reduce
the security of the network, so does the converse hold?
Post by jl2012 via bitcoin-dev
What if we reduce the block size to 0.125MB? That will allow 0.375tx/s.
If 3->24 sounds "almost the same", 3->0.375 also sounds almost the same.
We will have 50000 full nodes, instead of 5000, since it is so
affordable to run a full node.
If 0.125MB sounds too extreme, what about 0.5/0.7/0.9MB? Are we going to
have more full nodes?
No, I'm not trolling. I really want someone to tell me why we
should/shouldn't reduce the block size. Are we going to have more or
less full nodes if we reduce the block size?
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Elliot Olds via bitcoin-dev
2015-08-11 23:20:21 UTC
Permalink
On Fri, Aug 7, 2015 at 9:28 AM, Pieter Wuille via bitcoin-dev <
Post by Pieter Wuille via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
This is a fundamental disagreement then. I believe that the demand is
infinite if you don't set a fee minimum (and I don't think we should), and
it just takes time for the market to find a way to fill whatever is
available - the rest goes into off-chain systems anyway. You will run out
of capacity at any size, and acting out of fear of that reality does not
improve the system.
I think the case for increasing block size can be made without appealing to
fear of unknown effects of a fee market developing. I agree with you that
the most likely outcome is that fees will rise to a new equilibrium as
competition for block space increases, and some use cases will get priced
out of the market. If fees rise high enough, the effects of this can be
pretty bad though. I get the sense that you don't think high fees are that
bad / low fees are that good.

Can you let me know which of these statements related to low fees you
disagree with?

(0) Bitcoin's security will eventually have to be paid for almost entirely
via txn fees.

(1) A future in which lots of users are making on chain txns and each
paying 5 cents/tx is more sustainable than one in which a smaller number of
users are paying $3/tx, all else being equal (pretend the centralization
pressures are very low in both instances, and each scenario results in the
same amount of total tx fees).

(2) It's important that Bitcoin become widely used to protect the network
against regulators (note how political pressure from users who like Uber
have had a huge effect on preventing Uber from being banned in many
locations).

(3) There are potentially a lot of valuable use cases that can benefit from
Bitcoin's decentralization which can work at 5 cents / tx but are nonviable
at $3 / tx. Allowing fees to stay at $3 / tx and pricing out all the viable
use cases between $3 and 5 cents / tx would likely result in a significant
loss of utility for people who want these use cases to work.

(4) The Lightning Network will be a lot less appealing at $3 / tx than 5
cents / tx, because it'll require much larger anchor txn values to
sufficiently amortize the costs of the Bitcoin tx fees, and having to pay
$3 each time your counter-party misbehaves is somewhat painful.

(5) Assuming that Bitcoin is somewhat likely to end up in the "lots of
users, lower fees" situation described in (1), it's important that people
can experiment with low fee use cases now so that these use cases have time
to be discovered, be improved, and become popular before Bitcoin's security
relies exclusively on fees.


Finally, here's a type of question that devs on this list really don't like
answering but which I think is more informative than almost any other: If
you knew that hard forking to 4 MB soon would keep fees around 5 cents
(with a fee market) for the next two years, and that remaining at 1 MB
would result in fees of around $1 for the next two years, would you be in
favor of the 4 MB hard fork? (I know our knowledge of the decentralization
risks isn't very complete now, but assume you had to make a decision given
the state of your knowledge now).
Jorge Timón via bitcoin-dev
2015-08-07 17:33:34 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.

What are the other reasons?
Post by Gavin Andresen via bitcoin-dev
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.

When "the network runs out of capacity" (when we hit the limit) do we
expect anything to happen apart from minimum market fees rising (above
zero)?
Obviously any consequences of fees rising are included in this concern.
Thomas Zander via bitcoin-dev
2015-08-07 22:12:12 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
When "the network runs out of capacity" (when we hit the limit) do we
expect anything to happen apart from minimum market fees rising (above
zero)?
How many clients actually evict transactions from their mempool currently? If
the backlog grows infinitely (as a result of more in than out), that would be
a problem.
How many wallets re-transmit their transaction when your local full nodes
mempool no longer has it? Problem.

What will the backlash be when people here that are pushing for "off-chain-
transactions" fail to produce a properly working alternative, which
essentially means we have to say NO to more users. We can't service you,
sorry. Please go away.

At this time and this size of bitcoin community, my personal experience (and
I've been part of many communities) saying NO to new customers will kill the
product totally. Or, if we are lucky, just make an altcoin that quickly
becomes the de-facto standard.
--
Thomas Zander
Adam Back via bitcoin-dev
2015-08-07 23:06:28 UTC
Permalink
Please try to focus on constructive technical comments.

On 7 August 2015 at 23:12, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
What will the backlash be when people here that are pushing for "off-chain-
transactions" fail to produce a properly working alternative, which
essentially means we have to say NO to more users.
But > 99% of Bitcoin transactions are already off-chain. There are
multiple competing companies offering consumer & retail service with
off-chain settlement.

I wasnt clear but it seemed in your previous mail that you seemed to
say you dont mind trusting other people with your money, and so
presumably you are OK using these services, and so have no problem?
Post by Thomas Zander via bitcoin-dev
At this time and this size of bitcoin community, my personal experience (and
I've been part of many communities) saying NO to new customers
Who said no to anything? The systems of off-chain transfer already
exist and are by comparison to Bitcoins protocol simple and rapid to
adapt and scale.

Indications are that we can even do off-chain at scale with Bitcoin
similar trust-minimisation with lightning, and duplex payment
channels; and people are working on that right now.

I think it would be interesting and useful for someone, with an
interest in low trust, high scale transactions, to work on and propose
an interoperability standard and API for such off-chain services to be
accessed by wallets, and perhaps periodic on-chain inter-service
netting.

Adam
Dave Scotese via bitcoin-dev
2015-08-08 22:45:28 UTC
Permalink
I see value in lowering the block size or leaving it where it is. We expect
to run out of space, and I think it's a good idea to prepare for that,
rather than avoid it. When we run out of space and the block size is low,
we will see problems. If we raise the block size, we will NOT see these
problems until bitcoin is bigger and more important and the pressure is
higher.

Someone mentioned that when the backlog grows faster than it shrinks, that
is a real problem. I don't think it is. It is a problem for those who
don't wait for even one confirmation, but backlogs in the past have already
started training users to wait for at least one confirmation, or go
off-chain. I am comfortable leaving those zero-conf people in a little bit
of trouble. Everyone else can double-spend (perhaps that's not as easy as
it should be in bitcoin core) and use a higher fee, thus competing for
block space. Yes, $5 transactions suck, but $0.15 is not so bad and about
twice the average right now.

Meanwhile, the higher fees everyone starts feeling like paying, along with
the visibility of the problems caused by full-blocks, will provide
excellent justification and motivation for increasing the limit. My
favorite thing to do is to have a solution ready for a problem I expect to
see, see the problem (so I can measure things about it) and then implement
the solution.

In my experience, the single biggest reason not to run a full node has to
do with starting from scratch: "I used to run a full node, but last time I
had to download the full blockchain, it took ___ days, so I just use (some
wallet) now." I think that has been improved with headers-first, but many
people don't know it.

I have some ideas how a "full node" could postpone being "full" but still
be nearly completely operational so that the delay between startup and
having a full blockchain is nearly painless. It involves bonded
representation of important not-so-large pieces of data (blocks that have
my transactions, the complete UTXO as of some height, etc.). If I know
that I have some btc, I could offer it (say, 100 or 1000 transaction fees'
worth) to anyone who will guarantee good data to me, and then when I have
the whole blockchain, I will know if they were honest. If done right, the
whole network could know whether or not they were honest and enforce the
bond if they weren't. Credit the Lightening paper for parts of this idea.

Dave

On Fri, Aug 7, 2015 at 4:06 PM, Adam Back via bitcoin-dev <
Post by Adam Back via bitcoin-dev
Please try to focus on constructive technical comments.
On 7 August 2015 at 23:12, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
What will the backlash be when people here that are pushing for
"off-chain-
Post by Thomas Zander via bitcoin-dev
transactions" fail to produce a properly working alternative, which
essentially means we have to say NO to more users.
But > 99% of Bitcoin transactions are already off-chain. There are
multiple competing companies offering consumer & retail service with
off-chain settlement.
I wasnt clear but it seemed in your previous mail that you seemed to
say you dont mind trusting other people with your money, and so
presumably you are OK using these services, and so have no problem?
Post by Thomas Zander via bitcoin-dev
At this time and this size of bitcoin community, my personal experience
(and
Post by Thomas Zander via bitcoin-dev
I've been part of many communities) saying NO to new customers
Who said no to anything? The systems of off-chain transfer already
exist and are by comparison to Bitcoins protocol simple and rapid to
adapt and scale.
Indications are that we can even do off-chain at scale with Bitcoin
similar trust-minimisation with lightning, and duplex payment
channels; and people are working on that right now.
I think it would be interesting and useful for someone, with an
interest in low trust, high scale transactions, to work on and propose
an interoperability standard and API for such off-chain services to be
accessed by wallets, and perhaps periodic on-chain inter-service
netting.
Adam
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
I like to provide some work at no charge to prove my value. Do you need a
techie?
I own Litmocracy <http://www.litmocracy.com> and Meme Racing
<http://www.memeracing.net> (in alpha).
I'm the webmaster for The Voluntaryist <http://www.voluntaryist.com> which
now accepts Bitcoin.
I also code for The Dollar Vigilante <http://dollarvigilante.com/>.
"He ought to find it more profitable to play by the rules" - Satoshi
Nakamoto
Alex Morcos via bitcoin-dev
2015-08-08 23:05:29 UTC
Permalink
I agree
There are a lot of difficult technical problems introduced by insufficient block space that are best addressed now. As well as problems that scale will exacerbate like bootstrapping that we should develop solutions for first.


Sent from my iPad
I see value in lowering the block size or leaving it where it is. We expect to run out of space, and I think it's a good idea to prepare for that, rather than avoid it. When we run out of space and the block size is low, we will see problems. If we raise the block size, we will NOT see these problems until bitcoin is bigger and more important and the pressure is higher.
Someone mentioned that when the backlog grows faster than it shrinks, that is a real problem. I don't think it is. It is a problem for those who don't wait for even one confirmation, but backlogs in the past have already started training users to wait for at least one confirmation, or go off-chain. I am comfortable leaving those zero-conf people in a little bit of trouble. Everyone else can double-spend (perhaps that's not as easy as it should be in bitcoin core) and use a higher fee, thus competing for block space. Yes, $5 transactions suck, but $0.15 is not so bad and about twice the average right now.
Meanwhile, the higher fees everyone starts feeling like paying, along with the visibility of the problems caused by full-blocks, will provide excellent justification and motivation for increasing the limit. My favorite thing to do is to have a solution ready for a problem I expect to see, see the problem (so I can measure things about it) and then implement the solution.
In my experience, the single biggest reason not to run a full node has to do with starting from scratch: "I used to run a full node, but last time I had to download the full blockchain, it took ___ days, so I just use (some wallet) now." I think that has been improved with headers-first, but many people don't know it.
I have some ideas how a "full node" could postpone being "full" but still be nearly completely operational so that the delay between startup and having a full blockchain is nearly painless. It involves bonded representation of important not-so-large pieces of data (blocks that have my transactions, the complete UTXO as of some height, etc.). If I know that I have some btc, I could offer it (say, 100 or 1000 transaction fees' worth) to anyone who will guarantee good data to me, and then when I have the whole blockchain, I will know if they were honest. If done right, the whole network could know whether or not they were honest and enforce the bond if they weren't. Credit the Lightening paper for parts of this idea.
Dave
Post by Adam Back via bitcoin-dev
Please try to focus on constructive technical comments.
On 7 August 2015 at 23:12, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
What will the backlash be when people here that are pushing for "off-chain-
transactions" fail to produce a properly working alternative, which
essentially means we have to say NO to more users.
But > 99% of Bitcoin transactions are already off-chain. There are
multiple competing companies offering consumer & retail service with
off-chain settlement.
I wasnt clear but it seemed in your previous mail that you seemed to
say you dont mind trusting other people with your money, and so
presumably you are OK using these services, and so have no problem?
Post by Thomas Zander via bitcoin-dev
At this time and this size of bitcoin community, my personal experience (and
I've been part of many communities) saying NO to new customers
Who said no to anything? The systems of off-chain transfer already
exist and are by comparison to Bitcoins protocol simple and rapid to
adapt and scale.
Indications are that we can even do off-chain at scale with Bitcoin
similar trust-minimisation with lightning, and duplex payment
channels; and people are working on that right now.
I think it would be interesting and useful for someone, with an
interest in low trust, high scale transactions, to work on and propose
an interoperability standard and API for such off-chain services to be
accessed by wallets, and perhaps periodic on-chain inter-service
netting.
Adam
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
I like to provide some work at no charge to prove my value. Do you need a techie?
I own Litmocracy and Meme Racing (in alpha).
I'm the webmaster for The Voluntaryist which now accepts Bitcoin.
I also code for The Dollar Vigilante.
"He ought to find it more profitable to play by the rules" - Satoshi Nakamoto
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Hector Chu via bitcoin-dev
2015-08-09 05:52:37 UTC
Permalink
You people are the most selfish kind of people in the world. Blackmail
developers with overload of the system, to try to force them to urgently
come up with solutions to the problem. The solution is always going to
be... wait for it... "increase the block size". There is not enough time or
manpower to do anything else. We are witnessing a tragedy of the commons
before our very eyes.

On 9 August 2015 at 00:05, Alex Morcos via bitcoin-dev <
Post by Alex Morcos via bitcoin-dev
I agree
There are a lot of difficult technical problems introduced by insufficient
block space that are best addressed now. As well as problems that scale
will exacerbate like bootstrapping that we should develop solutions for
first.
Sent from my iPad
On Aug 8, 2015, at 6:45 PM, Dave Scotese via bitcoin-dev <
I see value in lowering the block size or leaving it where it is. We
expect to run out of space, and I think it's a good idea to prepare for
that, rather than avoid it. When we run out of space and the block size is
low, we will see problems. If we raise the block size, we will NOT see
these problems until bitcoin is bigger and more important and the pressure
is higher.
Someone mentioned that when the backlog grows faster than it shrinks, that
is a real problem. I don't think it is. It is a problem for those who
don't wait for even one confirmation, but backlogs in the past have already
started training users to wait for at least one confirmation, or go
off-chain. I am comfortable leaving those zero-conf people in a little bit
of trouble. Everyone else can double-spend (perhaps that's not as easy as
it should be in bitcoin core) and use a higher fee, thus competing for
block space. Yes, $5 transactions suck, but $0.15 is not so bad and about
twice the average right now.
Meanwhile, the higher fees everyone starts feeling like paying, along with
the visibility of the problems caused by full-blocks, will provide
excellent justification and motivation for increasing the limit. My
favorite thing to do is to have a solution ready for a problem I expect to
see, see the problem (so I can measure things about it) and then implement
the solution.
In my experience, the single biggest reason not to run a full node has to
do with starting from scratch: "I used to run a full node, but last time I
had to download the full blockchain, it took ___ days, so I just use (some
wallet) now." I think that has been improved with headers-first, but many
people don't know it.
I have some ideas how a "full node" could postpone being "full" but still
be nearly completely operational so that the delay between startup and
having a full blockchain is nearly painless. It involves bonded
representation of important not-so-large pieces of data (blocks that have
my transactions, the complete UTXO as of some height, etc.). If I know
that I have some btc, I could offer it (say, 100 or 1000 transaction fees'
worth) to anyone who will guarantee good data to me, and then when I have
the whole blockchain, I will know if they were honest. If done right, the
whole network could know whether or not they were honest and enforce the
bond if they weren't. Credit the Lightening paper for parts of this idea.
Dave
On Fri, Aug 7, 2015 at 4:06 PM, Adam Back via bitcoin-dev <
Post by Adam Back via bitcoin-dev
Please try to focus on constructive technical comments.
On 7 August 2015 at 23:12, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
What will the backlash be when people here that are pushing for
"off-chain-
Post by Thomas Zander via bitcoin-dev
transactions" fail to produce a properly working alternative, which
essentially means we have to say NO to more users.
But > 99% of Bitcoin transactions are already off-chain. There are
multiple competing companies offering consumer & retail service with
off-chain settlement.
I wasnt clear but it seemed in your previous mail that you seemed to
say you dont mind trusting other people with your money, and so
presumably you are OK using these services, and so have no problem?
Post by Thomas Zander via bitcoin-dev
At this time and this size of bitcoin community, my personal experience
(and
Post by Thomas Zander via bitcoin-dev
I've been part of many communities) saying NO to new customers
Who said no to anything? The systems of off-chain transfer already
exist and are by comparison to Bitcoins protocol simple and rapid to
adapt and scale.
Indications are that we can even do off-chain at scale with Bitcoin
similar trust-minimisation with lightning, and duplex payment
channels; and people are working on that right now.
I think it would be interesting and useful for someone, with an
interest in low trust, high scale transactions, to work on and propose
an interoperability standard and API for such off-chain services to be
accessed by wallets, and perhaps periodic on-chain inter-service
netting.
Adam
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
--
I like to provide some work at no charge to prove my value. Do you need a techie?
I own Litmocracy <http://www.litmocracy.com> and Meme Racing
<http://www.memeracing.net> (in alpha).
I'm the webmaster for The Voluntaryist <http://www.voluntaryist.com>
which now accepts Bitcoin.
I also code for The Dollar Vigilante <http://dollarvigilante.com/>.
"He ought to find it more profitable to play by the rules" - Satoshi Nakamoto
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Thomas Zander via bitcoin-dev
2015-08-09 10:32:01 UTC
Permalink
Post by Alex Morcos via bitcoin-dev
I agree
There are a lot of difficult technical problems introduced by insufficient
block space that are best addressed now.
I agree problems for space restrictions should be solved, and the sooner the
better.
What your statement has as a side-effect is that we will run into problems when
the moment of insufficient block space comes *this* year instead of in 5 years.

I can practically guarantee that no proper solutions will be deployed in time
for natural growth of usage to reach always 1Mb full blocks.
Having several more years to make such solutions will be very healthy.
Post by Alex Morcos via bitcoin-dev
As well as problems that scale
will exacerbate like bootstrapping that we should develop solutions for
first.
Notice that many people here have tried but have been unable to find a relation
between max-blocksize and full node-count.

Also, there are pretty good solutions already, like a bootstrap torrent and
the headers first. In the upcoming release the actual CPU load should also get
better making the actual download much much faster than the 0.9 release.

Or, in other words, these problems have been solved in a large part already,
and more is underway.
I don't expect them to be showstoppers when a the network finally allows bigger
than 1Mb blocks. Natural growth has shown that blocks won't jump in size
significantly in one month anyway. So this scenario still has 6 months or so.
--
Thomas Zander
Thomas Zander via bitcoin-dev
2015-08-09 10:42:53 UTC
Permalink
Post by Dave Scotese via bitcoin-dev
Someone mentioned that when the backlog grows faster than it shrinks, that
is a real problem. I don't think it is. It is a problem for those who
don't wait for even one confirmation
The mention you refer to was about the fact that the software doesn't cope
well with a continuously growing mempool.
If Bitcoind starts eating more and more memory, I expect lots of people that
run it now to turn it off.
Post by Dave Scotese via bitcoin-dev
but backlogs in the past have already
started training users to wait for at least one confirmation, or go
off-chain.
I am wondering how you concluded that? The only time we saw full blocks for a
considerable amount of time was when we had a spammer, and the only thing
we taught people was to use higher fees.
Actually, we didn't teach people anything, we told wallet developers to do it.
Most actual users were completely ignorant of the problem.

Full blocks will then stop being a supported usecase when real humans are
trying to buy a beer or a coffee. Waiting for a confirmation won't work either
for the vast majority of the current usages of Bitcoin in the real world.
Post by Dave Scotese via bitcoin-dev
I am comfortable leaving those zero-conf people in a little bit
of trouble. Everyone else can double-spend (perhaps that's not as easy as
it should be in bitcoin core) and use a higher fee, thus competing for
block space.
This is false, if you want to double spent you have to do a lot of work and
have non-standard software. For instance sending your newer transaction to a
random node will almost always get it rejected because its a double spent.
Replace by fee (even safe) is not supported in the vast majority of Bitcoin
land.
--
Thomas Zander
Dave Scotese via bitcoin-dev
2015-08-09 20:43:48 UTC
Permalink
On Sun, Aug 9, 2015 at 3:42 AM, Thomas Zander via bitcoin-dev <
Post by Thomas Zander via bitcoin-dev
Post by Dave Scotese via bitcoin-dev
Someone mentioned that when the backlog grows faster than it shrinks,
that
Post by Dave Scotese via bitcoin-dev
is a real problem. I don't think it is. It is a problem for those who
don't wait for even one confirmation
The mention you refer to was about the fact that the software doesn't cope
well with a continuously growing mempool.
If Bitcoind starts eating more and more memory, I expect lots of people that
run it now to turn it off.
That is a real problem then. While emptying the mempool faster with bigger
blocks will help to reduce the occurrence of that problem, I propose a
user-configurable default limit to the size of the mempool as a permanent
solution regardless of block size. "This software has stopped consuming
memory necessary to validate transactions. You can override this by ..."
If anyone feels that protecting those running full nodes from bitcoind
eating more and more memory this way is a good idea, I can make a BIP out
of it if that would help.
Post by Thomas Zander via bitcoin-dev
Post by Dave Scotese via bitcoin-dev
but backlogs in the past have already
started training users to wait for at least one confirmation, or go
off-chain.
I am wondering how you concluded that? The only time we saw full blocks for a
considerable amount of time was when we had a spammer, and the only thing
we taught people was to use higher fees.
I concluded that because I don't think I'm all that different than others,
and that is what I have done. The "training" of which I speak is not
always recognized by the bitcoiner on whom it operates. A similar
"training" is how we all learn to ignore teachers because governments force
our attendance at school.
Post by Thomas Zander via bitcoin-dev
Post by Dave Scotese via bitcoin-dev
Everyone else can double-spend (perhaps that's not as easy as
it should be in bitcoin core) and use a higher fee, thus competing for
block space.
This is false, if you want to double spent you have to do a lot of work and
have non-standard software. For instance sending your newer transaction to a
random node will almost always get it rejected because its a double spent.
Replace by fee (even safe) is not supported in the vast majority of Bitcoin
land.
I don't know what you meant to say is false. I agree with the other stuff
you wrote. Thanks for confirming that it is difficult.

I did some research on replace by fee (FSS-RBF) and on
Child-pays-for-parent (CPFP). You point out that these solutions to paying
too-low fees are "not supported in the vast majority...". Do you mean
philosophically or programmatically? The trend seems to me toward
improvements, just as I insinuated may be necessary ("perhaps that's not as
easy as it should be in bitcoin core"), so, once again, I have to reiterate
that transaction backlog has valuable solutions other than increasing the
block size.

I also realized that we have already been through a period of full blocks,
so that tremendously reduces the value I see in doing it again. It was
that "spam" test someone ran that did it for us, and I love that. It seems
to have kicked the fee-increasability efforts in the butt, which is great.

I now place a higher priority on enabling senders to increase their fee
when necessary than on increasing the Txns per second that the network can
handle. The competition between these two is rather unfair because of how
easy it is to apply the "N MB-blocks bandaid".

Dave
Jorge Timón via bitcoin-dev
2015-08-11 17:03:27 UTC
Permalink
On Aug 9, 2015 10:44 PM, "Dave Scotese via bitcoin-dev" <
Post by Dave Scotese via bitcoin-dev
On Sun, Aug 9, 2015 at 3:42 AM, Thomas Zander via bitcoin-dev <
Post by Thomas Zander via bitcoin-dev
Post by Dave Scotese via bitcoin-dev
Someone mentioned that when the backlog grows faster than it shrinks, that
is a real problem. I don't think it is. It is a problem for those who
don't wait for even one confirmation
The mention you refer to was about the fact that the software doesn't cope
well with a continuously growing mempool.
If Bitcoind starts eating more and more memory, I expect lots of people that
run it now to turn it off.
That is a real problem then. While emptying the mempool faster with
bigger blocks will help to reduce the occurrence of that problem, I propose
a user-configurable default limit to the size of the mempool as a permanent
solution regardless of block size. "This software has stopped consuming
memory necessary to validate transactions. You can override this by ..."
If anyone feels that protecting those running full nodes from bitcoind
eating more and more memory this way is a good idea, I can make a BIP out
of it if that would help.

You are completely right: this problem has nothing to do with the consensus
block size maximum and it has to be solved regardless of what the maximum
is. No BIP is necessary for this. The "doing nothing side" has been working
on this too:
https://github.com/bitcoin/bitcoin/pull/6470
Jorge Timón via bitcoin-dev
2015-08-10 11:55:03 UTC
Permalink
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
Post by Jorge Timón via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
What are the other reasons?
Post by Gavin Andresen via bitcoin-dev
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
When "the network runs out of capacity" (when we hit the limit) do we expect
anything to happen apart from minimum market fees rising (above zero)?
Obviously any consequences of fees rising are included in this concern.
Btc Drak via bitcoin-dev
2015-08-10 12:33:07 UTC
Permalink
On Mon, Aug 10, 2015 at 12:55 PM, Jorge Timón <
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
Additionally, correct me if I am wrong, but the net effect from preventing
fees rising from zero would be to guarantee miners have no alternative
income from fees as block subsidy dries up and thus harm the incentives to
secure the chain.
Jorge Timón via bitcoin-dev
2015-08-10 13:03:06 UTC
Permalink
Post by Btc Drak via bitcoin-dev
Additionally, correct me if I am wrong, but the net effect from preventing
fees rising from zero would be to guarantee miners have no alternative
income from fees as block subsidy dries up and thus harm the incentives to
secure the chain.
I don't think that's necessarily true. Theoretically urgent
transactions could fund hashing power on their own while there are
still some free non-urgent transactions being mined from time to time.
Thomas Zander via bitcoin-dev
2015-08-10 22:13:54 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
See my various emails in the last hour.
--
Thomas Zander
Jorge Timón via bitcoin-dev
2015-08-11 17:47:56 UTC
Permalink
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size apart
from fees maybe rising and making some problems that need to be solved
rewardless of the size more visible (like a dumb unbounded mempool design).

This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.

With the risk of sounding condescending or aggressive...Really, is not that
hard to answer questions directly and succinctly. We should all be friends
with clarity. Only fear, uncertainty and doubt are enemies of clarity. But
you guys on the "bigger blocks side" don't want to spread fud, do you?
Please, prove paranoid people like me wrong on this point, for the good of
this discussion. I really don't know how else to ask this without getting a
link to something I have already read as a response.
Michael Naber via bitcoin-dev
2015-08-11 18:46:43 UTC
Permalink
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.




On Tue, Aug 11, 2015 at 12:47 PM, Jorge Timón <
Post by Jorge Timón via bitcoin-dev
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size apart
from fees maybe rising and making some problems that need to be solved
rewardless of the size more visible (like a dumb unbounded mempool design).
This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.
With the risk of sounding condescending or aggressive...Really, is not
that hard to answer questions directly and succinctly. We should all be
friends with clarity. Only fear, uncertainty and doubt are enemies of
clarity. But you guys on the "bigger blocks side" don't want to spread fud,
do you?
Please, prove paranoid people like me wrong on this point, for the good of
this discussion. I really don't know how else to ask this without getting a
link to something I have already read as a response.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Mark Friedenbach via bitcoin-dev
2015-08-11 18:48:57 UTC
Permalink
Michael, why does it matter that every node in the world process and
validate your morning coffee transaction? Why does it matter to anyone
except you and the coffee vendor?

On Tue, Aug 11, 2015 at 11:46 AM, Michael Naber via bitcoin-dev <
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.
On Tue, Aug 11, 2015 at 12:47 PM, Jorge Timón <
Post by Jorge Timón via bitcoin-dev
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size
apart from fees maybe rising and making some problems that need to be
solved rewardless of the size more visible (like a dumb unbounded mempool
design).
This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.
With the risk of sounding condescending or aggressive...Really, is not
that hard to answer questions directly and succinctly. We should all be
friends with clarity. Only fear, uncertainty and doubt are enemies of
clarity. But you guys on the "bigger blocks side" don't want to spread fud,
do you?
Please, prove paranoid people like me wrong on this point, for the good
of this discussion. I really don't know how else to ask this without
getting a link to something I have already read as a response.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Michael Naber via bitcoin-dev
2015-08-11 18:55:56 UTC
Permalink
It generally doesn't matter that every node validate your coffee
transaction, and those transactions can and will probably be moved onto
offchain solutions in order to avoid paying the cost of achieving global
consensus. But you still don't get to set the cost of global consensus
artificially. Market forces will ensure that supply will meet demand there,
so if there is demand for access to global consensus, and technology exists
to meet that demand at a cost of one cent per transaction -- or whatever
the technology-limited cost of global consensus happens to be -- then
that's what the market will supply.

It would be like if Amazon suddenly said that they were going to be
charging $5 / gb / month to store data in s3. Can't do it. Technology
exists to bring about cloud storage at $0.01 / GB / month, so they don't
just get to set the price different from the capabilities of technology or
they'll get replaced by a competitor. Same applies to Bitcoin.
Post by Mark Friedenbach via bitcoin-dev
Michael, why does it matter that every node in the world process and
validate your morning coffee transaction? Why does it matter to anyone
except you and the coffee vendor?
On Tue, Aug 11, 2015 at 11:46 AM, Michael Naber via bitcoin-dev <
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.
On Tue, Aug 11, 2015 at 12:47 PM, Jorge Timón <
Post by Jorge Timón via bitcoin-dev
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Gavin, I interpret the absence of response to these questions as a
sign that everybody agrees that there's no other reason to increase
the consensus block size other than to avoid minimum market fees from
rising (above zero).
Feel free to correct that notion at any time by answering the
questions yourself.
In fact if any other "big block size advocate" thinks there's more
reason I would like to hear their reasons too.
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size
apart from fees maybe rising and making some problems that need to be
solved rewardless of the size more visible (like a dumb unbounded mempool
design).
This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.
With the risk of sounding condescending or aggressive...Really, is not
that hard to answer questions directly and succinctly. We should all be
friends with clarity. Only fear, uncertainty and doubt are enemies of
clarity. But you guys on the "bigger blocks side" don't want to spread fud,
do you?
Please, prove paranoid people like me wrong on this point, for the good
of this discussion. I really don't know how else to ask this without
getting a link to something I have already read as a response.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Jorge Timón via bitcoin-dev
2015-08-11 19:45:35 UTC
Permalink
Post by Michael Naber via bitcoin-dev
It generally doesn't matter that every node validate your coffee
transaction, and those transactions can and will probably be moved onto
offchain solutions in order to avoid paying the cost of achieving global
consensus. But you still don't get to set the cost of global consensus
artificially. Market forces will ensure that supply will meet demand there,
so if there is demand for access to global consensus, and technology exists
to meet that demand at a cost of one cent per transaction -- or whatever
the technology-limited cost of global consensus happens to be -- then
that's what the market will supply.

Assuming we maintain any block size maximum consensus rule, the market will
adapt to whatever maximum size is imposed by the consensus rules.
For example, with the current demand and the current consensus block size
maximum, the market has settled on a minimum fee of zero satoshis per
transaction. That's why I cannot understand the urgency to rise the maximum
size.

In any case, yhe consensus maximum shouldn't be based on current or
projected demand, only on centralization concerns, which is what the
consensus rule serves for (to limit centralization).
For example, Gavin advocates for 20 MB because he is not worried about how
that could increase centralization because he believes it won't.
I can't agree with that because I believe 20 MB could make mining
centralization (and centralization in general) much worse.

But if I have to chose between 2 "centralization safe" sizes, sure, the
bigger the better, why not.
In my opinion the main source of disagreement is that one: how the maximum
block size limits centralization.
Michael Naber via bitcoin-dev
2015-08-11 21:31:49 UTC
Permalink
Re: "In my opinion the main source of disagreement is that one: how the
maximum block size limits centralization."

I generally agree with that, but I would add that centralization is only a
goal insofar as it serves things like reliability, transaction integrity,
capacity, and accessibility. More broadly: how do you think that moving the
block size from 1MB to 8MB would materially impact these things?

Re: "That's why I cannot understand the urgency to rise the maximum size."

This issue is urgent because the difference between bitcoin being a success
and it being forgotten hinges on it being "better money" than other money.
If people want a money that can process lots and lots of transactions at
low cost, they're going to get it so long as technology can give it to
them. While it's not critical we raise the block size this very moment
since we're not hitting the capacity wall right now, based on the way
growth spikes in Bitcoin have occurred in the past we, may hit that
capacity wall soon and suddenly. And the moment we do, then Bitcoin may no
longer be "better money" since there's a big opportunity for other money
with higher throughput and lower fees to take its place.
Post by Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
It generally doesn't matter that every node validate your coffee
transaction, and those transactions can and will probably be moved onto
offchain solutions in order to avoid paying the cost of achieving global
consensus. But you still don't get to set the cost of global consensus
artificially. Market forces will ensure that supply will meet demand there,
so if there is demand for access to global consensus, and technology exists
to meet that demand at a cost of one cent per transaction -- or whatever
the technology-limited cost of global consensus happens to be -- then
that's what the market will supply.
Assuming we maintain any block size maximum consensus rule, the market
will adapt to whatever maximum size is imposed by the consensus rules.
For example, with the current demand and the current consensus block size
maximum, the market has settled on a minimum fee of zero satoshis per
transaction. That's why I cannot understand the urgency to rise the maximum
size.
In any case, yhe consensus maximum shouldn't be based on current or
projected demand, only on centralization concerns, which is what the
consensus rule serves for (to limit centralization).
For example, Gavin advocates for 20 MB because he is not worried about how
that could increase centralization because he believes it won't.
I can't agree with that because I believe 20 MB could make mining
centralization (and centralization in general) much worse.
But if I have to chose between 2 "centralization safe" sizes, sure, the
bigger the better, why not.
In my opinion the main source of disagreement is that one: how the maximum
block size limits centralization.
Bryan Bishop via bitcoin-dev
2015-08-11 18:51:00 UTC
Permalink
On Tue, Aug 11, 2015 at 1:46 PM, Michael Naber via bitcoin-dev
Note that lightning / hub and spoke do not meet requirements for users
wishing to participate in global consensus, because they are not global
consensus networks, since all participating nodes are not aware of all
transactions.
You don't need consensus on the lightning network because you are
using bitcoin consensus anyway. Commitment transactions are deep
enough in the blockchain history that removing that transaction from
the history is impractical. The remaining guarantees are ensured by
the properties of the scripts in the transaction. You don't need to
see all the transactions, but you do need to look at the transactions
you are given and draw conclusions based on the details to see whether
their commitments are valid or the setup wasn't broken.

- Bryan
http://heybryan.org/
1 512 203 0507
Michael Naber via bitcoin-dev
2015-08-11 18:59:26 UTC
Permalink
Lightning *depends* on global consensus in order to function. You can't use
it without a global consensus network at all. So given that there is
absolutely a place for a global consensus network, we need to decide
whether the cost to participate in that global consensus will be limited
above or below the capability of technology. In a world where anybody can
step up and fork the code, it's going to be hard for anyone to artificially
set the price of participating in global consensus at a rate higher than
what technology can deliver...
Post by Bryan Bishop via bitcoin-dev
On Tue, Aug 11, 2015 at 1:46 PM, Michael Naber via bitcoin-dev
Note that lightning / hub and spoke do not meet requirements for users
wishing to participate in global consensus, because they are not global
consensus networks, since all participating nodes are not aware of all
transactions.
You don't need consensus on the lightning network because you are
using bitcoin consensus anyway. Commitment transactions are deep
enough in the blockchain history that removing that transaction from
the history is impractical. The remaining guarantees are ensured by
the properties of the scripts in the transaction. You don't need to
see all the transactions, but you do need to look at the transactions
you are given and draw conclusions based on the details to see whether
their commitments are valid or the setup wasn't broken.
- Bryan
http://heybryan.org/
1 512 203 0507
Jorge Timón via bitcoin-dev
2015-08-11 19:27:46 UTC
Permalink
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.

Even if you are right, first fees will raise and that will be what pushes
people to other altcoins, no?
Can we agree that the first step in any potentially bad situation is
hitting the limit and then fees rising as a consequence?
Michael Naber via bitcoin-dev
2015-08-11 19:37:01 UTC
Permalink
Jorge, As long as Bitcoin remains the best global consensus network -- and
part of being best means being reasonably priced -- then no I don't think
people will be pushed into altcoins. Better money ultimately displaces
worse money, so I don't see a driving force for people to move to other
altcoins as long as Bitcoin remains competitive.

Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
Post by Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.
Even if you are right, first fees will raise and that will be what pushes
people to other altcoins, no?
Can we agree that the first step in any potentially bad situation is
hitting the limit and then fees rising as a consequence?
Pieter Wuille via bitcoin-dev
2015-08-11 19:51:59 UTC
Permalink
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev <
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher centralization pressure of various forms. There is discussion about
whether these centralization pressures are significant, but citing that
it's artificially constrained under the limit is IMHO a misrepresentation.
It is constrained to aim for a certain balance between utility and risk,
and neither extreme is interesting, while possibly still "working".

Consensus rules are what keeps the system together. You can't simply switch
to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes only when they are uncontroversial. Even when you believe a more
invasive change is worth it, others may disagree, and the risk from
disagreement is likely larger than the effect of a small block size
increase by itself: the risk that suddenly every transaction can be spent
twice (once on each side of the fork), the very thing that the block chain
was designed to prevent.

My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability
should be an issue: if fees are being paid, it means someone is willing to
pay them. If people are doing transactions despite being unreliable, there
must be a use for them. That may mean that some use cases don't fit
anymore, but that is already the case.
--
Pieter
Michael Naber via bitcoin-dev
2015-08-11 21:18:49 UTC
Permalink
The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about this, is
because Bitcoin is growing, since it's "better money than other money". One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't
going to grow, and in fact Bitcoin itself will be replaced by better money
that is less expensive to transfer.

So the importance of this issue cannot be overstated -- it's compete or die
for Bitcoin -- because people want to transact with global consensus at
high volume, and because technology exists to service that want, then it's
going to be met. This is basic rules of demand and supply. I don't
necessarily disagree with your position on only wanting to support
uncontroversial commits, but I think it's important to get consensus on the
criticality of the block size issue: do you agree, disagree, or not take a
side, and why?
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev <
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher centralization pressure of various forms. There is discussion about
whether these centralization pressures are significant, but citing that
it's artificially constrained under the limit is IMHO a misrepresentation.
It is constrained to aim for a certain balance between utility and risk,
and neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes only when they are uncontroversial. Even when you believe a more
invasive change is worth it, others may disagree, and the risk from
disagreement is likely larger than the effect of a small block size
increase by itself: the risk that suddenly every transaction can be spent
twice (once on each side of the fork), the very thing that the block chain
was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability
should be an issue: if fees are being paid, it means someone is willing to
pay them. If people are doing transactions despite being unreliable, there
must be a use for them. That may mean that some use cases don't fit
anymore, but that is already the case.
--
Pieter
Adam Back via bitcoin-dev
2015-08-11 21:23:18 UTC
Permalink
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.

Adam


On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about this, is
because Bitcoin is growing, since it's "better money than other money". One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't going
to grow, and in fact Bitcoin itself will be replaced by better money that is
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete or die
for Bitcoin -- because people want to transact with global consensus at high
volume, and because technology exists to service that want, then it's going
to be met. This is basic rules of demand and supply. I don't necessarily
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the criticality of
the block size issue: do you agree, disagree, or not take a side, and why?
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with higher
centralization pressure of various forms. There is discussion about whether
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation. It is
constrained to aim for a certain balance between utility and risk, and
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus changes
only when they are uncontroversial. Even when you believe a more invasive
change is worth it, others may disagree, and the risk from disagreement is
likely larger than the effect of a small block size increase by itself: the
risk that suddenly every transaction can be spent twice (once on each side
of the fork), the very thing that the block chain was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability should
be an issue: if fees are being paid, it means someone is willing to pay
them. If people are doing transactions despite being unreliable, there must
be a use for them. That may mean that some use cases don't fit anymore, but
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Angel Leon via bitcoin-dev
2015-08-11 21:30:42 UTC
Permalink
tell that to people in poor countries, or even in first world countries.
The competitive thing here is a deal breaker for a lot of people who have
no clue/don't care for decentralization, they just want to send money from
A to B, like email.

http://twitter.com/gubatron

On Tue, Aug 11, 2015 at 5:23 PM, Adam Back via bitcoin-dev <
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the
only
Post by Michael Naber via bitcoin-dev
reason why we're all even here on this mailing list talking about this,
is
Post by Michael Naber via bitcoin-dev
because Bitcoin is growing, since it's "better money than other money".
One
Post by Michael Naber via bitcoin-dev
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't
going
Post by Michael Naber via bitcoin-dev
to grow, and in fact Bitcoin itself will be replaced by better money
that is
Post by Michael Naber via bitcoin-dev
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete or
die
Post by Michael Naber via bitcoin-dev
for Bitcoin -- because people want to transact with global consensus at
high
Post by Michael Naber via bitcoin-dev
volume, and because technology exists to service that want, then it's
going
Post by Michael Naber via bitcoin-dev
to be met. This is basic rules of demand and supply. I don't necessarily
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the criticality
of
Post by Michael Naber via bitcoin-dev
the block size issue: do you agree, disagree, or not take a side, and
why?
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we
should
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in
some
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
other product / fork.
The question is not what the technology can deliver. The question is
what
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
centralization pressure of various forms. There is discussion about
whether
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation.
It is
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
constrained to aim for a certain balance between utility and risk, and
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will
end up
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
ignoring you. These rules are there for a reason. You and I may agree
about
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
only when they are uncontroversial. Even when you believe a more
invasive
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
change is worth it, others may disagree, and the risk from disagreement
is
the
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
risk that suddenly every transaction can be spent twice (once on each
side
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
of the fork), the very thing that the block chain was designed to
prevent.
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
My personal opinion is that we should aim to do a block size increase
for
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
the right reasons. I don't think fear of rising fees or unreliability
should
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
be an issue: if fees are being paid, it means someone is willing to pay
them. If people are doing transactions despite being unreliable, there
must
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
be a use for them. That may mean that some use cases don't fit anymore,
but
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Pieter Wuille via bitcoin-dev
2015-08-11 21:32:25 UTC
Permalink
On Tue, Aug 11, 2015 at 11:30 PM, Angel Leon via bitcoin-dev <
Post by Angel Leon via bitcoin-dev
tell that to people in poor countries, or even in first world countries.
The competitive thing here is a deal breaker for a lot of people who have
no clue/don't care for decentralization,
Then they also don't need their transactions to be on the blockchain, right?
--
Pieter
Adam Back via bitcoin-dev
2015-08-11 21:34:46 UTC
Permalink
So if they dont care about decentralisation, they'll be happy using
cheaper off-chain systems, right?

Adam
tell that to people in poor countries, or even in first world countries. The
competitive thing here is a deal breaker for a lot of people who have no
clue/don't care for decentralization, they just want to send money from A to
B, like email.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 5:23 PM, Adam Back via bitcoin-dev
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about this, is
because Bitcoin is growing, since it's "better money than other money". One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't going
to grow, and in fact Bitcoin itself will be replaced by better money that is
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete or die
for Bitcoin -- because people want to transact with global consensus at high
volume, and because technology exists to service that want, then it's going
to be met. This is basic rules of demand and supply. I don't necessarily
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the criticality of
the block size issue: do you agree, disagree, or not take a side, and why?
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with higher
centralization pressure of various forms. There is discussion about whether
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation. It is
constrained to aim for a certain balance between utility and risk, and
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus changes
only when they are uncontroversial. Even when you believe a more invasive
change is worth it, others may disagree, and the risk from disagreement is
likely larger than the effect of a small block size increase by itself: the
risk that suddenly every transaction can be spent twice (once on each side
of the fork), the very thing that the block chain was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability should
be an issue: if fees are being paid, it means someone is willing to pay
them. If people are doing transactions despite being unreliable, there must
be a use for them. That may mean that some use cases don't fit anymore, but
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Michael Naber via bitcoin-dev
2015-08-11 21:39:33 UTC
Permalink
Sure, most people probably would be happy with cheaper off-chain systems.
There already are and will probably continue to be more transactions
happening off-chain partly for this very reason. That's not the issue we're
trying to address though: The main chain is the lynch-pin to the whole
system. We've got to do a good job meeting demand that people have for
wanting to utilize the main-chain, or else we'll risk being replaced by
some other main-chain solution that does it better.
Post by Adam Back via bitcoin-dev
So if they dont care about decentralisation, they'll be happy using
cheaper off-chain systems, right?
Adam
Post by Angel Leon via bitcoin-dev
tell that to people in poor countries, or even in first world countries.
The
Post by Angel Leon via bitcoin-dev
competitive thing here is a deal breaker for a lot of people who have no
clue/don't care for decentralization, they just want to send money from
A to
Post by Angel Leon via bitcoin-dev
B, like email.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 5:23 PM, Adam Back via bitcoin-dev
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about
this,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
is
because Bitcoin is growing, since it's "better money than other
money".
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't going
to grow, and in fact Bitcoin itself will be replaced by better money that is
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete
or
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
die
for Bitcoin -- because people want to transact with global consensus
at
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
high
volume, and because technology exists to service that want, then it's going
to be met. This is basic rules of demand and supply. I don't
necessarily
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the
criticality
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
of
the block size issue: do you agree, disagree, or not take a side, and why?
On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille <
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing.
The
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
question at hand is whether we should constrain that limit below
what
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this
size,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with higher
centralization pressure of various forms. There is discussion about whether
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation. It is
constrained to aim for a certain balance between utility and risk,
and
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we
need
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus changes
only when they are uncontroversial. Even when you believe a more invasive
change is worth it, others may disagree, and the risk from
disagreement
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
is
likely larger than the effect of a small block size increase by
the
risk that suddenly every transaction can be spent twice (once on each side
of the fork), the very thing that the block chain was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability should
be an issue: if fees are being paid, it means someone is willing to
pay
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
them. If people are doing transactions despite being unreliable,
there
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
must
be a use for them. That may mean that some use cases don't fit
anymore,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
but
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Venzen Khaosan via bitcoin-dev
2015-08-12 06:10:34 UTC
Permalink
Your concern for adoption is valid yet there are a few assumptions in
your discussion and they are a common thread in the current wave of
"bigger blocksize" topics.

1) Supplying bigger blocks will meet the demand of more people:

Anyone can transact via Bitcoin. By increasing blocksize and making
more transactions possible at low fees, what's to stop a large
corporation, bank or government from using the protocol as a cheap
settlement mechanism. They don't have to fund or develop their own
(well, Ecuador has, for this exact use-case) and perhaps the utility
and capacity of the Bitcoin network means reliability and low fees
(cheaper than a bank clearance, say) for their use-case. In the
process they hog xMB of space in each block and discussion about a
capacity limit continues in this list. Increased supply *will* be
utilized - by all kinds of entities - not only the girl next-door and
the unbanked proletariat.

2) Dissatisfied users will move to alt-coins so Bitcoin better be
careful...

The assumption here is that the best skills and most able minds are
fairly evenly distributed amongst alt-coin dev teams. I doubt this is
true and the notion underestimates the quality of developer that is
attracted to Bitcoin Core to apply themselves to this project, often
self-funded. There are few (if any) comparable cryptocurrencies or cc
dev teams out there. Hence the Bitcoin market cap, the large
stakeholder industry, and the established brand.

3) Bitcoin is better money.

Yes, indeed. It's genius and revolution. Yet, it does not fit every
use-case. I know people don't like it when I make this example, but
it's the truth where I live, and by extension, in many places in the
world:

I live in rural Southeast Asia. Some houses have electricity and some
don't: by choice, because rural lifestyle in the tropics does not
always require you to have electricity. People charge their mobile
phones at the community eating house every other day. The electricity
supply is unreliable. I've had to rig a solar charging system to a
UPS, but most people around here have no choice but to deal with
intermittent power cuts. The local market has a diesel generator, so
constant electricity, but if a power cut lasts for long enough the
local cellular mast battery backup depletes and then there is no
cellular connectivity - the only means of accessing the internet.

Now, how does one expect this community to use or adopt
cryptocurrency? They are mostly unbanked, get paid fiat wages at the
end of the week and spend fiat on commodities, rent, food and
entertainment like the rest of the world. But Bitcoin is not a "better
money" in their case, and who knows for how long this condition will
remain true.

4) TBD

The notion that there be dragons at the capacity limit is unfounded
and reactionary. We have to make the journey and find out what is, in
fact, there at the edge - as many others have argued in the list. This
is our opportunity to make scientific observation and discovery for
the benefit of Bitcoin - while it is still in its early years and the
capacity limit untested.

Who knows? The outcome may be an informed decision to implement bigger
blocks. Informed. Based not on fear and uncertainty but on empirical
observation and facts.
Post by Michael Naber via bitcoin-dev
Sure, most people probably would be happy with cheaper off-chain
systems. There already are and will probably continue to be more
transactions happening off-chain partly for this very reason.
That's not the issue we're trying to address though: The main chain
is the lynch-pin to the whole system. We've got to do a good job
meeting demand that people have for wanting to utilize the
main-chain, or else we'll risk being replaced by some other
main-chain solution that does it better.
So if they dont care about decentralisation, they'll be happy
using cheaper off-chain systems, right?
Adam
Post by Angel Leon via bitcoin-dev
tell that to people in poor countries, or even in first world
countries. The
Post by Angel Leon via bitcoin-dev
competitive thing here is a deal breaker for a lot of people who
have no
Post by Angel Leon via bitcoin-dev
clue/don't care for decentralization, they just want to send
money
from A to
Post by Angel Leon via bitcoin-dev
B, like email.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 5:23 PM, Adam Back via bitcoin-dev
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic
of Bitcoin. I think the interesting thing is trustlessness -
being able to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in
fact the
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
only reason why we're all even here on this mailing list
talking
about this,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
is because Bitcoin is growing, since it's "better money than
other
money".
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
One of the key characteristics toward that is Bitcoin being
inexpensive to
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
transact. If that characteristic is no longer true, then
Bitcoin isn't
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
going to grow, and in fact Bitcoin itself will be replaced by
better
money
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
that is less expensive to transfer.
So the importance of this issue cannot be overstated -- it's
compete or
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
die for Bitcoin -- because people want to transact with
global
consensus at
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
high volume, and because technology exists to service that
want,
then it's
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
going to be met. This is basic rules of demand and supply. I don't
necessarily
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
disagree with your position on only wanting to support
uncontroversial
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
commits, but I think it's important to get consensus on the
criticality
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
of the block size issue: do you agree, disagree, or not take
a
side, and
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
why?
On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a
bad
thing. The
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
question at hand is whether we should constrain that
limit
below what
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
technology is capable of delivering. I'm arguing that not
only we should not, but that we could not even if we
wanted to, since
competition
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
will deliver capacity for global consensus whether it's
in Bitcoin
or in
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
some other product / fork.
The question is not what the technology can deliver. The
question is
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
what price we're willing to pay for that. It is not a
boolean "at
this size,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
things break, and below it, they work". A small constant
factor increase will unlikely break anything in the short
term, but it will
come with
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
higher centralization pressure of various forms. There is
discussion
about
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
whether these centralization pressures are significant, but
citing
that it's
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
artificially constrained under the limit is IMHO a
misrepresentation.
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
It is constrained to aim for a certain balance between
utility and
risk, and
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
neither extreme is interesting, while possibly still
"working".
Consensus rules are what keeps the system together. You
can't
simply
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
switch to new rules on your own, because the rest of the
system will
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
end up ignoring you. These rules are there for a reason.
You and I
may agree
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
about whether the 21M limit is necessary, and disagree
about whether
we need
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
a block size limit, but we should be extremely careful
with
change. My
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
position as Bitcoin Core developer is that we should merge
consensus
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
changes only when they are uncontroversial. Even when you
believe a more invasive change is worth it, others may
disagree, and the risk from
disagreement
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
is likely larger than the effect of a small block size
increase
the risk that suddenly every transaction can be spent twice
(once
on each
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
side of the fork), the very thing that the block chain was
designed to prevent.
My personal opinion is that we should aim to do a block
size
increase
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
for the right reasons. I don't think fear of rising fees
or
unreliability
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
should be an issue: if fees are being paid, it means
someone is
willing to pay
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
them. If people are doing transactions despite being
unreliable, there
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
must be a use for them. That may mean that some use cases
don't fit
anymore,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
but that is already the case.
-- Pieter
_______________________________________________ bitcoin-dev
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________ bitcoin-dev
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________ bitcoin-dev mailing
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Angel Leon via bitcoin-dev
2015-08-11 22:06:52 UTC
Permalink
So if they dont care about decentralisation, they'll be happy using cheaper
off-chain systems, right?

You betcha! Just talk to a regular people and try to sell them on the
different scenarios.

They will start using something cheaper/faster the minute it comes along
from the banking industry, just to give you a real world example, this week
I've been dreading the idea of having to go to the bank to make a couple of
cash deposits. If I could open my bank's web page right now and do a very
simple interbank transaction (without having to convince the to let me link
their accounts to mine, with the process that takes like 2 days when they
deposit 2 different cent amounts...) just here within the retarded US
banking system... which has clearly realized the threat from
cryptocurrencies as evidenced on many banker conferences this year.

They will come up with ways to allow us to do person to person transfers,
but this will surely be limited to transactions within the country,
international remittances still have a great chance of being disrupted by
Bitcoin, if and only if, it will be cheap, otherwise the western unions and
xooms of the world will still rule.

Please get out of our your academic cocoon for a bit, talk to real people,
try to convince them to use Bitcoin, and think how hard it will be to make
the sell if on top you tell them... "it costs more... but it's
decentralized!" LOL

http://twitter.com/gubatron
So if they dont care about decentralisation, they'll be happy using
cheaper off-chain systems, right?
Adam
Post by Angel Leon via bitcoin-dev
tell that to people in poor countries, or even in first world countries.
The
Post by Angel Leon via bitcoin-dev
competitive thing here is a deal breaker for a lot of people who have no
clue/don't care for decentralization, they just want to send money from
A to
Post by Angel Leon via bitcoin-dev
B, like email.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 5:23 PM, Adam Back via bitcoin-dev
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the only
reason why we're all even here on this mailing list talking about
this,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
is
because Bitcoin is growing, since it's "better money than other
money".
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
One
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't going
to grow, and in fact Bitcoin itself will be replaced by better money that is
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete
or
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
die
for Bitcoin -- because people want to transact with global consensus
at
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
high
volume, and because technology exists to service that want, then it's going
to be met. This is basic rules of demand and supply. I don't
necessarily
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the
criticality
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
of
the block size issue: do you agree, disagree, or not take a side, and why?
On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille <
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing.
The
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
question at hand is whether we should constrain that limit below
what
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this
size,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with higher
centralization pressure of various forms. There is discussion about whether
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation. It is
constrained to aim for a certain balance between utility and risk,
and
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we
need
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus changes
only when they are uncontroversial. Even when you believe a more invasive
change is worth it, others may disagree, and the risk from
disagreement
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
is
likely larger than the effect of a small block size increase by
the
risk that suddenly every transaction can be spent twice (once on each side
of the fork), the very thing that the block chain was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability should
be an issue: if fees are being paid, it means someone is willing to
pay
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
them. If people are doing transactions despite being unreliable,
there
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
must
be a use for them. That may mean that some use cases don't fit
anymore,
Post by Angel Leon via bitcoin-dev
Post by Adam Back via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
but
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Michael Naber via bitcoin-dev
2015-08-11 21:35:52 UTC
Permalink
Bitcoin would be better money than current money even if it were a bit more
expensive to transact, simply because of its other great characteristics
(trustlessness, limited supply, etc). However... it is not better than
something else sharing all those same characteristics but which is also
less expensive. The best money will win, and if Bitcoin doesn't increase
capacity then it won't remain the best.
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
Adam
On 11 August 2015 at 22:18, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
The only reason why Bitcoin has grown the way it has, and in fact the
only
Post by Michael Naber via bitcoin-dev
reason why we're all even here on this mailing list talking about this,
is
Post by Michael Naber via bitcoin-dev
because Bitcoin is growing, since it's "better money than other money".
One
Post by Michael Naber via bitcoin-dev
of the key characteristics toward that is Bitcoin being inexpensive to
transact. If that characteristic is no longer true, then Bitcoin isn't
going
Post by Michael Naber via bitcoin-dev
to grow, and in fact Bitcoin itself will be replaced by better money
that is
Post by Michael Naber via bitcoin-dev
less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete or
die
Post by Michael Naber via bitcoin-dev
for Bitcoin -- because people want to transact with global consensus at
high
Post by Michael Naber via bitcoin-dev
volume, and because technology exists to service that want, then it's
going
Post by Michael Naber via bitcoin-dev
to be met. This is basic rules of demand and supply. I don't necessarily
disagree with your position on only wanting to support uncontroversial
commits, but I think it's important to get consensus on the criticality
of
Post by Michael Naber via bitcoin-dev
the block size issue: do you agree, disagree, or not take a side, and
why?
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we
should
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in
some
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
other product / fork.
The question is not what the technology can deliver. The question is
what
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
centralization pressure of various forms. There is discussion about
whether
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
these centralization pressures are significant, but citing that it's
artificially constrained under the limit is IMHO a misrepresentation.
It is
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
constrained to aim for a certain balance between utility and risk, and
neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply
switch to new rules on your own, because the rest of the system will
end up
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
ignoring you. These rules are there for a reason. You and I may agree
about
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
only when they are uncontroversial. Even when you believe a more
invasive
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
change is worth it, others may disagree, and the risk from disagreement
is
the
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
risk that suddenly every transaction can be spent twice (once on each
side
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
of the fork), the very thing that the block chain was designed to
prevent.
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
My personal opinion is that we should aim to do a block size increase
for
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
the right reasons. I don't think fear of rising fees or unreliability
should
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
be an issue: if fees are being paid, it means someone is willing to pay
them. If people are doing transactions despite being unreliable, there
must
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
be a use for them. That may mean that some use cases don't fit anymore,
but
Post by Michael Naber via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
that is already the case.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Pieter Wuille via bitcoin-dev
2015-08-11 21:51:55 UTC
Permalink
Post by Michael Naber via bitcoin-dev
Bitcoin would be better money than current money even if it were a bit
more expensive to transact, simply because of its other great
characteristics (trustlessness, limited supply, etc). However... it is not
better than something else sharing all those same characteristics but which
is also less expensive. The best money will win, and if Bitcoin doesn't
increase capacity then it won't remain the best.
If it is less expensive, it is harder to be reliable (because it's easier
for a sudden new use case to outbid the available space), which is less
useful for a payment mechanism.

If it has better scale (with the same technology), it will have higher
centralization pressure. The higher price you potentially pay (in fees) to
get your transactions on a smaller block chain is the price of higher
security and independence. Perhaps the compromise is not at the optimal
place, but please stop saying "below what the technology can do". The
technology can "do" gigabyte blocks I'm sure, If you accept that you need a
small cluster to keep up with validation, and all blocks are produced by a
single miner cartel.

IMHO, Bitcoin (or any cryptocurrency) on-chain as a payment system is:
* Expensive: there is a (known in advance and agreed upon) inflation that
we're using to pay miners. But by holding Bitcoin you're paying for the
security of the system, even if it is not in fees.
* Unreliable: you never know when suddenly there will be more higher-fee
transactions that outbid you.
* Slow, unless you already trust the sender to not double spend (in which
case you don't actually need the security of the blockchain).

I don't know the future, and I don't know what use cases will develop and
what they'll want to pay or what reliability they need. But let's please
not throw out the one quality that Bitcoin is still good at: lack of
centralized parties to trust.
--
Pieter
Elliot Olds via bitcoin-dev
2015-08-12 03:35:43 UTC
Permalink
On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille via bitcoin-dev <
Post by Pieter Wuille via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Bitcoin would be better money than current money even if it were a bit
more expensive to transact, simply because of its other great
characteristics (trustlessness, limited supply, etc). However... it is not
better than something else sharing all those same characteristics but which
is also less expensive. The best money will win, and if Bitcoin doesn't
increase capacity then it won't remain the best.
If it is less expensive, it is harder to be reliable (because it's easier
for a sudden new use case to outbid the available space), which is less
useful for a payment mechanism.
It depends on which use case's reliability that you focus on. For any
specific use case of Bitcoin, that use case will be more reliable with a
larger block size (ignoring centralization effects).

The effect that I think you're talking about is that with lower fees, some
use cases will exist that otherwise wouldn't have been possible with higher
fees / smaller blocks, and these "low fee only" use cases will not be as
reliable as the use cases you'd see with high fees. But that puts you in a
position or arguing that it's better that low fee use cases never exist at
all, than existing at some high risk of being priced out eventually. Do we
know with high confidence how high tx fees will be in the future? Should it
be up to us discourage low fee use cases from being tried, because we think
the risk that they'll later be priced out is too great? Shouldn't we let
the people developing those use cases make that call? Maybe they don't mind
the unreliability. Maybe it's worth it to them if their use case only lasts
for a few months.

The important point to note is that the reliability of a use case is
determined by the fees that people are willing to pay for that use case,
not the fees that are actually paid. If big banks are willing to pay $1 /
tx for some use case right now, but they only need 200 of these txns per
block, then they might be paying only 5 cents / tx because no one is
forcing them to pay more. The fact that they're only paying 5 cents / tx
now doesn't make them any more vulnerable to new use cases than if they
were paying $1 / tx now. If a new use case started bidding up tx fees, the
banks would just increase their tx fees as high as they needed to (up to
$1).

The reason that larger block sizes increase reliability for any given use
case is that (a) You will never be priced out of blocks by a use case that
is only willing to pay lower fees than you. This is true regardless of the
block size. At worst they'll just force you to pay more in fees and lose
some of your consumer surplus. (b) If a use case is willing to pay higher
fees than you, then they're basically stepping ahead of you in line for
block space and pushing you closer to the edge of not being included in
blocks. The more space that exists between your use case and the marginal
use cases that are just barely getting included in blocks, the less
vulnerable you are to getting pushed out of blocks by new use cases.

If this is tricky to understand, here's an example that will make it clear:

Assume blocks can hold 2000 txns per MB. Before the new use case is
discovered, demand looks like this:

500 txns will pay $1 fees
1000 txns will pay 50 cent fees
2000 txns will pay 5 cent fees
8000 txns will pay 2 cent fees
15,000 txns will pay 1 cent fees.
100,000 txns will pay 0.01 cent fees.

So at a block size of 1MB, fees are 5 cents and user surplus is $925 per
block ($0.95 * 500 + 0.45 * 1000).
At a block size of 8 MB, fees are 1 cent and user surplus is $1,145 per
block ($0.99 * 500 + 0.49 * 1000 + $0.04 * 2000 + $0.01 * 8000).

Now a new use case comes into play and this is added to demand:

3000 txns will pay $5 / tx

That demand changes the scenarios like such:

At 1 MB fees jump to $5, user surplus is $0, and the $925 of value the
previous users were getting is lost. All existing use cases are priced out,
because there wasn't enough room in the blocks to accommodate them plus
this new use case.

At 8 MB, fees would stay at 1 cent, user surplus would be $16,115, and $0
in value would be lost (3000 users who were paying 1 cent for txns that
they valued only at 1 cent would stop making txns). All use cases
corresponding to the txns that were willing to pay at least 2 cents are
still viable, because there was enough space in blocks to accommodate them
plus the 3000 new high fee txns.

Let's say you're running the service that represents the 2000 txns willing
to pay 5 cents each on the demand curve specified above. Let's say you're
worried about being priced out of blocks. Which situation do you want to be
in, the one with 1 MB blocks or 8 MB blocks? It's pretty clear that your
best chance to remain viable is with larger blocks.
Venzen Khaosan via bitcoin-dev
2015-08-12 04:47:43 UTC
Permalink
Post by Elliot Olds via bitcoin-dev
On Tue, Aug 11, 2015 at 2:51 PM, Pieter Wuille via bitcoin-dev
On Tue, Aug 11, 2015 at 11:35 PM, Michael Naber
Bitcoin would be better money than current money even if it were a
bit more expensive to transact, simply because of its other great
characteristics (trustlessness, limited supply, etc). However...
it is not better than something else sharing all those same
characteristics but which is also less expensive. The best money
will win, and if Bitcoin doesn't increase capacity then it won't
remain the best.
If it is less expensive, it is harder to be reliable (because it's
easier for a sudden new use case to outbid the available space),
which is less useful for a payment mechanism.
It depends on which use case's reliability that you focus on. For
any specific use case of Bitcoin, that use case will be more
reliable with a larger block size (ignoring centralization
effects).
I read through your message and see the point you're trying to make,
but would like to point out that it is not useful to talk about
hypothetical scenarios involving Bitcoin that include the supposition
"ignoring centralization effects".

Decentralization concerns are fundamental to this innovation, else it
loses its meaning and value. And that's the trade-off that Pieter,
Jorge, Martin, Adam and others have referring to during the past 24
hours: in order to have a secure Bitcoin that is not vulnerable to
centralization, certain sacrifices have to be made and the Consensus
Rule of a relatively small blocksize is the main protection we
currently have.

There are a lot of "larger blocks, more transactions" arguments being
made that overlook this core axiom of decentralization. That is why
the developers and thinkers with the deepest understanding of this
protocol are pointing out the need for another layer on top of
Bitcoin. That is where the scaling can take place to cater for the
use-cases of more txns, quicker txns, remittance, etc. and with it
increased adoption.
Elliot Olds via bitcoin-dev
2015-08-14 21:47:02 UTC
Permalink
Post by Venzen Khaosan via bitcoin-dev
Post by Elliot Olds via bitcoin-dev
It depends on which use case's reliability that you focus on. For
any specific use case of Bitcoin, that use case will be more
reliable with a larger block size (ignoring centralization
effects).
I read through your message and see the point you're trying to make,
but would like to point out that it is not useful to talk about
hypothetical scenarios involving Bitcoin that include the supposition
"ignoring centralization effects".
Pieter was arguing for the existence of an effect on reliability that was
orthogonal to centralization risk. When arguing that this effect doesn't
really exist, it's appropriate to hold centralization risk constant.
Tom Harding via bitcoin-dev
2015-08-12 00:56:05 UTC
Permalink
Post by Adam Back via bitcoin-dev
I dont think Bitcoin being cheaper is the main characteristic of
Bitcoin. I think the interesting thing is trustlessness - being able
to transact without relying on third parties.
That rules out Lightning Network.

Lightning relies on third parties all over the place. Many things must
be done right, and on-time by N intermediate third parties (subject to
business pressures and regulation) or your payment will not work.

Lightning hubs can't steal your money. Yay! But banks stealing your
payment money is not a problem with today's payment systems. Some real
problems with those systems are:

- Limited ACCESS to payment systems
- High FEES
- Transaction AMOUNT restrictions
- FRAUD due to weak technology
- CURRENCY conversions

Plain old bitcoin solves all of these problems.

Bitcoin does have challenges. THROUGHPUT and TIME-TO-RELIABILITY are
critical ones. DECENTRALIZATION and PRIVACY must not be degraded.
These challenges can be met and exceeded.
Eric Voskuil via bitcoin-dev
2015-08-12 01:18:10 UTC
Permalink
Hi Michael,
One of the key characteristics toward that is Bitcoin being
inexpensive to transact.
What you seem to be missing is *why* bitcoin is better money. Have you
considered why is it comparatively inexpensive to transact in a medium
that is based on such a highly inefficient technology?

You might want to consider that these two considerations are not
independent. The reduced cost of transacting (and carrying) Bitcoin is a
direct consequence of its trustless nature. Any compromise in that
nature will eliminate that advantage, and therefore Bitcoin.

Bitcoin is designed to solve only one problem that other systems do not.
To accomplish this it makes significant compromises in other areas. The
benefit of this solution is that it cannot be effectively controlled by
the state. As a result, all of the associated overhead is eliminated.
Hence the net cost benefit despite high technical costs.

So this is a case where you should be careful what you wish for.

e
The only reason why Bitcoin has grown the way it has, and in fact the
only reason why we're all even here on this mailing list talking about
this, is because Bitcoin is growing, since it's "better money than other
money". One of the key characteristics toward that is Bitcoin being
inexpensive to transact. If that characteristic is no longer true, then
Bitcoin isn't going to grow, and in fact Bitcoin itself will be replaced
by better money that is less expensive to transfer.
So the importance of this issue cannot be overstated -- it's compete or
die for Bitcoin -- because people want to transact with global consensus
at high volume, and because technology exists to service that want, then
it's going to be met. This is basic rules of demand and supply. I don't
necessarily disagree with your position on only wanting to support
uncontroversial commits, but I think it's important to get consensus on
the criticality of the block size issue: do you agree, disagree, or not
take a side, and why?
Thomas Zander via bitcoin-dev
2015-08-12 08:10:45 UTC
Permalink
Post by Pieter Wuille via bitcoin-dev
If people are doing transactions despite being unreliable, there
must be a use for them.
Thats one usage of the form unreliable.
Yes, if people start getting their transactions thrown out because of full
blocks or full memory pools, then its unreliable to send stuff.

Much more importantly is the software is unreliable at such loads. Bitcoin
core will continue to grow in memory consumption, and eventually crash. Or,
worse, crash the system its running on.
We know of some issues in the software with regards to running at > 100%
capacity, I'm sure we'll find more when it actually happens.

IT experts are serious when they say that they avoid maxing out a system like
the plague.

This, btw, is a good scenario where more centralization ends up happening when
blocks are always full and people need to upgrade their client every week to
keep up with the bugfixes.
Jorge Timón via bitcoin-dev
2015-08-12 09:00:29 UTC
Permalink
On Aug 12, 2015 10:11 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Post by Pieter Wuille via bitcoin-dev
If people are doing transactions despite being unreliable, there
must be a use for them.
Thats one usage of the form unreliable.
Yes, if people start getting their transactions thrown out because of full
blocks or full memory pools, then its unreliable to send stuff.
Much more importantly is the software is unreliable at such loads. Bitcoin
core will continue to grow in memory consumption, and eventually crash. Or,
worse, crash the system its running on.
We know of some issues in the software with regards to running at > 100%
capacity, I'm sure we'll find more when it actually happens.
Don't fear this happening at 1 MB, fear this happening at any size. This
needs to be solved regardless of the block size.
Don't worry, the "doing nothing side" is already taking care of this. I
will give the link for the second time...

https://github.com/bitcoin/bitcoin/pull/6470
Thomas Zander via bitcoin-dev
2015-08-12 09:25:46 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
Don't fear this happening at 1 MB, fear this happening at any size. This
needs to be solved regardless of the block size.
I know, everyone knows.

There is a lot of work that needs to be done to be able to use bitcoind at an
forever growing backlog. And since I've been doing software for some decades,
I can tell you this won't be done or fixed in 6-12 months.
We probably haven't found the majority of the issues yet.

We need more time and a bigger blocksize gives us more time.
--
Thomas Zander
Jorge Timón via bitcoin-dev
2015-08-11 19:53:56 UTC
Permalink
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.

You didn't answer the 2 questions...
Anyway, if we don't care about centralization at all, we can just remove
the limit: that's what "technology can provide".
Maybe in that case it is developers who move to a decentralized
competitor...
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Even if you are right, first fees will raise and that will be what
pushes people to other altcoins, no?
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Can we agree that the first step in any potentially bad situation is
hitting the limit and then fees rising as a consequence?
Michael Naber via bitcoin-dev
2015-08-11 20:56:45 UTC
Permalink
I'm not sure whether removing the limit at the protocol-level would lead to
government by miners who might reject blocks which were too big, but I
probably wouldn't want to take that risk. I think we should probably keep a
block size limit in the protocol, but that we should increase it to be as
high as "technology can provide." Toward that: I don't necessarily think
that node-count in and of itself should be the metric for evaluating what
technology can provide, as much as the goal that the chain be inexpensive
to validate given the capabilities of present technology -- so if I can
lease a server in a datacenter which can validate the chain and my total
cost to do that is just a few dollars, then we're probably ok.

Of course there's also the issue that we maintain enough geographic /
political distribution to keep the network reliable, but I think we're far
from being in danger on the reliability front. So maybe my criteria that
the chain be validated at low cost is the wrong focus, but if it is than
what's the appropriate criteria for deciding whether it's safe by standards
of "today's technology" to raise the limit at the protocol level?
Post by Michael Naber via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hitting the limit in and of itself is not necessarily a bad thing. The
question at hand is whether we should constrain that limit below what
technology is capable of delivering. I'm arguing that not only we should
not, but that we could not even if we wanted to, since competition will
deliver capacity for global consensus whether it's in Bitcoin or in some
other product / fork.
You didn't answer the 2 questions...
Anyway, if we don't care about centralization at all, we can just remove
the limit: that's what "technology can provide".
Maybe in that case it is developers who move to a decentralized
competitor...
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Post by Michael Naber via bitcoin-dev
Hi Jorge: Many people would like to participate in a global consensus
network -- which is a network where all the participating nodes are aware
of and agree upon every transaction. Constraining Bitcoin capacity below
the limits of technology will only push users seeking to participate in a
global consensus network to other solutions which have adequate capacity,
such as BitcoinXT or others. Note that lightning / hub and spoke do not
meet requirements for users wishing to participate in global consensus,
because they are not global consensus networks, since all participating
nodes are not aware of all transactions.
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Even if you are right, first fees will raise and that will be what
pushes people to other altcoins, no?
Post by Michael Naber via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Can we agree that the first step in any potentially bad situation is
hitting the limit and then fees rising as a consequence?
Thomas Zander via bitcoin-dev
2015-08-12 07:54:24 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
Can we agree that the first step in any potentially bad situation is
hitting the limit and then fees rising as a consequence?
Fees rising due to scarcity has nothing to do with the problem. Its a
consequence that is irrelevant to me.

Bad situations are roughly divided into two parts;
* technical
* marketing.

The technical part is that we already know of several technical
solutions we
will need when we have a forever growing backlog. Without them, nodes
will
crash.
On top of that, we can expect a lot of new problems we don't know yet.

IT experts are serious when they say that they avoid maxing out a
system like
the plague.


Marketing wise full blocks means we can only serve 3 transactions a
second.
Which is beyond trivial. All the banks, nasdaq, countries, businesses
etc etc
now contemplating using Bitcoin itself will see this as a risk too big to
ignore and the 1Mb Bitcoin will loose 99% of its perceived value.

If you want fees to rise, then it should be viable to be used, withing 6
months, for something bigger than the economic size of Iceland.
(=random
smallest country I know).
--
Thomas Zander
Thomas Zander via bitcoin-dev
2015-08-12 08:01:57 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev"
Post by Thomas Zander via bitcoin-dev
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size apart
from fees maybe rising and making some problems that need to be solved
rewardless of the size more visible
[]
Post by Jorge Timón via bitcoin-dev
This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.
Since you replied to me;

I have to admit I find that a little depressing.
I put forward about 10 reasons in the last 24 hours and all you remember is
something with fees. Which, thats the funny part, I never wrote as being a
problem directly.
Post by Jorge Timón via bitcoin-dev
With the risk of sounding condescending or aggressive...Really, is not that
hard to answer questions directly and succinctly.
I would really like to avoid putting blame. I'd like to avoid the FUD
accusation and calling people paranoid, even yourself, sounds rather bad
too...

Personally I think its a bad idea to do write the way you do, which is that
some people have to prove that bad things will happen if we don't make a
certain change. It polarizes the discussion and puts people into camps. People
have to choose sides.

I've been reading the blocksize debate for months now and have been
wondering
why people here are either for or against, it makes no sense to me.
Neither camp is right, and everyone knows this!
Everyone knows that bigger blocks doesn't solve the scalability problem.
Everyone knows that you can't get substantial growth using lightning or higher
fees in, say, the next 12 months.

please reply to this email;
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-August/010129.html
--
Thomas Zander
Jorge Timón via bitcoin-dev
2015-08-12 08:51:57 UTC
Permalink
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
On Aug 11, 2015 12:14 AM, "Thomas Zander via bitcoin-dev"
Post by Thomas Zander via bitcoin-dev
See my various emails in the last hour.
I've read them. I have read gavin's blog posts as well, several times.
I still don't see what else can we fear from not increasing the size apart
from fees maybe rising and making some problems that need to be solved
rewardless of the size more visible
[]
And again, you dodge the question...
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
This discussion is frustrating for everyone. I could also say "This have
been explained many times" and similar things, but that's not productive.
I'm not trying to be obstinate, please, answer what else is to fear or
admit that all your feas are just potential consequences of rising fees.
Since you replied to me;
I have to admit I find that a little depressing.
I put forward about 10 reasons in the last 24 hours and all you remember
is
Post by Thomas Zander via bitcoin-dev
something with fees. Which, thats the funny part, I never wrote as being
a
Post by Thomas Zander via bitcoin-dev
problem directly.
It's not that I don't remember, it's that for all your "reasons" I can
always say one of these:

1) This could only be an indirect consequence of rising fees (people will
move to a competitive system, cheap transactions will become unreliable,
etc).
2) This problem will appear with other sizes too and it needs to be solved
permanently no matter what (dumb mempool design, true scalability, etc)
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
With the risk of sounding condescending or aggressive...Really, is not that
hard to answer questions directly and succinctly.
I would really like to avoid putting blame. I'd like to avoid the FUD
accusation and calling people paranoid, even yourself, sounds rather bad
too...
Personally I think its a bad idea to do write the way you do, which is
that
Post by Thomas Zander via bitcoin-dev
some people have to prove that bad things will happen if we don't make a
certain change. It polarizes the discussion and puts people into camps.
People
Post by Thomas Zander via bitcoin-dev
have to choose sides.
Whatever, even suggesting you may want to just spread fud and that's why
you don't respond directly to the questions made you respond directly to
the question: you answered with "[]".
I just give up trying that people worried about a non-increase in the short
term answer to me that question. I will internally think that they just
want to spread fud, but not vey vocal about it.
It's just seems strange to me that you don't want to prove to me that's not
Post by Thomas Zander via bitcoin-dev
Everyone knows that bigger blocks doesn't solve the scalability problem.
I'm not so sure, people keep talking about the need to scale the system by
increasing the consensus maximum...
But I'm happy that, indeed, many (possibly most?) people understand this.
Post by Thomas Zander via bitcoin-dev
Everyone knows that you can't get substantial growth using lightning or
higher
Post by Thomas Zander via bitcoin-dev
fees in, say, the next 12 months.
I disagree with this.
In any case, how can future demand be easier to predict than software
development times?
Thomas Zander via bitcoin-dev
2015-08-12 09:23:13 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
Personally I think its a bad idea to do write the way you do, which is
that
Post by Jorge Timón via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
some people have to prove that bad things will happen if we don't make a
certain change. It polarizes the discussion and puts people into camps.
Peoplehave to choose sides.
Whatever,
No, please don't just say "whatever". Show some respect, please.

If you have the courage to say people are spreading FUD you really should
have already exhausted all possible avenues of cooperation.
Now you look like you give up and blame others.
Post by Jorge Timón via bitcoin-dev
I just give up trying that people worried about a non-increase in the short
term answer to me that question. I will internally think that they just
want to spread fud, but not vey vocal about it.
Again, I've been trying really hard to give you answers, straight answers.
It saddens me if you really are giving up trying to understand what people
equally enthusiastic about this technology may see that you don't see.
Post by Jorge Timón via bitcoin-dev
It's just seems strange to me that you don't want to prove to me that's not
In the evolution of Bitcoin over the next couple of years we need bigger
blocks for a lot of different reasons. One of them is that LN isn't here.
The other is that we have known bugs that we have to fix, and that will take
time. Time we are running out of.
To buy more time, get bigger blocks now.

Anyway, I dislike your approach, as I said in the previous mail.
Its not about people spreading FUD or sidestepping the question, it is about
keeping the discussion civilised. You are essentially the one that asks;
"if you are not beating your wife, please prove it to me".
And the you get upset when I try to steer the conversation into less
black/white situations...
And, yes, that analogy is apt because you can't prove either.
--
Thomas Zander
Jorge Timón via bitcoin-dev
2015-08-12 09:45:53 UTC
Permalink
On Wed, Aug 12, 2015 at 11:23 AM, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
Personally I think its a bad idea to do write the way you do, which is
that
Post by Jorge Timón via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
some people have to prove that bad things will happen if we don't make a
certain change. It polarizes the discussion and puts people into camps.
Peoplehave to choose sides.
Whatever,
No, please don't just say "whatever". Show some respect, please.
If you have the courage to say people are spreading FUD you really should
have already exhausted all possible avenues of cooperation.
Now you look like you give up and blame others.
I feel people aren't being respectful with me either, but what I feel
doesn't matter.
I really feel I am very close to exhaust all possible avenues for that
question getting directly answered.
Suggesting that the answer doesn't come because the goal it's just to
spread FUD was one of my last hopes. And it didn't work!
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
I just give up trying that people worried about a non-increase in the short
term answer to me that question. I will internally think that they just
want to spread fud, but not vey vocal about it.
Again, I've been trying really hard to give you answers, straight answers.
It saddens me if you really are giving up trying to understand what people
equally enthusiastic about this technology may see that you don't see.
This question had been dodged repeatedly (one more time in this last response).
I could list all the times I have repeated the question in various
forms in the last 2 weeks and the "answers" I received (when I
received any answer at all) but I'm afraid that will take too much
time.
Then we could go one by one and classify them as:

1) Potential indirect consequence of rising fees.
2) Software problem independent of a concrete block size that needs to
be solved anyway, often specific to Bitcoin Core (ie other
implementations, say libbitcoin may not necessarily share these
problems).

If you think there's more "problem groups", please let me know.
Otherwise I don't see the point in repeating the question. I have not
received a straight answer but you think you've given it.
Seems like a dead end.

On Wed, Aug 12, 2015 at 11:25 AM, Thomas Zander via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
Post by Jorge Timón via bitcoin-dev
Don't fear this happening at 1 MB, fear this happening at any size. This
needs to be solved regardless of the block size.
I know, everyone knows.
I don't think everybody knows, but thank you for saying this
explicitly! Now I know for sure that you do.
Now I know that you are ok with classifying this concern under group 2
in my above list.
Thomas Zander via bitcoin-dev
2015-08-12 16:24:24 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
This question had been dodged repeatedly (one more time in this last response).
This "last response" had a very direct answer to your question, why do you
think it was dodged?
I wrote; "To buy more time, get bigger blocks now." (quoted from parent-of-
parent)
But also said here;
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-August/010129.html
and here;
http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-August/010186.html
Post by Jorge Timón via bitcoin-dev
1) Potential indirect consequence of rising fees.
I have not made any arguments that fall within this section.
Post by Jorge Timón via bitcoin-dev
2) Software problem independent of a concrete block size that needs to
be solved anyway
I'd like to suggest that number two is not described narrowly enough. This
includes everything that we need, ever will need and want in the future...

2) Testing and architectural improvements related to nodes that get more
transactions than can be handled for a considerable time.

This includes problems we don't know of yet, since we haven't run under
these conditions before.
Post by Jorge Timón via bitcoin-dev
If you think there's more "problem groups", please let me know.
Otherwise I don't see the point in repeating the question. I have not
received a straight answer but you think you've given it.
I quoted one such answer above, would be interested in knowing how you
missed it.
Here is another one from 2 hours before that email;
"All the banks, nasdaq, countries, businesses etc etc
"now contemplating using Bitcoin itself will see this as a risk too big to
"ignore and the 1Mb Bitcoin will loose 99% of its perceived value."

source; http://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-August/010180.html
Post by Jorge Timón via bitcoin-dev
Seems like a dead end.
After repeating some answers you said were missing, it would be nice to
know where the connection drops.

Maybe you don't understand what people answer, if so, please ask to explain
instead of saying people are dodging the question. ;)
--
Thomas Zander
BitMinter operator via bitcoin-dev
2015-08-17 14:49:21 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
1) Potential indirect consequence of rising fees.
2) Software problem independent of a concrete block size that needs to
be solved anyway, often specific to Bitcoin Core (ie other
implementations, say libbitcoin may not necessarily share these
problems).
I don't think rising fees is the issue.

Imagine that the government is worried because air lines are selling
tickets cheaply and may run themselves out of business. So their
solution is passing a new law that says only one commercial air plane is
allowed to be in the air at any given time.

This should help a ticket market to develop and prevent air lines from
giving away almost free tickets. In this way the government can protect
the air lines from themselves.

I would not classify all issues that would come out of this as
"potential indirect consequences of rising ticket prices."

It would just make air travel unusable.

That's the problem we may face in the short term.

It would be unwise to go all-in on a solution that doesn't exist yet,
which may or may not arrive in time, and may or may not do the job that
is needed. We need to use the solution we already have so that we can
get by in the short term.

I don't think mining pools will immediately make blocks as big as
possible if the hard limit is raised. Remember that mining pools had to
be coaxed into increasing their block size. Mining pools were making
small blocks to reduce the rate of orphaned blocks. Block propagation is
faster today, but this issue still exists. You need a lot of transaction
fees to make up for the danger of losing 25 BTC. Many pools don't even
pay out transaction fee income to their miners.
--
Regards,
Geir H. Hansen, Bitminter mining pool
Peter Todd via bitcoin-dev
2015-08-17 15:01:30 UTC
Permalink
Post by BitMinter operator via bitcoin-dev
I don't think mining pools will immediately make blocks as big as
possible if the hard limit is raised.
Note that XT includes a patch that sets the soft limit to be the same as the hard limit by default, so if miners did use the defaults "as big as possible" blocks would be produced.
Gavin Andresen via bitcoin-dev
2015-08-10 14:12:05 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
What are the other reasons?
Post by Gavin Andresen via bitcoin-dev
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
When "the network runs out of capacity" (when we hit the limit) do we
expect anything to happen apart from minimum market fees rising (above
zero)?
Obviously any consequences of fees rising are included in this concern.
It is frustrating to answer questions that we answered months ago,
especially when I linked to these in response to your recent "increase
advocates say that not increasing the max block size will KILL BITCOIN"
false claim:
http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent
https://medium.com/@octskyward/crash-landing-f5cc19908e32

Executive summary: when networks get over-saturated, they become
unreliable. Unreliable is bad.

Unreliable and expensive is extra bad, and that's where we're headed
without an increase to the max block size.

RE: the recent thread about "better deal with that type of thing now rather
than later" : exactly the same argument can be made about changes needed
to support a larger block size-- "better to do that now than to do that
later." I don't think either of those arguments are very convincing.
--
--
Gavin Andresen
Alex Morcos via bitcoin-dev
2015-08-10 14:24:18 UTC
Permalink
Gavin,
They are not analogous.

Increasing performance and making other changes that will help allow
scaling can be done while at small scale or large scale.
Dealing with full blocks and the resultant feedback effects is something
that can only be done when blocks are full. It's just too complicated a
problem to solve without seeing the effects first hand, and unlike the
block size/scaling concerns, its binary, you're either in the situation
where demands outgrows supply or you aren't.

Fee estimation is one example, I tried very hard to make fee estimation
work well when blocks started filling up but it was impossible to truly
test and in the small sample of full blocks we've gotten since the code
went live, many improvements made themselves obvious. Expanding mempools
is another issue that doesn't exist at all if supply > demand. Turns out
to also be a difficult problem to solve.

Nevertheless, I mostly agree that these arguments shouldn't be the reason
not to expand block size, I think they are more just an example of how
immature all of this technology is, and we should be concentrating on
improving it before we're trying to scale it to world acceptance levels.
The saddest thing about this whole debate is how fundamental improvements
to the science of cryptocurrencies (things like segregated witness and
confidential transactions) are just getting lost in the circus debate
around trying to cram a few more users into the existing system sooner
rather than later.



On Mon, Aug 10, 2015 at 10:12 AM, Gavin Andresen via bitcoin-dev <
Post by Gavin Andresen via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
I think there are multiple reasons to raise the maximum block size, and
yes, fear of Bad Things Happening as we run up against the 1MB limit is one
of the reasons.
What are the other reasons?
Post by Gavin Andresen via bitcoin-dev
I take the opinion of smart engineers who actually do resource planning
and have seen what happens when networks run out of capacity very seriously.
When "the network runs out of capacity" (when we hit the limit) do we
expect anything to happen apart from minimum market fees rising (above
zero)?
Obviously any consequences of fees rising are included in this concern.
It is frustrating to answer questions that we answered months ago,
especially when I linked to these in response to your recent "increase
advocates say that not increasing the max block size will KILL BITCOIN"
http://gavinandresen.ninja/why-increasing-the-max-block-size-is-urgent
Executive summary: when networks get over-saturated, they become
unreliable. Unreliable is bad.
Unreliable and expensive is extra bad, and that's where we're headed
without an increase to the max block size.
RE: the recent thread about "better deal with that type of thing now
rather than later" : exactly the same argument can be made about changes
needed to support a larger block size-- "better to do that now than to do
that later." I don't think either of those arguments are very convincing.
--
--
Gavin Andresen
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Thomas Zander via bitcoin-dev
2015-08-10 22:12:38 UTC
Permalink
Post by Alex Morcos via bitcoin-dev
think they are more just an example of how
immature all of this technology is, and we should be concentrating on
improving it before we're trying to scale it to world acceptance levels.
Would it be an idea to create a generator of transactions on the test network
that a large number of people can run? Using some randomization as well as the
actual estimation code would generate some reasonably useful data.

I'd volunteer my node to run that.
--
Thomas Zander
Pieter Wuille via bitcoin-dev
2015-08-10 14:34:55 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
Executive summary: when networks get over-saturated, they become
unreliable. Unreliable is bad.
Unreliable and expensive is extra bad, and that's where we're headed
without an increase to the max block size.
I think I see your point of view. You see demand for on-chain transactions
as a single number that grows with adoption. Once the transaction creation
rate grows close to the capacity, transactions will become unreliable, and
you consider this a bad thing.

And if you see Bitcoin as a payment system where guaranteed time to
confirmation is a feature, I fully agree. But I think that is an
unrealistic dream. It only seems reliable because of lack of use. It costs
1.5 BTC per day to create enough transactions to fill the block chain at
the minimum relay fee, and a small multiple of that at actual fee levels.
Assuming that rate remains similar with an increased block size, that
remains cheap.

If you want transactions to be cheap, it will also be cheap to make them
unreliable.
--
Pieter
Thomas Zander via bitcoin-dev
2015-08-10 22:04:52 UTC
Permalink
Post by Pieter Wuille via bitcoin-dev
Post by Gavin Andresen via bitcoin-dev
Executive summary: when networks get over-saturated, they become
unreliable. Unreliable is bad.
Unreliable and expensive is extra bad, and that's where we're headed
without an increase to the max block size.
I think I see your point of view. You see demand for on-chain transactions
as a single number that grows with adoption. Once the transaction creation
rate grows close to the capacity, transactions will become unreliable, and
you consider this a bad thing.
Everyone in any industry will consider that a bad thing.

There is no doubt that on-chain transactions will grow, absolutely no doubt.
You can direct many people to off-chain systems, but that will not stop growth
of on-chain transactions. Bitcoin economy is absolutely tiny and there is a
huge amount of growth possible. It can only grow.
Post by Pieter Wuille via bitcoin-dev
And if you see Bitcoin as a payment system where guaranteed time to
confirmation is a feature, I fully agree.
Naturally, that is a usecase. But not really one that enters my mind. It
certainly is not a requirement to have guaranteed time.

The situation is much simpler than that.
We have maybe 0,007% of the world population using Bitcoin once a month. (half
a million people). And I'm being very optimistic with that number...
This should give you an idea of how much growth is possible.

There is no doubt at all that the 1Mb blocks will get full, continuously, if
we get to a higher rate of usage. Even with the vast majority of users using
Bitcoin off-chain.

As such its not about a guaranteed time-to confirmation. Its about a
confirmation before I die.
Post by Pieter Wuille via bitcoin-dev
If you want transactions to be cheap, it will also be cheap to make them
unreliable.
Its not about transactions being cheap. The fee market is completely
irrelevant to the block size. If you think otherwise you are delusional.
The reason it is irrelevant is because when the system starts consistently
dropping transactions when user count goes up, and when that happens the
Bitcoin network looses value because people don't put value in something that
is unreliable.

This is simple economy 101.
Look at history; so many great companies made great products that had more
features, but didn't make it because their competition might have been slower
to market, but it was actually reliable.
--
Thomas Zander
Will Madden via bitcoin-dev
2015-08-20 14:40:50 UTC
Permalink
And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
Apologies, this is going to be long but please read it...

For starters, f the “minimum relay fee” is 0.0001 BTC, and the proven throughput from the recent stress tests was 2.3 trx/second, it’s 0.0001 x 2.3 x 60 x 60 x 24 or 19.872 BTC to fill up the block chain for 24 hours, not 1.5 BTC.

The math error isn’t important, because the premise of what you are saying is based on a misconception. No one is advocating that we should price fix transaction fees at some tiny amount to guarantee cheap transactions. Furthermore, it’s perfectly realistic to believe bitcoin can scale reliably without constraining the block size to 1MB, given certain constraints.

A quick disclosure, I run a small bitcoin startup that benefits from higher bitcoin prices and more people buying bitcoin with their fiat currency. I also have an inadvisably high percentage of my remaining personal savings denominated in bitcoin.

Back to the point, limiting block size to impose fee pressure is a well intentioned idea that is built on a misconception of how bitcoin's economics work. The price of bitcoin is based on the perceived value of units inside the protocol. Keeping transaction volumes through the bitcoin protocol capped at around 2.3 transactions / second limits the number of new people who can use bitcoin to around 100,000 people performing a little under 2 transactions daily. This is only a tiny bit more use than where we are presently. It’s forced stagnation.

Please, read and understand: constraining the network affect and adoption of bitcoin lowers its overall value, and by extension reduces its price as denominated in other currencies. The only alternatives presently to on blockchain transactions are centralized bank ledgers at exchanges or similar companies. Yes, while capping the bitcoin max_block_size to a level that restricts use will drive transaction fees higher, it will also reduce the underlying price of bitcoin as denominated in other currencies, because the outside observer will see bitcoin stagnating and failing to grow like a nascent but promising technology should. Higher fees combined with lower bitcoin price negates the value of the higher fees, and the only side effect is to stymie adoption in the process, while putting more focus on layer protocols that are no where near ready for mainstream, stable use!

Removing the cap entirely is also a catastrophically poor idea, because some group of jerks out there will absolutely make sure that every block is 32 MB, making it a real PITA for a hobbyist to get interested in bitcoin. Yes, some miners limit blocksize in order to balance propagation times against transaction fee revenue, so there is already a mechanism in place to push transaction fees higher by limiting size or not including transactions without fees that could offset a spam happy bad actor or group of actors, but we cannot leave that to chance. The damage is too high to allow the risk. Bitcoin is going to grow because each new curious, technically savvy kid who learns about it can download and participate as a full node. We’re not anywhere close to mainstream adoption or at a level of maturity where the protocol is fully baked, so we have an obligation to keep full nodes within the grasp of a starving college kid’s budget. That is the barometer here in my mind.

40% should be our best guess for keeping bitcoin in reach of hobbyists, and safer from more napsteresque node centralization. It's simple, which makes it less prone to failure and being picked apart politically as well. It may be too fast, or it may be too slow, but it’s probably a good guess for 5 years, and if history holds true, it will work for a long time and make the cost of running a node lower gradually as well. No one can predict the future, but this is the best we have. No one knows if it will be radio propagation of blocks, quantum spin liquid based storage or data transmission, or some other breakthrough that drives down costs, but something always seems to appear that keeps the long term trends intact. So why wouldn’t we use these trends?

8MB is about 40% annually from January 2009 to today. I can buy a 5TB external hard drive right now online for $130.00 in the US. The true block time is just over 9 minutes, so that’s 160 blocks a day x 8MB x 365.25 days a year, or around 467.52GB of new block size annually. This is 10.69 years of storage for $130.00, or a little over $12 a year - which is darn close to what the cost was back in late 2010 when I first learned about this stuff... I fail to see the “centralization" issue here, and when we contrast $12/year for hobbyists against the centralization risks of mining pools, we should all be ashamed to have wasted so much energy and time talking about this specific point. The math does not add up, and it’s not a significant centralization risk when we put an 8MB cap on this with 40% annual average growth. The energy we’ve blown on this topic should have been put into refining privacy, and solving mining pool centralization. There are so many more important problems.

Let's talk about other ideas for a moment. First, while lightning is really cool and I believe it will be an exponential magnifier for bitcoin someday, that day is NOT today. Waiting for layers over bitcoin to solve a self-imposed limit of 1mb is just a terrible, horrible idea for the health of the protocol. Lightning is really well thought through, but there are still problems that need to be solved. Forced channel expiration, transaction malleability are the theoretical issues that must be solved. There WILL be issues that aren’t anticipated and known today when it goes out “into the wild”. Protocols of this complexity do not go from white paper to stability in less than one to two years. Remember, even the bitcoin reference client had a catastrophic bug 1.5 years after its January 2009 launch in August 2010. I read here that the "deepest thinkers" believe we should wait for overlays to solve this bottleneck, well, bluntly, that is far from practical or pragmatic, and is “ivory tower” thinking. Discussing approaches like this are worse than agreeing to do nothing, because it drains our attention away from other more pressing issues that have a time limit on our ability to solve them before the protocol crystallizes from the scale of the network effect (like privacy, mining centralization, etc.) on top of accomplishing little of immediate value other than academic debates.

I’m not winning any popularity contests today
 but it was a bad idea to approach this as we did with XT. We should have put in a solution that addressed just the cap size issue and nothing more. Other changes, pork, and changing the nature of the community management around the XT client is just too much political baggage to work without fracturing the support of the community. And guess what happened? We have the major community forum moderators actively censoring posts, banning users, and things are looking to the outside observer as if the entire system is starting to fall in on itself. Truth is ladies and gentlemen, our egos and economic interests are creating a tragedy of the commons that will hurt the lot of us far more than it will help the best off of us. Yeah, I get that no one wants to code in a hostile environment, and this community has definitely turned caustic and behaves like a mob of petulant children, but sometimes you have to suck it up and get things done.

So
 what do we do? We should get our @#$@ together, stop the academic grand standing and ego driven debates, raise the cap to 8mb and permit an average growth of 40% a year, then get back to solving real problems and working on layer and side chain magnifiers. Allowing bitcoin to grow reasonably allows adoption to spread, the price to rise, which creates more demand, higher prices, and more fees. Again, because the fees and coinbase rewards are denominated in bitcoin, this increases the return to miners. This, combined with allowing for growth will encourage the price to rise, and increase stability for layers and side chains later on down the road when the technology is stable and mature.

For the love of whatever it is you care about, can we please just get this done? Do I have to go brush up on my C++ and start asking everyone obnoxious, amateur questions? Trust me, no one wants that. Let’s just get the cap raised to 8MB + 40% on average annualized and fight viciously about privacy or mining centralization. Something more important.

Thanks.
Executive summary: when networks get over-saturated, they become unreliable. Unreliable is bad.
Unreliable and expensive is extra bad, and that's where we're headed without an increase to the max block size.
I think I see your point of view. You see demand for on-chain transactions as a single number that grows with adoption. Once the transaction creation rate grows close to the capacity, transactions will become unreliable, and you consider this a bad thing.
And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
If you want transactions to be cheap, it will also be cheap to make them unreliable.
--
Pieter
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Jorge Timón via bitcoin-dev
2015-08-10 14:55:40 UTC
Permalink
Post by Gavin Andresen via bitcoin-dev
Executive summary: when networks get over-saturated, they become
unreliable. Unreliable is bad.
Post by Gavin Andresen via bitcoin-dev
Unreliable and expensive is extra bad, and that's where we're headed
without an increase to the max block size.

I'm not trying to be obstinate but I seriously can't see how they are
different.
When you say unreliable I think you mean "unreliable for cheap fee
transactions". Transactions with the highest fees will always confirm
reliably. For example, a 1 btc fee tx will probably always confirm very
reliably even if capacity never increases and demands increases a lot.
Thomas Zander via bitcoin-dev
2015-08-10 22:09:14 UTC
Permalink
Post by Jorge Timón via bitcoin-dev
I'm not trying to be obstinate but I seriously can't see how they are
different.
When you say unreliable I think you mean "unreliable for cheap fee
transactions". Transactions with the highest fees will always confirm
reliably. For example, a 1 btc fee tx will probably always confirm very
reliably even if capacity never increases and demands increases a lot.
The actual fee is irrelevant, the amount of transactions is relevant.

Have you ever been to a concert that was far away from public transport? They
typically set up bus shuttles, or taxis to get people back into town
afterwards.
The result there is always you end up waiting forever and it actually may be
easier to just walk instead of wait.
The amount you pay is irrelevant if everyone is paying it. There still is more
demand than there is capacity.

At the concert the amount of people will stop after some time, and you'd get
your bus. But in the scenarios created here the queues will never stop.

So, no, its not unreliable for cheap free transactions.
Its unreliable for all types of transactions.
--
Thomas Zander
Pieter Wuille via bitcoin-dev
2015-08-10 22:52:23 UTC
Permalink
On Aug 11, 2015 12:18 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Have you ever been to a concert that was far away from public transport? They
typically set up bus shuttles, or taxis to get people back into town
afterwards.
The result there is always you end up waiting forever and it actually may be
easier to just walk instead of wait.
The amount you pay is irrelevant if everyone is paying it. There still is more
demand than there is capacity.
That's an incorrect analogy. You choose the rate you pay, and get higher
priority when you pay more. Taxi drivers can't pick out higher-paying
customers in advance.

A better comparison is Uber, which charges more in places with high demand,
and you can accept or refuse in advance. And yes, it remains reliable if
you're among those with the highest willingness to pay.
Post by Thomas Zander via bitcoin-dev
So, no, its not unreliable for cheap free transactions.
Its unreliable for all types of transactions.
If 2500 transactions fit in the block chain per day (assuming constant
size) and there are less than 2500 per hour that pay at least 0.001 BTC in
fee, then any transaction which pays more than 0.001 BTC will have a very
high chance of getting in a small multiple of one hour, since miners
prioritize by feerate.

If there are in addition to that 5000 transactions per hour which pay less,
then yes, they need to compete for the remaiming space and their
confirmation will be unreliable.

The whole point is that whether confirmation at a particular price point is
reliable depends on how much demand there is at that price point. And
increasing the block size out of fear of what might happen is failing to
recognize that it can always happen that there is a sudden change in demand
that outcompetes the rest.

The point is not that evolution towards a specific higher feerate needs to
happen, but an evolution to an ecosystem that accepts that there is never a
guarantee for reliability, unless you're willing to pay more than everyone
else - whatever that number is.
--
Pieter
Pieter Wuille via bitcoin-dev
2015-08-10 23:11:14 UTC
Permalink
Post by Pieter Wuille via bitcoin-dev
On Aug 11, 2015 12:18 AM, "Thomas Zander via bitcoin-dev" <
Post by Thomas Zander via bitcoin-dev
Have you ever been to a concert that was far away from public transport? They
typically set up bus shuttles, or taxis to get people back into town
afterwards.
The result there is always you end up waiting forever and it actually may be
easier to just walk instead of wait.
The amount you pay is irrelevant if everyone is paying it. There still is more
demand than there is capacity.
That's an incorrect analogy. You choose the rate you pay, and get higher
priority when you pay more. Taxi drivers can't pick out higher-paying
customers in advance.

I'm sorry, I missed your "if everyone is paying it". This changes a lot. I
agree with you: if everyone wants to pay much then it becomes unreliable.

But I don't think that is something we can avoid with a small constant
factor block size increase, and we don't do the world a service by making
it look like it works for longer.

Let's grow within bounderies set by technology and centralization pressure
that we can agree on. Let the market decide whether how they will that will
low volume reliable transactions and/or high volume unreliable ones.
--
Pieter
Thomas Zander via bitcoin-dev
2015-08-11 05:34:11 UTC
Permalink
Post by Pieter Wuille via bitcoin-dev
The whole point is that whether confirmation at a particular price point is
reliable depends on how much demand there is at that price point. And
increasing the block size out of fear of what might happen is failing to
recognize that it can always happen that there is a sudden change in demand
that outcompetes the rest.
The point is not that evolution towards a specific higher feerate needs to
happen, but an evolution to an ecosystem that accepts that there is never a
guarantee for reliability, unless you're willing to pay more than everyone
else - whatever that number is.
I'm going to go with this one, since we are seeking common ground and all of
this makes sense to me. And I bet to Gavin would agree to this too.

The question I want to ask is this;

How do you expect to get from the current to the situation outlined above?

There are several market forces at work;

* people currently expect near-free payments.
* people currently expect zero-confirmations.
* Bitcoin is seeing a huge amount of uptake, popularity, etc.
* With Greece still in flux, there is a potential enormous spike of usage set
to come when (not if) the Euro falls.


I conclude that we need;

* to create and make working solutions like LN, sidechains etc etc etc.
This should allow people to get their fast confirmation time. Who cares its of
a different nature, the point is that the coffeeshop owner lets you leave with
your coffee. We can sell that.

* To buy ourselves time, LN is not done, Bitpay and friends work on-chain.
That won't change for a year at least.
We need to move the max-block size to a substantial bigger size to allow
Bitcoin to grow.


Unfortunately for us all, Bitcoin is over-sold. We don't have a sales
department but the worth-of-mouth leaves us in a bad situation. And we need to
react to make sure the product isn't killed by bad publicity. What Gox didn't
manage can certainly happen when people find out that Bitcoin can't currently
do any of the things everyone is talking about.
So, while LN is written, rolled out and tested, we need to respond with bigger
blocks. 8Mb - 8Gb sounds good to me.

Can everyone win?
--
Thomas Zander
Mark Friedenbach via bitcoin-dev
2015-08-11 06:03:39 UTC
Permalink
On Mon, Aug 10, 2015 at 10:34 PM, Thomas Zander via bitcoin-dev <
Post by Thomas Zander via bitcoin-dev
So, while LN is written, rolled out and tested, we need to respond with bigger
blocks. 8Mb - 8Gb sounds good to me.
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.

Can you at least understand the conservative position here? "1MB sounds
good to me" is how we got into this mess. We must make sure that we avoid
making the same mistakes again, creating more or worse problems then we are
solving.
Thomas Zander via bitcoin-dev
2015-08-11 06:31:11 UTC
Permalink
Post by Mark Friedenbach via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
So, while LN is written, rolled out and tested, we need to respond with bigger
blocks. 8Mb - 8Gb sounds good to me.
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.

There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause chain-effects.

It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.

Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.

If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.


Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.


On the other side, 3Tb harddrives are sold, which take 8Mb blocks without
problems.
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution..
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.



Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.

Remember 8Gb/block still doesn't support VISA/Mastercard.
--
Thomas Zander
Mark Friedenbach via bitcoin-dev
2015-08-11 07:08:42 UTC
Permalink
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
Post by Thomas Zander via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.
There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause
chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.
...
Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a single word
of that mattered. Yes we all want Bitcoin to scale such that every person
in the world can use it without difficulty. However if that were all that
we cared about then I would be remiss if I did not point out that there are
plenty of better, faster, and cheaper solutions to finding global consensus
over a payment ledger than Bitcoin. Architectures which are algorithmically
superior in their scaling properties. Indeed they are already implemented
and you can use them today:

https://www.stellar.org/
http://opentransactions.org/

So why do I work on Bitcoin, and why do I care about the outcome of this
debate? Because Bitcoin offers one thing, and one thing only which
alternative architectures fundamentally lack: policy neutrality. It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you. *That* is what Bitcoin offers that can't be replicated at
higher scale with a SQL database and an audit log.

It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting. We might as well get rid of mining at
that point and make Bitcoin look like Stellar or Open-Transactions because
at least then we'd scale even better and not be pumping millions of tons of
CO2 into the atmosphere from running all those ASICs.

On the other side, 3Tb harddrives are sold, which take 8Mb blocks without
Post by Thomas Zander via bitcoin-dev
problems.
Straw man, storage is not an issue.
Post by Thomas Zander via bitcoin-dev
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is to have
Bitcoin survive active censorship. Presumably that means being able to run
a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.

It may also be necessary to be able to run over Tor. And not just today's
Tor which is developed, serviced, and supported by the US government, but a
Tor or I2P that future governments have turned hostile towards and actively
censor or repress. Or existing authoritative governments, for that matter.
How much bandwidth would be available through those connections?

It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.

To the second point, it has been previously pointed out that large miners
stand to gain from larger blocks, for the same basic underlying reasons as
selfish mining. The incentive is to increase blocks, and miners are able to
do so at will and without cost. I would not be so certain that we wouldn't
see large blocks sooner than that.
Post by Thomas Zander via bitcoin-dev
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution.
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.
This is basically already deployed thanks to Matt's relay network. Further
improvements are not going to have dramatic effects.
Post by Thomas Zander via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would absolutely,
without any doubt destroy the very nature of Bitcoin, turning it into a
fundamentally uninteresting reincarnation of the existing financial system.
And still be unable to compete with VISA/Mastercard.

So why then the pressure to go down a route that WILL lead to failure by
your own metrics?

I humbly suggest that maybe we should play the strengths of Bitcoin instead
-- it's trustlessness via policy neutrality.

Either that, or go work on Stellar. Because that's where it's headed
otherwise.
Thomas Zander via bitcoin-dev
2015-08-11 08:38:06 UTC
Permalink
_______________________________________________
bitcoin-dev mailing list
bitcoin-***@lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Angel Leon via bitcoin-dev
2015-08-11 09:14:07 UTC
Permalink
- policy neutrality.
- It can't be censored.
- it can't be shut down
- and the rules cannot change from underneath you.

except it can be shutdown the minute it actually gets used by its inability
to scale.

what's the point of having all this if nobody can use it?
what's the point of going through all that energy and CO2 for a mere 24,000
transactions an hour?

It's clear that it's just a matter of time before it collapses.

Here's a simple proposal (concept) that doesn't pretend to set a fixed
block size limit as you can't ever know the demands the future will bring
https://gist.github.com/gubatron/143e431ee01158f27db4

We don't need to go as far as countries with hyper inflation trying to use
the technology to make it collapse, anybody here who has distributed
commercial/free end user software knows that any small company out there
installs more copies in a couple weeks than all the bitcoin users we have
at the moment, all we need is a single company/project with a decent amount
of users who are now enabled to transact directly on the blockchain to
screw it all up (perhaps OpenBazaar this winter could make this whole thing
come down, hopefully they'll take this debate and the current limitations
before their release, and boy are they coding nonstop on it now that they
got funded), the last of your fears should be a malicious government trying
to shut you down, for that to happen you must make an impact first, for now
this is a silly game in the grand scheme of things.

And you did sound pretty bad, all of his points were very valid and they
share the concern of many people, many investors, entrepreneurs putting
shitload of money, time and their lives on a much larger vision than that
of a network that does a mere 3,500 tx/hour, but some people seem to be
able to live in impossible or useless ideals.

It's simply irresponsible to not want to give the network a chance to grow
a bit more. Miners centralizing is inevitable given the POW based
consensus, hobbists-mining is only there for countries with very cheap
energy.

If things remain this way, this whole thing will be a massive failure and
it will probably take another decade before we can open our mouths about
cryptocurrencies, decentralization and what not, and this stubornness will
be the one policy that censored everyone, that shutdown everyone, that made
the immutable rules not matter.

Perhaps it will be Stellar what ends up delivering at this stubborn pace.

http://twitter.com/gubatron

On Tue, Aug 11, 2015 at 4:38 AM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting.
You asked to be convinced of the need for bigger blocks. I gave that.
What makes you think bitcoin will break when more people use it?
Sent on the go, excuse the brevity.
*From: *Mark Friedenbach
*Sent: *Tuesday, 11 August 2015 08:10
*To: *Thomas Zander
*Cc: *Bitcoin Dev
*Subject: *Re: [bitcoin-dev] Fees and the block-finding process
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.
There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause
chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.
...
Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a single word
of that mattered. Yes we all want Bitcoin to scale such that every person
in the world can use it without difficulty. However if that were all that
we cared about then I would be remiss if I did not point out that there are
plenty of better, faster, and cheaper solutions to finding global consensus
over a payment ledger than Bitcoin. Architectures which are algorithmically
superior in their scaling properties. Indeed they are already implemented
https://www.stellar.org/
http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome of this
debate? Because Bitcoin offers one thing, and one thing only which
alternative architectures fundamentally lack: policy neutrality. It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you. *That* is what Bitcoin offers that can't be replicated at
higher scale with a SQL database and an audit log.
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting. We might as well get rid of mining at
that point and make Bitcoin look like Stellar or Open-Transactions because
at least then we'd scale even better and not be pumping millions of tons of
CO2 into the atmosphere from running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks without
Post by Mark Friedenbach via bitcoin-dev
problems.
Straw man, storage is not an issue.
Post by Mark Friedenbach via bitcoin-dev
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is to have
Bitcoin survive active censorship. Presumably that means being able to run
a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just today's
Tor which is developed, serviced, and supported by the US government, but a
Tor or I2P that future governments have turned hostile towards and actively
censor or repress. Or existing authoritative governments, for that matter.
How much bandwidth would be available through those connections?
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
To the second point, it has been previously pointed out that large miners
stand to gain from larger blocks, for the same basic underlying reasons as
selfish mining. The incentive is to increase blocks, and miners are able to
do so at will and without cost. I would not be so certain that we wouldn't
see large blocks sooner than that.
Post by Mark Friedenbach via bitcoin-dev
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution.
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.
This is basically already deployed thanks to Matt's relay network. Further
improvements are not going to have dramatic effects.
Post by Mark Friedenbach via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would absolutely,
without any doubt destroy the very nature of Bitcoin, turning it into a
fundamentally uninteresting reincarnation of the existing financial system.
And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to failure by
your own metrics?
I humbly suggest that maybe we should play the strengths of Bitcoin
instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's headed
otherwise.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Mark Friedenbach via bitcoin-dev
2015-08-11 19:00:46 UTC
Permalink
More people using Bitcoin does not necessarily mean more transactions being
processed by the block chain. Satoshi was forward-thinking enough to
include a powerful script-signature system, something which has never
really existed before. Though suffering from some limitations to be sure,
this smart contract execution framework is expressive enough to enable a
wide variety of new features without changing bitcoin itself.

One of these invented features is micropayment channels -- the ability for
two parties to rapidly exchange funds while only settling the final balance
to the block chain, and to do so in an entirely trustless way. Right now
people don't use scripts to do interesting things like this, but there is
absolutely no reason why they can't. Lightning network is a vision of a
future where everyone uses a higher-layer protocol for their transactions
which only periodically settle on the block chain. It is entirely possible
that you may be able to do all your day-to-day transactions in bitcoin yet
only settle accounts every other week, totaling 13kB per year. A 1MB block
could support that level of usage by 4 million people, which is many orders
of magnitude more than the number of people presently using bitcoin on a
day to day basis.

And that, by the way, is without considering as-yet uninvented applications
of existing or future script which will provide even further improvements
to scale. This is very fertile ground being explored by very few people.
One thing I hope to come out of this block size debate is a lot more people
(like Joseph Poon) looking at how bitcoin script can be used to enable new
and innovative resource-efficient and privacy-enhancing payment protocols.

The network has room to grow. It just requires wallet developers and other
infrastructure folk to step up to the plate and do their part in deploying
this technology.
Post by Angel Leon via bitcoin-dev
- policy neutrality.
- It can't be censored.
- it can't be shut down
- and the rules cannot change from underneath you.
except it can be shutdown the minute it actually gets used by its
inability to scale.
what's the point of having all this if nobody can use it?
what's the point of going through all that energy and CO2 for a mere
24,000 transactions an hour?
It's clear that it's just a matter of time before it collapses.
Here's a simple proposal (concept) that doesn't pretend to set a fixed
block size limit as you can't ever know the demands the future will bring
https://gist.github.com/gubatron/143e431ee01158f27db4
We don't need to go as far as countries with hyper inflation trying to use
the technology to make it collapse, anybody here who has distributed
commercial/free end user software knows that any small company out there
installs more copies in a couple weeks than all the bitcoin users we have
at the moment, all we need is a single company/project with a decent amount
of users who are now enabled to transact directly on the blockchain to
screw it all up (perhaps OpenBazaar this winter could make this whole thing
come down, hopefully they'll take this debate and the current limitations
before their release, and boy are they coding nonstop on it now that they
got funded), the last of your fears should be a malicious government trying
to shut you down, for that to happen you must make an impact first, for now
this is a silly game in the grand scheme of things.
And you did sound pretty bad, all of his points were very valid and they
share the concern of many people, many investors, entrepreneurs putting
shitload of money, time and their lives on a much larger vision than that
of a network that does a mere 3,500 tx/hour, but some people seem to be
able to live in impossible or useless ideals.
It's simply irresponsible to not want to give the network a chance to grow
a bit more. Miners centralizing is inevitable given the POW based
consensus, hobbists-mining is only there for countries with very cheap
energy.
If things remain this way, this whole thing will be a massive failure and
it will probably take another decade before we can open our mouths about
cryptocurrencies, decentralization and what not, and this stubornness will
be the one policy that censored everyone, that shutdown everyone, that made
the immutable rules not matter.
Perhaps it will be Stellar what ends up delivering at this stubborn pace.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 4:38 AM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting.
You asked to be convinced of the need for bigger blocks. I gave that.
What makes you think bitcoin will break when more people use it?
Sent on the go, excuse the brevity.
*From: *Mark Friedenbach
*Sent: *Tuesday, 11 August 2015 08:10
*To: *Thomas Zander
*Cc: *Bitcoin Dev
*Subject: *Re: [bitcoin-dev] Fees and the block-finding process
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.
There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.
...
Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a single
word of that mattered. Yes we all want Bitcoin to scale such that every
person in the world can use it without difficulty. However if that were all
that we cared about then I would be remiss if I did not point out that
there are plenty of better, faster, and cheaper solutions to finding global
consensus over a payment ledger than Bitcoin. Architectures which are
algorithmically superior in their scaling properties. Indeed they are
https://www.stellar.org/
http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome of this
debate? Because Bitcoin offers one thing, and one thing only which
alternative architectures fundamentally lack: policy neutrality. It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you. *That* is what Bitcoin offers that can't be replicated at
higher scale with a SQL database and an audit log.
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting. We might as well get rid of mining at
that point and make Bitcoin look like Stellar or Open-Transactions because
at least then we'd scale even better and not be pumping millions of tons of
CO2 into the atmosphere from running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks without
Post by Mark Friedenbach via bitcoin-dev
problems.
Straw man, storage is not an issue.
Post by Mark Friedenbach via bitcoin-dev
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is to
have Bitcoin survive active censorship. Presumably that means being able to
run a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just today's
Tor which is developed, serviced, and supported by the US government, but a
Tor or I2P that future governments have turned hostile towards and actively
censor or repress. Or existing authoritative governments, for that matter.
How much bandwidth would be available through those connections?
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
To the second point, it has been previously pointed out that large miners
stand to gain from larger blocks, for the same basic underlying reasons as
selfish mining. The incentive is to increase blocks, and miners are able to
do so at will and without cost. I would not be so certain that we wouldn't
see large blocks sooner than that.
Post by Mark Friedenbach via bitcoin-dev
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution.
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.
This is basically already deployed thanks to Matt's relay network.
Further improvements are not going to have dramatic effects.
Post by Mark Friedenbach via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would
absolutely, without any doubt destroy the very nature of Bitcoin, turning
it into a fundamentally uninteresting reincarnation of the existing
financial system. And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to failure by
your own metrics?
I humbly suggest that maybe we should play the strengths of Bitcoin
instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's headed
otherwise.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Michael Naber via bitcoin-dev
2015-08-11 19:26:48 UTC
Permalink
All things considered, if people want to participate in a global consensus
network, and the technology exist to do it at a lower cost, then is it
sensible or even possible to somehow arbitrarily set the price of
participating in a global consensus network to be expensive? Can someone
please walk me through how that's expected to play out because I'm really
having a hard time understanding how it could work.



On Tue, Aug 11, 2015 at 2:00 PM, Mark Friedenbach via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
More people using Bitcoin does not necessarily mean more transactions
being processed by the block chain. Satoshi was forward-thinking enough to
include a powerful script-signature system, something which has never
really existed before. Though suffering from some limitations to be sure,
this smart contract execution framework is expressive enough to enable a
wide variety of new features without changing bitcoin itself.
One of these invented features is micropayment channels -- the ability for
two parties to rapidly exchange funds while only settling the final balance
to the block chain, and to do so in an entirely trustless way. Right now
people don't use scripts to do interesting things like this, but there is
absolutely no reason why they can't. Lightning network is a vision of a
future where everyone uses a higher-layer protocol for their transactions
which only periodically settle on the block chain. It is entirely possible
that you may be able to do all your day-to-day transactions in bitcoin yet
only settle accounts every other week, totaling 13kB per year. A 1MB block
could support that level of usage by 4 million people, which is many orders
of magnitude more than the number of people presently using bitcoin on a
day to day basis.
And that, by the way, is without considering as-yet uninvented
applications of existing or future script which will provide even further
improvements to scale. This is very fertile ground being explored by very
few people. One thing I hope to come out of this block size debate is a lot
more people (like Joseph Poon) looking at how bitcoin script can be used to
enable new and innovative resource-efficient and privacy-enhancing payment
protocols.
The network has room to grow. It just requires wallet developers and other
infrastructure folk to step up to the plate and do their part in deploying
this technology.
Post by Angel Leon via bitcoin-dev
- policy neutrality.
- It can't be censored.
- it can't be shut down
- and the rules cannot change from underneath you.
except it can be shutdown the minute it actually gets used by its
inability to scale.
what's the point of having all this if nobody can use it?
what's the point of going through all that energy and CO2 for a mere
24,000 transactions an hour?
It's clear that it's just a matter of time before it collapses.
Here's a simple proposal (concept) that doesn't pretend to set a fixed
block size limit as you can't ever know the demands the future will bring
https://gist.github.com/gubatron/143e431ee01158f27db4
We don't need to go as far as countries with hyper inflation trying to
use the technology to make it collapse, anybody here who has distributed
commercial/free end user software knows that any small company out there
installs more copies in a couple weeks than all the bitcoin users we have
at the moment, all we need is a single company/project with a decent amount
of users who are now enabled to transact directly on the blockchain to
screw it all up (perhaps OpenBazaar this winter could make this whole thing
come down, hopefully they'll take this debate and the current limitations
before their release, and boy are they coding nonstop on it now that they
got funded), the last of your fears should be a malicious government trying
to shut you down, for that to happen you must make an impact first, for now
this is a silly game in the grand scheme of things.
And you did sound pretty bad, all of his points were very valid and they
share the concern of many people, many investors, entrepreneurs putting
shitload of money, time and their lives on a much larger vision than that
of a network that does a mere 3,500 tx/hour, but some people seem to be
able to live in impossible or useless ideals.
It's simply irresponsible to not want to give the network a chance to
grow a bit more. Miners centralizing is inevitable given the POW based
consensus, hobbists-mining is only there for countries with very cheap
energy.
If things remain this way, this whole thing will be a massive failure and
it will probably take another decade before we can open our mouths about
cryptocurrencies, decentralization and what not, and this stubornness will
be the one policy that censored everyone, that shutdown everyone, that made
the immutable rules not matter.
Perhaps it will be Stellar what ends up delivering at this stubborn pace.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 4:38 AM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting.
You asked to be convinced of the need for bigger blocks. I gave that.
What makes you think bitcoin will break when more people use it?
Sent on the go, excuse the brevity.
*From: *Mark Friedenbach
*Sent: *Tuesday, 11 August 2015 08:10
*To: *Thomas Zander
*Cc: *Bitcoin Dev
*Subject: *Re: [bitcoin-dev] Fees and the block-finding process
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.
There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.
...
Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a single
word of that mattered. Yes we all want Bitcoin to scale such that every
person in the world can use it without difficulty. However if that were all
that we cared about then I would be remiss if I did not point out that
there are plenty of better, faster, and cheaper solutions to finding global
consensus over a payment ledger than Bitcoin. Architectures which are
algorithmically superior in their scaling properties. Indeed they are
https://www.stellar.org/
http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome of this
debate? Because Bitcoin offers one thing, and one thing only which
alternative architectures fundamentally lack: policy neutrality. It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you. *That* is what Bitcoin offers that can't be replicated at
higher scale with a SQL database and an audit log.
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting. We might as well get rid of mining at
that point and make Bitcoin look like Stellar or Open-Transactions because
at least then we'd scale even better and not be pumping millions of tons of
CO2 into the atmosphere from running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks without
Post by Mark Friedenbach via bitcoin-dev
problems.
Straw man, storage is not an issue.
Post by Mark Friedenbach via bitcoin-dev
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is to
have Bitcoin survive active censorship. Presumably that means being able to
run a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just
today's Tor which is developed, serviced, and supported by the US
government, but a Tor or I2P that future governments have turned hostile
towards and actively censor or repress. Or existing authoritative
governments, for that matter. How much bandwidth would be available through
those connections?
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
To the second point, it has been previously pointed out that large
miners stand to gain from larger blocks, for the same basic underlying
reasons as selfish mining. The incentive is to increase blocks, and miners
are able to do so at will and without cost. I would not be so certain that
we wouldn't see large blocks sooner than that.
Post by Mark Friedenbach via bitcoin-dev
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution.
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.
This is basically already deployed thanks to Matt's relay network.
Further improvements are not going to have dramatic effects.
Post by Mark Friedenbach via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would
absolutely, without any doubt destroy the very nature of Bitcoin, turning
it into a fundamentally uninteresting reincarnation of the existing
financial system. And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to failure by
your own metrics?
I humbly suggest that maybe we should play the strengths of Bitcoin
instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's headed
otherwise.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
Adam Back via bitcoin-dev
2015-08-11 20:12:43 UTC
Permalink
I think everyone is expending huge effort on design, analysis and
implementation of the lowest cost technology for Bitcoin.

Changing parameters doesnt create progress on scalability fundamentals -
there really is an inherent cost and security / throughput tradeoff to
blockchains. Security is quite central to this discussion. It is
unrealistic in my opinion to suppose that everything can fit directly
on-chain in the fullest Bitcoin adoption across cash-payments, internet of
things, QoS, micropayments, share-trading, derivates etc. Hence the
interest in protocols like lightning (encourage you and others to read the
paper, blog posts and implementation progress on the lightning-dev mailing
list).

Mid-term different tradeoffs can happen that are all connected to and
building on Bitcoin. But whatever technologies win out for scale, they all
depend on Bitcoin security - anything built on Bitcoin requires a secure
base. So I think it is logical that we strive to maintain and improve
Bitcoin security. Long-term tradeoffs that significantly weaken security
for throughput or other considerations should be built on top of Bitcoin,
and avoiding creating a one-size fits all unfortunate compromise that
weakens Bitcoin to the lowest common denominator of centralisation,
insecurity and throughput tradeoffs. This pattern (secure base, other
protocols built on top) is already the status quo - probably > 99% of
Bitcoin transactions are off-chain already (in exchanges, web wallets
etc). And there are various things that can and are being done to improve
the security of those solutions, with provable reserves, periodic on-chain
settlement, netting, lightning like protocols and other things probably
still to be invented.

Some of the longer term things we probably dont know yet, but the future is
NOT bleak. Lots of scope for technology improvement.

Adam


On 11 August 2015 at 20:26, Michael Naber via bitcoin-dev <
Post by Michael Naber via bitcoin-dev
All things considered, if people want to participate in a global consensus
network, and the technology exist to do it at a lower cost, then is it
sensible or even possible to somehow arbitrarily set the price of
participating in a global consensus network to be expensive? Can someone
please walk me through how that's expected to play out because I'm really
having a hard time understanding how it could work.
On Tue, Aug 11, 2015 at 2:00 PM, Mark Friedenbach via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
More people using Bitcoin does not necessarily mean more transactions
being processed by the block chain. Satoshi was forward-thinking enough to
include a powerful script-signature system, something which has never
really existed before. Though suffering from some limitations to be sure,
this smart contract execution framework is expressive enough to enable a
wide variety of new features without changing bitcoin itself.
One of these invented features is micropayment channels -- the ability
for two parties to rapidly exchange funds while only settling the final
balance to the block chain, and to do so in an entirely trustless way.
Right now people don't use scripts to do interesting things like this, but
there is absolutely no reason why they can't. Lightning network is a vision
of a future where everyone uses a higher-layer protocol for their
transactions which only periodically settle on the block chain. It is
entirely possible that you may be able to do all your day-to-day
transactions in bitcoin yet only settle accounts every other week, totaling
13kB per year. A 1MB block could support that level of usage by 4 million
people, which is many orders of magnitude more than the number of people
presently using bitcoin on a day to day basis.
And that, by the way, is without considering as-yet uninvented
applications of existing or future script which will provide even further
improvements to scale. This is very fertile ground being explored by very
few people. One thing I hope to come out of this block size debate is a lot
more people (like Joseph Poon) looking at how bitcoin script can be used to
enable new and innovative resource-efficient and privacy-enhancing payment
protocols.
The network has room to grow. It just requires wallet developers and
other infrastructure folk to step up to the plate and do their part in
deploying this technology.
Post by Angel Leon via bitcoin-dev
- policy neutrality.
- It can't be censored.
- it can't be shut down
- and the rules cannot change from underneath you.
except it can be shutdown the minute it actually gets used by its
inability to scale.
what's the point of having all this if nobody can use it?
what's the point of going through all that energy and CO2 for a mere
24,000 transactions an hour?
It's clear that it's just a matter of time before it collapses.
Here's a simple proposal (concept) that doesn't pretend to set a fixed
block size limit as you can't ever know the demands the future will bring
https://gist.github.com/gubatron/143e431ee01158f27db4
We don't need to go as far as countries with hyper inflation trying to
use the technology to make it collapse, anybody here who has distributed
commercial/free end user software knows that any small company out there
installs more copies in a couple weeks than all the bitcoin users we have
at the moment, all we need is a single company/project with a decent amount
of users who are now enabled to transact directly on the blockchain to
screw it all up (perhaps OpenBazaar this winter could make this whole thing
come down, hopefully they'll take this debate and the current limitations
before their release, and boy are they coding nonstop on it now that they
got funded), the last of your fears should be a malicious government trying
to shut you down, for that to happen you must make an impact first, for now
this is a silly game in the grand scheme of things.
And you did sound pretty bad, all of his points were very valid and they
share the concern of many people, many investors, entrepreneurs putting
shitload of money, time and their lives on a much larger vision than that
of a network that does a mere 3,500 tx/hour, but some people seem to be
able to live in impossible or useless ideals.
It's simply irresponsible to not want to give the network a chance to
grow a bit more. Miners centralizing is inevitable given the POW based
consensus, hobbists-mining is only there for countries with very cheap
energy.
If things remain this way, this whole thing will be a massive failure
and it will probably take another decade before we can open our mouths
about cryptocurrencies, decentralization and what not, and this stubornness
will be the one policy that censored everyone, that shutdown everyone, that
made the immutable rules not matter.
Perhaps it will be Stellar what ends up delivering at this stubborn pace.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 4:38 AM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting.
You asked to be convinced of the need for bigger blocks. I gave that.
What makes you think bitcoin will break when more people use it?
Sent on the go, excuse the brevity.
*From: *Mark Friedenbach
*Sent: *Tuesday, 11 August 2015 08:10
*To: *Thomas Zander
*Cc: *Bitcoin Dev
*Subject: *Re: [bitcoin-dev] Fees and the block-finding process
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
Post by Mark Friedenbach via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or growth
trajectory. But defend it with data and reasoned analysis.
We currently serve about 0,007% of the world population sending maybe one
transaction a month.
This can only go up.
There are about 20 currencies in the world that are unstable and showing early
signs of hyperinflation. If even small percentage of these people cash-out and
get Bitcoins for their savings you'd have the amount of people using Bitcoin
as savings go from maybe half a million to 10 million in the space of a couple
of months. Why so fast? Because all the world currencies are linked.
Practically all currencies follow the USD, and while that one may stay robust
and standing, the linkage has been shown in the past to cause chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but we have
seen big rises in price as Cyprus had a bailin and then when Greece first
showed bad signs again.
Lets do our due diligence and agree that in the current world economy there
are sure signs that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the transaction
rate go up very much, but if you have feet on the ground you already see that
people go back to barter in countries like Poland, Ireland, Greece etc.
And Bitcoin will be an alternative to good to ignore. Then transaction rates
will go up. Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are at
0,007%... Thats like a f-ing rounding error in the world economy. You can't
reason from that. Its like using a float to do calculations that you should
have done in a double and getting weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd. Because a 20
times increase is very common in a "company" that is about 6 years old.
For instance Android was about that age when it started to get shipped by non-
Google companies. There the increase was substantially bigger and the company
backing it was definitely able to change direction faster than the Bitcoin
oiltanker can change direction.
...
Another metric to remember; if you follow hackernews (well, the incubator more
than the linked articles) you'd be exposed to the thinking of these startups.
Their only criteria is growth. and this is rather substantial growth. Like
150% per month. Naturally, most of these build on top of html or other
existing technologies. But the point is that exponential growth is expected
in any startup. They typically have a much much more agressive timeline,
though. Every month instead of every year.
Having exponential growth in the blockchain is really not odd and even if we
have LN or sidechains or the next changetip, this space will be used. And we
will still have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a single
word of that mattered. Yes we all want Bitcoin to scale such that every
person in the world can use it without difficulty. However if that were all
that we cared about then I would be remiss if I did not point out that
there are plenty of better, faster, and cheaper solutions to finding global
consensus over a payment ledger than Bitcoin. Architectures which are
algorithmically superior in their scaling properties. Indeed they are
https://www.stellar.org/
http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome of
this debate? Because Bitcoin offers one thing, and one thing only which
alternative architectures fundamentally lack: policy neutrality. It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you. *That* is what Bitcoin offers that can't be replicated at
higher scale with a SQL database and an audit log.
It follows then, that if we make a decision now which destroys that
property, which makes it possible to censor bitcoin, to deny service, or to
pressure miners into changing rules contrary to user interests, then
Bitcoin is no longer interesting. We might as well get rid of mining at
that point and make Bitcoin look like Stellar or Open-Transactions because
at least then we'd scale even better and not be pumping millions of tons of
CO2 into the atmosphere from running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks
Post by Mark Friedenbach via bitcoin-dev
without
problems.
Straw man, storage is not an issue.
Post by Mark Friedenbach via bitcoin-dev
You can buy broadband in every relevant country that easily supports the
bandwidth we need. (remember we won't jump to 8Mb in a day, it will likely
take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is to
have Bitcoin survive active censorship. Presumably that means being able to
run a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just
today's Tor which is developed, serviced, and supported by the US
government, but a Tor or I2P that future governments have turned hostile
towards and actively censor or repress. Or existing authoritative
governments, for that matter. How much bandwidth would be available through
those connections?
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
To the second point, it has been previously pointed out that large
miners stand to gain from larger blocks, for the same basic underlying
reasons as selfish mining. The incentive is to increase blocks, and miners
are able to do so at will and without cost. I would not be so certain that
we wouldn't see large blocks sooner than that.
Post by Mark Friedenbach via bitcoin-dev
We should get the inverted bloom filters stuff (or competing products) working
at least on a one-to-one basis so we can solve the propagation time problem.
There frankly is a huge amount of optimization that can be done in that area,
we don't even use locality (pingtime) to optimize distribution.
From my experience you can expect a 2-magnitude speedup in that same 6 month
period by focusing some research there.
This is basically already deployed thanks to Matt's relay network.
Further improvements are not going to have dramatic effects.
Post by Mark Friedenbach via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would
absolutely, without any doubt destroy the very nature of Bitcoin, turning
it into a fundamentally uninteresting reincarnation of the existing
financial system. And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to failure
by your own metrics?
I humbly suggest that maybe we should play the strengths of Bitcoin
instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's headed
otherwise.
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________
bitcoin-dev mailing list
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
odinn via bitcoin-dev
2015-08-12 00:32:20 UTC
Permalink
Hey Angel,
-policy neutrality. - It can't be censored. - it can't be shut
down - and the rules cannot change from underneath you.
except it can be shutdown the minute it actually gets used by its
inability to scale.
what's the point of having all this if nobody can use it? what's
the point of going through all that energy and CO2 for a mere
24,000 transactions an hour?
It's clear that it's just a matter of time before it collapses.
Here's a simple proposal (concept) that doesn't pretend to set a
fixed block size limit as you can't ever know the demands the
future will bring
https://gist.github.com/gubatron/143e431ee01158f27db4
This seems to be a really good idea... May I add in here something
that's been dismissed before but I will mention it again anyway...

http://is.gd/DiFuRr "dynamic block size adjustment"
My sense has been that something like this could be coupled with
Garzik's BIP 100. For some reason I keep getting attacked for saying
this.

/RantOff
We don't need to go as far as countries with hyper inflation trying
to use the technology to make it collapse, anybody here who has
distributed commercial/free end user software knows that any small
company out there installs more copies in a couple weeks than all
the bitcoin users we have at the moment, all we need is a single
company/project with a decent amount of users who are now enabled
to transact directly on the blockchain to screw it all up (perhaps
OpenBazaar this winter could make this whole thing come down,
hopefully they'll take this debate and the current limitations
before their release, and boy are they coding nonstop on it now
that they got funded), the last of your fears should be a malicious
government trying to shut you down, for that to happen you must
make an impact first, for now this is a silly game in the grand
scheme of things.
And you did sound pretty bad, all of his points were very valid and
they share the concern of many people, many investors,
entrepreneurs putting shitload of money, time and their lives on a
much larger vision than that of a network that does a mere 3,500
tx/hour, but some people seem to be able to live in impossible or
useless ideals.
It's simply irresponsible to not want to give the network a chance
to grow a bit more. Miners centralizing is inevitable given the POW
based consensus, hobbists-mining is only there for countries with
very cheap energy.
If things remain this way, this whole thing will be a massive
failure and it will probably take another decade before we can open
our mouths about cryptocurrencies, decentralization and what not,
and this stubornness will be the one policy that censored everyone,
that shutdown everyone, that made the immutable rules not matter.
Perhaps it will be Stellar what ends up delivering at this stubborn pace.
http://twitter.com/gubatron
On Tue, Aug 11, 2015 at 4:38 AM, Thomas Zander via bitcoin-dev
Post by Mark Friedenbach via bitcoin-dev
It follows then, that if we make a decision now which destroys
that property, which makes it possible to censor bitcoin, to deny
service, or to pressure miners into changing rules contrary to
user interests, then Bitcoin is no longer interesting.
You asked to be convinced of the need for bigger blocks. I gave
that. What makes you think bitcoin will break when more people use
it?
*Tuesday, 11 August 2015 08:10 *To: *Thomas Zander *Cc: *Bitcoin
Dev *Subject: *Re: [bitcoin-dev] Fees and the block-finding
process
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev
On Monday 10. August 2015 23.03.39 <tel:2015%2023.03.39> Mark
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or
growth trajectory. But defend it with data and reasoned
analysis.
We currently serve about 0,007% of the world population sending
maybe one transaction a month. This can only go up.
There are about 20 currencies in the world that are unstable and
showing early signs of hyperinflation. If even small percentage of
these people cash-out and get Bitcoins for their savings you'd have
the amount of people using Bitcoin as savings go from maybe half a
million to 10 million in the space of a couple of months. Why so
fast? Because all the world currencies are linked. Practically all
currencies follow the USD, and while that one may stay robust and
standing, the linkage has been shown in the past to cause
chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but
we have seen big rises in price as Cyprus had a bailin and then
when Greece first showed bad signs again. Lets do our due diligence
and agree that in the current world economy there are sure signs
that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the
transaction rate go up very much, but if you have feet on the
ground you already see that people go back to barter in countries
like Poland, Ireland, Greece etc. And Bitcoin will be an
alternative to good to ignore. Then transaction rates will go up.
Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are
at 0,007%... Thats like a f-ing rounding error in the world
economy. You can't reason from that. Its like using a float to do
calculations that you should have done in a double and getting
weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd.
Because a 20 times increase is very common in a "company" that is
about 6 years old. For instance Android was about that age when it
started to get shipped by non- Google companies. There the increase
was substantially bigger and the company backing it was definitely
able to change direction faster than the Bitcoin oiltanker can
change direction.
...
Another metric to remember; if you follow hackernews (well, the
incubator more than the linked articles) you'd be exposed to the
thinking of these startups. Their only criteria is growth. and this
is rather substantial growth. Like 150% per month. Naturally, most
of these build on top of html or other existing technologies. But
the point is that exponential growth is expected in any startup.
They typically have a much much more agressive timeline, though.
Every month instead of every year. Having exponential growth in the
blockchain is really not odd and even if we have LN or sidechains
or the next changetip, this space will be used. And we will still
have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a
single word of that mattered. Yes we all want Bitcoin to scale
such that every person in the world can use it without difficulty.
However if that were all that we cared about then I would be
remiss if I did not point out that there are plenty of better,
faster, and cheaper solutions to finding global consensus over a
payment ledger than Bitcoin. Architectures which are
algorithmically superior in their scaling properties. Indeed they
https://www.stellar.org/ http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome
of this debate? Because Bitcoin offers one thing, and one thing
only which alternative architectures fundamentally lack: policy
neutrality. It can't be censored, it can't be shut down, and the
rules cannot change from underneath you. *That* is what Bitcoin
offers that can't be replicated at higher scale with a SQL
database and an audit log.
It follows then, that if we make a decision now which destroys
that property, which makes it possible to censor bitcoin, to deny
service, or to pressure miners into changing rules contrary to
user interests, then Bitcoin is no longer interesting. We might as
well get rid of mining at that point and make Bitcoin look like
Stellar or Open-Transactions because at least then we'd scale even
better and not be pumping millions of tons of CO2 into the
atmosphere from running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks
without problems.
Straw man, storage is not an issue.
You can buy broadband in every relevant country that easily
supports the bandwidth we need. (remember we won't jump to 8Mb in a
day, it will likely take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is
to have Bitcoin survive active censorship. Presumably that means
being able to run a node even in the face of a hostile ISP or
government. Furthermore, it means being location independent and
being able to move around. In many places the higher the bandwidth
requirements the fewer the number of ISPs that are available to
service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just
today's Tor which is developed, serviced, and supported by the US
government, but a Tor or I2P that future governments have turned
hostile towards and actively censor or repress. Or existing
authoritative governments, for that matter. How much bandwidth
would be available through those connections?
It may hopefully never be necessary to operate under such
constraints, except by freedom seeking individuals within existing
totalitarian regimes. However the credible threat of doing so may
be what keeps Bitcoin from being repressed in the first place. Lose
the capability to go underground, and it will be pressured into
regulation, eventually.
To the second point, it has been previously pointed out that large
miners stand to gain from larger blocks, for the same basic
underlying reasons as selfish mining. The incentive is to increase
blocks, and miners are able to do so at will and without cost. I
would not be so certain that we wouldn't see large blocks sooner
than that.
We should get the inverted bloom filters stuff (or competing
products) working at least on a one-to-one basis so we can solve
the propagation time problem. There frankly is a huge amount of
optimization that can be done in that area, we don't even use
locality (pingtime) to optimize distribution.
Post by Mark Friedenbach via bitcoin-dev
From my experience you can expect a 2-magnitude speedup in that
same 6 month period by focusing some research there.
This is basically already deployed thanks to Matt's relay network.
Further improvements are not going to have dramatic effects.
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would
absolutely, without any doubt destroy the very nature of Bitcoin,
turning it into a fundamentally uninteresting reincarnation of the
existing financial system. And still be unable to compete with
VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to
failure by your own metrics?
I humbly suggest that maybe we should play the strengths of
Bitcoin instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's
headed otherwise.
_______________________________________________ bitcoin-dev mailing
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
_______________________________________________ bitcoin-dev mailing
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
- --
http://abis.io ~
"a protocol concept to enable decentralization
and expansion of a giving economy, and a new social good"
https://keybase.io/odinn
Thomas Zander via bitcoin-dev
2015-08-11 11:10:48 UTC
Permalink
Post by Mark Friedenbach via bitcoin-dev
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev <
So why do I work on Bitcoin, [] It can't
be censored, it can't be shut down, and the rules cannot change from
underneath you.
Fully agreed, and I like that a lot as well.
Post by Mark Friedenbach via bitcoin-dev
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
I think remembering the Internet architecture here is viable.
There is a saying that censorship on the internet is seen as a defect and
route around. Bitcoin follows the same concept, and arguable is even better at
it since transactions don't have to be delivered to the network in real time.
It can be shipped by carrier pigeon in the extreme case ;)
Or though smileys over skype chat...
Post by Mark Friedenbach via bitcoin-dev
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
I understand your point, its a good one.

Here is my counter argument; countries (or states) that fail to legally get
the bandwidth to do mining, are not an indicator for the success of Bitcoin.
Tor will work fine with a full node (or gnunet, if you want), just make sure
you take the transmission delays into account.
And naturally, there is the point that actual end users don't need a full
node. The system as a whole will work just fine for people in totalitarian
regimes as long as 100% of the world doesn't reach that point.
With various nodes in Sealand (near the UK) and miners in China, the system
would still work for users in New York.
Post by Mark Friedenbach via bitcoin-dev
Post by Thomas Zander via bitcoin-dev
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would absolutely,
without any doubt destroy the very nature of Bitcoin, turning it into a
fundamentally uninteresting reincarnation of the existing financial system.
And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to failure by
your own metrics?
Naturally, I was referring to the existing proposal that 8Gb blocks would be
reached only in many years. Its a really long way away.

And if you read my previous replies on this thread you can see a more
substantial argument which I'll make brief here;
I'm not suggesting we scale the blocksize to accomodate for the next 10 years
of growth.
Instead I'm suggesting that we use solutions like Lightning and sidechains and
anything people can invent as soon as possible. But we need bigger blocks as
well. Because not any single solution is the answer, we need a combination of
multiple.

There really is no reason to suspect we can't actually increase the blocksize
in some months as the first thing we do.
--
Thomas Zander
odinn via bitcoin-dev
2015-08-12 00:18:45 UTC
Permalink
Hello, I thought these were good points, but I have a couple questions..
.
Post by Mark Friedenbach via bitcoin-dev
On Mon, Aug 10, 2015 at 11:31 PM, Thomas Zander via bitcoin-dev
On Monday 10. August 2015 23.03.39 <tel:2015%2023.03.39> Mark
Post by Mark Friedenbach via bitcoin-dev
This is where things diverge. It's fine to pick a new limit or
growth trajectory. But defend it with data and reasoned
analysis.
We currently serve about 0,007% of the world population sending
maybe one transaction a month. This can only go up.
There are about 20 currencies in the world that are unstable and
showing early signs of hyperinflation. If even small percentage of
these people cash-out and get Bitcoins for their savings you'd have
the amount of people using Bitcoin as savings go from maybe half a
million to 10 million in the space of a couple of months. Why so
fast? Because all the world currencies are linked. Practically all
currencies follow the USD, and while that one may stay robust and
standing, the linkage has been shown in the past to cause
chain-effects.
It is impossible to predict how much uptake Bitcoin will take, but
we have seen big rises in price as Cyprus had a bailin and then
when Greece first showed bad signs again. Lets do our due diligence
and agree that in the current world economy there are sure signs
that people are considering Bitcoin on a big scale.
Bigger amount of people holding Bitcoin savings won't make the
transaction rate go up very much, but if you have feet on the
ground you already see that people go back to barter in countries
like Poland, Ireland, Greece etc. And Bitcoin will be an
alternative to good to ignore. Then transaction rates will go up.
Dramatically.
If you are asking for numbers, that is a bit tricky. Again; we are
at 0,007%... Thats like a f-ing rounding error in the world
economy. You can't reason from that. Its like using a float to do
calculations that you should have done in a double and getting
weird output.
Bottom line is that a maximum size of 8Mb blocks is not that odd.
Because a 20 times increase is very common in a "company" that is
about 6 years old. For instance Android was about that age when it
started to get shipped by non- Google companies. There the increase
was substantially bigger and the company backing it was definitely
able to change direction faster than the Bitcoin oiltanker can
change direction.
...
Another metric to remember; if you follow hackernews (well, the
incubator more than the linked articles) you'd be exposed to the
thinking of these startups. Their only criteria is growth. and this
is rather substantial growth. Like 150% per month. Naturally, most
of these build on top of html or other existing technologies. But
the point is that exponential growth is expected in any startup.
They typically have a much much more agressive timeline, though.
Every month instead of every year. Having exponential growth in the
blockchain is really not odd and even if we have LN or sidechains
or the next changetip, this space will be used. And we will still
have scarcity.
I'm sorry, I really don't want to sound like a jerk, but not a
single word of that mattered. Yes we all want Bitcoin to scale such
that every person in the world can use it without difficulty.
However if that were all that we cared about then I would be remiss
if I did not point out that there are plenty of better, faster, and
cheaper solutions to finding global consensus over a payment ledger
than Bitcoin. Architectures which are algorithmically superior in
their scaling properties. Indeed they are already implemented and
https://www.stellar.org/ http://opentransactions.org/
So why do I work on Bitcoin, and why do I care about the outcome of
this debate? Because Bitcoin offers one thing, and one thing only
which alternative architectures fundamentally lack: policy
neutrality. It can't be censored, it can't be shut down, and the
rules cannot change from underneath you. *That* is what Bitcoin
offers that can't be replicated at higher scale with a SQL database
and an audit log.
It follows then, that if we make a decision now which destroys
that property, which makes it possible to censor bitcoin, to deny
service, or to pressure miners into changing rules contrary to user
interests, then Bitcoin is no longer interesting. We might as well
get rid of mining at that point and make Bitcoin look like Stellar
or Open-Transactions because at least then we'd scale even better
and not be pumping millions of tons of CO2 into the atmosphere from
running all those ASICs.
On the other side, 3Tb harddrives are sold, which take 8Mb blocks
without problems.
Straw man, storage is not an issue.
You can buy broadband in every relevant country that easily
supports the bandwidth we need. (remember we won't jump to 8Mb in a
day, it will likely take at least 6 months).
Neither one of those assertions is clear. Keep in mind the goal is
to have Bitcoin survive active censorship. Presumably that means
being able to run a node even in the face of a hostile ISP or
government. Furthermore, it means being location independent and
being able to move around. In many places the higher the bandwidth
requirements the fewer the number of ISPs that are available to
service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just
today's Tor which is developed, serviced, and supported by the US
government, but a Tor or I2P that future governments have turned
hostile towards and actively censor or repress. Or existing
authoritative governments, for that matter. How much bandwidth
would be available through those connections?
It may hopefully never be necessary to operate under such
constraints, except by freedom seeking individuals within existing
totalitarian regimes. However the credible threat of doing so may
be what keeps Bitcoin from being repressed in the first place. Lose
the capability to go underground, and it will be pressured into
regulation, eventually.
Bitcoin (as well as the internet, and the world wide web) is already
regulated around the world.
There used to be a map that documented this fairly well, called
bitlegal (bitlegal.net) but the site is now parked or offline. There
is an alternative visual picture of the subject at:
http://is.gd/vFgYrf
The upshot of that is that web wallets (which are popularized due to
convenience) cannot be considered to provide users with control over
their money - extended discussion on this here in the bitcoin.org
repository:
https://github.com/bitcoin-dot-org/bitcoin.org/issues/996

My question, therefore is this:
When you say "it will be pressured into regulation, eventually," what
do you mean? Are you implying that even the hardware wallets and
desktop wallets for bitcoin will be regulated to the point where they
cannot be used by individuals who actually care about being able to
circumvent financial censorship? If so, explain how that is the case.

I could see an argument where that might be the case to some degree
if you would mention it in the context of services like Chainalysis
and other companies that are in the process of setting up services for
corporation-states for "virtual currency compliance;" e.g.; if
activity can be scanned and if a state has a requirement you are
supposed to be registered with the state to use virtual currency at
some level, and if companies scan the blockchain and report this
information to states (as they now do), then as you say, the only
reasonable method of using virtual currencies would be one in which
location is masked and information about the nodes and history of
transactions can be hidden. However, this requires more privacy and
anonymity effort in bitcoin development, but I don't think it would
actually keep people from using hardware and desktop wallets (although
I do think eventually people will, as they are beginning to now, be
gradually censored more and more from utilizing web wallets for
activities that they desire).

It seems to me that the existence of various tools and conditions
(external to the whole issue of legal constraints) are also very
important. For example, various companies have recently made public
announcements that they will leave (or not operate in) New York due to
Bitlicense - Kraken, Shapeshift.io, poloniex, and others. It's
understandable given the extreme nature of NY's approach (I personally
oppose any regulation of virtual currencies). But technically, they
didn't have to cease operating, did they? This was a failure in their
business model. They made a big statement about how evil NY was and
fled the scene, perhaps never to return to serve NY. (Note, I don't
live in NY, so I'm not personally being left out in the cold when
Kraken etc. leave, for the record.) As said, this was a failure in
Kraken, poloniex's, shapeshift's, etc., business model. Why? Because
here we are talking about web-based services. A distributed,
decentralized, peer-to-peer model in which the user has a piece of
software on their computer (think openbazaar or bitsquare) is far and
away better than services which are web based. Why? Because systems
which are set up such as (openbazaar, bitsquare) don't have to worry
about the constraints of state boundaries or shifting legal things,
nor do the people who made the software end up holding keys on behalf
of the users. If Kraken, poloniex, etc., had bothered to develop
contingency software (similar to what is being worked on with
https://bitsquare.io/ for example) for users who might be affected in
jurisdictions where corporation-states Just Don't Get It (read... NY,
and possibly CA, as examples) then maybe they wouldn't have to worry
about it at all. It just would be a matter of users installing a
piece of software from their website, discontinuing the web-based
exchange for that state, and directing users for highly regulated
states to a download page for the decentralized exchange software.
Not that hard really - but something we haven't seen done yet by the
big exchanges.

I have another question for you below...
Post by Mark Friedenbach via bitcoin-dev
To the second point, it has been previously pointed out that large
miners stand to gain from larger blocks, for the same basic
underlying reasons as selfish mining. The incentive is to increase
blocks, and miners are able to do so at will and without cost. I
would not be so certain that we wouldn't see large blocks sooner
than that.
We should get the inverted bloom filters stuff (or competing
products) working at least on a one-to-one basis so we can solve
the propagation time problem. There frankly is a huge amount of
optimization that can be done in that area, we don't even use
locality (pingtime) to optimize distribution.
Post by Mark Friedenbach via bitcoin-dev
From my experience you can expect a 2-magnitude speedup in that
same 6 month period by focusing some research there.
This is basically already deployed thanks to Matt's relay network.
Further improvements are not going to have dramatic effects.
Remember 8Gb/block still doesn't support VISA/Mastercard.
No, it doesn't. And 8GB/block is ludicrously large -- it would
absolutely, without any doubt destroy the very nature of Bitcoin,
turning it into a fundamentally uninteresting reincarnation of the
existing financial system.
Why do you say 8GB / block would destroy the very nature of Bitcoin?
I don't see that it would destroy bitcoin... but also, what would
cause 8GB to happen very soon? As I understood it, any process would
phase increases. Explain how that would destroy the very nature of
Bitcoin?

Blocksize is extremely likely to get bigger (I'm supposing here that
some version of Garzik's BIP 100 or something very close to it is what
will likely be adopted and that the blocksize would be voted on
something like this:
https://bitcoin.stackexchange.com/questions/37943/bip-100-what-votes-are
- -possible

Am I wrong? Please let me know if I'm dumb.

I am aware consensus isn't precisely there yet, but I'm just curious.
Post by Mark Friedenbach via bitcoin-dev
And still be unable to compete with VISA/Mastercard.
So why then the pressure to go down a route that WILL lead to
failure by your own metrics?
I humbly suggest that maybe we should play the strengths of
Bitcoin instead -- it's trustlessness via policy neutrality.
Either that, or go work on Stellar. Because that's where it's
headed otherwise.
_______________________________________________ bitcoin-dev mailing
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
- --
http://abis.io ~
"a protocol concept to enable decentralization
and expansion of a giving economy, and a new social good"
https://keybase.io/odinn
Jim Phillips via bitcoin-dev
2015-08-07 21:30:40 UTC
Permalink
On Fri, Aug 7, 2015 at 10:16 AM, Pieter Wuille via bitcoin-dev <
But perhaps there is some "use" for ultra-low-priority unreliable
transactions (... despite DoS attacks).
I can think of a variety of protocols that broadcast information and don't
really care about whether it gets delivered.. Think of everything that uses
UDP on TCP/IP. The most basic thing I can think of would be low-priority
notifications that are sent to the entire Bitcoin universe, but don't need
to persist. The protocol provides for a signed and thus verified message,
and a method for broadcasting it to every node that might be interested in
seeing it. If it never makes it into a block, so be it. If it does, so be
it.

--
*James G. Phillips IV*
<https://plus.google.com/u/0/113107039501292625391/posts>
<http://www.linkedin.com/in/ergophobe>

*"Don't bunt. Aim out of the ball park. Aim for the company of immortals."
-- David Ogilvy*

*This message was created with 100% recycled electrons. Please think twice
before printing.*
Anthony Towns via bitcoin-dev
2015-08-07 18:22:50 UTC
Permalink
On 8 August 2015 at 00:57, Gavin Andresen via bitcoin-dev <
Post by Gavin Andresen via bitcoin-dev
1. If you are willing to wait an infinite amount of time, I think the
minimum fee will always be zero or very close to zero, so I think it's a
silly question.
That's not what I'm thinking. It is just an observation based on the fact
that blocks are found at random intervals.
Every once in a while the network will get lucky and we'll find six blocks
Post by Gavin Andresen via bitcoin-dev
in ten minutes. If you are deciding what transaction fee to put on your
transaction, and you're willing to wait until that
six-blocks-in-ten-minutes once-a-week event, submit your transaction with a
low fee.
All the higher-fee transactions waiting to be confirmed will get confirmed
Post by Gavin Andresen via bitcoin-dev
in the first five blocks and, if miners don't have any floor on the fee
they'll accept (they will, but lets pretend they won't) then your
very-low-fee transaction will get confirmed.
​That depends a bit on how ra​tional miners are, doesn't it? Once the block
subsidy is retired, hashpower is only paid for by fees -- and if there's no
fee paying transactions in the queue, then there's no reward for applying
hashpower, so mining a block won't even pay for your costs. In that case,
better to switch to hatching something else (an altcoin with less fees than
bitcoin has on average but more than nothing, eg), or put your hashing
hardward into a low power mode so you at least cut costs.

That will only be needed for a short while though -- presumably enough
transactions will come in in the next five or ten minutes for a block to be
worth mining again, so maybe implementing that decision process is more
costly than the money you'd save.

​(C​
onversely, when the queue is over-full because there's been no blocks found
for a while, that should mean you can fill a block with higher-than-average
fee transactions, so I'd expect some miners to switch hashpower from
altcoins and sidechains to catch the temporary chance of higher revenue
blocks.
​Both tendencies would help reduce the variance in block time, compared to
a steady hashrate, which would probably be a good thing for the network as
a whole)​


I think the same incentives apply with mining being paid for by assurance
contracts rather than directly by transaction fees -- if you get a bunch of
blocks done quickly, the existing assurance contracts are dealt with just
as well as if it had taken longer; so you want to wait until new ones come
in rather than spend your hashpower for no return.

​All of this only applies once fees make up a significant portion of the
payment for mining a block, though.​

Cheers,
aj
--
Anthony Towns <***@erisian.com.au>
Peter R via bitcoin-dev
2015-08-07 18:36:32 UTC
Permalink
...blocks are found at random intervals.
Every once in a while the network will get lucky and we'll find six blocks in ten minutes. If you are deciding what transaction fee to put on your transaction, and you're willing to wait until that six-blocks-in-ten-minutes once-a-week event, submit your transaction with a low fee.
All the higher-fee transactions waiting to be confirmed will get confirmed in the first five blocks and, if miners don't have any floor on the fee they'll accept (they will, but lets pretend they won't) then your very-low-fee transaction will get confirmed.
In the limit, that logic becomes "wait an infinite amount of time, pay zero fee."
...
Gavin Andresen
Yes, I see this as correct as well. If demand for space within a particular block is elevated (e.g., when the network has not found a block for 30 minutes), the minimum fee density for inclusion will be greater than the minimum fee density when demand for space is low (e.g., when the network has found several blocks in quick succession, as Gavin pointed out). Lower-fee paying transaction will just wait to be included at one of the network lulls where a bunch of blocks were found quickly in a row.

The feemarket.pdf paper ( https://dl.dropboxusercontent.com/u/43331625/feemarket.pdf ) shows that this will always be the case so long as the block space supply curve (i.e., the cost in BTC/byte to supply additional space within a block [rho]) is a monotonically-increasing function of the block size (refer to Fig. 6 and Table 1). The curve will satisfy this condition provided the propagation time for block solutions grows faster than log Q where Q is the size of the block. Assuming that block solutions are propagated across physical channels, and that the quantity of pure information communicated per solution is proportional to the amount of information contained within the block, then the communication time will always grow asymptotically like O(Q) as per the Shannon-Hartely theorem, and the fee market will be healthy.

Best regards,
Peter
Corey Haddad via bitcoin-dev
2015-08-12 01:56:00 UTC
Permalink
On Tur, Aug 11, 2015 at 07:08 AM, *Mark Friedenbach* via bitcoin-dev <

bitcoin-dev at lists.linuxfoundation.org
<https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev>>
Post by Mark Friedenbach via bitcoin-dev
Neither one of those assertions is clear. Keep in mind the goal is to have
Bitcoin survive active censorship. Presumably that means being able to run
a node even in the face of a hostile ISP or government. Furthermore, it
means being location independent and being able to move around. In many
places the higher the bandwidth requirements the fewer the number of ISPs
that are available to service you, and the more visible you are.
It may also be necessary to be able to run over Tor. And not just today's
Tor which is developed, serviced, and supported by the US government, but a
Tor or I2P that future governments have turned hostile towards and actively
censor or repress. Or existing authoritative governments, for that matter.
How much bandwidth would be available through those connections?
It may hopefully never be necessary to operate under such constraints,
except by freedom seeking individuals within existing totalitarian regimes.
However the credible threat of doing so may be what keeps Bitcoin from
being repressed in the first place. Lose the capability to go underground,
and it will be pressured into regulation, eventually.
I agree on the importance of having the credible threat of being able to
operate in the underground, and for the reasons you outlined. However, I
see that threat as being inherent in the now-public-knowledge that a system
like Bitcoin can exist. The smart governments already know that
Bitcoin-like systems are unstoppable phenomena, that they can operate over
Tor and I2P, that they can and do run without central servers, and that
they can be run on commodity hardware without detection. Bitcoin itself
does not need to constantly operate in survival-mode, hunkered down, and
always ready for big brother’s onslaught, to benefit from the protection of
the ‘credible threat’.

It’s important to accurately asses the level of threat the Bitcoin system
faces from regulation, legislation, and government ‘operations’. If we are
too paranoid, we are going to waste resources or forgo opportunities in the
name of, essentially, baseless fear. When I got involved with this project
in 2012, no one really knew how governments were going to react. Had an
all out war-on-Bitcoin been declared, I think it’s pretty safe to say the
structure of the network would look different than it does today. We would
probably be discussing ways to disguise Bitcoin traffic to look like VoIP
calls, not talking about how to best scale the network. In light of the
current regulatory climate surrounding Bitcoin, I believe the best security
against a state-sponsored / political crackdown to be gained at this time
comes from growing the user base and use cases, as opposed to hardening and
fortifying the protocol. Uber is a great example of this form of
security-though-adoption, as was mentioned earlier today on this mailing
list.

If there are security or network-hardening measures that don’t come at the
expense of growing the user base and use cases, then there is no reason not
to adopt them. The recent improvements in Tor routing are a great example
of a security improvement that in no meaningful way slows Bitcoin’s
potential growth. How does this relate to the Blocksize debate? Let’s
accept that 8 MB blocks might cause a little bit, and perhaps even a
‘medium bit’ (however that is measured), of centralization. Although the
network might be slightly more vulnerable to government attack, if millions
more people are able to join the system as a result, I’d wager the overall
security situation would be stronger, owning to greatly decreased risk of
attack.

-Corey (CubicEarth)
Continue reading on narkive:
Loading...