The Bitcoin Second Layer - Crypto Words now WORDS

The Origins of the Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to Bitcoin [link] [comments]

The Origins of the (Modern) Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to sound8bits [link] [comments]

To the OG bitcoiners (UASF)

We had to make this thing against all odds. We were fighting (and still are fighting) one of the largest, most powerful industries on the planet: banking. It was never going to be easy and we were here because we could handle disruption and inconvenience to make a better world in the long run.
Of course this never made sense to any rational person who had the luxury of using a fairly stable currency like the dollar or the euro, but we didn't care about that. We knew the benefits that bitcoin promised were long term and that - for now - the on and off ramps to bitcoin made it the hobby of rich kids. But that didn't mean that we couldn't make something the entire world could use.
We had a strong ideology - one that put massive emphasis on agency and explicitly rejected the notion that money supply, inflation, and transfer should have anything to do with central authorities: We understood that separating these things was perhaps as important as separating the church and state.
Uncensoreable, instant transfer of wealth over IP. From users who had saved with confidence, not having to pay unpredictable stealth taxes that would slowly rob them of what they had worked for. Instead they would be rewarded for their faith and respect for their own agency.
It would teach responsibility as you were now your own bank. The cost society has to pay collectively to these institutions who keep our money (sometimes) safe for us would soon be unnecessary, and the involvement of third parties when you communicated your understanding of the value of something to another was no longer anyone's business except you, and the person on the other end of the exchange. Because a truly fluid medium now existed.
Would it succeed? Certainly by some metrics.
The most obvious to use would be the price.
It would serve as an easy indicator that all was well with the software, the community and the project as a whole. However this was not a proper way to gauge the health of the technology. More subtle observers and participants were interested in how competent the devs were, how high the difficulty was, how many nodes there were and most importantly, how distributed the hashrate and nodes were. Without these qualities we would begin our death spiral, even if the price continued to fail to reflect this.
The hashrate continued to grow but there was a problem. It all seemed to be in one place: China. This seemed to be due to the subsidizing of electricity by the Chinese authorities (hmmm, weren't those meant to be kept separate from this ecosystem?)
Those that understood how to asses a cryptocurrency's health stated as a matter of fact that this was not a healthy scenario.
The Bitcoiner community had become diluted and had collectively lost sight of what bitcoin is, and what paypal, visa and SEPA are not. A very simple rule was slowly being forgotten amidst wave after wave of euphoria as the gold rush continued and bubble after bubble would elate us all.
The understanding of what made bitcoin resilient and anti-fragile seemed to be irrelevant: we were - by far - still the world's best investment. Consistently. Year after year (with the one exception of 2014) we would leave everyone else in the dust.
The empty phrase was thrown around: "the technology behind bitcoin the technology behind bitcoin the technology behind bitcoin " by those who didn't have a clue.
If we knew any better, then we might not any more.
You have a healthy, diverse blockchain maintained by as diverse a group as possible or you have the world's most inefficient, public database.
We are well on our way to becoming the latter. It should not be possible to name the person responsible for almost half of the new blocks added to our blockchain every day. But it is.
This was an unacceptable development in its own right, but to compound the issue, this person has incentives that directly oppose those of the users.
Users like myself had told hundreds of people about bitcoin for years, and we were now complete hypocrites: we had explained that this miracle technology had solved the double spend problem, when in fact general A and general B now increasingly had to rely on the reliability of one millionaire to allow them to communicate.
Those with malicious intent flooded the space and chipped away at bitcoin's principles stating that the risk of becoming PayPal 2.0 was a risk worth taking, not realising that when you cut out the principles on which you build something, the existence of the whole thing will be put in jeopardy.
Eventually a false narrative became entrenched. Everyone wanted bitcoin to scale but apparently they just disagreed about how to do it.
However this was in fact a conflict between those who understood why bitcoin works and those who don't - or do but don't care.
The was framed as a "debate" (a laughable bastardization of the word) between those who would scale bitcoin by making blocks larger (thus chipping away yet another hurdle for the miners who wanted even greater control over the blockchain) and those who would wait for a better solution.
The former was represented by those who were either flippant, misguided or malevolent with regard to the health of the network. The latter however understood that this would spell disaster.
SegWit came along which promised to scale bitcoin without making the bad situation worse. It is hard to find words to explain how flawless SegWit is, suffice it to say that malevolent parties have given up trying to discredit it and instead now only agree to allow it in bitcoin should the users swallow a poisoned pill at the same time.
Those who wanted SegWit the least were initially given power of veto. A painfully naive approach which led to years of frustration for the users of bitcoin. Transactions were getting more expensive and those who would bring others in to the ecosystem could no longer talk about it's utility for the world's poorest. What was once a hobby for egalitarian rich kids was now crippled by ignorant gold rushers.
Inevitably some in the community would attempt to fight back. A loosely affiliated group of highly principled users who were sick of being denied the bitcoin we all knew was possible started a grass roots movement - UASF.
This would force the miners to stop their blockade. What was being framed as a debate was in fact a battle for dominance: either the principles that made bitcoin the success story it was would win, or the eventual surrender to central control would.
There are many proposals on the table but in terms of power they all boil down to the same thing: Can those who represent the larger, more diverse side of bitcoin win over the oligarchy that has emerged?
If they can the UASF will be successful.
If they can't, it won't.
For this reason I hope, that on 2017:08:01 at 00:00, bitcoin turns on SegWit without a hitch. If it doesn't then my continued faith in this community will be no longer justified.
Tldr: Forcing the miners to activate SegWit is the same battle that brought us to bitcoin in the first place.
submitted by violencequalsbad to Bitcoin [link] [comments]

Stability of the difficulty. A weakness you may not be aware of.

EDIT: Here's a TL;DR
Sorry if I rambled on a bit there. I'll try to make my point a bit more concise here.
TL;DR: If bitcoin starts to gain mainstream success, eventually a large percentage of the miners will be in it for the profit and not for the good of bitcoin. My fear is a crash in bitcoin's value after achieving general acceptance as a form of payment could cause a crash in hash rate from miners shutting down when the boss sees a drop in profitability. A crash in hash rate could possibly destroy bitcoin's usefulness as a form of exchange, driving the price down further, killing the currency. In my post I describe a possible solution to make scaling down difficulty to match a sudden drop in hash rate smother. If this interests you, then please read my post and share your thoughts. I feel like this is an obvious flaw in the protocol that was overlooked. Am I wrong? If so, please tell me because I'd like to have my worries eased.
ASIC miners and the rapid increase in difficulty have created a new point of failure in the bitcoin network that was never that large of a problem before. No, I’m not talking about the increased centralisation of mining that you see so many people complaining about. I’m talking about the rising difficulty itself and the way the network scales the difficulty. It allows for smooth transitions in the upward direction, but a sharp decline in hash rate could kill bitcoin completely. The readjusting of the difficulty doesn’t really happen every two weeks, it’s really every 2,016 blocks which should be about two weeks in theory. If we were to lose a large chunk of the hash rate, lets say 66%, block discovery time will increase to 30 minutes per block. Maybe that’s not a huge deal, but it definitely impacts the convenience and usability of bitcoin as a means of exchange and would certainly impact the price negatively. Also, it’s possible that we would be stuck at this slow confirmation rate for 6 weeks in this example. Perhaps it is an unlikely scenario that such a large amount of hash rate is lost within a short span, but think about it this way. Imagine a point in the future where ASIC miners account for over 98% of the hash rate. Maybe bitcoin is well on it’s way into mainstream acceptance in this future. There may be large corporations that own large mining farms. These corporations may be publicly traded companies with a responsibility to maximize profits for their shareholders. If the value of bitcoin were to crash, there’s a good chance that some of these corporations may shut down their mining operations because it no longer has a good ROI in the opinion of their CEO. Maybe the next reward halving is coming up soon, that could also cause miners to shut down. This would slow down transaction confirmations, impacting it’s usefulness as a means of exchange and driving the price down even lower, driving even more miners to shut down. This would continue exponentially until the only remaining hashing power is us, the early adopters, true believers, and ASIC manufacturers. We could very easily end up in a situation with hour long confirmation times or longer, and the next difficulty adjustment being months away. That would essentially completely kill off bitcoin. It’s not logical to assume there will never be a large drop in hash rate between now and 2140. It’s not possible to predict political events that far into the future. Maybe world war 3 happens and China decides to unplug their whole country from the outside internet. Maybe Butterfly Labs successfully ships out 1,000,000 terahash miners that run on sunshine dust and unicorn farts and they quickly become the defacto standard, and then it’s discovered that they are a fire hazard. Many people are killed, homes are lost, and people just start turning them off out of fear. You just don't know what will happen that far into the future.
I have an idea about how the protocol could be modified to protect against this sort of scenario and allow for a sharp decrease in hash rate without losing its usefulness as a means of exchange. There should be an emergency mode that drastically cuts the difficulty rate. This emergency mode can be requested by any node in the network but will only occur if the network has consensus from the nodes. The request could be triggered by two possible events. The first trigger event should be if no blocks are discovered for a certain span of time, lets say an hour and a half. The second event should be if a certain number of consecutive blocks take over 25 minutes to discover. The reason for using 25 minutes as the trigger is because that would require a loss of 55% so in theory a single person couldn't trigger it without having over 50% and in that scenario we have worse problems than the falling hash rate. Once the event is triggered, the difficulty should be slashed by a factor of 10x and reduce the block reward to 0. If it’s impossible to have no reward blocks, then use a dust amount of bitcoin like a few Satoshies. If the event was triggered by 25 minute confirmation times then the new emergency block discovery time will be 2.5 minutes. This will have the immediate effect of speeding up transactions that have possibly been waiting awhile for confirmations. The rapid block time will have the secondary effect of quickly and more accurately calculating the new hash rate of the network. (More data points over a shorter spread of time.) The removal of the block reward serves to disincentivize miners with lots of hashing power from trying to trigger the event on purpose so they could make more bitcoins at a lower difficulty rate and also prevents the creation of large amount of bitcoins since it would be moving into an unknown block discovery rate. This emergency event should not last long because miners would just shut down even faster with no block reward. Lets say it only lasts 96 blocks. Assuming 2.5 minute confirmation times, 96 blocks would take 4 hours to mine but that number could vary depending on how much hash rate was lost. Lets put this in perspective. A sudden loss of 55% of the hash rate would result in a loss of 4 hours of block rewards. A loss of 90% would be 16 hours with no block rewards. After 96 blocks have been mined, the network can make an estimation on the new hash rate based on the speed those blocks took, and generate a new difficulty to resume normal 10 minute confirmation times and normal block rewards. However, such a short span of time may not be enough to generate an accurate difficulty for the network. The network should recalculate the difficulty again after 432 blocks (3 days), and then resume the normal 2 week schedule. Each time an emergency event is triggered, the next scheduled halving of the block reward should be moved back by 96 blocks. That way these events have no effect on the mathematical total of bitcoins that can possibly be mined by 2140.
Even if these events are very unlikely and it’s possible that this scenario will never play out, it would be added security to the value of bitcoin to have fail safes in place in the event of huge losses in hash rate. It could only add to the strength of bitcoin. It gives it the robustness it needs to survive a crash in price in a world where the miners are mostly interested in profits and ROI, not the good of bitcoin.
EDIT: Here's another TL;DR of my proposed solution
Basicly the nodes can request an emergency drop in hash rate if it notices block confirmation times rise above 25 minutes. This would require a loss of 55% of the hash rate. If the nodes have consensus on this request then the difficulty gets slashed by a factor of 10 and the network will mine 96 blocks with no block reward. That will take 4 hours with a 55% hash rate loss and 16 hours at a 90% hash rate loss. After the 96 blocks are mined, the network will calculate the appropriate difficulty based on the discovery rate that those blocks were mined. Normal block confirmations and block rewards will then resume at that point. What makes it nice is it will only be triggered by extreme cases. Also, mining for 4 to 16 hours for the good of bitcoin is an easier pill to swallow than just keep mining indefinitely into a possibly dying system for the good of bitcoin.
submitted by testing1567 to Bitcoin [link] [comments]

Battlecoin [BCX]: a new (and ambitious) game changer in the world of cryptocurrency - Interview with JackofAll

"When it gets too hard for mortals to generate 50BTC, new users could get some coins to play with right away." - Satoshi Nakamoto, 2010
Time goes faster when one talks about IT but, when talking about cryptocurrencies, time goes faster yet.
How long have passed since we’ve heard of mythic characters from the beginning of cypherpunk era – like John Gilmore, Eric Hughes and Tim May – and those from the early days of digital currency of Wei Dai, Satoshi Nakamoto (real or not, it doesn't matter) and others, passing by the current Bitcoin and other cryptocurrencies whales – that now are experts or well established entrepreneurs – to us, the Fourth Generation?
Twenty-two years.
However, if we consider that the real deal has only begun on 2009 with Satoshi’s Genesis Block, then the perspective of the quantic leaps we are giving become clearer.
2009 - Satoshi Nakamoto deployed tools and strategy to people regain control
2011 - FPGA Bitcoin Miners appear. Regular GPU miners start to struggle more and more for Bitcoins
2013 - ASIC miners enters the market, and an official Bitcoin mining industry arises, taking small miners – regular people – completely out of the Bitcoin mines, forcing them to mine – and to create – many different altcoins with many different purposes.
2014 - Now, five years after Bitcoin Golden Dawn and almost four years after the white ninja Satoshi’s disappearance, another cryptowarrior arises on battlefield to give (hash) power back to the people.
Was Satoshi foreseeing what was about to come? However, this time, not so philanthropically…
Meet JackofAll, Head Developer of Battlecoin [BCX].
Andre Torres: Jack, are you there?
JackofAll: Yes.
AT: Thanks for talking to Cryptonerd.co and also to Criptonauta.net (in Portuguese and Spanish). Shall we begin?
JoA: Yes.
AT: With the ASIC TH/s mining hardware invading Bitcoin mining pools as FPGA once did - and are about to do again, this time in Litecoin mines - common people are getting more and more trouble to mine big cryptocurrencies and have ROI.
The cryptomines, crowded with dead and injured miners, have become - literally - a battlefield.
Now is my question, why the name Battlecoin?
JoA: Battle coin is by design of course. Just like everything that we do. And yes, it is for the reasons that you might think. It has become a "Battle” out there to get coins made, to get coins listed and just to be able to mine against the early guys that have all the hardware. So I came up with the name Battlecoin rather solidify what we are doing. We are battling for hashpower and BCX will give the opportunity for anyone (with enough battlecoins) be able to have control of a comparable amount of hashpower that the elite crypto whales have.
We also have a little controversy that surrounds us, as some people would say that we are waging war on altcoins by the nature of what we are doing. People will Battle it out to keep control of our hashpower. It could possibly lock some of the weaker coins ...for instance if we are paid to switch to a coin with low difficulty or one that has low network support on average and we are paid to drive the difficulty up .
There are several things that could happen to some coins... Bad things they might not react how the developers have intended. We might fork coins or even lock coins up when our pool stops mining. So not everyone will like us. So here again another battle to try to walk a public relations line. So in short, I would say that Battlecoin represents all of the battles we have gone through and the many more we have in front of us.
We also have other app ideas that would complement the Battlecoin Brand.
AT: Nice :)
Do you watch anime/read manga, Naruto, more specifically? I mean, Battlecoin project is like an army that will, more than just fight, direct the ways of the wars by influencing/disrupting the market by its own will or by contract?
JoA: Actually I do not, as my schedule does not permit such luxury... but my sister does and as a matter of fact she is a very good cartoon artist and most of her subjects are anime. But relation to any specific object or character is purely coincidental.
I will check it out now though =)
AT: I did that relation to Naruto because there is an organization called Akatsuki, and when you replied my first question, it immediately reminded me of them.
JoA: Nice.
AT: However, you did not reply my previous comment... but then I've got the idea correctly... or not?
JoA: Ask the question again and I will try to sum it up… (Jack remembers the question) “Battlecoin project is like an army that will, more than just fight, to direct the ways of the wars by influencing/disrupting the market by its own will or by contract?”
Yes exactly. It will do all of the above it will have a big influence in the market.
AT: I was just thinking about that.
JoA: Yes that is why such a controversy. There are people that don’t want to see this happen but I feel it is part of altcoin evolution.
AT: indeed. It’s like a new powerful ninja/warrior coming into the game, not seen since Satoshi's era.
JoA: Yes... You Get it.
AT: What about competitors? Is there anyone on the same level?
JoA: No. Not really. We are the only ones that I know of that is taking the multipool to this level. They have the hashpower but they do not let the public decide where to put it. They just follow an algorithm that directs to "most profitable" coin. We will give that, plus add some human power to the equation. As far as I know, we are the only one that is working on having a voting system that controls it.
Giving it that human element that other concepts lack.
AT: But it needs more than a swallow to make a summer... You do have some other strong companions, don't you? Also, you talk as "we".
Are there other Generals in Battlecoin army? Who are they?
JoA: Well, it was originally my concept but I could not embark on this project alone. I have one partner that is above the board Mr. Big. We Kind of met through a mutual acquaintance and formed a solid partnership. Mr. Big has several projects that I am not sure of what I have liberty to discuss but I do know one of those projects will be to provide hosting services and as I said before we are working on some application ideas that are still in concept.
I also have a private backer that would like to remain nameless. In addition, I have a few consultants that I work with too and I consider them a part of the team of course.
Our team is growing daily... and you have to remember this is a project that involves the community, so in my eyes they are part of US too.
AT: Yes... or all the strategy developed might go to the floor, since the project will require a LOT of hashpower...
JoA: I am hoping to have camaraderie developed and rivals be formed over this concept. I want people to be talking in War rooms about what coin they want to hit... Strategy for pump and dump coins, etc.
Yes, it will require a lot of hashpower and I hope that people will want to give us that hashpower, because they will get paid top $$ for that.
We won’t be making the revenue from the battlecoins that get spent... That money will be split between the miners in our pool as subsidy to make sure that they continue to make as much or more than they could make mining anywhere else.
AT: Now that the strategy has been covered and we are entering more into the battlefield grounds, when will the battles begin?
JoA: I cannot confirm a release date for Phase 3, which includes the "arena", but Phase 1 Will be open to the public on this Friday 9 minutes after 9 pm. The wallets will be linked on our website first then we will post on BCT and then we will have a Big giveaway starting shortly after to kick it all off.
I will also provide a mirror on Google drive. We should have a Block crawler and a faucet too, if all goes well.
AT: That sounds great. Andre Torres: So, during Phase 1, you will gather your ranks that will battle when Phase 3 starts... On what consists Phase 2?
JoA: Phase 2 will be where we determine the market value of a Battlecoin. It will need to be listed on an exchange to determine the FMV of the coin. We originally were going to dictate the price on our own exchange but we feel like to keep with the nature of crypto it would be best to let the free market decide. We have been in touch with a couple of exchanges that have interest in our idea as ours is one that would form a close relationship and provide an elevated amount of trade volume with the exchange that carries our brand .
AT: Battlecoin already have an exchange of preference? On the other hand, perhaps some exchange have already manifested interest on trading BCX with exclusivity?
JoA: I wish to decline to answer as negotiations are still going on.
AT: A wise decision. (laughs)
AT: Now, from the battlefield to the weapons of combat... could you talk a bit about the mechanics of Battlecoin?
JoA: It is pretty straightforward. We did a small pre-mine to make sure we had enough coin for the 3rd phase. And we are doing a small bonus block mine in the beginning to give all of our supporters plenty of Battlecoin to play with for the phase 3 open. And then from there it is a solid 50 coins a block every 2min we should find a block...
Difficulty adjusts every block with a 10-block look back. I think this will provide a very smooth operating coin providing plenty of coin to the market for the use of our services.
There will be a proof of stake 1% every 10 days with maturity of 20 days. This is to reward the users for holding our coin so they will have plenty to use when the time comes.
AT: As we finish this interview, any other comments you might like to add?
JoA: I think we have covered quite a bit and we have much more to come in the future. I appreciate all of your time and hard work!
AT: Me too. I am very glad of this talk and for having the opportunity of talking beforehand with the mastermind of a project than can be a huge game changer on cryptocurrency world.
JoA: Yes, it is nice to be able to talk directly to the people that make it happen. I wish I had back in the day... lol. The advantage and tools that the newcomers have…
AT: Let us make new days :)
This interview was made on 01-07-13, on #CryptoNerd mIRC channel. Portuguese and Spanish versions are available on criptonauta.net.
BCX refers to BattleCoinEXchange. It is not related in any form to BitcoinEXpress.
submitted by criptonauta to CryptoCurrency [link] [comments]

Bitcoin Cash, Alert: Mining Difficulty, Block Halving ... Bitcoin Q&A: What is difficulty targeting? What is Dysphagia (Difficulty Swallowing)? - YouTube What is Crypto Mining Difficulty and How it Impacts YOUR Profits - Explained W/ BTC ZenCash ZEC How Bitcoin mining actually works - What is the ...

Bitcoin’s mining difficulty is at a record high, Singapore’s central bank digital currency could find commercial use and Chinese firms are going in on Filecoin. Here’s the story: You’re reading Blockchain Bites, the daily roundup of the most pivotal stories in blockchain and crypto news, and why they’re significant. You can subscribe to this and all of CoinDesk’s newsletters here ... Bitcoin’s antifragile protocol and its exponentially increasing network effects make it a behemoth, gradually swallowing up global economic activity. The latest of these network effects is a ... Bitcoin has just posted its biggest mining difficulty increase in nearly 2.5 years. At around 17:00 UTC on Tuesday, the network adjusted its difficulty level – a measure of how hard it is for miners to compete for block rewards on the blockchain – to 15.78 trillion. Coinwarz bitcoin mining calculator difficulty swallowing.. Coinwarz bitcoin mining calculator difficulty swallowing. Earthcams florida. Buy bitcoins with credit card anonymous definition. Is currency a security. Coinwarz bitcoin mining calculator difficulty swallowing.. # usr/bin/python. Coinwarz bitcoin mining calculator difficulty swallowing.. Nyse current stock price. Blockchain world ... Bitcoin Fundamentals Booming. With BTC moving above $8,000, the fundamentals of the Bitcoin blockchain have also surged. As pointed out by industry researcher Kevin Rooke, Bitcoin’s mining difficulty has reached an all-time high of 7.46 trillion. The thing is, we’re still down by around 55% from BTC’s all-time high of $20,000, and public awareness of the cryptocurrency space is still ...

[index] [37963] [16392] [28899] [12968] [9805] [40699] [2494] [8843] [7134] [12246]

Bitcoin Cash, Alert: Mining Difficulty, Block Halving ...

What is crypto mining difficulty, how is it adjusted, what is the point of a block time? Vosk explains how the difficulty for mining a block reward is adjusted when mining Bitcoin on sha-256 or ... Bitcoin Cash, Alert: Mining Difficulty, Block Halving & Profitability vs. Bitcoin-BTC: https://steemit.com/@garypalmerjr We are living in very exciting times... This is a variable that the Bitcoin system is using to keep the growth of new Bitcoins on a controllable rate. It started as 1 and changes once in every 2016 calculated blocks. Finding the current ... How does difficulty targeting work? What determines the desired pattern? Should difficulty re-targeting still happen every 2016 blocks? Could difficulty chan... Blockchain/Bitcoin for beginners 6: blocks and mining, content and creation of bitcoin blocks - Duration: 46:48. Matt Thomas 11,077 views

#