The little-blockians want the block sizes of Bitcoin to remain
small, and thus for it to be a pure decentralised currency that
can be used by anyone with a computer. This would maintain it as a
peer-to-peer currency, but would limit its growth.
By comparison, the big-blockians believe Bitcoin should grow into
a universal currency, expanding the block size to accommodate
absolutely every transaction. The downside is that this is beyond
the computational limits of domestic machines, thereby meaning
that Bitcoin could only be regulated by banks, governments, and
other large organisations: thereby moving it away from a
libertarian idyll into something more akin to a regular currency.
Oh my god, I finally understand the block size debate. I've been hearing about it forever but never understood why the size mattered to so many people so much or why there was a debate at all.
That's only one of many aspects of the whole block size debate.
And I'm not sure there's a lot of merit to this specific aspect. From what I can tell[0], the "computational limit" is primarily in increased bandwidth and storage requirements in running a full node (2.8 GB per day for a block size of 20 MB).
While that's not trivial, it hardly limits adoption to institutions like banks. 2.8 GB is about 45 minutes of 1080p video, which people seem to manage even on domestic connections. At the same time, even with the current block size, very few people (less than 1%?) currently run a full node; so maybe it's not the "computational limits" stopping people?
Well the 2.8gb is to download all confirmed blocks, say a copy of the blocks added 10 days ago.
But the actual bandwidth exceeds that significantly. Both transactions that never make it into actual blocks (like bad transactions or double spends) and blocks that are orphaned are part of the bandwidth too.
More importantly, even if you were just sending and receiving the transactions that end up in the final longest blockchain, you're communicating hopefully not with 1 other node, as that'd mostly defeat the purpose, but rather tens of nodes, meaning you're perhaps seeding yourself the new blockchain data to tens of other nodes.
In fact, if you look at nodes today, the average daily bandwidth is pretty much the 2.8gb you mentioned for blocks 20x the current size. e.g. http://213.165.91.169/
So if we'd go 20x, you're talking about 50gb a day, or 1500gb a month. And that just doesn't fly with most consumer ISPs. It's something we could certainly move towards in 2020 or 2025 though.
Now you can limit your node of course, but that limits the utility, too few connections means you're just receiving data and not passing through actually hurting the health of the network. This node transmits only about 20% more than it receives, so it's not some kind of super node that delivers data to lots of others. There are nodes like that which today rack up 800gb of monthly data already, this one is closer to 100gb per month, which already is a lot just 'on the side', in countries with shitty bandwidth caps. (US) Plus, a lot of the traffic is burst traffic, so it's not like we're talking about 3gb daily = 35kb/s (which sounds fine) 24/7. For example, he's had days reaching 30gb of traffic, and again this isn't some industrial node. You have to be able to deal with that as well. If you cut your node when the going get's though on days like these, then it defeats the point and means the blocks are too big to handle.
So I think we can go a lot bigger with nodes, but we shouldn't downplay the traffic increases too much, 20mb blocks for example aren't trivial and can't just be compared to some 1080p video (although I make similar comparisons from time to time). At some point, you have to give away autonomy if you increase blocks too much. We already see with 1mb blocks today that lots of peeps run nodes on a VPS, so they can offload the hassle of a home setup. But this means lots of nodes already are being run by commercial providers, which are susceptible to government influence. Now I'm in no way anti-government, but the whole point of bitcoin is decentralisation, gateless, disintermediated, resistant to power etc... removing this leaves you a database that can be manipulated by those with power, just like an ordinary financial system we already have.
Tricky debate this!
edit: just to add for others who are interested... If the blockchain has value, people will use it, and keeping blocks small just forces transactions off-chain. And these can themselves be controlled by various gatekeepers and intermediaries. So on both sides you're losing out. A balance is necessary, everyone agrees there. Where the balance is, they don't. And it's made difficult by the need to 'get it right' right away, as changing the protocol in 5 years becomes way harder. Every day that goes by, more actors move in, making change a political process. So simply making blocks 2mb and saying 'let's get back to it in two years when blocks are getting full again' could actually be worse than delaying it shortly now and forcing a good permanent solution. While at the same time, every single day, bitcoin is treading uncharted waters, none of this has been done before in this particular context, so the foresight for a long-term solution is incredibly hard and already incredibly political.
Thanks for the link, seems like the post I was going by was underestimating the traffic quite a bit (though to be fair, they allude to it and it was probably my reading at blame).
That said, I stand by my point that the traffic ("computational") requirements hardly limit running a node to large organisations: depending on where you live, a domestic connection might not do it -- though it very well might! -- but a cheap server hosted somewhere on the "right" side of the last mile will.
> a cheap server hosted somewhere on the "right" side of the last mile will.
I think the idea is that if a large majority of the network is on linode, then shutting down Linode would cause the majority of whoever is left being able to dictate what's canon?
Why is 2MB the perfect size? Will we run into the same problems in three years? Or thirty?
I highly doubt that expanding the block size by a factor of 2 or 4 or even 8 will change it into something normal computers cannot handle.
The only reason things don't change is that miners don't change. The control lays with miners, full stop. IIRC, they agreed to some change by signalling a flag in mined blocks. The idea was that at a certain amount of consensus, the new behaviour would activate. Instead, the miners set the flag, then ignored it when the consensus threshold was passed.
I wouldn't be surprised if it was just an issue of some miners having some minor difficulty in increasing the blocksize, and deciding its not worth the hassle/lost profits.
Hate to confuse you again, but it's a lot more complicated than that. Keeping the blocks small will also lead to centralization as transactions are forced off-chain. Making blocks bigger (within reason) will not increase computing, storage, or bandwidth requirements beyond what most people have sitting idle at home.
Well there's another wrinkle to this, which is that having a small block size and lower throughput may drive transaction costs up. What I've read suggests that the current 1MB block size works out to about 7 transactions per second, which is not great. Also, it's possibly fair to say that we're already past the point of a "free for all" Bitcoin; from what I've read mining is controlled by a small number of entities. So, that is one way to characterize the debate, but I am not entirely sure if it is completely fair.
Both sides think their solution is decentralized and the other is centralized.
Big blocks wanted limited growth that still works on domestic machines. Big blockers believe in using Bitcoin as-is; small blockers want to "spam filter" (i.e. control) some transactions or use some third-party thing like Lightning.
Nice write-up explaining the problems, but the killer for me is a hint as to how others may have been fooled:
Note that the antepenultimate line says ‘signiture’ instead
of ‘signature’, so the script doesn’t do what is claimed. In
particular, it reads the signature from the environment
variable ‘signiture’ rather than from the command-line
argument. Hence, if you populate the environment variable
with your own public-key, rather than Satoshi’s, you can
cause the test to pass!
Subtle and clever, if that is indeed what has happened.
Without getting into Sicilian poison cup arguments around "I knew they'd think that," I find it hard to believe that someone would go to all this trouble, then innocently publish their magic trick to the world in a screenshot like that.
Edit: to be clear, I'm inclined to believe Kaminsky et al that this is a ruse. I just don't think it's that ruse.
It isn't that ruse. I replicated the file that is verified in the screenshots - the ruse is that it contains an old bitcoin transaction from Satoshi rather than Sartre's writings.
What's interesting to me is how blatantly wrong Wright's post is. It seems like he went out of his way to make it wrong (typos in the actual code). If he was a scammer wanting to prove something, wouldn't he go out of his to way to make sure there were at least no typos?
So what is his motive other than completely fucking up his reputation?
One freaky, and completely unfounded, hypothesis is that Wright is Satoshi, and was pressured by government to have experts prove he was satoshi (for tax reasons), but is willingly playing stupid publicly and releasing a blatantly foolish post to keep his anonymity and let the world think he's not Satoshi.
Since everyone is getting called out on nits today, I'm going to point out that he calls "Combinatorial Problems and Exercises" a paper, which means that he's (a) not into combinatorics, because it's an important _book_ in the field; and (b) he can't be bothered to google the thing even though he's going to rip on Wright for referencing it.
I have no idea (or care really) about who is right but there's clearly epic trolling going on here.
At the same time, Wright is obviously a chimp himself. I doubt I know anyone with any combinatorics knowledge who would describe it as "a process," mistake permutations for subsets, and then cite Lovasz to define binomial coefficients.
Here is another test: show your mom this video and ask her whether she thinks that the guy is either a genius mathematician or a con artist
http://www.bbc.com/news/technology-36168863
I find it weird that he doesn't want anything from this, and wants to be left alone yet he is the one coming out as Satoshi. He could have stayed private, and would have all of what he wanted. So him claiming to be Satoshi seems to be anti to what he is saying he wants.
Nice writeup. I know people say not to focus on Wright's writing style, but he sure doesn't sound like someone extremely knowledgeable about bitcoin or cryptography. He sounds like he's writing an undergraduate paper. Given enough time, anyone can cobble together enough references to describe something accurately. But that doesn't read anything like https://bitcoin.org/bitcoin.pdf
If you look past the glow of history and the greatness of the underlying idea, the Bitcoin whitepaper itself reads like an undergraduate paper. The original abstract had flagrant spelling errors, and the discussion and C implementation of the Poisson distribution will raise eyebrows among any statisticians or data scientists, who would either not bother to give any proof as trivial or would provide a one-liner in R.
Good to know. I'm willing to accept that Satoshi was not a statistician or a great writer (or even a great programmer) and yet he was still able to succinctly describe a problem few people in the world even knew existed at the time.
Wright, on the other hand, spends 5 paragraphs giving a repetitive and incorrect summary of hash functions, specifically SHA-256.
I don't know how anyone in the world can look at that and think these are the same person or even colleagues.
What's interesting is that, given enough practice, one could convincingly mimic Satoshi's writing style as some ancillary (unimpeachable tech proof aside) evidence of validity. It seems Wright hasn't even gone that far.
Can you imagine being pulled over by a police officer, and saying:
"I have my drivers license in my pocket, but there's no reason for me to show you. I showed Steve my license 10 minutes ago, and Steve will vouch for my identity"
If you think that's fishy, and don't think this is fishy... consider that in this metaphor, the person pulled over spent a great deal of his life advocating for the use of physical licenses to prove identity rather than social authority.
Is it possible to transfer bitcoin from and to the same wallet? If so, why not do that for one satoshi worth from one of the earliest blocks and include metadata proving the claim? Unless the private key for the earliest mined blocks are irretrievably lost (or deliberately destroyed), what's with all the weak hand-waving proofs?
As I see it there are 2 main competing philosophies for Bitcoin.
1. Digital Gold. Pseudo anonymous value storage that can't be tampered with by third parties.
2. A payment network with transaction levels that match Visa.
It appears the 2 are not compatible.
For 1 we need more nodes, and node counts from my understanding are on the way down.
For 2 we need to increase the blocksize, but this might decrease the number of people willing to run nodes.
I think we have to look at the way we use Bitcoin today, after all the experiment has been running for a few years now. I don't know anyone who is using Bitcoin for day to day spending. But I do know people who are using it as an investment or for value storage.
So in my view the market has decided.
The miners have also decided. They were offered a solution with increased blocksize and they voted no.
What do you think Craig Wright gets out of this? I can't imagine him trying to draw investor money five years after letting bitcoin go; everyone knows in startup world ideas matter less than execution and the first bitcoin software was not stellar, by far. Really, what people think his motivation could be.
> The use of a factorial to count the number of binary strings should immediately trigger alarm bells in anyone with a rudimentary undergraduate-level understanding of discrete mathematics.
Is it possible the use of "!" was used in an English sense, indicating an exclamation and not factorial?